# Thread: MATH2601 Higher Linear Algebra

1. ## Re: MATH2601 Linear Algebra/Group Theory Questions

So that part makes sense now.

How do we complete the proof if |H| = 1?

2. ## Re: MATH2601 Linear Algebra/Group Theory Questions

Originally Posted by leehuan
So that part makes sense now.

How do we complete the proof if |H| = 1?
That can only happen if a = e (the identity). Take a to be any other element in G (there must be at least one other element since G has prime order, which implies |G| is at least 2), and the result will follow (since |H| won't be able to be 1).

3. ## Re: MATH2601 Linear Algebra/Group Theory Questions

$\text{Let }V\text{ be a finite-dimensional vector space and }T:V\to V\text{ be a linear transformation}\\ \text{Are the following true or false}$

The questions were

$\\\text{a) If }<\textbf{u}\mid T(\textbf{v})>\, =0\text{ for all }\textbf{u},\textbf{v}\in V,\text{ then }T(\textbf{v})=\textbf{0}\forall \textbf{v}\in V\\ \text{b) If }<\textbf{v}\mid T(\textbf{v})>\, =0\text{ for all }\textbf{v}\in V,\text{ then }T(\textbf{v})=\textbf{0}\forall \textbf{v}\in V$

For the first one I claimed it was true by using a uniqueness result
$<\textbf{x}\mid \textbf{y}>\, = \, <\textbf{x}\mid \textbf{z}> \, \implies \textbf{y}=\textbf{z}$

and by pairing T(v) with 0. But I can't figure out why this argument does not work for the second one?

4. ## Re: MATH2601 Linear Algebra/Group Theory Questions

Why do you think that "uniqueness result" is true? Add anything orthogonal to x to y and you won't change the inner product of x with it.

(Also the truth of b) is (perhaps surprisingly) dependent on the field your vector space is over.)

5. ## Re: MATH2601 Linear Algebra/Group Theory Questions

Originally Posted by leehuan
$\text{Let }V\text{ be a finite-dimensional vector space and }T:V\to V\text{ be a linear transformation}\\ \text{Are the following true or false}$

The questions were

$\\\text{a) If }<\textbf{u}\mid T(\textbf{v})>\, =0\text{ for all }\textbf{u},\textbf{v}\in V,\text{ then }T(\textbf{v})=\textbf{0}\forall \textbf{v}\in V\\ \text{b) If }<\textbf{v}\mid T(\textbf{v})>\, =0\text{ for all }\textbf{v}\in V,\text{ then }T(\textbf{v})=\textbf{0}\forall \textbf{v}\in V$

For the first one I claimed it was true by using a uniqueness result
$<\textbf{x}\mid \textbf{y}>\, = \, <\textbf{x}\mid \textbf{z}> \, \implies \textbf{y}=\textbf{z}$

and by pairing T(v) with 0. But I can't figure out why this argument does not work for the second one?
For a), we can do it like this: since T is from V -> V and < u, T(v) > = 0 for all u, v in V, for each v, just take u = T(v), and the result will follow.

6. ## Re: MATH2601 Linear Algebra/Group Theory Questions

Is there an intuitive explanation for this?

Let T be a linear map on a finite-dimensional inner product space V

Then T is an isometry iff T is unitary

7. ## Re: MATH2601 Linear Algebra/Group Theory Questions

$V=\{p \in \mathbb{P}_3 \mid p(2) = 0\}\text{ (assumed to be a V.S.}$

$\text{RTP: }B=\{(t-2),(t-2)^2,(t-2)^3\}\text{ is a basis for }V$

So the question is obviously easy first year stuff. I'd prove linear independence and then use dim(V) = B to deduce that it's a basis.

However, for the linearly independence step

$c_1(t-2)+c_2(t-2)^2+c_3(t-3)^3 = 0\, \forall t$

I just want a validity check because I'm having second doubts. Mostly because no solutions made a remark on this.
I differentiated w.r.t t and then subbed in t=2 to prove c1 = 0. (And then repeated this to show c2 = c3 = 0.) Is this ok?

8. ## Re: MATH2601 Linear Algebra/Group Theory Questions

Originally Posted by leehuan
$V=\{p \in \mathbb{P}_3 \mid p(2) = 0\}\text{ (assumed to be a V.S.}$

$\text{RTP: }B=\{(t-2),(t-2)^2,(t-2)^3\}\text{ is a basis for }V$

So the question is obviously easy first year stuff. I'd prove linear independence and then use dim(V) = B to deduce that it's a basis.

However, for the linearly independence step

$c_1(t-2)+c_2(t-2)^2+c_3(t-3)^3 = 0\, \forall t$

I just want a validity check because I'm having second doubts. Mostly because no solutions made a remark on this.
I differentiated w.r.t t and then subbed in t=2 to prove c1 = 0. (And then repeated this to show c2 = c3 = 0.) Is this ok?
Yeah, that's OK for showing linear independence.

9. ## Re: MATH2601 Linear Algebra/Group Theory Questions

Originally Posted by leehuan
Is there an intuitive explanation for this?

Let T be a linear map on a finite-dimensional inner product space V

Then T is an isometry iff T is unitary
In an inner product space, the norm is defined in terms of the inner product. This means that any operator that preserves the inner product will preserve the norm.

Less obvious is the fact that in an inner product space, the inner product can be written in terms of the induced norm (*). Consequently anything that preserves the norm will preserve the inner product.

Of course, you don't need to prove (*) in order to answer this particular question, but it kind of hits at the heart of the relationship between inner products and induced norms on a real/complex inner product space and is a good exercise.

Note also that finite dimensionality is not required in any of these arguments.

10. ## Re: MATH2601 Linear Algebra/Group Theory Questions

$T:\mathbb{P}_2 \to \mathbb{R}^3, \, T(p) = (p(a),p^\prime(b),p(c))$

$\text{Found in b): w.r.t. the standard bases, the matrix of T is}\\ A=\begin{pmatrix}1&a&a^2\\ 0&1&2b\\ 1&c&c^2\end{pmatrix}$

$\text{Deduced in c): Using the fact that }T\text{ is invertible iff }A\text{ is invertible}\\ T\text{ is invertible provided }a\neq c\text{ and }a+c\neq 2b$

The question is an extension on c. How can I explain this answer from first-year calculus?

11. ## Re: MATH2601 Linear Algebra/Group Theory Questions

Originally Posted by leehuan
$T:\mathbb{P}_2 \to \mathbb{R}^3, \, T(p) = (p(a),p^\prime(b),p(c))$

$\text{Found in b): w.r.t. the standard bases, the matrix of T is}\\ A=\begin{pmatrix}1&a&a^2\\ 0&1&2b\\ 1&c&c^2\end{pmatrix}$

$\text{Deduced in c): Using the fact that }T\text{ is invertible iff }A\text{ is invertible}\\ T\text{ is invertible provided }a\neq c\text{ and }a+c\neq 2b$

The question is an extension on c. How can I explain this answer from first-year calculus?
$\noindent It is clear that a necessary condition for T to be invertible is that a\neq c (otherwise the first and last entry of T(p) are always the same, so T cannot be onto). Note also with a\neq c, we also can't have b = \frac{a+c}{2}, because if this is the case, since there exist more than one polynomial with roots at a and c, and these polynomials also havep'(b) = 0 (essentially since the vertex of the parabola is at b), T will map more than one polynomial to (0,0,0)^{\top}, and hence won't be one-to-one.$

$\noindent To see that these conditions are also sufficient for T to be invertible, note that if a\neq c and a + b \neq 2b, then T is one-to-one. For if p(a) = q(a), p'(b) = q'(b) and p(c) = q(c), the polynomial r(t) := p(t) - q(t) has its roots at a and b and has slope 0 at a point \emph{that is not the midpoint of these roots} (b), since b \neq \frac{a+b}{2}. Hence the quadratic r must be the zero polynomial, i.e. p=q, so T is one-to-one.$

$\noindent See if you can see why T also will be onto (or if you're willing to use some linear algebra at this point, you could say it's a one-to-one linear map between two equi-dimensional spaces, so is automatically onto as well).$

12. ## Re: MATH2601 Linear Algebra/Group Theory Questions

$\text{Let }T:\mathbb{R}^2\to \mathbb{R}^3\text{ be linear and suppose }\\T(\textbf{v})=(1,2,3)\text{ and }T(\textbf{w})=(2,4,6)$

$\text{Must }\textbf{v}\text{ and }\textbf{w}\text{ be linearly dependent?}$

13. ## Re: MATH2601 Linear Algebra/Group Theory Questions

Originally Posted by leehuan
$\text{Let }T:\mathbb{R}^2\to \mathbb{R}^3\text{ be linear and suppose }\\T(\textbf{v})=(1,2,3)\text{ and }T(\textbf{w})=(2,4,6)$

$\text{Must }\textbf{v}\text{ and }\textbf{w}\text{ be linearly dependent?}$
No.

14. ## Re: MATH2601 Linear Algebra/Group Theory Questions

Originally Posted by InteGrand
No.
Okay good I agree. But what would be an easy method to generate a counterexample?

15. ## Re: MATH2601 Linear Algebra/Group Theory Questions

Originally Posted by leehuan
Okay good I agree. But what would be an easy method to generate a counterexample?
$\noindent Take a map whose image is just \mathrm{span}\left\{(1,2,3)^{\top}\right\}. Then T would map all non-zero vectors to multiples of (1,2,3)^{\top}, so wouldn't imply that those \mathbf{v} and \mathbf{w} are dependent. For example, take T to be defined by T(\mathbf{x}) = A\mathbf{x}, where A = \begin{bmatrix}1 & 2\\ 2 & 4 \\ 3 & 6\end{bmatrix}. Then take \mathbf{v} = \mathbf{e}_{1} and \mathbf{w} = \mathbf{e}_{2}.$

16. ## Re: MATH2601 Linear Algebra/Group Theory Questions

I don't know much about Graph Theory and Group Theory but are they two different topics? Or Different names but same subjects?

I can't be bothered Googling it

17. ## Re: MATH2601 Linear Algebra/Group Theory Questions

Originally Posted by davidgoes4wce
I don't know much about Graph Theory and Group Theory but are they two different topics? Or Different names but Two dame subjects?

I can't be bothered Googling it
Two different topics.

18. ## Re: MATH2601 Linear Algebra/Group Theory Questions

I was wondering if given the trace and the determinant of a matrix could you write down a unique matrix satisfying these conditions, or a simple formula for the family of matrices satisfying it?

Mostly asking for the 2x2 case

19. ## Re: MATH2601 Linear Algebra/Group Theory Questions

Originally Posted by leehuan
I was wondering if given the trace and the determinant of a matrix could you write down a unique matrix satisfying these conditions, or a simple formula for the family of matrices satisfying it?

Mostly asking for the 2x2 case
No, the value for the trace and determinant of a real or complex matrix does not uniquely specify the matrix.

$\noindent For 2\times 2, let A = \begin{bmatrix}a & b \\ c & d\end{bmatrix} and suppose the trace is given to be \alpha and the determinant \beta. This is equivalent to a + d = \alpha and ad - bc = \beta. So these are the conditions that make A have given trace \alpha and determinant \beta.$

(Note that for a 2x2 complex matrix, the trace and determinant will uniquely specify the eigenvalues of the matrix though.)

20. ## Re: MATH2601 Linear Algebra/Group Theory Questions

$A=\begin{bmatrix}3&1\\-2&0\end{bmatrix}\\ \text{Find all diagonalisable matrices }B\text{ such that }B^2=A$

I then realised that B may share the same eigenvectors as A, and have eigenvalues equal to the square root of those of A. But I'm not sure where to proceed from there.

21. ## Re: MATH2601 Linear Algebra/Group Theory Questions

Originally Posted by leehuan

$A=\begin{bmatrix}3&1\\-2&0\end{bmatrix}\\ \text{Find all diagonalisable matrices }B\text{ such that }B^2=A$

I then realised that B may share the same eigenvectors as A, and have eigenvalues equal to the square root of those of A. But I'm not sure where to proceed from there.
$\noindent Here's some hints. If B is a diagonalisable matrix such that B^{2} = A, then write B = PDP^{-1} (so P is an invertible matrix whose columns are eigenvectors of B, with corresponding eigenvalues in the diagonal entries of the diagonal matrix D). Then B^{2} = PD^{2}P^{-1} = A. So the eigenvalues of A are the squares of those of B and the corresponding eigenvectors are the eigenvectors from B.$

22. ## Re: MATH2601 Linear Algebra/Group Theory Questions

$\\\text{Let }T:V\to V\text{ be linear, }\\W_1\text{ a subspace of }V\\ W_2\text{ a subspace of }W_1$

$\text{Suppose }W_1\text{ is invariant under }T.\\ \text{Must }W_2\text{ be invariant under }T\text{ and why/why not?}$

23. ## Re: MATH2601 Linear Algebra/Group Theory Questions

Originally Posted by leehuan
$\\\text{Let }T:V\to V\text{ be linear, }\\W_1\text{ a subspace of }V\\ W_2\text{ a subspace of }W_1$

$\text{Suppose }W_1\text{ is invariant under }T.\\ \text{Must }W_2\text{ be invariant under }T\text{ and why/why not?}$
No it need not be. Say V = R^2 and W1 = R^2 (= V) and W2 be the line (t, 0) (the x-axis). Take T to be a rotation map by 90 degrees counter-clockwise about the origin say. Then T is a linear map from V to V, so T(V) = T(W1) is a subspace of W1 = V = R^2, and W2 is a subspace of W1 (which is a subspace of V), but clearly W2 is not invariant under T (e.g. the point (1, 0) in W1 does not get mapped to a point in W2 by T; it gets mapped to (0, 1)).

24. ## Re: MATH2601 Higher Linear Algebra

$V\text{ is a finite dimensional V.S. over }\mathbb{C}\text{ and }T:V\to V\text{ is linear}$

$\\\text{Suppose }T^2 = T = T^*\text{ (i.e. idempotent and self-adjoint)}\\ \text{Prove that there exists a subspace of }W\text{ such that }\\ T(\textbf{v}) = \text{proj}_{W}\textbf{v}$

My approach thus far: Write $\textbf{v} = \textbf{x}+\textbf{y}, \quad \textbf{x}\in W\text{ and }\textbf{y} \in W^\perp$

$<\textbf{x} \mid \textbf{y} > 0 \implies \textbf{x} \perp \textbf{v} - \textbf{x}$

Is this a dead end? Because I don't see how I can use what I know about T here

25. ## Re: MATH2601 Higher Linear Algebra

Originally Posted by leehuan
$V\text{ is a finite dimensional V.S. over }\mathbb{C}\text{ and }T:V\to V\text{ is linear}$

$\\\text{Suppose }T^2 = T = T^*\text{ (i.e. idempotent and self-adjoint)}\\ \text{Prove that there exists a subspace of }W\text{ such that }\\ T(\textbf{v}) = \text{proj}_{W}\textbf{v}$

My approach thus far: Write $\textbf{v} = \textbf{x}+\textbf{y}, \quad \textbf{x}\in W\text{ and }\textbf{y} \in W^\perp$

$<\textbf{x} \mid \textbf{y} > 0 \implies \textbf{x} \perp \textbf{v} - \textbf{x}$

Is this a dead end? Because I don't see how I can use what I know about T here
Claim: W := im(T) is such a subspace.

Proof: Exercise.

Page 2 of 4 First 1234 Last

There are currently 1 users browsing this thread. (0 members and 1 guests)

#### Posting Permissions

• You may not post new threads
• You may not post replies
• You may not post attachments
• You may not edit your posts
•