# Thread: MATH2601 Higher Linear Algebra

1. ## Re: MATH2601 Higher Linear Algebra

Originally Posted by InteGrand
Claim: W := im(T) is such a subspace.

Proof: Exercise.
Where does the inspiration come from that it just happens to be the image that satisfy this criteria o.O

2. ## Re: MATH2601 Higher Linear Algebra

Originally Posted by leehuan
Where does the inspiration come from that it just happens to be the image that satisfy this criteria o.O
$\noindent Well for one thing, if T(\mathbf{v}) = \mathrm{proj}_{W}(\mathbf{v}) for all \mathbf{v} \in V, we need that T(\mathbf{v})\in W for all \mathbf{v} \in V (because by definition \mathrm{proj}_{W}(\mathbf{v})\in W). Now as we know, a subspace of V that contains T(\mathbf{v}) for all \mathbf{v} \in V is \mathrm{im}(T) (in fact of course any subspace with this property must contain the image, i.e. the image of T is the smallest'' subspace with this property). So it would make sense to try the image of T. (And if there was any other subspace W that would work, since W would have to contain \mathrm{im}(T), it would have to be the case that \mathrm{im}(T) works too. So just try \mathrm{im}(T).)$

3. ## Re: MATH2601 Higher Linear Algebra

$A=\begin{pmatrix}3 &-1&2\\ -1&3&2\\ 2&2&0\end{pmatrix}$

$\text{Found in part iv): }A=QDQ^T\text{ where }\\ Q=\begin{pmatrix}\frac{1}{\sqrt3} & \frac{1}{\sqrt2} & -\frac{1}{\sqrt5}\\ \frac{1}{\sqrt3} & -\frac{1}{\sqrt2} & -\frac{1}{\sqrt 6}\\ \frac{1}{\sqrt 3}& 0 & \frac{2}{\sqrt6}\end{pmatrix}\\ D = \begin{pmatrix}4 &0&0\\ 0&4&0\\ 0&0&-2\end{pmatrix}$

$\text{v) Write down an expression for a matrix }B\text{ such that }B^2=A$

4. ## Re: MATH2601 Higher Linear Algebra

$(Q \sqrt{D} Q^T)(Q \sqrt{D} Q^T) = Q \sqrt{D} (Q^T Q) \sqrt{D} Q^T = Q D Q^T = A$

5. ## Re: MATH2601 Higher Linear Algebra

Originally Posted by leehuan
$A=\begin{pmatrix}3 &-1&2\\ -1&3&2\\ 2&2&0\end{pmatrix}$

$\text{Found in part iv): }A=QDQ^T\text{ where }\\ Q=\begin{pmatrix}\frac{1}{\sqrt3} & \frac{1}{\sqrt2} & -\frac{1}{\sqrt5}\\ \frac{1}{\sqrt3} & -\frac{1}{\sqrt2} & -\frac{1}{\sqrt 6}\\ \frac{1}{\sqrt 3}& 0 & \frac{2}{\sqrt6}\end{pmatrix}\\ D = \begin{pmatrix}4 &0&0\\ 0&4&0\\ 0&0&-2\end{pmatrix}$

$\text{v) Write down an expression for a matrix }B\text{ such that }B^2=A$
$\noindent If A = PDP^{-1} in general (D diagonal), then defining B= PD^{\frac{1}{2}}P^{-1}, we have B^{2} = A. Here D^{\frac{1}{2}} is a square root of D, which is a diagonal matrix with diagonal entries all being square roots of the entries in D. In your example, you can do this if you are willing to accept complex entries (a \sqrt{2}i for example). You can find out more about square roots of matrices here:$

https://en.wikipedia.org/wiki/Square...iagonalization .

6. ## Re: MATH2601 Higher Linear Algebra

Originally Posted by InteGrand
$\noindent If A = PDP^{-1} in general (D diagonal), then defining B= PD^{\frac{1}{2}}P^{-1}, we have B^{2} = A. Here D^{\frac{1}{2}} is a square root of D, which is a diagonal matrix with diagonal entries all being square roots of the entries in D. In your example, you can do this if you are willing to accept complex entries (a \sqrt{2}i for example). You can find out more about square roots of matrices here:$

https://en.wikipedia.org/wiki/Square_root_of_a_matrix
Woah, how do you do it so fast?

7. ## Re: MATH2601 Higher Linear Algebra

Originally Posted by boredofstudiesuser1
Woah, how do you do it so fast?
He's a machine!

8. ## Re: MATH2601 Higher Linear Algebra

I feel bad lol. I had the same idea as InteGrand, I just mucked up my matlab input when I went to check my answer
_______________

$\text{Is this identity true?}\\ \text{proj}_W\textbf{v} = \textbf{v} - \text{proj}_{W^\perp}\textbf{v}$

9. ## Re: MATH2601 Higher Linear Algebra

Originally Posted by leehuan
I feel bad lol. I had the same idea as InteGrand, I just mucked up my matlab input when I went to check my answer
_______________

$\text{Is this identity true?}\\ \text{proj}_W\textbf{v} = \textbf{v} - \text{proj}_{W^\perp}\textbf{v}$
Yes

10. ## Re: MATH2601 Higher Linear Algebra

Originally Posted by leehuan
I feel bad lol. I had the same idea as InteGrand, I just mucked up my matlab input when I went to check my answer
_______________

$\text{Is this identity true?}\\ \text{proj}_W\textbf{v} = \textbf{v} - \text{proj}_{W^\perp}\textbf{v}$
It's ok, we'll call you a machine too if it makes you feel better.

11. ## Re: MATH2601 Higher Linear Algebra

Originally Posted by leehuan
I feel bad lol. I had the same idea as InteGrand, I just mucked up my matlab input when I went to check my answer
I don't know if this was the reason why, but in the Q you typed above, there's a typo (top-right entry should have 6 rather than 5 in the square root).

12. ## Re: MATH2601 Higher Linear Algebra

Originally Posted by InteGrand
I don't know if this was the reason why, but in the Q you typed above, there's a typo (top-right entry should have 6 rather than 5 in the square root).
Oops. Nah I think that was just a typo as I typed it on the forums

13. ## Re: MATH2601 Higher Linear Algebra

This one's a bit long...

$\\B\in M_{n,n}(\mathbb{C}) \text{ satisfies}\\ \text{nullity}(B^{m-1})< n\text{ and nullity}(B^m) = n\\ \text{for some integer }m$

$\text{Fix }\textbf{v}\in \mathbb{C}^n \backslash \ker(B^{m-1})$

Proven in i): 0 is the only eigenvalue of B (so B is nilpotent)

$\text{ii) By considering the Jordan form of }B\text{, or otherwise, prove that }\\\det (B+I)=1$

$\text{iii) Prove that for }k=1,\dots,m-1\\ B^k \textbf{v} \in \ker(B^{m-k})\backslash \ker(B^{m-k-1})$

14. ## Re: MATH2601 Higher Linear Algebra

Originally Posted by leehuan
This one's a bit long...

$\\B\in M_{n,n}(\mathbb{C}) \text{ satisfies}\\ \text{nullity}(B^{m-1})< n\text{ and nullity}(B^m) = n\\ \text{for some integer }m$

$\text{Fix }\textbf{v}\in \mathbb{C}^n \backslash \ker(B^{m-1})$

Proven in i): 0 is the only eigenvalue of B (so B is nilpotent)

$\text{ii) By considering the Jordan form of }B\text{, or otherwise, prove that }\\\det (B+I)=1$

$\text{iii) Prove that for }k=1,\dots,m-1\\ B^k \textbf{v} \in \ker(B^{m-k})\backslash \ker(B^{m-k-1})$
$\noindent For (ii), let B = PJP^{-1} where J is the Jordan form of B, then use what you know about the eigenvalues of B and the fact that B + I= P(J+I)P^{-1} to deduce the result.$

$\noindent And I think (iii) is simpler than you (probably) think.$

15. ## Re: MATH2601 Higher Linear Algebra

Originally Posted by InteGrand
$\noindent For (ii), let B = PJP^{-1} where J is the Jordan form of B, then use what you know about the eigenvalues of B and the fact that B + I= P(J+I)P^{-1} to deduce the result.$

$\noindent And I think (iii) is simpler than you (probably) think.$
Oh of course. Once I drew out the Jordan chain again and looked carefully at what the question gave iii made sense.
_________________________________________

$\\\text{Suppose that }A\text{ is a }9\times 9\text{ matrix with only one eigenvalue }\lambda\text{, and that}\\ \text{nullity}(A-\lambda I)=4\text{ and nullity}(A-\lambda I)^2 = 7$

$\text{Show that there exists constant }9\times 9\text{ matrices }M_0,M_1,M_2,M_3\text{ matrices such that}\\ A^n = \lambda^nM_0+n\lambda^nM_1+n^2\lambda^nM_2+n^3 \lambda^nM _3\\\text{ for all }n$

Tools permitted if useful: Binomial theorem for matrices that commute in multiplication, Cayley-Hamilton theorem

16. ## Re: MATH2601 Higher Linear Algebra

No more questions for this sem after tomorrow.
__________________

$\\\text{I forgot how to use my field axioms.}\\ \text{Prove that }a0=0\text{ for }a\in \mathbb{F}$

17. ## Re: MATH2601 Higher Linear Algebra

Originally Posted by leehuan
No more questions for this sem after tomorrow.
__________________

$\\\text{I forgot how to use my field axioms.}\\ \text{Prove that }a0=0\text{ for }a\in \mathbb{F}$
Hint: Use the axioms to show that a + a0 = a.

18. ## Re: MATH2601 Higher Linear Algebra

This is a highly open-ended question and everyone's opinion might be different.

What's the easiest proof (or would be a very easy proof) of the Cauchy-Schwarz inequality to memorise?

19. ## Re: MATH2601 Higher Linear Algebra

Originally Posted by leehuan
This is a highly open-ended question and everyone's opinion might be different.

What's the easiest proof (or would be a very easy proof) of the Cauchy-Schwarz inequality to memorise?
Well you wrote one up here before, so maybe you'd find that easiest to "memorise" for yourself:

Originally Posted by leehuan
$\text{They showed one of the many proofs of it in my lecture last sem.}$

\begin{align*}p(\lambda)&=|\textbf{a}-\lambda\textbf{b}|^2\\ &=(\textbf{a}-\lambda\textbf{b})\cdot \textbf{a}-\lambda\textbf{b}\\ &= |\textbf{a}|^2-2\lambda \textbf{a}\cdot\textbf{b}+\lambda^2 |\textbf{b}|^2 \end{align*}\\ \text{And note that }p(\lambda)\ge 0

$\\ \text{From 2U methods, we see that }p\text{ is miniimised when }\lambda = \frac{\textbf{a}\cdot\textbf{b}}{|\textbf{b}|^2}\\ \text{Substituting back in gives }\min_{\lambda \in \mathbb R}p(\lambda)=|\textbf{a}|^2-\frac{(\textbf{a}\cdot \textbf{b})^2}{|\textbf{b}|^2}$

$\\\text{So from }p(\lambda )\ge 0\text{ and }|\textbf{x}|\ge 0\\ \text{Rearranging gives }|\textbf{a}\cdot \textbf{b}|\le |\textbf{a}||\textbf{b}|$

I did not even know that there was a sum form until doing past papers for 1251. Then I had to figure out why the sum and vector forms were equivalent.
Note that it needs to be adapted slightly to deal with the complex case, but it's not too big a deal.

You can also probably find many proofs online. There are twelve proofs here, but they seem to only be for the case of R^n: http://www.uni-miskolc.hu/~matsefi/O...rticle1_19.pdf .

20. ## Re: MATH2601 Higher Linear Algebra

_________________

$\\\text{Suppose }Q\in M_{n,n}\text{ is unitary. Prove that all its eigenvalues }\lambda\text{ satisfy}\\ |\lambda|=1$

21. ## Re: MATH2601 Higher Linear Algebra

Originally Posted by leehuan
_________________

$\\\text{Suppose }Q\in M_{n,n}\text{ is unitary. Prove that all its eigenvalues }\lambda\text{ satisfy}\\ |\lambda|=1$
$\noindent Note that if \lambda is an eigenvalue of Q with unit eigenvector \vec{v}, then$

\begin{align*}\lambda &= \lambda\left\langle \vec{v},\vec{v}\right\rangle \\ &=\langle \lambda\vec{v},\vec{v}\rangle \\ &= \langle Q \vec{v},\vec{v}\rangle \\ &= \langle \vec{v},Q^{*} \vec{v}\rangle \\ &= \langle \vec{v}, Q^{-1}\vec{v}\rangle \\ &= \langle \vec{v}, \lambda^{-1} \vec{v}\rangle \\ &= \overline{\lambda^{-1}}\langle \vec{v},\vec{v}\rangle \\ &= \left(\overline{\lambda}\right)^{-1}.\end{align*}

$\noindent So \lambda = \left( \overline{\lambda}\right)^{-1} \Rightarrow \lambda\overline{\lambda} = 1\Rightarrow |\lambda|^{2} = 1 \Rightarrow |\lambda| = 1.$

$\noindent Facts used include:$

$\noindent \bullet Q^{*} = Q^{-1} (as Q is unitary)$

$\noindent \bullet As Q is unitary, it is invertible and so cannot have a zero eigenvalue, so \lambda \neq 0$

$\noindent \bullet If A is an invertible square complex matrix and A\vec{u} = t \vec{u} for some vector \vec{u} and scalar t, then A^{-1}\vec{u} = t^{-1} \vec{u}$

$\noindent \bullet Definition of eigenvalue and eigenvector, basic properties of adjoints and inner products, and \langle \vec{v}, \vec{v} \rangle = 1, since \vec{v} is a unit eigenvector.$

22. ## Re: MATH2601 Higher Linear Algebra

This is just some personal fun

$\\\text{For the V.S. }(\mathbb{R}^+, \oplus, \otimes, \mathbb{R})\\ \text{where }x\oplus y = x\times y\\ x\otimes y = y^x\\ \text{Does there exist an inner product we can define to make it an inner product space?}$

23. ## Re: MATH2601 Higher Linear Algebra

Originally Posted by leehuan
This is just some personal fun

$\\\text{For the V.S. }(\mathbb{R}^+, \oplus, \otimes, \mathbb{R}^+)\\ \text{where }x\oplus y = x\times y\\ x\otimes y = y^x\\ \text{Does there exist an inner product we can define to make it an inner product space?}$
Yes. (I assume you meant the field to be R.)

24. ## Re: MATH2601 Higher Linear Algebra

$\noindent Just in general, suppose V is a vector space over a field \mathbb{F}, and let \phi be a function and W a set such that \phi : V \to W is a bijection (so of course the inverse \phi^{-1} : W \to V is also a bijection). Then we can make W be a vector space over \mathbb{F} that is isomorphic to V, by \emph{defining} vector addition in W as$

$w_{1} \oplus w_{2} = \phi \left(\phi^{-1}\left(w_{1}\right)+ \phi^{-1} \left(w_{2}\right)\right) for all w_{1}, w_{2} \in W,$

$\noindent where the +'' in the RHS is the addition operation in the vector space V, and scalar multiplication by$

$\alpha \otimes w = \phi\left(\alpha * \phi^{-1}(w)\right) for all \alpha \in \mathbb{F} and w\in W,$

$\noindent where the *'' refers to scalar multiplication in V.$

$\noindent Under these definitions, you can easily confirm that W is a vector space over \mathbb{F} and is isomorphic to V (with \phi^{-1} providing an isomorphism; in fact, this is the reason why we chose to define \oplus and \otimes in this way -- it essentially makes \phi^{-1} into a bijective \emph{linear} map, i.e. a vector space isomorphism. (Of course \phi will also provide an isomorphism.)).$

$\noindent Now with these definitions, you can also easily show that if V is an \emph{inner product} space over \mathbb{F} (which is either \mathbb{R} or \mathbb{C}), then W is also an inner product space over \mathbb{F} (which we would expect, as it is isomorphic to V), with an inner product on W being$

$\langle w_{1}, w_{2} \rangle_{W} \stackrel{\text{def}}{=} \langle \phi^{-1} \left(w_{1}\right), \phi^{-1}\left(w_{2}\right) \rangle_{V} for all w_{1}, w_{2} \in W,$

$\noindent where \langle \cdot, \cdot \rangle_{V} is the inner product of the inner product space V.$

$\noindent (The intuition behind this is that since W is isomorphic to V, it would be natural for it to have an inner product that is just that of V, but you just plug in the vectors to \langle \cdot, \cdot \rangle_{V} that are the re-labelled'' versions of w_{1}, w_{2} in V, namely \phi^{-1} \left(w_{1}\right) and \phi^{-1} \left(w_{2}\right). You can show as an exercise that this is indeed an inner product.)$

$\noindent In the example you gave, the known vector space (and inner product space) was V = \mathbb{R} (with field \mathbb{F} = \mathbb{R}). The set W was W = \mathbb{R}^{+}, and the bijection was \phi : V \to W (i.e. \phi : \mathbb{R} \to \mathbb{R}^{+}) given by \phi (v) = e^{v} for all v \in V = \mathbb{R}. (The inverse mapping was \phi^{-1}: \mathbb{R}^{+} \to \mathbb{R}, given by \phi^{-1} (w) = \ln w for all w\in \mathbb{R}^{+} (i.e. just the inverse function of \phi, which is an exponential).).$

$\noindent In your example, then, from the preceding discussion, we would want to define addition on W = \mathbb{R}^{+} as w_{1}\oplus w_{2} = \phi \left(\phi^{-1}\left(w_{1}\right)+ \phi^{-1} \left(w_{2}\right)\right) with \phi being the exponential function. This is indeed what was done, as$

\begin{align*}\phi \left(\phi^{-1}\left(w_{1}\right)+ \phi^{-1} \left(w_{2}\right)\right) &= \exp \left(\ln w_{1} + \ln w_{2}\right) \\ &= \exp \left(\ln \left(w_{1} w_{2}\right) \right) \\ &= w_{1}w_{2},\end{align*}

$\noindent which is how they defined the addition. Similarly, we would want scalar multiplication to be defined by \alpha \otimes w = \phi\left(\alpha * \phi^{-1}(w)\right), and indeed it is, since$

\begin{align*}\phi\left(\alpha * \phi^{-1}(w)\right) &= \exp \left(\alpha \ln w\right)\\ &= \exp \left(\ln \left(w^{\alpha}\right)\right) \\ &= w^{\alpha}. \checkmark \end{align*}

$\noindent In other words, assuming you've proved the assertions made earlier in this post, the vector space with its addition and scalar multiplication you gave is isomorphic to \mathbb{R}. Since \mathbb{R} is an inner product space with an inner product \langle v_{1}, v_{2}\rangle_{\mathbb{R}} = v_{1} v_{2} for v_{1}, v_{2} \in \mathbb{R} (just normal multiplication), then assuming you proved the assertion about being isomorphic to an inner product space, we have that the W is also an inner product space over \mathbb{R} and has as an inner product$

\begin{align*}\langle w_{1}, w_{2}\rangle_{W} &\stackrel{\text{def}}{=} \langle \phi^{-1} \left(w_{1}\right), \phi^{-1}\left(w_{2}\right) \rangle_{\mathbb{R}} \\ &\stackrel{\text{def}}{=} \phi^{-1}\left(w_{1}\right)\phi^{-1}\left(w_{2}\right) \\ &= \left(\ln w_{1}\right)\left(\ln w_{2}\right).\end{align*}

$\noindent In other words, defining \langle w_{1}, w_{2}\rangle_{W} = \left(\ln w_{1}\right)\left(\ln w_{2}\right) (product of the logs) for all w_{1}, w_{2} \in \mathbb{R}^{+}, we have that W is an inner product space with this as an inner product.$

25. ## Re: MATH2601 Higher Linear Algebra

Hopefully you don't mind if i post a question here. (taking MATH2601 this semester)

Suppose that G is a group with precisely three distinct elements e (the identity), a and b.
a) Prove that ab = e (Hint: eliminate other possibilities).
b) Prove that a^2 = b.
c) Deduce that G = {e, a, a^2} and hence that G is isomorphic to the group.

(How do you get LaTeX to work here?)

Page 3 of 4 First 1234 Last

There are currently 1 users browsing this thread. (0 members and 1 guests)

#### Posting Permissions

• You may not post new threads
• You may not post replies
• You may not post attachments
• You may not edit your posts
•