Pavel Grinfeld’s Linear Algebra 2

Lecture 18a: Eigen Value decomposition

He begins with the matrix  \begin{bmatrix} 5&2& 0 \\ 2& 5& 0 \\ 4&-1&4 \end{bmatrix}. It’s easy to determine all three eigen values.  As sum of the rows is 7. One eigen value is \lambda_1 =7 and the corresponding eigen vector is \begin{bmatrix} 1\\1\\1 \end{bmatrix}. The other eigen value is obviously 4 as it is a non-zero value on diagonal in which all other entries in the column are zero. Thus \lambda_2=4 and the last can be determined by the trace idea i.e 7+4+x=5+5+4 \Rightarrow x = 3. Thus \lambda_3 = 3. The eigen vector corresponding to \lambda_2=4 is \begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix}. Similarly for eigen value \lambda_3=3. The corresponding eigen vectors is \begin{bmatrix} -1 \\ 1 \\ 5 \end{bmatrix}.

Now we construct a different matrix because we know that Ax = \lambda x \Rightarrow\begin{bmatrix} 5&2& 0 \\ 2& 5& 0 \\ 4&-1&4 \end{bmatrix}X =

Lecture 17i: Easy eigen value, constructing a matrix with given eigen values and eigen vectors

Lecture 17h: Easy eigen values, the grand finale

In this lecture he determined the eigen values and eigen vector of the following matrix

\begin{bmatrix} 1 & 3 & 0 & 7 & 0 \\ 2 & 6 & 0 & 3 & 0 \\ 0 & 0 & 6 & 5 & 0 \\ 5 & 15 & 0 & -9 & 0 \\3 & 9 & 0 & -10 & 9 \end{bmatrix}

Lecture 17a: Easy eigen values

Here he begins with a diagonal matrix \begin{bmatrix} 3&0&0 \\ 0 & 7 & 0 \\ 0 & 0 &-8 \end{bmatrix}. It has three eigen values and these are the entries on its diagonal. And it has three eigen vectors as \begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix},\begin{bmatrix} 0 \\ 1\\ 0 \end{bmatrix},\begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix} respectively. However if we have the matrix \begin{bmatrix} 7&0&0 \\ 0 & -8 & 0 \\ 0 & 0 &3 \end{bmatrix}. We will have the following eigen vectors \begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix},\begin{bmatrix} 0 \\ 1\\ 0 \end{bmatrix},\begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix}. Notice these are same however the eigen value and eigen vector pairs are different.

Lecture 16l: Similarity transformations preserve the trace and determinant. In this lecture he considers the matrix $\begin{bmatrix} 1&3 \\ -2 & 6 \end{bmatrix}$ and asks which of the following three matrices are similar to it. \begin{bmatrix} 8&3 \\ -6 & 1 \end{bmatrix} \begin{bmatrix} 5&1 \\ -2&2 \end{bmatrix} \begin{bmatrix} 10&4 \\ -8 &-2 \end{bmatrix}

Lecture 16k: Eigen Values, Eigen Vectors and Similarity Transformation In this lecture he asks us to consider the following matrices XAX^{-1}. Now this matrix is not A however it is related to A. What we discover that this matrix too has same eigen value but different eigen vector. Let Av= \lambda v and let X^{-1}u = v and u = Xv. Therefore \Rightarrow XAX^{-1}u = XAv \Rightarrow XAX^{-1}u = X\lambda v \Rightarrow XAX^{-1}u = \lambda Xv \Rightarrow XAX^{-1}u = \lambda u. Thus u is the eigen vector (u=Xv) and \lambda still remains the eigen value.

Lecture 16j: Eigen vectors of A^2 and A^{-1} In this lecture he starts with Ax= \lambda x\Rightarrow AAx= A\lambda x \Rightarrow A^2x= \lambda^2 x . Thus eigen values of A^2 is \lambda^2. Ax= \lambda x\Rightarrow A^{-1}Ax= A^{-1}\lambda x \Rightarrow Ix= \lambda A^{-1} x . Thus eigen values of A^{-1} is \frac{1}{\lambda}. Note in this case its important that \lambda \ne 0

Lecture 16i: Defective Matrix \begin{bmatrix} 2&3 \\ 0 & 2 \end{bmatrix}. Here both the eigen values are 2 but there is only 1 eigen vector corresponding to these two repeated eigen values. So the defective matrices are those where algebraic multiplicity is not same as geometric multiplicity. There is another example of defective matrix and it is \begin{bmatrix} -16&12 \\ -27 &20 \end{bmatrix}. Again it has the same repeated eigen values 2.

Lecture 16h: A 3×3 Eigen value and Eigen Vector example

Here he chose the matrix \begin{bmatrix} -5&4&-2 \\ -24 &17&-8\\-36&24&-11 \end{bmatrix}

Lecture 16e:

Lecture 16d: Eigen values and vectors of 2×2 matrices He does few examples

Lecture 16c+: Sum is trace and product is determinant This is a proof which excludes complex numbers and defective case.

Lecture 16c: A detailed example of determining eigen values and eigen vectors Here he takes up the example \begin{bmatrix} 17&-6 \\45&-16 \end{bmatrix} and points out that sum is equal to trace and product is equal to determinant of the matrix.

Lecture 16b: Algebraic derivation of Eigen value algorithm Here he talks about Ax = \lambda x \Rightarrow (A-\lambda I)x=0 \Rightarrow |A-\lambda I|x=0

Lecture 16a: The Eigen value Algorithm Here he solves this question in detail \begin{bmatrix} 2&1 \\1&2 \end{bmatrix} Lecture 15o: Null Space

Lecture 15n: Why Eigen Values and Eigen Vectors are so important The reason is important because any vector can be written in the form of v=\alpha_1 e_1+\alpha_2 e_2 +...+\alpha_n e_n, So the transformation of this is given by T(v)=\alpha_1 T(e_1)+\alpha_2 T(e_2) +...+\alpha_n T(e_n) =\alpha_1\lambda_1 e_1+\alpha_2\lambda_2 e_2+..+\alpha_n\lambda_n e_n. Thus the conclusion is if vectors is represented using eigen bases then calculation its Transformation becomes very simple and boils down to multiplying by eigen values.

Lecture 15m+: The identity crisis, I mean transformation

In this lecture he takes the identity matrix. Now any vector multiplied by identity matrix gives the same vector. Thus identity matrix has only 1 eigen value 1 and its multiplicity is n and each vector is an eigen vector. So if the identity matrix is n \times n. Then there are n linearly independent eigen vectors.

Lecture 15m: All transformations in Rn can be represented as matrix product In this lecture he revisits the two transformations T \left( \begin{bmatrix} a\\b\\c\end{bmatrix} \right)=\begin{bmatrix} b\\a\\2c\end{bmatrix} and T \left( \begin{bmatrix} a\\b\\c\end{bmatrix} \right)=\begin{bmatrix} \frac{a+b}{2}\\ \frac{a+b}{2}\\c\end{bmatrix} and shows that these can be represented by matrix multiplications. T \left( \begin{bmatrix} a\\b\\c\end{bmatrix} \right)=\begin{bmatrix} b\\a\\2c\end{bmatrix} can be written as \begin{bmatrix} 0 &1&0\\ 1&0& 0 \\ 0&0&2 \end{bmatrix}\begin{bmatrix} a\\b\\c\end{bmatrix}=\begin{bmatrix} b\\a\\2c\end{bmatrix} Similarly T \left( \begin{bmatrix} a\\b\\c\end{bmatrix} \right)=\begin{bmatrix} \frac{a+b}{2}\\ \frac{a+b}{2}\\c\end{bmatrix} can be expressed as \begin{bmatrix} \frac{1}{2} & \frac{1}{2}&0\\ \frac{1}{2}& \frac{1}{2}& 0 \\ 0&0&1 \end{bmatrix}\begin{bmatrix} a\\b\\c\end{bmatrix}=\begin{bmatrix} \frac{a+b}{2}\\ \frac{a+b}{2}\\c\end{bmatrix}

Lecture 15l: Another linear transformation in Rn In this lecture he introduces the transformation T \left( \begin{bmatrix} a\\b\\c\end{bmatrix} \right)=\begin{bmatrix} \frac{a+b}{2}\\ \frac{a+b}{2}\\c\end{bmatrix}. This is a linear transformation. Now T\left(\begin{bmatrix} 1\\2\\3\end{bmatrix}\right)= \begin{bmatrix}\frac{3}{2}\\\frac{3}{2}\\3\end{bmatrix} and T\left(\begin{bmatrix} 7\\7\\9\end{bmatrix}\right)=\begin{bmatrix} 7\\7\\9\end{bmatrix}. So it’s obvious that \begin{bmatrix} 7\\7\\9\end{bmatrix} is an eigen vector with eigen value \lambda=1. We rather write this as \begin{bmatrix} 0 \\ 0 \\1 \end{bmatrix} with \lambda=1 and \begin{bmatrix} 1\\1\\ 0 \end{bmatrix} with \lambda=1. The last eigen vector can be seen as \begin{bmatrix} 1\\-1\\ 0 \end{bmatrix} and this corresponds to eigen value \lambda =0.

Lecture 15k: A linear transformation in Rn In this lecture he introduces the transformation T \left( \begin{bmatrix} a\\b\\c\end{bmatrix} \right)=\begin{bmatrix} b\\a\\2c\end{bmatrix}. This is a linear transformation. Now T\left(\begin{bmatrix} 1\\2\\3\end{bmatrix}\right)= \begin{bmatrix} 2\\1 \\ 6\end{bmatrix} and T\left(\begin{bmatrix} 7\\7\\9\end{bmatrix}\right)=\begin{bmatrix} 7\\7\\18\end{bmatrix}. So it’s obvious that \begin{bmatrix} 7\\7\\9\end{bmatrix} is not an eigen vector. However the two gives a clue that  \begin{bmatrix} 0\\ 0\\1 \end{bmatrix} is an eigen vector with eigen value 2. The second eigen vector is \begin{bmatrix} 1\\ 1\\ 0 \end{bmatrix} is an eigen vector with eigen value 1. and the third eigen value is \begin{bmatrix} 1\\ -1\\ 0 \end{bmatrix} is an eigen vector with eigen value -1. So we found three different eigen values and three different eigen vectors.

Lecture 15j: Example of non-linear Transformations This is one of my favorite lecture in this series. Here he takes four different transformations. These are T_1(x) = f^2(x), T_2(x)=f(x^2),T_3(x)=\sin(x)f(x),T_4(x)=f(x)+x. Out of these only T_2(x)=f(x^2),T_3(x)=\sin(x)f(x) are linear the other are non linear. What comes as a surprise that \sin(x)f(x) is linear which means \sin(x+y)f(x+y)

Lecture 15i: Dialation and wavelet This topic was popularized by Gilbert Strang. Take for example the transformation Df(x) = f(2x-1) \Rightarrow Dx^2 = (2x-1)^2=4x^2-4x+1 and D(x-1) = 2x-2 and D(x^2+x-1) =(2x-1)^2+(2x-1)-1 = (2x-1)^2+2x-2. Notice that we have discovered here that eigen value of Df(x) = 2x-1 is 2 and the corresponding eigen vector is x-1

Lecture 15h: Derivative as linear transformation D,D^2 moving beyond geometric vector. Asking the same kind of question. Are these operator linear ? What are the eigen values. Note for derivative function of a polynomial we always go down by one degree. So it’s not possible to have an eigen value. However we do note that constant function has a derivative zero. So we can say that it has eigen value 0. Now consider that a linear polynomial ax+b its D^2 is zero so it has. The other functions he consider are e^{px}. Here D(e^{px}) =pe^{px}. Here we have infinite eigen values. Now D^2(e^{px}) =pe^{px} \sin(px). Here D(\sin(px))

Lecture 15g: Geometric transformation in space

Lecture 15f: The translation by vector a We see that its not a linear Lecture15e:  There is no such lecture right now

Lecture 15d: The Projection Transformation So the take home point is that projection is reflection on a line. There are two extreme projections one is when it is along the line in that case we get Px = x and the second case when it is perpendicular to the line in that case we have Px = 0. So we have two eigen values 1 and 0. Also note that on repeatedly applying projection it doesn’t change so PPx = Px. Thus P^2=P.

Lecture 15c: The Reflection Transformation and Introduction to Eigenvalues This tells that if we do two reflection we get the same ie. P^2x=x \Rightarrow P^2=1 \Rightarrow P= 1

Lecture 15b:  What makes a transformation linear Here he starts with the idea of linear Transformation by giving an example of someone who goes to Europe where the cost of eggs = 4 euro and coffee is 2 euro and exchange rate is 1 euro is 1.5 dollars. Suppose you want to calculate the amount in dollors then there are two ways to do this problem. Convert total first in to euro which is 4+2 = 6 euro and then into dollars which is 9. T(u+v) = T(u)+T(v)

About Sumant Sumant

I love Math and I am always looking forward to collaborate with fellow learners. If you need help learning math then please do contact me.
This entry was posted in Uncategorized and tagged . Bookmark the permalink.

Leave a comment