Eigenvalues and Eigenvectors

Posted by Amit Rajan on Thursday, April 14, 2022

18.1 Eigenvalues and Eigenvectors

The usual function of a matrix is to act on a vector like a function, i.e. in goes a vector $x$, out comes a vector $Ax$. For a specific matrix $A$, if $Ax$ is parallel to $x$ i.e. $Ax = \lambda x$, the vectors $x$ are called as eigenvectors. The constant $\lambda$ is called as eigenvalue.

The vector $x$ and constant $\lambda$ satisfying the equation $Ax = \lambda x$ are eigenvectors and eigenvalues. The eigenvalues and eigenvectors for some of the commonly used matrices are as follows:

  • Projection Matrix: A projection matrix $P$ takes a vector $b$ and projects it as $Pb$, where $Pb$ is in the plane of $P$. If the vector $x$ is in the plane of $P$, then it’s projection $Px (=x)$ will be the same. This means that all the vectors in the plane of projection matrix $P$ are eigenvectors with eigenvalues equal to $\lambda = 1$. Apart from this, any $x \perp P$ will have a $0$ projection and hence an eigenvesctor with $\lambda = 0$ eigenvalue.

  • Permutation Matrix: Let us take an example of a permutation matrix $A = \begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix}$, the eigenvalues and eigenvectors are $\lambda_1 = 1, x_1 = \begin{bmatrix} 1 \\ 1 \end{bmatrix}; \lambda_2 = -1, x_2 = \begin{bmatrix} -1 \\ 1 \end{bmatrix}$.

  • Rotation Matrix: Let $Q = \begin{bmatrix} 0 & -1 \\ 1 & 0 \end{bmatrix}$, then $\begin{vmatrix} -\lambda & -1 \\ 1 & -\lambda \end{vmatrix} = 0 \implies \lambda^2 + 1 = 0 \implies \lambda = \pm i$. Here eigenvalues are not real. It should be noted that the eigenvalues are complex conjugate of each other. Intutively, finding the eigenvectors of a rotation matrix is like finding vectors which when rotated by $90^\circ$ comes out to be the same. Only vector having this property is the $0$ vector.

  • Traiangular Matrix: Let $Q = \begin{bmatrix} 3 & 1 \\ 0 & 3 \end{bmatrix}$, then $\begin{vmatrix} 3-\lambda & 1 \\ 0 & 3-\lambda \end{vmatrix} = 0 \implies (3-\lambda)^2 = 0 \implies \lambda_1 = 3,\lambda_2 = 3$. We get $x_1 = x_2 = \begin{bmatrix} 1\\ 0 \end{bmatrix}$. Hence, if we get repeated eigenvalues, we can have shortage of eigenvectors.

One of the most important fact about eigenvalue is: Sum of the eigenvalues of a matrix is the sum of the elements across it’s diagonal (called as trace) and the product of the eigenvalues is the value of the determinant.

18.2 How to solve $Ax = \lambda x$

We can rewrite the equation $Ax = \lambda x$ as $(A - \lambda I)x = 0$. For this equation to have any other solution apart from $x=0$, the matrix $A - \lambda I$ has to be singular, i.e. $|A - \lambda I| = 0$. Now to find $x$ is like finding the null space of $(A - \lambda I)$.

Example: Take $A = \begin{bmatrix} 3 & 1 \\ 1 & 3 \end{bmatrix}$. Then, $|A - \lambda I| = \begin{vmatrix} 3-\lambda & 1 \\ 1 & 3-\lambda \end{vmatrix} = (3-\lambda)^2 - 1 = \lambda^2 - 6\lambda + 8$. In the equation $\lambda^2 - 6\lambda + 8$, $6$ is the trace (sum of diagonal elements) and $8$ is the determinant of matrix $A$. On solving this equation, we get the eigenvalues as $\lambda_1 = 4,\lambda_2 = 2$. The eigenvector $x_1$ can be obtained by solving the equation $(A - \lambda_1I)x_1 = 0 \implies (A - 4I)x_1 = 0 \implies \begin{bmatrix} -1 & 1 \\ 1 & -1 \end{bmatrix}x_1 = 0 \implies x_1 = \begin{bmatrix} 1 \\ 1 \end{bmatrix}$. Similarly, for $\lambda_2$, $x_2 = \begin{bmatrix} -1 \\ 1 \end{bmatrix}$.

If we look at the matrix in the example, it is $3I$ away from the permutation matrix discussed above. It should be noted that the eigenvalues of the new matrix $A$ can be obtained by adding $3$ in the eigenvalues of the permutation matrix with eigenvectors remaining the same. This fact can be demonstrated mathematically as well.

If $Ax = \lambda x$, then $(A + 3I)x = \lambda x + 3x = (\lambda + 3)x$. Hence for the new matrix $A + 3I$, eigenvalues are $\lambda + 3$ with eigenvectors remaining unchanged.