Videos for Section 4.1


Up to now, we've been reviewing the structure of vector spaces and linear transformations. Now we start on the heart of the course: eigenvalues and eigenvectors. As with all mathematical topics, you should ask yourself three questions:

  1. What is it?
  2. How do you compute it?
  3. What is it good for?
Section 4.1 is all about the first question. We define eigenvalues, eigenvectors and eigenspaces. Sections 4.2-4.6 are about the second question. The rest of the book is about the third.


If $A$ is an $n \times n$ matrix, and if ${\bf x}$ is a non-zero vector such that $$A{\bf x} = \lambda {\bf x}$$ for some scalar $\lambda$, then we say that ${\bf x}$ is an eigenvector with eigenvalue $\lambda$. For each eigenvalue $\lambda$, the set of all solutions to $A{\bf x} =\lambda {\bf x}$ is called an eigenspace and is denoted $E_\lambda$.


If $L: V \to V$ is an operator, the definitions are similar. If ${\bf v} \in V$ is a non-zero vector such that $L({\bf v}) = \lambda {\bf v}$, then ${\bf v}$ is an eigenvector with eigenvalue $\lambda$. The eigenvalues and eigenvectors of the operator $L$ are closely related to the eigenvalues and eigenvectors of the matrix $[L]_\mathcal{B}$, where $\mathcal{B}$ is a basis for $V$.