Skip to content

Eigenvalues and eigenvectors

Definition

Let \(T\) be a linear transformation from a vector space \(V\) over a field \(F\) into itself. In other words,

\[ T: V \to V \]

Let \(\mathbf{v}\) be a vector in \(V\) that is not the zero vector. The vector \(\mathbf{v}\) is an eigenvector of \(T\) if \(T(\mathbf{v})\) is a scalar multiple of \(\mathbf{v}\). This can be written as

\[ T(\mathbf{v}) = \lambda \mathbf{v} \]

for some scalar \(\lambda\) in the field \(F\).

If the vector space \(V\) is finite-dimensional, then the linear transformation \(T\) can be represented as a square matrix \(A\), and the vector \(\mathbf{v}\) by a column vector:

\[ A \mathbf{v} = \lambda \mathbf{v} \]

Geometrically, an eigenvector, corresponding to a real nonzero eigenvalue, points in a direction that is stretched by the transformation and the eigenvalue is the factor by which it is stretched. If the eigenvalue is negative, the direction is reversed.

Reference