Eigenvalues and eigenvectors¶
Definition¶
Let \(T\) be a linear transformation from a vector space \(V\) over a field \(F\) into itself. In other words,
Let \(\mathbf{v}\) be a vector in \(V\) that is not the zero vector. The vector \(\mathbf{v}\) is an eigenvector of \(T\) if \(T(\mathbf{v})\) is a scalar multiple of \(\mathbf{v}\). This can be written as
for some scalar \(\lambda\) in the field \(F\).
If the vector space \(V\) is finite-dimensional, then the linear transformation \(T\) can be represented as a square matrix \(A\), and the vector \(\mathbf{v}\) by a column vector:
Geometrically, an eigenvector, corresponding to a real nonzero eigenvalue, points in a direction that is stretched by the transformation and the eigenvalue is the factor by which it is stretched. If the eigenvalue is negative, the direction is reversed.