Revision as of 16:05, 1 April 2005 edit82.124.230.104 (talk) →Definition← Previous edit |
Latest revision as of 23:00, 7 February 2020 edit undoJ947 (talk | contribs)Extended confirmed users, New page reviewers, Pending changes reviewers, Rollbackers34,100 edits Modifying redirect categories using Capricorn ♑ |
(40 intermediate revisions by 22 users not shown) |
Line 1: |
Line 1: |
|
|
#REDIRECT ] |
|
In ], the '''eigenvectors''' (from the ] ''eigen'' meaning "inherent, characteristic") of a ] are non-zero ] which, when operated on by the operator, result in a ] multiple of themselves. The scalar is then called the ] associated with the eigenvector. |
|
|
|
|
|
|
|
{{Redirect category shell| |
|
In ] and ] the eigenvectors of a ] or a ] often have important physical significance. In ] the eigenvectors of the governing equations typically correspond to natural modes of vibration in a body, and the eigenvalues to their frequencies. In ], operators correspond to observable variables, eigenvectors are also called '''eigenstates''', and the eigenvalues of an operator represent those values of the corresponding variable that have non-zero probability of occurring. |
|
|
|
{{R to plural}} |
|
|
|
|
|
{{R from subtopic}} |
|
== Examples == |
|
|
|
{{R with history}} |
|
Intuitively, for ]s of two-dimensional space '''R'''<sup>2</sup>, eigenvectors are thus: |
|
|
|
{{R unprintworthy}} |
|
* rotation: no real valued eigenvectors. (Complex eigenvalue, eigenvector pairs exist). |
|
|
|
}} |
|
* reflection: eigenvectors are perpendicular and parallel to the line of symmetry, the eigenvalues are -1 and 1, respectively |
|
|
* uniform scaling: all vectors are eigenvectors, and the eigenvalue is the scale factor |
|
|
* projection onto a line: eigenvectors with eigenvalue 1 are parallel to the line, eigenvectors with eigenvalue 0 are parallel to the direction of projection |
|
|
|
|
|
== Definition == |
|
|
Formally, we define eigenvectors and eigenvalues as follows: |
|
|
If '''A''' : ''V'' <tt>-></tt> ''V'' is a linear operator on some ] ''V'', '''v''' is a non-zero ] in ''V'' and ''c'' is a scalar (possibly zero) such that |
|
|
|
|
|
: <math>\mathbf{A} \mathbf{v} = c \mathbf{v},</math> |
|
|
|
|
|
then we say that '''v''' is an eigenvector of the operator '''A''', and its associated eigenvalue is <math>c</math>. Note that if '''v''' is an eigenvector with ] <math>c</math>, then any non-zero multiple of '''v''' is also an eigenvector with eigenvalue <math>c</math>. In fact, all the eigenvectors with associated eigenvalue <math>c</math>, together with '''0''', form a subspace of ''V'', the ''']''' for the eigenvalue <math>c</math>. |
|
|
|
|
|
== Identifying eigenvectors == |
|
|
|
|
|
For example, consider the ] |
|
|
|
|
|
: <math>A = |
|
|
\begin{bmatrix} |
|
|
\; 0 & 1 & -1 \\ |
|
|
\; 1 & 1 & \; 0 \\ |
|
|
-1 & 0 & \; 1 |
|
|
\end{bmatrix} |
|
|
</math> |
|
|
|
|
|
which represents a linear operator '''R'''<sup>3</sup> <tt>-></tt> '''R'''<sup>3</sup>. One can check that |
|
|
|
|
|
: <math>A \begin{bmatrix} \; 1 \\ \; 1 \\ -1 \end{bmatrix} |
|
|
= \begin{bmatrix} \; 2 \\ \; 2 \\ -2 \end{bmatrix} |
|
|
= 2 \begin{bmatrix} \; 1 \\ \; 1 \\ -1 \end{bmatrix} |
|
|
</math> |
|
|
|
|
|
and therefore 2 is an eigenvalue of '''A''' and we have found a corresponding eigenvector. |
|
|
|
|
|
== The characteristic polynomial == |
|
|
|
|
|
An important tool for describing eigenvalues of square matrices is the ]: saying that ''c'' is an eigenvalue of '''A''' is equivalent to stating that the system of linear equations ('''A''' - ''c''<b>I</b>) '''x''' = '''0''' (where '''I''' is the identity matrix) has a non-zero solution '''x''' (namely an eigenvector), and so it is equivalent to the ] det('''A''' - ''c'' '''I''') being zero. The function ''p''(''c'') = det('''A''' - ''c''<b>I</b>) is a ] in ''c'' since determinants are defined as sums of products. |
|
|
This is the ''characteristic polynomial'' of '''A'''; its zeros are precisely the eigenvalues of '''A'''. |
|
|
If '''A''' is an ''n''-by-''n'' matrix, then its characteristic polynomial has degree ''n'' and '''A''' can therefore have at most ''n'' eigenvalues. |
|
|
|
|
|
Returning to the example above, if we wanted to compute all of <b>A</b>'s eigenvalues, we could determine the characteristic polynomial first: |
|
|
|
|
|
:<math>p(x) = \det( A - xI) = |
|
|
\begin{vmatrix} |
|
|
-x & 1 & -1\;\; \\ |
|
|
\;\;1 & 1\!-\!x & 0 \\ |
|
|
-1 & 0 & 1\!-\!x |
|
|
\end{vmatrix} |
|
|
</math> |
|
|
|
|
|
::<math> = -x^3 + 2x^2 + x - 2\ </math> |
|
|
|
|
|
and because <math>p(x) = -(x - 2) (x - 1) (x + 1)</math> we see that the eigenvalues of '''A''' are 2, 1 and -1. The ] states that every square matrix satisfies its own characteristic polynomial. |
|
|
|
|
|
(In practice, eigenvalues of large matrices are not computed using the characteristic polynomial. Faster and more numerically stable methods are available, for instance the ].) |
|
|
|
|
|
== Complex eigenvectors == |
|
|
|
|
|
Note that if '''A''' is a ] matrix, the characteristic polynomial will have real coefficients, but not |
|
|
all its roots will necessarily be real. The ] eigenvalues will all be associated to complex eigenvectors. |
|
|
|
|
|
In general, if '''v'''<sub>1</sub>, ..., '''v'''<sub>''m''</sub> are eigenvectors to ''different'' eigenvalues λ<sub>1</sub>, ..., λ<sub>''m''</sub>, then the vectors '''v'''<sub>1</sub>, ..., '''v'''<sub>''m''</sub> are necessarily ]. |
|
|
|
|
|
The ] for symmetric matrices states that, if '''A''' is a real symmetric ''n''-by-''n'' matrix, then all its eigenvalues are real, and there exist ''n'' linearly independent eigenvectors for '''A''' which all have length 1 and are mutually ]. |
|
|
|
|
|
Our example matrix from above is symmetric, and three mutually orthogonal eigenvectors of '''A''' are |
|
|
|
|
|
:<math>v_1 = \begin{bmatrix}\; 1 \\ \;1 \\ -1 \end{bmatrix},\quad v_2 = \begin{bmatrix}\; 0\;\\ 1 \\ 1 \end{bmatrix},\quad v_3 = \begin{bmatrix}\; 2 \\ -1 \\ \; 1 \end{bmatrix}.</math> |
|
|
|
|
|
These three vectors form a ] of '''R'''<sup>3</sup>. With respect to this basis, the linear map represented by '''A''' takes a particularly simple form: every vector '''x''' in '''R'''<sup>3</sup> can be written uniquely as |
|
|
: <math>\mathbf{x} = x_1 \mathbf{v}_1 + x_2 \mathbf{v}_2 + x_3 \mathbf{v}_3</math> |
|
|
and then we have |
|
|
:<math> \mathbf{A x} = 2x_1 \mathbf{v}_1 + x_2 \mathbf{v}_2 - x_3 \mathbf{v}_3.</math> |
|
|
|
|
|
== Infinite-dimensional spaces == |
|
|
|
|
|
The concept of eigenvectors can be extended to ]s |
|
|
acting on infinite-dimensional ]s or |
|
|
]s. |
|
|
|
|
|
There are operators on Banach spaces which have no eigenvectors at |
|
|
all. For example, take the ] on the Hilbert space |
|
|
<math>\ell^2(\mathbb{Z})</math>; it is easy to see that any potential |
|
|
eigenvector can't be square-summable, so none exist. However, any |
|
|
bounded linear operator on a Banach space <var>V</var> does have |
|
|
non-empty '''spectrum'''. The spectrum <math>\sigma(T)</math> of the operator<br> |
|
|
<var>T</var> : <var>V</var> → <var>V</var> is defined as |
|
|
|
|
|
:<math> |
|
|
\sigma(T) = \{ \lambda\in\mathbb{C} : (\lambda 1 - T)\;</math> is not invertible<math>\}. \; </math> |
|
|
<!-- Does that "is not invertible" really belong in there? --> |
|
|
|
|
|
Then <math>\sigma(T)</math> is a ] set of complex numbers, |
|
|
and it is non-empty. When <math>T</math> is a |
|
|
] (and in particular when <math>T</math> is an |
|
|
operator between finite-dimensional spaces as above), the |
|
|
spectrum of <math>T</math> is the same as the set of its eigenvalues. |
|
|
|
|
|
The spectrum of an operator is an important property in ]. |
|
|
|
|
|
==See also== |
|
|
*] |
|
|
*] |
|
|
*] |
|
|
|
|
|
==External links== |
|
|
* |
|
|
* |
|
|
|
|
|
{{Linear_algebra}} |
|
|
] ] ] |
|
|
] |
|
|
] |
|
|
] |
|
|
] |
|
|
] |
|