Misplaced Pages

Eigenvector: Difference between revisions

Article snapshot taken from[REDACTED] with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
Browse history interactively← Previous editContent deleted Content addedVisualWikitext
Revision as of 16:46, 31 August 2005 edit192.116.127.165 (talk)No edit summary← Previous edit Latest revision as of 23:00, 7 February 2020 edit undoJ947 (talk | contribs)Extended confirmed users, New page reviewers, Pending changes reviewers, Rollbackers34,100 edits Modifying redirect categories using Capricorn ♑ 
(10 intermediate revisions by 9 users not shown)
Line 1: Line 1:
#REDIRECT ]
{{merging|eigenvalue, eigenvector, and eigenspace}}
{{MathCOTW}}


{{Redirect category shell|
In ], the '''eigenvectors''' (from the ] ''eigen'' meaning "own") of a ] are non-zero ] which, when operated on by the operator, result in a ] multiple of themselves. The scalar is then called the ] associated with the eigenvector.
{{R to plural}}

{{R from subtopic}}
In ] and ] the eigenvectors of a ] or a ] often have important physical significance. In ] the eigenvectors of the governing equations typically correspond to natural modes of vibration in a body, and the eigenvalues to their frequencies. In ], operators correspond to ] variables, eigenvectors are also called '''eigenstates''', and the eigenvalues of an operator represent those values of the corresponding variable that have non-zero ] of being measured.
{{R with history}}

{{R unprintworthy}}
== Definition ==
}}
Formally, we define eigenvectors and eigenvalues as follows:
If ''A'' : ''V'' <tt>-></tt> ''V'' is a linear operator on some ] ''V'', ''v'' is a non-zero ] in ''V'' and ''&lambda;'' is a scalar (possibly zero) such that

: <math>Av = \lambda v\;</math>

then we say that ''v'' is an eigenvector of the operator ''A'', and its associated eigenvalue is ''&lambda;''. Note that if ''v'' is an eigenvector with ] ''&lambda;'', then any non-zero multiple of ''v'' is also an eigenvector with eigenvalue ''&lambda;''. In fact, all the eigenvectors with associated eigenvalue ''&lambda;'', together with ''0'', form a subspace of ''V'', the '''eigenspace''' for the eigenvalue ''&lambda;''.

== Examples ==
For ]s of two-]al space '''R'''<sup>2</sup> we can discern the following special cases:
* ]s: no eigenvectors. Example: Av=v+a
* ]s: no real eigenvectors, (Complex eigenvalue, eigenvector pairs exist). Example: A=(01)(10).
* ]: eigenvectors are perpendicular and parallel to the line of ], the eigenvalues are -1 and 1, respectively. Example: A=(10)(0 -1)
* uniform scaling: all vectors are eigenvectors, and the eigenvalue is the ].Example: Ax=cx
* projection onto a line: vectors on the line are eigenvectors with eigenvalue 1 and vectors perpendicular to the line are eigenvectors with eigenvalue 0. Example: A=(00)(01).

== Identifying eigenvectors ==

For example, consider the ]

: <math>A =
\begin{bmatrix}
\; 0 & 1 & -1 \\
\; 1 & 1 & \; 0 \\
-1 & 0 & \; 1
\end{bmatrix}
</math>

which represents a linear operator '''R'''<sup>3</sup> <tt>-></tt> '''R'''<sup>3</sup>. One can check that

: <math>A \begin{bmatrix} \; 1 \\ \; 1 \\ -1 \end{bmatrix}
= \begin{bmatrix} \; 2 \\ \; 2 \\ -2 \end{bmatrix}
= 2 \begin{bmatrix} \; 1 \\ \; 1 \\ -1 \end{bmatrix}
</math>

and therefore 2 is an eigenvalue of ''A'' and we have found a corresponding eigenvector.

== The characteristic polynomial ==

An important tool for describing eigenvalues of square matrices is the ]: saying that ''&lambda;'' is an eigenvalue of ''A'' is equivalent to stating that the ] (''A'' - ''&lambda;''id<sub>''V''</sub>) ''v'' = 0 (where id<sub>''V''</sub> is the ]) has a non-zero solution ''v'' (namely an eigenvector), and so it is equivalent to the ] det(''A'' - ''&lambda;'' id<sub>''V''</sub>) being zero. The function ''p''(''&lambda;'') = det(''A'' - ''&lambda;''id<sub>''V''</sub>) is a ] in ''&lambda;'' since determinants are defined as sums of products.
This is the ''characteristic polynomial'' of ''A''; its zeros are precisely the eigenvalues of ''A''.
If ''A'' is an ''n''-by-''n'' matrix, then its characteristic polynomial has degree ''n'' and ''A'' can therefore have at most ''n'' eigenvalues.

Returning to the example above, if we wanted to compute all of ''A'''s eigenvalues, we could determine the characteristic polynomial first:

:<math>p(x) = \det( A - xI) =
\begin{vmatrix}
-x & 1 & -1 \\
1 & 1-x & 0 \\
-1 & 0 & 1-x \end{vmatrix} = -x^3 + 2x^2 + x - 2 = -(x - 2) (x - 1) (x + 1)</math>
and we see that the eigenvalues of ''A'' are 2, 1 and -1.

The ] states that every square matrix satisfies its own characteristic polynomial, that is ''p''(''A'')=0.

(In practice, eigenvalues of large matrices are not computed using the characteristic polynomial. Faster and more numerically stable methods are available, for instance the ].)

== Complex eigenvectors ==

Note that if ''A'' is a ] matrix, the characteristic polynomial will have real coefficients, but not all its roots will necessarily be real. The ] eigenvalues will all be associated to complex eigenvectors.

In general, if ''v''<sub>1</sub>, ..., ''v''<sub>''m''</sub> are eigenvectors with different eigenvalues &lambda;<sub>1</sub>, ..., &lambda;<sub>''m''</sub>, then the vectors ''v''<sub>1</sub>, ..., ''v''<sub>''m''</sub> are necessarily ].

The ] for symmetric matrices states that, if ''A'' is a real symmetric ''n''-by-''n'' matrix, then all its eigenvalues are real, and there exist ''n'' linearly independent eigenvectors for ''A'' which are mutually ].

Our example matrix from above is symmetric, and three mutually orthogonal eigenvectors of ''A'' are

:<math>v_1 = \begin{bmatrix}\; 1 \\ \;1 \\ -1 \end{bmatrix},\quad v_2 = \begin{bmatrix}\; 0\;\\ 1 \\ 1 \end{bmatrix},\quad v_3 = \begin{bmatrix}\; 2 \\ -1 \\ \; 1 \end{bmatrix}.</math>

These three vectors form a ] of '''R'''<sup>3</sup>. With respect to this basis, the ] represented by ''A'' takes a particularly simple form: every vector ''x'' in '''R'''<sup>3</sup> can be written uniquely as
:<math>x = x_1 v_1 + x_2 v_2 + x_3 v_3</math>
and then we have
:<math>A x = 2x_1 v_1 + x_2 v_2 - x_3 v_3.</math>

==Decomposition theorem==
An ''n'' by ''n'' matrix has ''n'' linearly independent real eigenvectors if and only if it can be decomposed into the form

:<math>A=U \Lambda U^{-1}\;</math>

and &Lambda; is a ] with all of the eigenvalues on the diagonal. If ''A'' is ], then ''U'' is ], and if ''A'' is ], then ''U'' is ]. Such a matrix ''U'' does not always exist; for example

:<math>A=\left( \begin{matrix} 1 & 1 \\ 0 & 1 \end{matrix} \right)</math>

has only one 1-dimensional eigenspace. In such a case, the ] must be used.

== Infinite-dimensional spaces ==

The concept of eigenvectors can be extended to ]s
acting on infinite-dimensional ]s or
]s.

There are operators on Banach spaces which have no eigenvectors at all. For example, take the ] on the Hilbert space <math>\ell^2(\mathbf{Z})</math>; it is easy to see that any potential eigenvector can't be square-summable, so none exist. However, any bounded linear operator on a Banach space ''V'' does have non-empty '''spectrum'''. The spectrum &sigma;(''A'') of the operator ''A'' : ''V'' &rarr; ''V'' is defined as

:<math>\sigma(A) = \{ \lambda \in \mathbf{C} | A - \lambda \mathrm{id}_V \mbox{ is not invertible }\}.</math>

Then &sigma;(''A'') is a ] of complex numbers, and it is non-empty. When ''A'' is a ] (and in particular when ''A'' is an operator between ]-dimensional spaces as above), the spectrum of ''A'' is the same as the set of its eigenvalues.

The spectrum of an operator is an important property in ].

==See also==
*]
*]
*]

==External links==
*
*

] ]
]
]
]
]
]
'''Bold text'''

Latest revision as of 23:00, 7 February 2020

Redirect to:

This page is a redirect. The following categories are used to track and monitor this redirect:
  • To its plural form: This is a redirect from a singular noun to its plural form.
    • Redirects of this sort exist for reader convenience in cases of singular–plural pairs. It is also used for "false singulars", wherein the plural or plural-looking form is better attested in usage, such that the normal "prefer the singular" Misplaced Pages naming convention is not followed. Examples:
    • Use this rcat to tag qualified mainspace redirects only; if qualified singular forms are found in other namespaces, use {{R from modification}} instead.
  • With history: This is a redirect from a page containing substantive page history. This page is kept as a redirect to preserve its former content and attributions. Please do not remove the tag that generates this text (unless the need to recreate content on this page has been demonstrated), nor delete this page.
    • This template should not be used for redirects having some edit history but no meaningful content in their previous versions, nor for redirects created as a result of a page merge (use {{R from merge}} instead), nor for redirects from a title that forms a historic part of Misplaced Pages (use {{R with old history}} instead).
When appropriate, protection levels are automatically sensed, described and categorized.
Eigenvector: Difference between revisions Add topic