Advanced Math/Eigen Values
Hi Mr. Patton,
I am an engineering graduate..I have done vector calculus and matrix algebra. One thing that I have not understood are Eigen vectors and Eigen Values. I know how to get the answers but have NO idea what they mean. Please help me understand what they mean and how and when they should be used in applications.
I agree that eigenvalues and eigenvectors can be a bit of a mystery.
I’ve come to think of eigenvectors as representing the basic structure of a linear transformation (LT). A basic theorem of linear algebra says that any LT on a finite vector space can be represented as a matrix. Of course matrices are where we usually get introduced to eigenvalues and vectors. The usual introduction is something relatively opaque such as eigenvectors, x, are solutions to
Ax = ax
where a is an eigenvalue (constant). This says that x is a special vector that just gets stretched by the LT. Thats fine, but so what? Well, in the somewhat special case of symmetric LT, say T, the finite vector space it is associated with has as a basis the eigenvectors of T. This means that, among other things, that the matrix associated with T can be diagonalized using a matrix, E, of the eigenvectors with the diagonal elements being the (real) eigenvalues of T:
ETE^-1 = D = diagonal matrix.
This is very handy computationally since algebraic manipulations of T can be carried out very simply on D and then transformed back to T, e.g., taking the square root is done by taking the square root of the eigenvalues.
For symmetric matrices, the eigenvector basis is orthonormal (orthogonal unit vectors) which makes things even nicer. A covariance matrix fits this criterion, for instance, and gives a more realistic interpretation of eigenvector and eigenvalues. The eigenvectors correspond to orthogonal directions about which the data scatters and the eigenvalues give the variance of the scatter. For 2-D, this is an ellipse with the 2 eigenvectors showing the direction of the most and least amount of scatter. See http://www.visiondummy.com/2014/04/geometric-interpretation-covariance-matrix/#E
Singular value decomposition (SVD) is another good example of how the ‘eigenstructure’ of an LT can be used. Without going into detail (see http://www.ams.org/samplings/feature-column/fcarc-svd
), a more general class of LTs, say M, can be analyzed and manipulated using the eigenvectors of the (by construction) symmetric matrix M^tM.
Perhaps an easier interpretation of eigenvalues and eigenvectors comes from looking at solutions to differential equations with boundary conditions, which describe a whole host of physics and other real world problems. Here eigenvectors become the more general eigenfunctions. It is easy to imagine the solutions to a stretched string fixed at the ends (eg., guitar string) being sinusoidal (eigenfunctions) with discrete frequencies (eigenvalues). The LT in this case is an differential operator (linear, of course) and opens up a whole new area of study called Sturm-Liouville Theory (see http://people.uncw.edu/hermanr/mat463/ODEBook/Book/SL.pdf
So in summary, eigenvalues and eigenvectors seem to correspond to the structure and resonance of linear transformations. Hope this helps.