Download
# OnLine Geometric Modeling Notes EIGENVALUES AND EIGENVECTORS Kenneth I PDF document - DocSlides

lindy-dunigan | 2014-12-14 | General

### Presentations text content in OnLine Geometric Modeling Notes EIGENVALUES AND EIGENVECTORS Kenneth I

Show

Page 1

On-Line Geometric Modeling Notes EIGENVALUES AND EIGENVECTORS Kenneth I. Joy Visualization and Graphics Research Group Department of Computer Science University of California, Davis In engineering applications, eigenvalue problems are among the most important problems connected with matrices. In this section we give the basic deﬁnitions of eigenvalues and eigenvectors and some of the basic results of their use. What are Eigenvalues and Eigenvectors? Let be an matrix and consider the vector equation A~v λ~v where is a scalar value. It is clear that if ~v , we have a solution for any value of . A value of for which the equation has a solution with ~v is called an eigenvalue or characteristic value of the matrix . The corresponding solutions ~v are called eigenvectors or characteristic vectors of . In the problem above, we are looking for vectors that when multiplied by the matrix , give a scalar multiple of itself. The set of eigenvalues of is commonly called the spectrum of and the largest of the absolute values of the eigenvalues is called the spectral radius of How Do We Calculate the Eigenvalues? It is easy to see that the equation A~v λ~v

Page 2

can be rewritten as λI ~v = 0 where is the identity matrix. A matrix equation of this form can only be solved if the determinant of the matrix is nonzero (see Cramer’s Rule) – that is, if det λI ) = 0 Since this equation is a polynomial in , commonly called the characteristic polynomial , we only need to ﬁnd the roots of this polynomial to ﬁnd the eigenvalues. We note that to get a complete set of eigenvalues, one may have to extend the scope of this discussion into the ﬁeld of complex numbers. How Do We Calculate the Eigenvectors? The eigenvalues must be determined ﬁrst. Once these are known, the corresponding eigenvectors can be calculated directly from the linear system λI ~v = 0 It should be noted that if ~v is an eigenvector, then so is k~v for any scalar Right Eigenvectors Given an eigenvalue , The eigenvector ~r that satisﬁes A~r λ~r is sometimes called a (right) eigenvector for the matrix corresponding to the eigenvalue . If , ,..., are the eigenvalues and ~r ,~r ,...,~r are the corresponding right eigenvectors, then is easy to see that the set of right eigenvectors form a basis of a vector space. If this vector space is of dimension , then we can construct an matrix whose columns are the components of the right eigenvectors, which has the

Page 3

property that AR where is the diagonal matrix Λ = 0 0 0 0 0 0 0 0 whose diagonal elements are the eigenvalues. By appropriate numbering of the eigenvalues and eigenvec- tors, it is possible to arrange the columns of the matrix so that ... Left Eigenvectors A vector so that is called a left eigenvector for corresponding to the eigenvalue . If , ,..., are the eigenvalues and ,..., are the corresponding left eigenvectors, then is easy to see that the set of left eigenvectors form a basis of a vector space. If this vector space is of dimension , then we can construct an matrix whose rows are the components of the left eigenvectors, which has the property that LA = It is possible to choose the left eigenvectors ,... and right eigenvectors ~r ,~r ,... so that ~r if 0 otherwise This is easily done if we deﬁne and deﬁne the components of the left eigenvectors to be the

Page 4

elements of the respective rows of . Beginning with AR and multiplying both sides on the left by , we obtain AR = and multiplying on the right by , we have = which implies that any row of satisﬁes the properties of a left eigenvector. Diagonalization of a Matrix Given an matrix , we say that is diagonalizable if there is a matrix so that AX = where Λ = 0 0 0 0 0 0 0 0 It is clear from the above discussions that if all the eigenvalues are real and district, then we can use the matrix of right eigenvectors as References

Page 5

All contents copyright (c) 1996, 1997, 1998, 1999, 2000 Computer Science Department, University of California, Davis All rights reserved.

Joy Visualization and Graphics Research Group Department of Computer Science University of California Davis In engineering applications eigenvalue problems are among the most important problems connected with matrices In this section we give the bas ID: 24032

- Views :
**142**

**Direct Link:**- Link:https://www.docslides.com/lindy-dunigan/online-geometric-modeling-notes-571
**Embed code:**

Download this pdf

DownloadNote - The PPT/PDF document "OnLine Geometric Modeling Notes EIGENVAL..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.

Page 1

On-Line Geometric Modeling Notes EIGENVALUES AND EIGENVECTORS Kenneth I. Joy Visualization and Graphics Research Group Department of Computer Science University of California, Davis In engineering applications, eigenvalue problems are among the most important problems connected with matrices. In this section we give the basic deﬁnitions of eigenvalues and eigenvectors and some of the basic results of their use. What are Eigenvalues and Eigenvectors? Let be an matrix and consider the vector equation A~v λ~v where is a scalar value. It is clear that if ~v , we have a solution for any value of . A value of for which the equation has a solution with ~v is called an eigenvalue or characteristic value of the matrix . The corresponding solutions ~v are called eigenvectors or characteristic vectors of . In the problem above, we are looking for vectors that when multiplied by the matrix , give a scalar multiple of itself. The set of eigenvalues of is commonly called the spectrum of and the largest of the absolute values of the eigenvalues is called the spectral radius of How Do We Calculate the Eigenvalues? It is easy to see that the equation A~v λ~v

Page 2

can be rewritten as λI ~v = 0 where is the identity matrix. A matrix equation of this form can only be solved if the determinant of the matrix is nonzero (see Cramer’s Rule) – that is, if det λI ) = 0 Since this equation is a polynomial in , commonly called the characteristic polynomial , we only need to ﬁnd the roots of this polynomial to ﬁnd the eigenvalues. We note that to get a complete set of eigenvalues, one may have to extend the scope of this discussion into the ﬁeld of complex numbers. How Do We Calculate the Eigenvectors? The eigenvalues must be determined ﬁrst. Once these are known, the corresponding eigenvectors can be calculated directly from the linear system λI ~v = 0 It should be noted that if ~v is an eigenvector, then so is k~v for any scalar Right Eigenvectors Given an eigenvalue , The eigenvector ~r that satisﬁes A~r λ~r is sometimes called a (right) eigenvector for the matrix corresponding to the eigenvalue . If , ,..., are the eigenvalues and ~r ,~r ,...,~r are the corresponding right eigenvectors, then is easy to see that the set of right eigenvectors form a basis of a vector space. If this vector space is of dimension , then we can construct an matrix whose columns are the components of the right eigenvectors, which has the

Page 3

property that AR where is the diagonal matrix Λ = 0 0 0 0 0 0 0 0 whose diagonal elements are the eigenvalues. By appropriate numbering of the eigenvalues and eigenvec- tors, it is possible to arrange the columns of the matrix so that ... Left Eigenvectors A vector so that is called a left eigenvector for corresponding to the eigenvalue . If , ,..., are the eigenvalues and ,..., are the corresponding left eigenvectors, then is easy to see that the set of left eigenvectors form a basis of a vector space. If this vector space is of dimension , then we can construct an matrix whose rows are the components of the left eigenvectors, which has the property that LA = It is possible to choose the left eigenvectors ,... and right eigenvectors ~r ,~r ,... so that ~r if 0 otherwise This is easily done if we deﬁne and deﬁne the components of the left eigenvectors to be the

Page 4

elements of the respective rows of . Beginning with AR and multiplying both sides on the left by , we obtain AR = and multiplying on the right by , we have = which implies that any row of satisﬁes the properties of a left eigenvector. Diagonalization of a Matrix Given an matrix , we say that is diagonalizable if there is a matrix so that AX = where Λ = 0 0 0 0 0 0 0 0 It is clear from the above discussions that if all the eigenvalues are real and district, then we can use the matrix of right eigenvectors as References

Page 5

All contents copyright (c) 1996, 1997, 1998, 1999, 2000 Computer Science Department, University of California, Davis All rights reserved.

Today's Top Docs

Related Slides