Download
# Generalized Companion Matrices for Polynomials not expressed in Monomial Bases Robert M PDF document - DocSlides

test | 2014-12-12 | General

### Presentations text content in Generalized Companion Matrices for Polynomials not expressed in Monomial Bases Robert M

Show

Page 1

Generalized Companion Matrices for Polynomials not expressed in Monomial Bases Robert M. Corless and Gurjeet Litt Ontario Research Centre for Computer Algebra 1 Introduction This short note gives formulae (derived by linear transfor- mations) for the entries in companion matrices for polyno- mials not expressed in the monomial basis ,x,x ,...,x Most proofs are omitted because veriﬁcation a posteriori is simple. One reason these formulae may be interesting is that the condition number for the eigenvalue problem for the generalized companion matrix is, in some cases, smaller than that for the monomial basis companion matrix. 1.1 History Several people have rediscovered the fact that companion matrix pencils can be deduced for matrix polynomials ex- pressed in orthogonal polynomial bases, and variously called these matrices “comrade” matrices or “colleague” matrices. This overly cute naming just obscures a literature search, where you would like to be able to ﬁnd references by look- ing simply for companion matrix, or ‘generalized companion matrix’. The most general result seems to be that of Bar- nett, who noticed that given a basis of polynomials expressible by a three-term recurrence relation, as is always the case for orthogonal polynomials for example, then one can write down a generalized companion matrix pencil for the polynomial ) = =0 that is, a pair of by sparse matrices, say and , such that ) = det( xB Of course this can be done by expressing each ) in terms of the monomial basis , but that means that the matrices and have entries made up of linear combinations of the , and we would like to be able to keep those linear combinations as simple as possible. These results may be used not just for orthogonal poly- nomials, but polynomials of any kind of three-term recur- rence. The monomial basis ﬁts into this category, with triv- ial three-term recurrence +1 + 0 , and an arbitrary Newton basis =1 ), because +1 = ( + 0 is again a trivial three-term recurrence relation. This fact seems not to have been no- ticed. A discovery, apparently new to this paper, is that we may also write down a (quite diﬀerent) companion matrix form for polynomials expressed in the Lagrange basis, which does not have a three-term recurrence relation. This allows direct expression of an eigenvalue problem for matrix pen- cils to give roots of polynomials (or eigenvalues of matrix polynomials) directly given the values of the matrix polyno- mials at + 1 distinct points. This direct formulation may be quite useful for applications where interpolation gives ef- ﬁciency gains to the algorithm, and it may also be possible to choose interpolation points to give improved conditioning to the resulting eigenproblem. 2 Monomial basis companion matrix pencil If ) = ... , then put 0 1 0 0 1 0 0 0 1 and Then a straightforward computation, say by expansion by minors along the ﬁrst column, shows that ) = det( xB (1) That is, the eigenvalues of the matrix pencil ( ,B ) are the roots of ). This useful observation is the foundation of a stable though somewhat ineﬃcient method for computing roots of polynomials: given a polynomial, construct its companion matrix pencil, and ﬁnd the eigenvalues by a standard rou- tine [3, 1]. This is how the “roots” command in Matlab works. Explaining the observed stability of this method is not trivial, and not completely resolved even yet. 3 Newton basis Companion Matrix Pencil Suppose however that we are given ) expressed as ) = ) + )( ) + ... )( ...

Page 2

which uses the Newton basis ,x )( ,..., )( ... that occurs, for example, on interpolation at the points ... +1 The coeﬃcients are divided diﬀerences [2]. To ﬁnd a com- panion matrix for ) expressed in this basis, we could of course simply convert to a monomial basis and then use the previous form. This is unsatisfactory because the conversion process may introduce further unwanted rounding errors, if we use ﬂoating point arithmetic. But because of the are linearly related to the , it ought to be possible to express the matrix pencil directly in terms of the Theorem : Put and Then ) = det( xB ). Proof: Expansion by minors along the ﬁrst column, and induction. Remark 1 : The do not have to be distinct, and therefore this form works even when the have been determined by solving a conﬂuent system. 4 A GCM using values of at +1 If we suppose that ) = , 1 + 1 , and moreover now also suppose that if , then we can transform the companion matrix pencil ( ,B ) of the previous sec- tion into one where the appear instead of the (which, after all, are linear in the through divided diﬀerences). Theorem : Let ) = =1 ) and and where +1 =1 +1 +1 and (2) +1 =1 +1 (3) Then if we put ) = det( xB ), we have ) = + 1. Proof We distinguish cases: (i) 1 1, (ii) and + 1. Case(i) : Expansion by minors of a 5 x 5 example is clari- fying: det α β γ δ E = ( 1) det 1 0 0 0 2 0 0 0 0 0 0 4 = ( 1) 1) γC det That gives an idea of what happens in the general case, which we now consider: det( = det ... ... = ( 1) 1+ det = ( 1) 1+ )( 1) 2+ det +1 =1 +1 ) = + by the deﬁnition of

Page 3

Case(ii) : or + 1. Expansion by minors along the last row gives det( = det ... = ( =1 We wish to calculate and so that the results are and +1 if or + 1 respectively. These conditions give =1 and +1 +1 +1 =1 +1 and solving these two linear equations and simpliﬁcation of the results gives us the formulae for and in equation (3). 5 Extension to Matrix Polynomials All of the companion matrix pencils discussed in the previ- ous section extend straightforwardly to matrix polynomials ) = xP + ( + ( )( + ( )( when and are all matrices. The companion forms are I I I I ... and and I t I t I t and where +1 =1 +1 +1 +1 =1 +1 are sums of matrices. In all cases, ) = det( xB ) = det( xB ) = det( xB ) as before. 6 Companion Matrices using three-term recur- rence relations Suppose ) satisfy, for 0 , the following recurrence relations. ) = 1 ) + ) = x +1 ) + ) + ) = x For example, rewriting the familiar recurrence relation +1 ) = 2 xT ) for the Chebyshev polyno- mials gives +1 ) + 0 ) + ) = xT ). The choice ( , , ) = (1 0) gives ) = , the mono- mial basis, whilst ( , , ) = (1 ,r 0) gives ) = )( ... ), the Newton basis. Theorem : If ) = =0 , where each is an matrix, then ) = ... det( xB orth orth where, orth I I I I I I I ... D E where

Page 4

and is an identity matrix, and orth is diag( I,I,...,I, ). Proof : Let be an 1 eigenvector of the matrix poly- nomial ) corresponding to the eigenvalue . Then con- sider the vector = [ , ,..., ]. A straightforward computation shows that [ orth λB orth 0. Evaluation of det( xB orth orth ) near gives the leading coeﬃcent as . The result follows. Remark We have just found a formula for the right eigen- vectors in the three-term recurrence case, generalizing the known formula for the monomial basis. A short computa- tion shows that for ( ,B ) the right eigenvector is, if all ,..., 7 Left Eigenvectors To evaluate the condition number of a given simple eigen- value , we need the left eigenvector corresponding to also. Then the condition number of the eigenvalue of the matrix pencil ( C,B ) corresponding to the left and right eigenvectors and is kk (4) We can derive this equation by considering perturbations of ( C,B ) to ( C, ) = ( sλE,B sF ) and diﬀerentiating with respect to , at = 0. The left eigenvectors for the monomial basis pencil are known in simple form [ ], and hence we know that they can be found here as well. 8 Left eigenvectors in the interpolation basis We look for vectors such that λu or ... s λu Suppose = [ ,u ,...,u 1] . Then λu or ) = ) when 1 1. If some happened to be an eigenvalue, we would have = 0 and hence for that eigenvalue. Therefore the condition number of eigenvectors in the interpolation basis is kk = [ ,..., 1] = 1 =1 Using the inﬁnity norm for example, = max = max We do not at this time know how to choose the (and hence the ) so that the eigencondition numbers are good. That these condition numbers vary with is obvious ( = 1 if each is a distinct eigenvalue, for example). The purpose of this note is to lay these formulae in front of the reader, in the hope that they will be of some use. One example use is in the solution of the problem of the intersection of a poly- nomially parametric curve ), 1 , with a surface deﬁned by det = 0 where is a matrix of polyno- mials in . See question 4.16 of Demmel and the references there for applications. Using the formulation of this note, we may sample the curve at a ﬁnite number of points, set up the matrix pencil, and search for real eigenvalues. References [1] Gene Golub and Charles Van Loan. Matrix Computa- tions . Johns Hopkins, 2nd edition, 1989. [2] Nicholas J. Higham. Accuracy and Stability of Numerical Algorithms . Society for Industrial and Applied Mathe- matics, Philadelphia, PA, USA, 1996. [3] Lloyd N. Trefethen and D. Bau. Numerical Linear Al- gebra . SIAM, 1997.

Corless and Gurjeet Litt Ontario Research Centre for Computer Algebra 1 Introduction This short note gives formulae derived by linear transfor mations for the entries in companion matrices for polyno mials not expressed in the monomial basis xx x Mo ID: 23001

- Views :
**174**

**Direct Link:**- Link:https://www.docslides.com/test/generalized-companion-matrices
**Embed code:**

Download this pdf

DownloadNote - The PPT/PDF document "Generalized Companion Matrices for Polyn..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.

Page 1

Generalized Companion Matrices for Polynomials not expressed in Monomial Bases Robert M. Corless and Gurjeet Litt Ontario Research Centre for Computer Algebra 1 Introduction This short note gives formulae (derived by linear transfor- mations) for the entries in companion matrices for polyno- mials not expressed in the monomial basis ,x,x ,...,x Most proofs are omitted because veriﬁcation a posteriori is simple. One reason these formulae may be interesting is that the condition number for the eigenvalue problem for the generalized companion matrix is, in some cases, smaller than that for the monomial basis companion matrix. 1.1 History Several people have rediscovered the fact that companion matrix pencils can be deduced for matrix polynomials ex- pressed in orthogonal polynomial bases, and variously called these matrices “comrade” matrices or “colleague” matrices. This overly cute naming just obscures a literature search, where you would like to be able to ﬁnd references by look- ing simply for companion matrix, or ‘generalized companion matrix’. The most general result seems to be that of Bar- nett, who noticed that given a basis of polynomials expressible by a three-term recurrence relation, as is always the case for orthogonal polynomials for example, then one can write down a generalized companion matrix pencil for the polynomial ) = =0 that is, a pair of by sparse matrices, say and , such that ) = det( xB Of course this can be done by expressing each ) in terms of the monomial basis , but that means that the matrices and have entries made up of linear combinations of the , and we would like to be able to keep those linear combinations as simple as possible. These results may be used not just for orthogonal poly- nomials, but polynomials of any kind of three-term recur- rence. The monomial basis ﬁts into this category, with triv- ial three-term recurrence +1 + 0 , and an arbitrary Newton basis =1 ), because +1 = ( + 0 is again a trivial three-term recurrence relation. This fact seems not to have been no- ticed. A discovery, apparently new to this paper, is that we may also write down a (quite diﬀerent) companion matrix form for polynomials expressed in the Lagrange basis, which does not have a three-term recurrence relation. This allows direct expression of an eigenvalue problem for matrix pen- cils to give roots of polynomials (or eigenvalues of matrix polynomials) directly given the values of the matrix polyno- mials at + 1 distinct points. This direct formulation may be quite useful for applications where interpolation gives ef- ﬁciency gains to the algorithm, and it may also be possible to choose interpolation points to give improved conditioning to the resulting eigenproblem. 2 Monomial basis companion matrix pencil If ) = ... , then put 0 1 0 0 1 0 0 0 1 and Then a straightforward computation, say by expansion by minors along the ﬁrst column, shows that ) = det( xB (1) That is, the eigenvalues of the matrix pencil ( ,B ) are the roots of ). This useful observation is the foundation of a stable though somewhat ineﬃcient method for computing roots of polynomials: given a polynomial, construct its companion matrix pencil, and ﬁnd the eigenvalues by a standard rou- tine [3, 1]. This is how the “roots” command in Matlab works. Explaining the observed stability of this method is not trivial, and not completely resolved even yet. 3 Newton basis Companion Matrix Pencil Suppose however that we are given ) expressed as ) = ) + )( ) + ... )( ...

Page 2

which uses the Newton basis ,x )( ,..., )( ... that occurs, for example, on interpolation at the points ... +1 The coeﬃcients are divided diﬀerences [2]. To ﬁnd a com- panion matrix for ) expressed in this basis, we could of course simply convert to a monomial basis and then use the previous form. This is unsatisfactory because the conversion process may introduce further unwanted rounding errors, if we use ﬂoating point arithmetic. But because of the are linearly related to the , it ought to be possible to express the matrix pencil directly in terms of the Theorem : Put and Then ) = det( xB ). Proof: Expansion by minors along the ﬁrst column, and induction. Remark 1 : The do not have to be distinct, and therefore this form works even when the have been determined by solving a conﬂuent system. 4 A GCM using values of at +1 If we suppose that ) = , 1 + 1 , and moreover now also suppose that if , then we can transform the companion matrix pencil ( ,B ) of the previous sec- tion into one where the appear instead of the (which, after all, are linear in the through divided diﬀerences). Theorem : Let ) = =1 ) and and where +1 =1 +1 +1 and (2) +1 =1 +1 (3) Then if we put ) = det( xB ), we have ) = + 1. Proof We distinguish cases: (i) 1 1, (ii) and + 1. Case(i) : Expansion by minors of a 5 x 5 example is clari- fying: det α β γ δ E = ( 1) det 1 0 0 0 2 0 0 0 0 0 0 4 = ( 1) 1) γC det That gives an idea of what happens in the general case, which we now consider: det( = det ... ... = ( 1) 1+ det = ( 1) 1+ )( 1) 2+ det +1 =1 +1 ) = + by the deﬁnition of

Page 3

Case(ii) : or + 1. Expansion by minors along the last row gives det( = det ... = ( =1 We wish to calculate and so that the results are and +1 if or + 1 respectively. These conditions give =1 and +1 +1 +1 =1 +1 and solving these two linear equations and simpliﬁcation of the results gives us the formulae for and in equation (3). 5 Extension to Matrix Polynomials All of the companion matrix pencils discussed in the previ- ous section extend straightforwardly to matrix polynomials ) = xP + ( + ( )( + ( )( when and are all matrices. The companion forms are I I I I ... and and I t I t I t and where +1 =1 +1 +1 +1 =1 +1 are sums of matrices. In all cases, ) = det( xB ) = det( xB ) = det( xB ) as before. 6 Companion Matrices using three-term recur- rence relations Suppose ) satisfy, for 0 , the following recurrence relations. ) = 1 ) + ) = x +1 ) + ) + ) = x For example, rewriting the familiar recurrence relation +1 ) = 2 xT ) for the Chebyshev polyno- mials gives +1 ) + 0 ) + ) = xT ). The choice ( , , ) = (1 0) gives ) = , the mono- mial basis, whilst ( , , ) = (1 ,r 0) gives ) = )( ... ), the Newton basis. Theorem : If ) = =0 , where each is an matrix, then ) = ... det( xB orth orth where, orth I I I I I I I ... D E where

Page 4

and is an identity matrix, and orth is diag( I,I,...,I, ). Proof : Let be an 1 eigenvector of the matrix poly- nomial ) corresponding to the eigenvalue . Then con- sider the vector = [ , ,..., ]. A straightforward computation shows that [ orth λB orth 0. Evaluation of det( xB orth orth ) near gives the leading coeﬃcent as . The result follows. Remark We have just found a formula for the right eigen- vectors in the three-term recurrence case, generalizing the known formula for the monomial basis. A short computa- tion shows that for ( ,B ) the right eigenvector is, if all ,..., 7 Left Eigenvectors To evaluate the condition number of a given simple eigen- value , we need the left eigenvector corresponding to also. Then the condition number of the eigenvalue of the matrix pencil ( C,B ) corresponding to the left and right eigenvectors and is kk (4) We can derive this equation by considering perturbations of ( C,B ) to ( C, ) = ( sλE,B sF ) and diﬀerentiating with respect to , at = 0. The left eigenvectors for the monomial basis pencil are known in simple form [ ], and hence we know that they can be found here as well. 8 Left eigenvectors in the interpolation basis We look for vectors such that λu or ... s λu Suppose = [ ,u ,...,u 1] . Then λu or ) = ) when 1 1. If some happened to be an eigenvalue, we would have = 0 and hence for that eigenvalue. Therefore the condition number of eigenvectors in the interpolation basis is kk = [ ,..., 1] = 1 =1 Using the inﬁnity norm for example, = max = max We do not at this time know how to choose the (and hence the ) so that the eigencondition numbers are good. That these condition numbers vary with is obvious ( = 1 if each is a distinct eigenvalue, for example). The purpose of this note is to lay these formulae in front of the reader, in the hope that they will be of some use. One example use is in the solution of the problem of the intersection of a poly- nomially parametric curve ), 1 , with a surface deﬁned by det = 0 where is a matrix of polyno- mials in . See question 4.16 of Demmel and the references there for applications. Using the formulation of this note, we may sample the curve at a ﬁnite number of points, set up the matrix pencil, and search for real eigenvalues. References [1] Gene Golub and Charles Van Loan. Matrix Computa- tions . Johns Hopkins, 2nd edition, 1989. [2] Nicholas J. Higham. Accuracy and Stability of Numerical Algorithms . Society for Industrial and Applied Mathe- matics, Philadelphia, PA, USA, 1996. [3] Lloyd N. Trefethen and D. Bau. Numerical Linear Al- gebra . SIAM, 1997.

Today's Top Docs

Related Slides