2014-12-12 88K 88 0 0

##### Description

elseviercomlocatelaa A note on companion matrices Miroslav Fiedler Academy of Sciences of the Czech Republic Institute of Computer Science Pod vod57569ren v e57581 2 182 07 Praha 8 Czech Republic Received 8 October 2002 accepted 12 April 2003 Submitt ID: 23000

**Embed code:**

## Download this pdf

DownloadNote - The PPT/PDF document "Linear Algebra and its Applications w..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.

## Presentations text content in Linear Algebra and its Applications www

Page 1

Linear Algebra and its Applications 372 (2003) 325–331 www.elsevier.com/locate/laa A note on companion matrices Miroslav Fiedler Academy of Sciences of the Czech Republic, Institute of Computer Science, Pod vodren. v ež 2, 182 07 Praha 8, Czech Republic Received 8 October 2002; accepted 12 April 2003 Submitted by H. Schneider Abstract We show that the usual companion matrix of a polynomial of degree can be factored into a product of matrices, 1 of them being the identity matrix in which a 2 2 identity submatrix in two consecutive rows (and columns) is replaced by an appropriate 2 2 matrix, the remaining being the identity matrix with the last entry replaced by possibly different entry. By a certain similarity transformation, we obtain a simple new companion matrix in a penta- diagonal form. Some generalizations are also possible. 2003 Elsevier Inc. All rights reserved. AMS classiﬁcation: 15A23; 15A57; 65F15 Keywords: Companion matrix; Characteristic polynomial; Pentadiagonal matrix; Zeros of polynomials 1. Introduction Let p(x) ++ (1) be a polynomial with coefﬁcients over an arbitrary ﬁeld. As is well known, the matrix ... 10 00 01 00 10 (2) E-mail address: ﬁedler@math.cas.cz (M. Fiedler). Research supported by grant A1030003. 0024-3795/$ - see front matter 2003 Elsevier Inc. All rights reserved. doi:10.1016/S0024-3795(03)00548-2

Page 2

326 M. Fiedler / Linear Algebra and its Applications 372 (2003) 325–331 has the property that det (xI A) p(x). The matrix , or some of its modiﬁcations, is being called companion matrix of the polynomial p(x) since its characteristic polynomial is p(x). In the sequel, we ﬁnd another simple matrix which is similar to and has thus the same property with respect to the polynomial p(x). 2. Results We start with a simple lemma. Lemma 2.1. For ,...,n denote by the matrix (3) where is a matrix 10 (4) and by the matrix diag ,..., (5) Then Proof. Follows easily by induction from 10 10 000 01 00 00 100 00 001 00 010 10 000 01 00 00 100 00 010 Lemma 2.2. All the matrices obtained as products for some permutation (i ,...,i of ,...,n) have the same spectrum including multiplicities They even are all similar

Page 3

M. Fiedler / Linear Algebra and its Applications 372 (2003) 325–331 327 Proof. Clearly, if This allows us to bring every such product to the form (A )(A (A with the property that in both permutations (i ,...,i and (j ,j ,..., ,j ,...,j ,...,n,n ,...,j ), each pair (k, k is either ordered “positively”, i.e. precedes or “negatively”, i.e. 1 precedes k. Moreover, by a well known theorem [4, Theorem 1.3.20], if and are square matrices, then AB and BA have the same spectrum including multiplicities; these matrices are even similar if one of the matrices A,B is non-singular. This allows us to “rotate” the permutations which determine the product without changing the spectrum in the sense that the permutation (i ,...,i ,i ,...,i can be replaced by (i ,...,i ,i ,...,i ). It thus sufﬁces to prove that the matrix corresponding to any permutation can be obtained by these operations from the matrix A( ,... ,...) ; here, we denote by A(i ,...,i the matrix and the subscript means the number of elements in the permutation. Observe that all resulting matrices are similar since at most the matrix from (5) can be singular. We prove the assertion by induction with respect to n. It is immediate that for 2and the assertion is true. Let thus n> 2 and suppose the assertion is true for Let (i ,...,i be a permutation. We distinguish two cases. Case 1 . The pair is positive. Then ,... ,...) ,..., ,...) (we use here the symbol to express that “the spectrum of A( ,... ,...) is the same as the spectrum of A( ,..., ,...) ”) which is ,... ,...) We now take the pair as one element 1 and diminish indi- ces of all the remaining indices by one: thus ,... ,...) ,... ,...) which is again equivalent to ,... ,...) by the rota- tional operation. By the induction hypothesis, this permutation is equivalent to the permutation obtained from (i ,...,i by putting together the pair by moving 1 to the right until it hits 2 and then denoting it by 1 and diminishing all the remain- ing indices by one. Going back we reconstruct the chain of operations which brings the permutation ,... ,...) into (i ,...,i ). Case 2 . The pair is negative. By rotation, we can arrange that 1 will be the ﬁrst element in the permutation. Of course, the pair will then be positive and, by Case 1, the assertion is correct. Theorem 2.3. All matrices for any permutation (i ,...,i are compan- ion matrices of p(x) and are similar to the matrix ). In particular this holds for the matrix BC, where is the matrix and is the matrix where are the matrices from ). The matrix is the direct sum of the matrices ,C etc ., the matrix the direct sum of the identity matrix and the matrices ,C etc. from ). For even

Page 4

328 M. Fiedler / Linear Algebra and its Applications 372 (2003) 325–331 the matrix ends with the block ), for odd, ends with the block ). The matrix is pentadiagonal and contains the same entries as the usual companion matrix ). Proof. The ﬁrst part follows from Lemma 2.2 since by Lemma 2.1 the matrix is the usual companion matrix. The last assertion follows from the fact that both matrices and are tridiagonal and from the comment in the following example. Example 2.4. Let us present explicitly the matrices B, C, and BC for 5and 6 (the even and odd cases slightly differ): For 10 10 ,C 10 10 BC 100 10000 01000 000 For 10 10 10 10 10 BC 1000 1 00000 10 0 10000 000 0 00100

Page 5

M. Fiedler / Linear Algebra and its Applications 372 (2003) 325–331 329 We see that––and this is true in general––the ﬁrst two rows of BC contain non-zero entries in the ﬁrst three columns only: 100 the following pairs of rows 2 1and2 contain non-zero entries only in four columns with indices 2 ...,2 and the submatrices 1000 in these rows and columns contain two entries and two ones; the remaining are zero. The last two rows in the even case contain non-zeros only in the last three columns 100 the last row in the odd case is as in the example above: ,..., Remark 2.5. The matrix from Theorem 2.3 can be transformed by a permuta- tional similarity starting with odd rows and columns and continuing with even rows and columns to the block form 11 12 21 22 where 11 01 12 00 ... ... ... 21 10 ... 00 ... . . ... . 00 ... ,Z 22 00 ... 00 10 ... 00 01 ... 00 ....... 00 ... 10 If is even, 12 and 21 are square. If is odd, 12 is of size (n (n ).

Page 6

330 M. Fiedler / Linear Algebra and its Applications 372 (2003) 325–331 3. Comments Note that the presented companion matrix cannot be obtained (for n> 2) by per- mutational similarity from the usual companion matrix. Similarly to the results in [1,2,5], the new companion can be used in estimates of the roots, possibly also in computations of the roots using algorithms for the eigen- values. Since det (xI BC) det det (xC B) for non-singular and the matrix in Theorem 2.3 can easily be inverted, we obtain Theorem 3.1. Therootsof p(x) coincide with the roots of the equation in determinantal form in which the matrix is symmetric and tridiagonal det 10 xa 10 xa ... ... ... We have now the following lemma the proof of which by checking is immediate. Lemma 3.2. The matrix 10 has the spectral decomposition QDQ where for is orthogonal, diag w, diagonal. In addition the modulus i.e. the positive semideﬁnite matrix for which GG is |= the positive semideﬁnite square root of is (w α) (w α) (w α) (w α) (w α) (w α) (w α) (w α) the spectral decomposition of is QD where diag α,

Page 7

M. Fiedler / Linear Algebra and its Applications 372 (2003) 325–331 331 Using this lemma, one can ﬁnd explicitly the moduli of the matrices and in Theorem 2.3. The singular values of BC are the eigenvalues of || ; these can thus be obtained as eigenvalues of the symmetric heptadiagonal matrix || and used for estimation of the roots. In our opinion, a particularly interesting feature of the matrices and from Theorem 2.3 is the following: If g(x) and h(x) are polynomials for which f(x) g(x xh(x ), then for odd, depends on the coefﬁcients of only whereas depends on the coefﬁcients of h. For even, it is the other way. It is also easy to ﬁnd explicitly the QR -, RQ -, etc., decompositions of both ma- trices and from Theorem 2.3 and use them for manipulation. As a sample, if and then is the QR -decomposition of another companion matrix, which is a banded matrix with a small number of bands. For one gets 000 00 000 00001 00 00 00 00 00 0 00 0 0 where is set as 1for Observe also that some of the matrices considered above have superdiagonal rank (sometimes, subdiagonal rank) one. Here, as in [3], we call subdiagonal rank (respectively, superdiagonal rank ) of a square matrix the order of the maximal non- singular submatrix all of whose entries are in the subdiagonal (respectively, super- diagonal) part. In fact, in the matrix R, the superdiagonal rank is one even if we add to the super- diagonal part all the even diagonal positions (in the sense of [3]). In some mentioned cases, a similar property holds for the subdiagonal or superdiagonal rank, too. References [1] S. Barnett, Congenial matrices, Linear Algebra Appl. 41 (1981) 277–298. [2] L. Brand, Applications of the companion matrix, Amer. Math. Monthly 75 (1968) 146–152. [3] M. Fiedler, Structure ranks of matrices, Linear Algebra Appl. 179 (1993) 119–128. [4] R.A. Horn, C.R. Johnson, Matrix Analysis, Cambridge University Press, Cambridge, 1985. [5] H. Linden, Bounds for the zeros of polynomials from eigenvalues and singular values of some com- panion matrices, Linear Algebra Appl. 271 (1998) 41–82.