 204K - views

# The Jordan Canonical Form The Jordan canonical form describes the structure of an arbitrary linear transformation on a nitedimensional vector space over an al gebraically closed eld

Here we develop it using only the most basic concepts of linear algebra with no reference to determinants or ideals of polynomials HEOREM 1 Let be linearly independent vectors in a vector space If they are in the span of then Proof We prove the fo

Tags :

## The Jordan Canonical Form The Jordan canonical form describes the structure of an arbitrary linear transformation on a nitedimensional vector space over an al gebraically closed eld

Download Pdf - The PPT/PDF document "The Jordan Canonical Form The Jordan can..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.

## Presentation on theme: "The Jordan Canonical Form The Jordan canonical form describes the structure of an arbitrary linear transformation on a nitedimensional vector space over an al gebraically closed eld"â€” Presentation transcript:

Page 1
The Jordan Canonical Form The Jordan canonical form describes the structure of an arbitrary linear transformation on a ﬁnite-dimensional vector space over an al- gebraically closed ﬁeld. Here we develop it using only the most basic concepts of linear algebra, with no reference to determinants or ideals of polynomials. HEOREM 1. Let ... be linearly independent vectors in a vector space. If they are in the span of ... then Proof. We prove the following claim: Let ... be linearly independent vectors in a vector space. For all with and all vectors ... , if ... are in

the span of ... ... , then The proof of the claim is by induction on . For = 0, the claim is obvious since ... are linearly independent. Suppose the claim is true for 1, and suppose that ... are in the span of the vectors ... ... . Then in particular we have +1 ··· ··· (1) For some we must have = 0 since ... are linearly independent, so we can solve (1) for as a linear combination of ... +1 ... +1 ... . (2) Hence the vectors (2) span ... . By the induction hypothesis, + 1) + ( 1) , so . This proves the claim. The case = 0 of the claim gives the theorem. By this theorem, any two bases of a

ﬁnite-dimensional vector space have the same number of elements, the dimension of the vector space. Let be a linear transformation on the ﬁnite-dimensional vector space over the ﬁeld . An annihilating polynomial for is a non-zero polynomial such that ) = 0. HEOREM 2. Let be a linear transformation on the ﬁnite-dimen- sional vector space . Then there exists an annihilating polynomial for
Page 2
Proof. Let ... be a basis for . For each with 1 by Theorem 1 there exist scalars ... , not all 0, such that T ··· = 0 That is, = 0 where ··· . Let be the product of

all the . Then is an annihilating polynomial for since = 0 for each basis vector We denote the null space of the linear transformation by and its range by HEOREM 3. Let be a linear transformation on the ﬁnite dimen- sional vector space . Then and are linear subspaces of invariant under , with dim + dim = dim V. (3) If ∩R then ⊕R (4) is a decomposition of as a direct sum of subspaces invariant under Proof. It is clear that and are linear subspaces of invari- ant under . Let ... be a basis for and extend it by the vectors +1 ... to be a basis for . Then T +1 ... T are a basis

for : they span , and if +1 T +1 ··· T = 0 then +1 +1 ··· ∈N , so all of the coeﬃcients are 0. This proves (3), from which (4) follows if ∩R HEOREM 4. Let be a linear transformation on a non-zero ﬁnite- dimensional vector over an algebraically closed ﬁeld . Then has an eigenvector. Proof. By Theorem 2 there exists an annihilating polynomial for . Since is algebraically closed, is a non-zero scalar multiple of ··· for some scalars ... . Let be a non-zero vector and let be the least number such that ··· = 0 If = 1, then is an eigenvector with the eigenvalue ;

otherwise, = ( ··· is an eigenvector with the eigenvalue
Page 3
HEOREM 5. Let be a linear transformation on the ﬁnite-dimen- sional vector space with an eigenvalue . Let for some cI = 0 (5) Then there exists such that cI (6) and cI ⊕R cI (7) is a decomposition of as a direct sum of subspaces invariant under Proof. If and , then c ; if (so that cI = 0 for some ) and (so that ( cI = 0 for some ), then (since ( cI ) = 0 whenever ,j ). Thus is a linear subspace of . It has a ﬁnite basis, since is ﬁnite dimensional, so there is an such that ( cI = 0 for each

basis element and consequently for all in . This proves (6). I claim that cI and cI have intersection Suppose that is in both spaces. Then = ( cI for some since is in cI . Since it is in cI cI = ( cI = 0 so by the deﬁnition (5) of . Hence ( cI = 0 by (6), so = 0. This proves the claim. By Theorem 3 we have (7). Each of the two spaces is invariant under cI , and under cI , so also under = ( cI ) + cI HEOREM 6. Let be a linear transformation on the ﬁnite-dimen- sional vector space over the algebraically closed ﬁeld , and let the scalars ... be the distinct eigenvalues of .

Then there exist numbers , for , such that ⊕···⊕N (8) is a direct sum decomposition of into subspaces invariant under Proof. From Theorem 5 by induction on the number of distinct eigenvalues. A linear transformation is nilpotent of degree in case = 0 but = 0; it is nilpotent in case it is nilpotent of degree for some Notice that on each of the subspaces of the direct sum decomposition (8), the operator is a scalar multiple of plus a nilpotent operator. Thus our remaining task is to ﬁnd the structure of a nilpotent operator.
Page 4
HEOREM 7. Let be nilpotent of

degree on the vector space Then we have strict inclusions ⊂N ⊂···⊂N ⊂N V. (9) Proof. The inclusions are obvious. They are strict inclusions be- cause by deﬁnition there is a vector in such that = 0 but = 0. Then is in but not We say that the vectors ... are linearly independent of the linear subspace in case ··· is in only if ··· = 0. HEOREM 8. Let be a nilpotent linear transformation of degree on the ﬁnite-dimensional vector space . Then there exist a number and vectors ... such that the non-zero vectors of the form for and , are a basis for . Any vectors

linearly independent of can be included among the ... For , let be the subspace with basis ... where is the least number such that = 0 . Then ⊕··· (10) is a direct sum decomposition of into -dimensional subspaces invari- ant under , and is nilpotent of degree on . For , let be the number of subspaces in the decomposition (10) of dimension at least . Then dim dim so the number of subspaces in (10) of any given dimension is determined uniquely by Proof. We prove the statements of the ﬁrst paragraph by induction on . For = 1, we have = 0 and the result is trivial. Suppose that the

result holds for 1, and consider a nilpotent linear transformation of degree Given vectors linearly independent of , extend them to a maximal such set ... (so that they together with any basis for are a basis for ). Then the vectors N ... N are in and are linearly independent of , for if N ··· N ∈N then ··· ∈N so ··· = 0. Now restricted to is nilpotent of degree 1, so by the
Page 5
induction hypothesis there are vectors ... , including N ... N among them, such that the non-zero vectors of the form are a basis for . Adjoin the vectors ... to them; then this is a basis for

of the desired form. Now the statements of the second paragraph follow directly. (See the following example, in which is nilpotent of degree 5 on a 24 di- mensional space. The bottom rows are .) N N N N N N We have done all the work necessary to establish the Jordan canon- ical form; it remains only to put the pieces together. It is convenient to express the result in matrix language. Let ) be the lower triangular matrix with along the diagonal, 1 everywhere immediately below the diagonal, and 0 every- where else. Such a matrix is called a Jordan block . Notice that in the decomposition (10),

the matrix of on , with respect to the basis described in Theorem 8, is the Jordan block ; 0). (With the basis in reverse order, the entries 1 are immediately above the diagonal. Either convention is acceptable.) A matrix that is a direct sum of Jordan blocks is in Jordan form HEOREM 9. Let be a linear transformation on the ﬁnite-dimen- sional vector space over the algebraically closed ﬁeld . Then there exists a basis of such that the matrix of is in Jordan form. This matrix is unique except for the order of the Jordan blocks. Proof. By Theorems 6 and 8. The proof shows that the

same result holds for a ﬁeld that is not algebraically closed provided that has some annihilating polynomial that factors into ﬁrst degree factors. http://math.princeton.edu/ nelson/217/jordan.pdf