Multivariable Control  Algebraic Riccati Equation Let  and be real matrices with and symmetric
172K - views

Multivariable Control Algebraic Riccati Equation Let and be real matrices with and symmetric

An algebraic Riccati equation ARE is XA XRX 0 We associate a 2 matrix called the Hamiltonian matrix with the ARE A R The Hamiltonian matrix has some useful properties The eigenvalues of are symmetric about the imaginary axis To prove this assertion

Download Pdf

Multivariable Control Algebraic Riccati Equation Let and be real matrices with and symmetric




Download Pdf - The PPT/PDF document "Multivariable Control Algebraic Riccati..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.



Presentation on theme: "Multivariable Control Algebraic Riccati Equation Let and be real matrices with and symmetric"— Presentation transcript:


Page 1
Multivariable Control - Algebraic Riccati Equation Let , and be real matrices with and symmetric. An algebraic Riccati equation (ARE) is XA XRX = 0 We associate a 2 matrix called the Hamiltonian matrix with the ARE A R The Hamiltonian matrix has some useful properties. The eigenvalues of are symmetric about the imaginary axis. To prove this assertion, fir st note that has a special type of skew symmetry. In particular, let then JH A R Q A A R which is a symmetric matrix. Moreover which means that . So we can readily see that HJ JHJ is, of course, a similarity

transformation so that the eigenvalues of are also the eigenvalues of . But we also know that if is an eigenvalue of then is an eigenvalue of (where is the complex conjugate of ). This implies that must also be an eigenvalue of , which means the eigenvalues of are symmetric with respect to the imaginary axis. Let be an eigenvalue of with associated eigenvector . The pair ( λ,x ) generate a 1-dimensional subspace of αx, so that if , the Hv also. We therefore say that is an -invariant subspace. We now show how to construct solutions to the ARE using -invariant subspaces. Let be an 2

-dimensional -invariant subspace and let and by two matrices in such that = Im Since is -invariant, there is a matrix such that A R Let us assume is invertible , we can then post multiply by to obtain A R A R where we let
Page 2
If we no pre-multiply by X I we obtain X I A R X I Simplifying the right and left hand sides of the above equations reduc es to XA XRX = 0 which is our algebraic Riccati equation. This implies therefore that solves the ARE. Note that the solution is independent of the choice of basis spanning . If we were to choose any other basis spanning represented by the

image space of matrix where is a similarity transformation, then it is clear that = ( )( Thereby showing that any other choice of basis for results in the same matrix satisfying the ARE. The converse of the above result holds also. So that if solves the ARE, we claim we can always find and with invertible such that and the columns of form an -dimensional invariant subspace of . To prove this assertion, let Λ = RX and note that Λ = XA XRX Using the ARE, the above equation becomes Λ = we can now rewrite the equation for Λ and the last equation in matrix f orm as A R

This implies that spans an -d -invariant subspace. So the result is satisfied by simply taking and Note that this is useful because it shows that solving the ARE is equiv alent to solving a system of linear algebraic equations. Note that there may be many solutions to a given ARE which are obtain ed by making different selections for the basis of . Consider the ARE obtained when 3 2 2 1 , R 0 0 , Q 0 0 0 0 One can readily verify that the following matrices satisfy the ARE 10 6 , X 0 0 0 0 , X 2 2 If we use the ARE solver in Matlab, however, we only get the zero solu tion above. In

general, we are interested in whether our ARE has stabilizing solutions. In other words is RX a Hurwitz matrix. Assume has no eigenvalues on the j -axis. By the symmetry properties discussed earlier, must have eigenvalues on the open right hand side of the complex plans and on the open left hand
Page 3
side. This means that the spectrum of can be partitioned into to sets of stable and unstable eigenvalues. Consider two -dimensional subspaces spanned by the eigenvectors associated with the stable and unstable eigenvalues of ) = stable subspace ) = unstable subspace Lets consider the

stable subspace ) and determine a basis so that ) = Im If exists, then we can set which is uniquely determined by the Hamiltonian matrix . We therefore introduce the operator Ric : as a map from the Hamiltonian matrix onto the ARE solution associated with the stabilizing subspace, ) of . The domain of the Ric operator is denoted as dom(Ric) and the value that this operator takes for a specified Hamilontian matrix is denoted as Ric( ). We can now state and prove the following theorem Theorem: Suppose dom(Ric) and = Ric( ), then the following holds is real and symmetric satisfies the

ARE RX is Hurwitz Proof: Consider and constructed as discussed above. Note that there exists a stable real matrix such that If we premultiply by we obtain JH Recallthat JH issymmetricsothattherightandlefthandsidesoftheaboveequat ionaresymmetric and we can conclude that ) = If we let = ( ), then the above equation can be rewritten as = 0 which is a Lyapunov equation. Because is Hurwitz, we can show that = 0 which implies that which thereby shows that is symmetric. Note that since is nonsingular = ( and since we know is symmetric, we can conclude is symmetric as well. The second assertion is

established by starting with
Page 4
Post-multiplying by we obtain (1) and pre-multiplying by we obtain = 0 Expanding out we obtain the ARE. To show the third assertion, premultiply equation (1) by to obtain Rx This shows that RX has the same eigenvalues as and so RX is Hurwitz. When is in dom( Ric )? We can obtain testable conditions if were willing to assume that has no imaginary eigenvalues. Recall that this allows us to partition the sp ectrum of into stable and unstable eigenvalues and we can then generate an -dimensional -invariant subspace, ), from the stable eigenvalues. The

main result well prove is that dom(Ric) if and only if ( A,R ) is stabilizable assuming that is either positive semi-definite or negative semi-definite. To prove this assertion, lets assume dom(Ric). This means there exist matrices and such that ) = Im with invertible. This only occur if ker( ) = (trivial subspace) So well focus on the kernel of First note that ker is -invariant. Now assume that her( ) is not trivial, this means there exists ( λ,x ) with = 0 such that λx So consider This implies that QX Let her( ) and multiply through by to obtain QX The first

term is zero and the last term reduces to λX . We can therefore simply this to λI = 0 which says that ( λ,X ) is an eigenvalue/vector pair for . One can also show that Rx = 0 which means that λI R = 0 If ( A,R ) is stabilizable, then = 0 and we know = 0 which implies = 0 and so ker( ) is trivial and is invertible.
Page 5
If we apply the above results to specific Hamiltonians associated with AREs used in and synthesis we obtain the following results. In particular consider the Hamiltonian matrix BB This is associated with the Full-Information problem. Using

our prior results we see that the associated ARE has a stabilizing positive semi-definite solution if ( A,B ) is stabilizable and ( A,C is detectable. The following Hamiltonian arises in the output-feedback problem C B BR DC BR DR BR The ARE associated with this Hamiltonian matrix has a stabilizing solution if ( A,B ) is stabilizable and jωI B C D has full column rank (the detectability condition). If we return to o ur original OF problem with the plants state space realization being 11 12 21 , we saw that the assumptions we placed on the OF problem to obtain a solution were ,B )

stabilizable and ( ,A ) detectable Orthogonality assumptions The matrix jωI B 12 has full column rank and the matrix ωI B 21 has full row rank We nowseethat the firstandthird assumptionsarerequiredforth e existenceofa stabilizingsolution to the ARE.