f called regressors or basis functions data or measurements g 1 m where and usually problem 64257nd coe64259cients x so that i 1 m ie 64257nd linear combination of functions that 64257ts data leastsquares 64257t choose to minimize tot ID: 30092
Download Pdf The PPT/PDF document "EE Autumn Stephen Boyd Lecture Leastsq..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.
EE263Autumn2007-08StephenBoydLecture6Least-squaresapplicationsleast-squaresdatattinggrowingsetsofregressorssystemidenticationgrowingsetsofmeasurementsandrecursiveleast-squares6{1 Least-squaresdatattingwearegiven:functions;:::;fn:S!R,calledregressorsorbasisfunctionsdataormeasurements(si;gi),i=1;:::;m,wheresi2Sand(usually)mproblem:ndcoecientsx;:::;xn2Rsothatx(si)++xnn(si)i;i=1;:::;mi.e.,ndlinearcombinationoffunctionsthattsdataleast-squarest:choosextominimizetotalsquarettingerror:mXi=1(x(si)++xnn(si) i)Least-squaresapplications6{2 usingmatrixnotation,totalsquarettingerroriskAx k,whereAij=j(si)hence,least-squarestisgivenbyx=(AA) A(assumingAisskinny,fullrank)correspondingfunctionislst(s)=x(s)++xnn(s)applications:{interpolation,extrapolation,smoothingofdata{developingsimple,approximatemodelofdataLeast-squaresapplications6{3 Least-squarespolynomialttingproblem:tpolynomialofdegreen,p()=a+a++an n ;todata(i;yi),i=1;:::;mbasisfunctionsarej()=j ,j=1;:::;nmatrixAhasformAij=j iA=26641n 1n ......1mmn m3775(calledaVandermondematrix)Least-squaresapplications6{4 assumingk=lfork=andm,Aisfullrank:supposeAa=0correspondingpolynomialp()=a++an n vanishesatmpoints;:::;tmbyfundamentaltheoremofalgebrapcanhavenomorethan 1zeros,sopisidenticallyzero,anda=0columnsofAareindependent,i.e.,AfullrankLeast-squaresapplications6{5 Examplet()=4t=(1+10)withpolynomialm=100pointsbetween=0=1least-squarestfordegrees1,2,3,4haveRMSerrors:135,:076,:025,:005,respectivelyLeast-squaresapplications6{6 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0 0.5 1 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0 0.5 1 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0 0.5 1 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0 0.5 1 p()p()p()p()Least-squaresapplications6{7 Growingsetsofregressorsconsiderfamilyofleast-squaresproblemsminimizekPpi=1xiai ykforp=1;:::;n(a;:::;aparecalledregressors)approximateybylinearcombinationofa;:::;approjectyontospanfa;:::;apgregressyona;:::;apaspincreases,getbettert,sooptimalresidualdecreasesLeast-squaresapplications6{8 solutionforeachpisgivenbyx(p)ls=(ApAp) Apy=R ppywhereAp=[aap]2RmpistherstpcolumnsofAAp=pRpistheQRfactorizationofApRp2RppistheleadingppsubmatrixofRp=[qqp]istherstpcolumnsofLeast-squaresapplications6{9 Normofoptimalresidualversuspplotofoptimalresidualversuspshowshowwellycanbematchedbylinearcombinationofa;:::;ap,asfunctionofp kresidualkp01234567kykminx1kxa ykminx1;:::;x7kPi=1xiai ykLeast-squaresapplications6{10 Least-squaressystemidenticationwemeasureinputu()andoutputy()for=0;:::;Nofunknownsystem u()y()unknownsystemsystemidenticationproblem:ndreasonablemodelforsystembasedonmeasuredI/Odatau,yexamplewithscalaru,y(vectoru,yreadilyhandled):tI/Odatawithmoving-average(MA)modelwithdelays^y()=hu()+hu( 1)++hnu( )whereh;:::;hn2RLeast-squaresapplications6{11 wecanwritemodelorpredictedoutputas2664^y()^y(+1)...^y(N)3775=2664u()u( 1)u(0)u(+1)u()u(1).........u(N)u(N 1)u(N )37752664hh...hn3775modelpredictionerroris=(y() ^y();:::;y(N) ^y(N))least-squaresidentication:choosemodel(i.e.,h)thatminimizesnormofmodelpredictionerrorkk...aleast-squaresproblem(withvariablesh)Least-squaresapplications6{12 Example 0 10 20 30 40 50 60 70 -4 -2 0 2 4 0 10 20 30 40 50 60 70 -5 0 5 y()u()for=7weobtainMAmodelwith(h;:::;h)=(:024;:282;:418;:354;:243;:487;:208;:441)withrelativepredictionerrorkk=kyk=0:37Least-squaresapplications6{13 0 10 20 30 40 50 60 70 -4 -3 -2 -1 0 1 2 3 4 5 solid:y(t):actualoutputdashed:^y(t),predictedfrommodelLeast-squaresapplications6{14 Modelorderselectionquestion:howlargeshouldbe?obviouslythelarger,thesmallerthepredictionerroronthedatausedtoformthemodelsuggestsusinglargestpossiblemodelorderforsmallestpredictionerrorLeast-squaresapplications6{15 0 5 10 15 20 25 30 35 40 45 50 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 relativepredictionerrorkk=kyk diculty:fortoolargethepredictiveabilityofthemodelonotherI/Odata(fromthesamesystem)becomesworseLeast-squaresapplications6{16 Cross-validationevaluatemodelpredictiveperformanceonanotherI/Odatasetnotusedtodevelopmodelmodelvalidationdataset: 0 10 20 30 40 50 60 70 -4 -2 0 2 4 0 10 20 30 40 50 60 70 -5 0 5 y()u()Least-squaresapplications6{17 nowcheckpredictionerrorofmodels(developedusingmodelingdata)onvalidationdata: 0 5 10 15 20 25 30 35 40 45 50 0 0.2 0.4 0.6 0.8 1 relativepredictionerrorvalidationdatamodelingdataplotsuggests=10isagoodchoiceLeast-squaresapplications6{18 for=50theactualandpredictedoutputsonsystemidenticationandmodelvalidationdataare: 0 10 20 30 40 50 60 70 -5 0 5 0 10 20 30 40 50 60 70 -5 0 5 solid:y(t)dashed:predictedy(t)solid:y(t)dashed:predictedy(t)lossofpredictiveabilitywhentoolargeiscalledmodelovertorovermodelingLeast-squaresapplications6{19 Growingsetsofmeasurementsleast-squaresproblemin`row'form:minimizekAx yk=mXi=1(aix yi)whereaiaretherowsofA(ai2Rn)x2Rnissomevectortobeestimatedeachpairai,yicorrespondstoonemeasurementsolutionisxls= mXi=1aiai! mXi=1yiaisupposethataiandyibecomeavailablesequentially,i.e.,mincreaseswithtimeLeast-squaresapplications6{20 Recursiveleast-squareswecancomputexls(m)= mXi=1aiai! mXi=1yiairecursivelyinitializeP(0)=02Rnn,q(0)=02Rnform=0;1;:::;P(m+1)=P(m)+am+1am+1q(m+1)=q(m)+ym+1am+1ifP(m)isinvertible,wehavexls(m)=P(m) q(m)P(m)isinvertible()a;:::;amspanRn(so,onceP(m)becomesinvertible,itstaysinvertible)Least-squaresapplications6{21 Fastupdateforrecursiveleast-squareswecancalculateP(m+1) = P(m)+am+1am+1 ecientlyfromP(m) usingtherankoneupdateformula P+aa =P 1 1+aP a(P a)(P a)validwhenP=P,andPandP+aaarebothinvertiblegivesanO()methodforcomputingP(m+1) fromP(m) standardmethodsforcomputingP(m+1) fromP(m+1)isO()Least-squaresapplications6{22 Vericationofrankoneupdateformula(P+aa)P 1 1+aP a(P a)(P a)=I+aaP 1 1+aP aP(P a)(P a) 1 1+aP aaa(P a)(P a)=I+aaP 1 1+aP aaaP aP a 1+aP aaaP =ILeast-squaresapplications6{23