/
2V.Chandrasekaran,S.Sanghavi,P.A.Parrilo,andA.S.Willskyamatrixhaveseve 2V.Chandrasekaran,S.Sanghavi,P.A.Parrilo,andA.S.Willskyamatrixhaveseve

2V.Chandrasekaran,S.Sanghavi,P.A.Parrilo,andA.S.Willskyamatrixhaveseve - PDF document

danika-pritchard
danika-pritchard . @danika-pritchard
Follow
379 views
Uploaded On 2015-08-13

2V.Chandrasekaran,S.Sanghavi,P.A.Parrilo,andA.S.Willskyamatrixhaveseve - PPT Presentation

4VChandrasekaranSSanghaviPAParriloandASWillskythesparsematrixAandnotthesingularvaluesofBorthevaluesofthenonzeroentriesofAThereasonforthisisthatthenonzeroentriesofAandthesingularvalues ID: 106399

4V.Chandrasekaran S.Sanghavi P.A.Parrilo andA.S.WillskythesparsematrixA? andnotthesingularvaluesofB?orthevaluesofthenon-zeroentriesofA?.Thereasonforthisisthatthenon-zeroentriesofA?andthesingularvalues

Share:

Link:

Embed:

Download Presentation from below link

Download Pdf The PPT/PDF document "2V.Chandrasekaran,S.Sanghavi,P.A.Parrilo..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

2V.Chandrasekaran,S.Sanghavi,P.A.Parrilo,andA.S.Willskyamatrixhaveseveralimplicationsincomplexitytheory[19].Similarly,inasystemidenti cationsettingthelow-rankmatrixrepresentsasystemwithasmallmodelorderwhilethesparsematrixrepresentsasystemwithasparseimpulseresponse.Decomposingasystemintosuchsimplercomponentscanbeusedtoprovideasimpler,moreecientdescription.1.1.Ourresults.Formallythedecompositionproblemweareinterestedcanbede nedasfollows:Problem.GivenC=A?+B?whereA?isanunknownsparsematrixandB?isanunknownlow-rankmatrix,recoverA?andB?fromCusingnoadditionalinformationonthesparsitypatternand/ortherankofthecomponents.Intheabsenceofanyfurtherassumptions,thisdecompositionproblemisfunda-mentallyill-posed.Indeed,thereareanumberofscenariosinwhichauniquesplittingofCinto\low-rank"and\sparse"partsmaynotexist;forexample,thelow-rankmatrixmayitselfbeverysparseleadingtoidenti abilityissues.Inordertochar-acterizewhensuchadecompositionispossiblewedevelopanotionofrank-sparsityincoherence,anuncertaintyprinciplebetweenthesparsitypatternofamatrixanditsrow/columnspaces.Thisconditionisbasedonquantitiesinvolvingthetangentspacestothealgebraicvarietyofsparsematricesandthealgebraicvarietyoflow-rankmatrices[16].Twonaturalidenti abilityproblemsmayarise.The rstoneoccursifthelow-rankmatrixitselfisverysparse.Inordertoavoidsuchaproblemweimposecertainconditionsontherow/columnspacesofthelow-rankmatrix.Speci cally,foramatrixMletT(M)bethetangentspaceatMwithrespecttothevarietyofallmatriceswithranklessthanorequaltorank(M).Operationally,T(M)isthespanofallmatriceswithrow-spacecontainedintherow-spaceofMorwithcolumn-spacecontainedinthecolumn-spaceofM;see(3.2)foraformalcharacterization.Let(M)bede nedasfollows:(M),maxN2T(M);kNk1kNk1:(1.1)Herekkisthespectralnorm(i.e.,thelargestsingularvalue),andkk1denotesthelargestentryinmagnitude.Thus(M)beingsmallimpliesthat(appropriatelyscaled)elementsofthetangentspaceT(M)are\di use",i.e.,theseelementsarenottoosparse;asaresultMcannotbeverysparse.AsshowninProposition4(seeSection4.3)alow-rankmatrixMwithrow/columnspacesthatarenotcloselyalignedwiththecoordinateaxeshassmall(M).Theotheridenti abilityproblemmayariseifthesparsematrixhasallitssupportconcentratedinonecolumn;theentriesinthiscolumncouldnegatetheentriesofthecorrespondinglow-rankmatrix,thusleavingtherankandthecolumnspaceofthelow-rankmatrixunchanged.Toavoidsuchasituation,weimposeconditionsonthesparsitypatternofthesparsematrixsothatitssupportisnottooconcentratedinanyrow/column.ForamatrixMlet (M)bethetangentspaceatMwithrespecttothevarietyofallmatriceswithnumberofnon-zeroentrieslessthanorequaltojsupport(M)j.Thespace (M)issimplythesetofallmatricesthathavesupportcontainedwithinthesupportofM;see(3.4).Let(M)bede nedasfollows:(M),maxN2 (M);kNk11kNk:(1.2)Thequantity(M)beingsmallforamatriximpliesthatthespectrumofanyelementofthetangentspace (M)is\di use",i.e.,thesingularvaluesoftheseelementsare 4V.Chandrasekaran,S.Sanghavi,P.A.Parrilo,andA.S.WillskythesparsematrixA?,andnotthesingularvaluesofB?orthevaluesofthenon-zeroentriesofA?.Thereasonforthisisthatthenon-zeroentriesofA?andthesingularvaluesofB?playnoroleinthesubgradientconditionswithrespecttothe`1normandthenuclearnorm.Inthesequelwediscussconcreteclassesofsparseandlow-rankmatricesthathavesmallandrespectively.Wealsoshowthatwhenthesparseandlow-rankmatricesA?andB?aredrawnfromcertainnaturalrandomensembles,thenthesucientconditionsofTheorem2aresatis edwithhighprobability;consequently,(1.3)providesexactrecoverywithhighprobabilityforsuchmatrices.1.2.Previousworkusingincoherence.Theconceptofincoherencewasstud-iedinthecontextofrecoveringsparserepresentationsofvectorsfromaso-called\overcompletedictionary"[9].Moreconcretelyconsiderasituationinwhichoneisgivenavectorformedbyasparselinearcombinationofafewelementsfromacom-binedtime-frequencydictionary,i.e.,avectorformedbyaddingafewsinusoidsandafew\spikes";thegoalistorecoverthespikesandsinusoidsthatcomposethevec-torfromthein nitelymanypossiblesolutions.Basedonanotionoftime-frequencyincoherence,the`1heuristicwasshowntosucceedinrecoveringsparsesolutions[8].Incoherenceisalsoaconceptthatisimplicitlyusedinrecentworkunderthetitleofcompressedsensing,whichaimstorecover\low-dimensional"objectssuchassparsevectors[3,11]andlow-rankmatrices[22,4]givenincompleteobservations.Ourworkiscloserinspirittothatin[9],andcanbeviewedasamethodtorecoverthe\simplestexplanation"ofamatrixgivenan\overcompletedictionary"ofsparseandlow-rankmatrixatoms.1.3.Outline.InSection2weelaborateontheapplicationsmentionedprevi-ously,anddiscusstheimplicationsofourresultsforeachoftheseapplications.Sec-tion3formallydescribesconditionsforfundamentalidenti abilityinthedecompo-sitionproblembasedonthequantitiesandde nedin(1.1)and(1.2).Wealsoprovideaproofoftherank-sparsityuncertaintyprincipleofTheorem1.WeproveTheorem2inSection4,andalsoprovideconcreteclassesofsparseandlow-rankmatricesthatsatisfythesucientconditionsofTheorem2.Section5describestheresultsofsimulationsofourapproachappliedtosyntheticmatrixdecompositionprob-lems.WeconcludewithadiscussioninSection6.TheAppendixprovidesadditionaldetailsandproofs.2.Applications.Inthissectionwedescribeseveralapplicationsthatinvolvedecomposingamatrixintosparseandlow-rankcomponents.2.1.Graphicalmodelingwithlatentvariables.Webeginwithaprobleminstatisticalmodelselection.Inmanyapplicationslargecovariancematricesareapprox-imatedaslow-rankmatricesbasedontheassumptionthatasmallnumberoflatentfactorsexplainmostoftheobservedstatistics(e.g.,principalcomponentanalysis).Anotherwell-studiedclassofmodelsarethosedescribedbygraphicalmodels[18]inwhichtheinverseofthecovariancematrix(alsocalledtheprecisionorconcentrationorinformationmatrix)isassumedtobesparse(typicallythissparsityiswithrespecttosomegraph).Wedescribeamodelselectionprobleminvolvinggraphicalmodelswithlatentvariables.LetthecovariancematrixofacollectionofjointlyGaussianvariablesbedenotedby(oh),whereorepresentsobservedvariablesandhrepresentsunobserved,hiddenvariables.Themarginalstatisticscorrespondingtotheobservedvariablesoaregivenbythemarginalcovariancematrixo,whichissimplyasub-matrixofthefullcovariancematrix(oh).Suppose,however,thatweparameterize 6V.Chandrasekaran,S.Sanghavi,P.A.Parrilo,andA.S.WillskyarecommonlymodeledusingtheHopkinsintegral[15],whichgivestheoutputinten-sityatapointasafunctionoftheinputtransmissionviaaquadraticform.Inmanyapplicationstheoperatorinthisquadraticformcanbewell-approximatedbya( nite)positivesemi-de nitematrix.Opticalsystemsdescribedbyalow-pass lterarecalledcoherentimagingsystems,andthecorrespondingsystemmatriceshavesmallrank.Forsystemsthatarenotperfectlycoherentvariousmethodshavebeenproposedto ndanoptimalcoherentdecomposition[21],andtheseessentiallyidentifythebestapproximationofthesystemmatrixbyamatrixoflowerrank.Attheotherendareincoherentopticalsystemsthatallowsomehighfrequencies,andarecharacterizedbysystemmatricesthatarediagonal.Asmostreal-worldimagingsystemsaresomecombinationofcoherentandincoherent,itwassuggestedin[12]thatopticalsystemsarebetterdescribedbyasumofcoherentandincoherentsystemsratherthanbythebestcoherent(i.e.,low-rank)approximationasin[21].Thus,decomposinganimagingsystemintocoherentandincoherentcomponentsinvolvessplittingtheopticalsystemmatrixintolow-rankanddiagonalcomponents.Identifyingthesesimplercomponentshasimportantapplicationsintaskssuchasopticalmicrolithography[21,15].3.Rank-SparsityIncoherence.Throughoutthispaper,werestrictourselvestosquarennmatricestoavoidclutterednotation.Allouranalysisextendstorectangularn1n2matrices,ifwesimplyreplacenbymax(n1;n2).3.1.Identi abilityissues.Asdescribedintheintroduction,thematrixde-compositionproblemcanbefundamentallyill-posed.Wedescribetwosituationsinwhichidenti abilityissuesarise.Theseexamplessuggestthekindsofadditionalcon-ditionsthatarerequiredinordertoensurethatthereexistsauniquedecompositionintosparseandlow-rankmatrices.First,letA?beanysparsematrixandletB?=eieTj,whereeirepresentsthei-thstandardbasisvector.Inthiscase,thelow-rankmatrixB?isalsoverysparse,andavalidsparse-plus-low-rankdecompositionmightbe^A=A?+eieTjand^B=0.Thus,weneedconditionsthatensurethatthelow-rankmatrixisnottoosparse.Onewaytoaccomplishthisistorequirethatthequantity(B?)besmall.AswillbediscussedinSection4.3),iftherowandcolumnspacesofB?are\incoherent"withrespecttothestandardbasis,i.e.,therow/columnspacesarenotalignedcloselywithanyofthecoordinateaxes,then(B?)issmall.Next,considerthescenarioinwhichB?isanylow-rankmatrixandA?=�veT1withvbeingthe rstcolumnofB?.Thus,C=A?+B?haszerosinthe rstcolumn,rank(C)=rank(B?),andChasthesamecolumnspaceasB?.Therefore,areasonablesparse-plus-low-rankdecompositioninthiscasemightbe^B=B?+A?and^A=0.Hererank(^B)=rank(B?).RequiringthatasparsematrixA?havesmall(A?)avoidssuchidenti abilityissues.IndeedweshowinSection4.3thatsparsematriceswith\boundeddegree"(i.e.,fewnon-zeroentriesperrow/column)havesmall.3.2.Tangent-spaceidenti ability.Webeginbydescribingthesetsofsparseandlow-rankmatrices.Thesesetscanbeconsideredeitherasdi erentiablemani-folds(awayfromtheirsingularities)orasalgebraicvarieties;weemphasizethelatterviewpointhere.Recallthatanalgebraicvarietyisde nedasthezerosetofasystemofpolynomialequations[16].Thevarietyofrank-constrainedmatricesisde nedas:P(k),fM2Rnnjrank(M)kg:(3.1) 8V.Chandrasekaran,S.Sanghavi,P.A.Parrilo,andA.S.WillskyThus,both(A?)and(B?)beingsmallimpliesthatthetangentspaces (A?)andT(B?)intersecttransversally;consequently,wecanexactlyrecover(A?;B?)given (A?)andT(B?).Asweshallsee,theconditionrequiredinTheorem2(seeSec-tion4.2)forexactrecoveryusingtheconvexprogram(1.3)willbesimplyamildtighteningoftheconditionrequiredaboveforuniquedecompositiongiventhetan-gentspaces.3.3.Rank-sparsityuncertaintyprinciple.AnotherimportantconsequenceofProposition1isthatwehaveanelementaryproofofthefollowingrank-sparsityuncertaintyprinciple.Theorem1.ForanymatrixM6=0,wehavethat(M)(M)1;where(M)and(M)areasde nedin(1.1)and(1.2)respectively.Proof:GivenanyM6=0itisclearthatM2 (M)\T(M),i.e.,Misanelementofbothtangentspaces.However(M)(M)1wouldimplyfromProposition1that (M)\T(M)=f0g,whichisacontradiction.Consequently,wemusthavethat(M)(M)1.Hence,foranymatrixM6=0both(M)and(M)cannotbesimultaneouslysmall.NotethatProposition1isanassertioninvolvingandfor(ingeneral)di erentmatrices,whileTheorem1isastatementaboutandforthesamematrix.Essentiallytheuncertaintyprincipleassertsthatnomatrixcanbetoosparsewhilehaving\di use"rowandcolumnspaces.AnextremeexampleisthematrixeieTj,whichhasthepropertythat(eieTj)(eieTj)=1.4.ExactDecompositionUsingSemide niteProgramming.Webeginthissectionbystudyingtheoptimalityconditionsoftheconvexprogram(1.3),afterwhichweprovideaproofofTheorem2withsimpleconditionsthatguaranteeexactdecomposition.Nextwediscussconcreteclassesofsparseandlow-rankmatricesthatsatisfytheconditionsofTheorem2,andcanthusbeuniquelydecomposedusing(1.3).4.1.Optimalityconditions.Theorthogonalprojectionontothespace (A?)isdenotedP (A?),whichsimplysetstozerothoseentrieswithsupportnotinsidesupport(A?).Thesubspaceorthogonalto (A?)isdenoted (A?)c,anditconsistsofmatriceswithcomplementarysupport,i.e.,supportedonsupport(A?)c.Thepro-jectiononto (A?)cisdenotedP (A?)c.SimilarlytheorthogonalprojectionontothespaceT(B?)isdenotedPT(B?).Let-tingB?=UVTbetheSVDofB?,wehavethefollowingexplicitrelationforPT(B?):PT(B?)(M)=PUM+MPV�PUMPV:(4.1)HerePU=UUTandPV=VVT.ThespaceorthogonaltoT(B?)isdenotedT(B?)?,andthecorrespondingprojectionisdenotedPT(B?)?(M).ThespaceT(B?)?con-sistsofmatriceswithrow-spaceorthogonaltotherow-spaceofB?andcolumn-spaceorthogonaltothecolumn-spaceofB?.WehavethatPT(B?)?(M)=(Inn�PU)M(Inn�PV);(4.2)whereInnisthennidentitymatrix. Rank-SparsityIncoherenceforMatrixDecomposition114.3.Sparseandlow-rankmatriceswith(A?)(B?)1 6.Wediscusscon-creteclassesofsparseandlow-rankmatricesthatsatisfythesucientconditionofTheorem2forexactdecomposition.Webeginbyshowingthatsparsematriceswith\boundeddegree",i.e.,boundednumberofnon-zerosperrow/column,havesmall.Proposition3.LetA2Rnnbeanymatrixwithatmostdegmax(A)non-zeroentriesperrow/column,andwithatleastdegmin(A)non-zeroentriesperrow/column.With(A)asde nedin(1.2),wehavethatdegmin(A)(A)degmax(A):SeeAppendixBfortheproof.NotethatifA2Rnnhasfullsupport,i.e., (A)=Rnn,then(A)=n.Therefore,aconstraintonthenumberofzerosperrow/columnprovidesausefulboundon.Weemphasizeherethatsimplyboundingthenumberofnon-zeroentriesinAdoesnotsuce;thesparsitypatternalsoplaysaroleindeterminingthevalueof.Nextweconsiderlow-rankmatricesthathavesmall.Speci cally,weshowthatmatriceswithrowandcolumnspacesthatareincoherentwithrespecttothestandardbasishavesmall.WemeasuretheincoherenceofasubspaceSRnasfollows: (S),maxikPSeik2;(4.6)whereeiisthei'thstandardbasisvector,PSdenotestheprojectionontothesubspaceS,andkk2denotesthevector`2norm.Thisde nitionofincoherencealsoplayedanimportantroleintheresultsin[4].Asmallvalueof (S)impliesthatthesubspaceSisnotcloselyalignedwithanyofthecoordinateaxes.Ingeneralforanyk-dimensionalsubspaceS,wehavethatr k n (S)1;wherethelowerboundisachieved,forexample,byasubspacethatspansanykcolumnsofannnorthonormalHadamardmatrix,whiletheupperboundisachievedbyanysubspacethatcontainsastandardbasisvector.Basedonthede nitionof (S),wede netheincoherenceoftherow/columnspacesofamatrixB2Rnnasinc(B),maxf (row-space(B)); (column-space(B))g:(4.7)IftheSVDofB=UVTthenrow-space(B)=span(V)andcolumn-space(B)=span(U).WeshowinAppendixBthatmatriceswithincoherentrow/columnspaceshavesmall;theprooftechniqueforthelowerboundherewassuggestedbyBenRecht[23].Proposition4.LetB2Rnnbeanymatrixwithinc(B)de nedasin(4.7),and(B)de nedasin(1.1).Wehavethatinc(B)(B)2inc(B):IfB2Rnnisafull-rankmatrixoramatrixsuchase1eT1,then(B)=1.Therefore,aboundontheincoherenceoftherow/columnspacesofBisimportantinordertobound.UsingPropositions3and4alongwithTheorem2wehavethe Rank-SparsityIncoherenceforMatrixDecomposition13withveryhighprobability.ApplyingthesetworesultsinconjunctionwithCorollary3,wehavethatsparseandlow-rankmatricesdrawnfromtherandomsparsitymodelandtherandomor-thogonalmodelcanbeuniquelydecomposedwithhighprobability.Corollary4.Supposethatarank-kmatrixB?2Rnnisdrawnfromtherandomorthogonalmodel,andthatA?2Rnnisdrawnfromtherandomsparsitymodelwithmnon-zeroentries.GivenC=A?+B?,thereexistsarangeofvaluesfor (givenby(4.8))sothat(^A;^B)=(A?;B?)istheuniqueoptimumoftheSDP(1.3)withhighprobabilityprovidedm.n1:5 lognp max(k;logn):Thus,formatricesB?withrankksmallerthanntheSDP(1.3)yieldsexactrecoverywithhighprobabilityevenwhenthesizeofthesupportofA?issuper-linearinn.During nalpreparationofthismanuscriptwelearnedofrelatedcontempora-neouswork[30]thatspeci callystudiestheproblemofdecomposingrandomsparseandlow-rankmatrices.Inadditiontotheassumptionsofourrandomsparsityandrandomorthogonalmodels,[30]alsorequiresthatthenon-zeroentriesofA?haveindependentlychosensignsthatare1withequalprobability,whiletheleftandrightsingularvectorsofB?arechosenindependentofeachother.Forthisparticularspecializationofourmoregeneralframework,theresultsin[30]improveuponourboundinCorollary4.Implicationsforthematrixrigidityproblem.Corollary4hasimplicationsforthematrixrigidityproblemdiscussedinSection2.RecallthatRM(k)isthesmallestnumberofentriesofMthatneedtobechangedtoreducetherankofMbelowk(thechangescanbeofarbitrarymagnitude).AgenericmatrixM2RnnhasrigidityRM(k)=(n�k)2[27].However,specialstructuredclassesofmatricescanhavelowrigidity.ConsideramatrixMformedbyaddingasparsematrixdrawnfromtherandomsparsitymodelwithsupportsizeO(n logn),andalow-rankmatrixdrawnfromtherandomorthogonalmodelwithranknforsome xed�0.SuchamatrixhasrigidityRM(n)=O(n logn),andonecanrecoverthesparseandlow-rankcomponentsthatcomposeMwithhighprobabilitybysolvingtheSDP(1.3).Toseethis,notethatn logn.n1:5 lognp max(n;logn)=n1:5 lognp n;whichsatis esthesucientconditionofCorollary4forexactrecovery.Therefore,whiletherigidityofamatrixisNP-hardtocomputeingeneral[7],forsuchlow-rigiditymatricesMonecancomputetherigidityRM(n);infacttheSDP(1.3)providesacerti cateofthesparseandlow-rankmatricesthatformthelowrigiditymatrixM.5.SimulationResults.Wecon rmthetheoreticalpredictionsinthispaperwithsomesimpleexperimentalresults.Wealsopresentaheuristictochoosethetrade-o parameter .AlloursimulationswereperformedusingYALMIP[31]andtheSDPT3software[26]forsolvingSDPs.Inthe rstexperimentwegeneraterandom2525matricesaccordingtotherandomsparsityandrandomorthogonalmodelsdescribedinSection4.4.Togeneratearandomrank-kmatrixB?accordingtotherandomorthogonalmodel,wegenerate Rank-SparsityIncoherenceforMatrixDecomposition15cansimplycheckthestabilityofthesolution(^A;^B)as isvariedwithoutknowingtheappropriaterangefor inadvance.ToformalizethisschemeweconsiderthefollowingSDPfort2[0;1],whichisaslightlymodi edversionof(1.3):(^At;^Bt)=argminA;BtkAk1+(1�t)kBks.t.A+B=C:(5.2)Thereisaone-to-onecorrespondencebetween(1.3)and(5.2)givenbyt= 1+ .Thebene tinlookingat(5.2)isthattherangeofvalidparametersiscompact,i.e.,t2[0;1],asopposedtothesituationin(1.3)where 2[0;1).Wecomputethedi erencebetweensolutionsforsometandt�asfollows:di t=(k^At��^AtkF)+(k^Bt��^BtkF);(5.3)where�0issomesmall xedconstant,say=0:01.WegeneratearandomA?2R2525thatis25-sparseandarandomB?2R2525withrank=2asdescribedabove.GivenC=A?+B?,wesolve(5.2)forvariousvaluesoft.Figure5.2showstwocurves{oneistolt(whichisde nedanalogoustotol in(5.1))andtheotherisdi t.Clearlywedonothaveaccesstotoltinpractice.However,weseethatdi tisnear-zeroinexactlythreeregions.Forsucientlysmallttheoptimalsolutionto(5.2)is(^At;^Bt)=(A?+B?;0),whileforsucientlylargettheoptimalsolutionis(^At;^Bt)=(0;A?+B?).Asseeninthe gure,di tstabilizesforsmallandlarget.Thethird\middle"rangeofstabilityiswherewetypicallyhave(^At;^Bt)=(A?;B?).Noticethatoutsideofthesethreeregionsdi tisnotcloseto0andinfactchangesrapidly.Thereforeifareasonableguessfort(or )isnotavailable,onecouldsolve(5.2)forarangeoftandchooseasolutioncorrespondingtothe\middle"rangeinwhichdi tisstableandnearzero.Arelatedmethodtocheckforstabilityistocomputethesensitivityofthecostoftheoptimalsolutionwithrespectto ,whichcanbeobtainedfromthedualsolution.6.Discussion.WehavestudiedtheproblemofexactlydecomposingagivenmatrixC=A?+B?intoitssparseandlow-rankcomponentsA?andB?.Thisproblemarisesinanumberofapplicationsinmodelselection,systemidenti cation,complexitytheory,andoptics.Wecharacterizedfundamentalidenti abilityinthedecompositionproblembasedonanotionofrank-sparsityincoherence,whichrelatesthesparsitypatternofamatrixanditsrow/columnspacesviaanuncertaintyprin-ciple.AsthegeneraldecompositionproblemisNP-hardweproposeanaturalSDPrelaxation(1.3)tosolvetheproblem,andprovidesucientconditionsonsparseandlow-rankmatricessothattheSDPexactlyrecoverssuchmatrices.Oursucientconditionsaredeterministicinnature;theyessentiallyrequirethatthesparsematrixmusthavesupportthatisnottooconcentratedinanyrow/column,whilethelow-rankmatrixmusthaverow/columnspacesthatarenotcloselyalignedwiththecoordinateaxes.Ouranalysiscentersaroundstudyingthetangentspaceswithrespecttothealgebraicvarietiesofsparseandlow-rankmatrices.Indeedthesucientconditionsforidenti abilityandforexactrecoveryusingtheSDPcanalsobeviewedasrequiringthatcertaintangentspaceshaveatransverseintersection.Wealsodemonstratedtheimplicationsofourresultsforthematrixrigidityproblem.Aninterestingproblemforfurtherresearchisthedevelopmentofspecial-purposealgorithmsthattakeadvantageofstructurein(1.3)toprovideamoreecientsolutionthanageneral-purposeSDPsolver.Anotherquestionthatarisesinapplicationssuch Rank-SparsityIncoherenceforMatrixDecomposition17whichwouldallowustoconcludetheproofofthisproposition.WehavethefollowingsequenceofinequalitiesmaxN2T(B?);kNk1kP (A?)(N)kmaxN2T(B?);kNk1(A?)kP (A?)(N)k1maxN2T(B?);kNk1(A?)kNk1(A?)(B?):Herethe rstinequalityfollowsfromthede nition(1.2)of(A?)asP (A?)(N)2 (A?),thesecondinequalityisduetothefactthatkP (A?)(N)k1kNk1,andthe nalinequalityfollowsfromthede nition(1.1)of(B?).ProofofProposition2.We rstshowthat(A?;B?)isanoptimumof(1.3),beforemovingontoshowinguniqueness.Basedonsubgradientoptimalityconditionsappliedat(A?;B?),theremustexistadualQsuchthatQ2 @kA?k1andQ2@kB?k:ThesecondconditioninthispropositionguaranteestheexistenceofadualQthatsatis esboththesesubgradientconditionssimultaneously(see(4.4)and(4.5)).There-fore,wehavethat(A?;B?)isanoptimum.Nextweshowthatundertheconditionsspeci edinthelemma,(A?;B?)isalsoauniqueoptimum.Toavoidclutteredno-tation,intherestofthisproofwelet = (A?),T=T(B?), c(A?)= c,andT?(B?)=T?.Supposethatthereisanotherfeasiblesolution(A?+NA;B?+NB)thatisalsoaminimizer.WemusthavethatNA+NB=0becauseA?+B?=C=(A?+NA)+(B?+NB).Applyingthesubgradientpropertyat(A?;B?),wehavethatforanysubgradient(QA;QB)ofthefunction kAk1+kBk(at(A?;B?)) kA?+NAk1+kB?+NBk kA?k1+kB?k+hQA;NAi+hQB;NBi:(B.2)Since(QA;QB)isasubgradientofthefunction kAk1+kBkat(A?;B?),wemusthavefrom(4.4)and(4.5)thatQA= sign(A?)+P c(QA),withkP c(QA)k1 .QB=UV0+PT?(QB),withkPT?(QB)k1.UsingtheseconditionswerewritehQA;NAiandhQB;NBi.BasedontheexistenceofthedualQasdescribedinthelemma,wehavethathQA;NAi=h sign(A?)+P c(QA);NAi=hQ�P c(Q)+P c(QA);NAi=hP c(QA)�P c(Q);NAi+hQ;NAi;(B.3)wherewehaveusedthefactthatQ= sign(A?)+P c(Q).Similarly,wehavethathQB;NBi=hUV0+PT?(QB);NBi=hQ�PT?(Q)+PT?(QB);NBi=hPT?(QB)�PT?(Q);NBi+hQ;NBi;(B.4)wherewehaveusedthefactthatQ=UV0+PT?(Q).Putting(B.3)and(B.4) Rank-SparsityIncoherenceforMatrixDecomposition19ofthesplittingcanbeconcludedbecause \T=f0g.LetQ = sign(A?)+ andQT=UV0+T.WethenhaveP (^Q)= sign(A?)+ +P (QT)= sign(A?)+ +P (UV0+T):SinceP (^Q)= sign(A?), =�P (UV0+T):(B.7)Similarly,T=�PT( sign(A?)+ ):(B.8)Next,weobtainthefollowingboundonkP c(^Q)k1:kP c(^Q)k1=kP c(UV0+T)k1kUV0+Tk1(B?)kUV0+Tk(B?)(1+kTk);(B.9)whereweobtainthesecondinequalitybasedonthede nitionof(B?)(sinceUV0+T2T).Similarly,wecanobtainthefollowingboundonkPT?(^Q)kkPT?(^Q)k=kPT?( sign(A?)+ )kk sign(A?)+ k(A?)k sign(A?)+ k1(A?)( +k k1);(B.10)whereweobtainthesecondinequalitybasedonthede nitionof(A?)(since sign(A?)+ 2 ).Thus,wecanboundkP c(^Q)k1andkPT?(^Q)kbyboundingkTkandk k1respectively(usingtherelations(B.8)and(B.7)).Byde nitionof(B?)andusing(B.7),k k1=kP (UV0+T)k1kUV0+Tk1(B?)kUV0+Tk(B?)(1+kTk);(B.11)wherethesecondinequalityisobtainedbecauseUV0+T2T.Similarly,byde nitionof(A?)andusing(B.8)kTk=kPT( sign(A?)+ )k2k sign(A?)+ k2(A?)k sign(A?)+ k12(A?)( +k k1);(B.12)wherethe rstinequalityisobtainedbecausekPT(M)k2kMk,andthesecondinequalityisobtainedbecause sign(A?)+ 2 . Rank-SparsityIncoherenceforMatrixDecomposition21Upperbound.Sincethereformulationof(A)aboveinvolvesthemaximizationofacontinuousfunctionoveracompactset,themaximumisachievedatsomepointintheconstraintset.Therefore,wehavethatanyoptimal(^x;^y)mustsatisfythefollowingnecessaryoptimalityconditions:ThereexistLagrangemultipliers1;2suchthatrx24X(i;j)2 (A)xiyj35(^x;^y)=21^xry24X(i;j)2 (A)xiyj35(^x;^y)=22^yThisreducestothefollowingsystemofequations:X(i;j)2 (A)^yj=21^xi;8i(B.16)X(i;j)2 (A)^xi=22^yj;8j:(B.17)Multiplyingthe rstsystemofequations(B.16)element-wiseby^xandthensumming,wehavethatXi^xiXj:(i;j)2 (A)^yj=Xi^xi21^xi)X(i;j)2 (A)^xi^yj=21:Similarly,wehavethatP(i;j)2 (A)^xi^yj=22,whichimpliesthattheLagrangemultipliersareequaltoeachotherandtoone-halfoftheoptimalvalueattained21=22=X(i;j)2 (A)^xi^yj,2:Werecallherethattheoptimalpoints^x;^yareelement-wisenon-negative.Letdenotetheelement-wisesumoftheoptimalpoints^x;^y:=Xi^xi+Xj^yj:Summingoveralliin(B.16)andalljin(B.17),wehavethatXiXj:(i;j)2 (A)^yj+XjXi:(i;j)2 (A)^xi=2)X(i;j)2 (A)^yj+X(i;j)2 (A)^xi=2)Xjdegmax(A)^yj+Xidegmax(A)^xi2)degmax(A)2)degmax(A)2=X(i;j)2 (A)^xi^yj:Notethatweusedthefactthat6=0.Thus,wehavethat(A)degmax(A). Rank-SparsityIncoherenceforMatrixDecomposition23Lowerbound.Nextweprovealowerboundon(B).Recallthede nitionofthetangentspaceT(B)from(3.2).WerestrictourattentiontoelementsofthetangentspaceT(B)oftheformPUM=UUTMforMunitary(ananalogousargumentfollowsforelementsoftheformPVMforMunitary).OnecancheckthatkPUMk=maxkxk2=1;kyk2=1xTPUMymaxkxk2=1kPUxk2maxkyk2=1kMyk21:Therefore,(B)maxMunitarykPUMk1:Thus,weonlyneedtoshowthattheinequalityinline(2)of(B.18)isachievedbysomeunitarymatrixMinordertoconcludethat(B) (U).De nethe\mostaligned"basisvectorwiththesubspaceUasfollows:i=argmaxikPUeik2:LetMbeanyunitarymatrixwithoneofitscolumnsequalto1 (U)PUei,i.e.,anormalizedversionoftheprojectionontoUofthemostalignedbasisvector.Onecancheckthatsuchaunitarymatrixachievesequalityinline(2)of(B.18).Consequently,wehavethat(B)maxMunitarykPUMk1= (U):ByasimilarargumentwithrespecttoV,wehavethelowerboundasclaimedintheproposition.REFERENCES[1]D.P.Bertsekas,A.Nedic,andA.E.Ozdaglar,ConvexAnalysisandOptimization,AthenaScientic,Belmont,MA,2003.[2]B.Bollobas,Randomgraphs,CambridgeUniversityPress,2001.[3]E.J.Candes,J.Romberg,andT.Tao,Robustuncertaintyprinciples:exactsignalrecon-structionfromhighlyincompletefrequencyinformation,IEEETransactionsonInformationTheory,volume52,number2,pages489{509,2006.[4]E.J.CandesandB.Recht,ExactMatrixCompletionViaConvexOptimization,Submittedforpublication,2008.[5]V.Chandrasekaran,S.Sanghavi,P.A.Parrilo,andA.S.Willsky,SparseandLow-rankMatrixDecompositions,Proceedingsofthe15thIFACSymposiumonSystemIdenti ca-tion,2009.[6]V.Chandrasekaran,S.Sanghavi,P.A.Parrilo,andA.S.Willsky,Latent-VariableGaussianGraphicalModelSelection,Inpreparation.[7]B.Codenotti,Matrixrigidity,LinearAlgebraanditsApplications,volume304,number1{3,pages181{192,2000.[8]D.L.DonohoandX.Huo,Uncertaintyprinciplesandidealatomicdecomposition,IEEETransactionsonInformationTheory,volume47,number7,pages2845{2862,2001.[9]D.L.DonohoandM.Elad,OptimalSparseRepresentationinGeneral(Nonorthogonal)Dic-tionariesvia`1Minimization,ProceedingsoftheNationalAcademyofSciences,volume100,pages2197{2202,2003.[10]D.L.Donoho,Formostlargeunderdeterminedsystemsoflinearequationstheminimal`1-normsolutionisalsothesparsestsolution,CommunicationsonPureandAppliedMath-ematics,volume59,issue6,pages797{829,2006.[11]D.L.Donoho,Compressedsensing,IEEETransactionsonInformationTheory,volume52,number4,pages1289{1306,2006.