/
A TUTORIAL ON SUBSPACE CLUSTERING Ren e Vidal Johns Hopkins University The past few years A TUTORIAL ON SUBSPACE CLUSTERING Ren e Vidal Johns Hopkins University The past few years

A TUTORIAL ON SUBSPACE CLUSTERING Ren e Vidal Johns Hopkins University The past few years - PDF document

stefany-barnette
stefany-barnette . @stefany-barnette
Follow
601 views
Uploaded On 2014-12-22

A TUTORIAL ON SUBSPACE CLUSTERING Ren e Vidal Johns Hopkins University The past few years - PPT Presentation

For example millions of cameras have been installed in build ings streets airports and cities around the world This has generated extraordinary advances on how to acquire com press store transmit and process massive amounts of com plex highdimension ID: 27540

For example millions

Share:

Link:

Embed:

Download Presentation from below link

Download Pdf The PPT/PDF document "A TUTORIAL ON SUBSPACE CLUSTERING Ren e ..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Inwhatfollows,wedescribeseveralmethodsforbuildinganafnitybetweenpairspointslyinginmultiplesubspaces.Thersttwomethods(factorizationandGPCA)aredesignedforlinearsubspaces,thoughtheycanbeappliedtoafnesub-spacesbyusinghomogeneouscoordinates.Theremainingmethodscanhandleeitherlinearorafnesubspaces.Factorization-basedafnity.Interestingly,oneoftherstsubspaceclusteringalgorithmsisbasedonbothmatrixfac-torizationandspectralclustering.Specically,thealgorithmofBoultandBrown[7]obtainsthesegmentationofthedatafromtheeigenvectorsofthematrixQ=VV�in(6).SincetheseeigenvectorsarethesingularvectorsofX,thesegmen-tationisobtainedbyclusteringtherowsofV.However,recallthattheafnityAjk=Qjkhasanumberofissues.First,itisnotnecessarilythecasethatAjk1whenpointsiandjareinthesamesubspace.Second,theequationQjk=0issensi-tivetonoiseanditisvalidonlyforindependentsubspaces.GPCA-basedafnity.TheGPCAalgorithmcanalsobeusedtodeneanafnitybetweenpairsofpoints.Recallthatthederivativesofthepolynomialsp(xj)atapointxj2SiprovideanestimateofthenormalvectorstosubspaceSi.Therefore,onecanusetheanglesbetweenthesubspacestodeneanafnityasAjk=Qmin(dj;dk)m=1cos2(mjk),wheremjkisthemthsubspaceanglebetweenthebasesoftheesti-matedsubspacesatpointsjandk,^Sjand^Sk,respectively,forj;k=1;:::;N.ThesegmentationofthedataisthenfoundbyapplyingspectralclusteringtothenormalizedLaplacian.LocalSubspaceAfnity(LSA)andSpectralLocalBest-tFlats(SLBF).TheLSA[21]andSLBF[22]algorithmsarebasedontheobservationthatapointanditsnearestneighbors(NNs)oftenbelongtothesamesubspace.Therefore,wecantanafnesubspace^Sjtoeachpointjanditsd-NNsusing,e.g.,PCA.Inpractice,wecanchooseKdNNs,henceddoesnotneedtobeknownexactly:weonlyneedanupperbound.Then,iftwopointsjandklieinthesamesubspaceSi,theirlocallyestimatedsubspaces^Sjand^Skshouldbethesame,whileifthetwopointslieindifferentsubspaces^Sjand^Skshouldbedifferent.Therefore,wecanuseadistancebe-tween^Sjand^Sktodeneanafnitybetweenthetwopoints.Therst(optional)stepoftheLSAandSLBFalgorithmsistoprojectthedatapointsontoasubspaceofdimensionr=rank(X)usingtheSVDofX.Withnoisydata,thevalueofrisdeterminedusingmodelselectiontechniques.Inthecasedatadrawnfromlinearsubspaces,theLSAalgorithmprojectstheresultingpointsinRrontothehypersphereSr�1.ThesecondstepistocomputetheK-NNsofeachpointjandtotalocalafnesubspace^Sjtothepointanditsneighbors.LSAassumesthatKisspeciedbytheuser.TheK-NNsarethenfoundusingtheanglebetweenpairsofdatapointsortheEuclideandistanceasametric.PCAisthenusedtotthelocalsubspace^Sj.Thesubspacedimensiondjisdeterminedusingmodelselectiontechniques.SLBFde-terminesboththenumberofneighborsKjandthesubspace^Sjforpointjautomatically.ItdoessobysearchingforthesmallestvalueofKjthatminimizesacertainttingerror.ThethirdstepofLSAistocomputeanafnitymatrixasAjk=exp��min(dj;dk)Xm=1sin2(mjk);(20)wherethemjkisthemthprincipalanglebetweenthebasesofsubspaces^Sjand^Sk.Inthecaseofdatadrawnfromafnesubspaces,Ajkwouldneedtobemodiedtoalsoincorporateadistancebetweenpointsjandk.SLBFusesanafnitymatrixthatisapplicabletobothlinearandafnesubspacesasAjk=exp(�^djk=22j)+exp(�^djk=22k);(21)where^djk=q dist(xj;^Sk)dist(xk;^Sj)anddist(x;S)istheEuclideandistancefrompointxtosubspaceS.Thesegmen-tationofthedataisthenfoundbyapplyingspectralclusteringtothenormalizedLaplacian.TheLSAandSLBFalgorithmshavetwomainadvan-tageswhencomparedtoGPCA.First,outliersarelikelytobe“rejected”,becausetheyarefarfromallthepointsandsotheyarenotconsideredasneighborsoftheinliers.Second,LSArequiresonlyO(ndmax)datapoints,whileGPCAneedsO(Mn(dmax+1)).Ontheotherhand,LSAhastwomaindrawbacks.First,theneighborsofapointcouldbelongtoadifferentsubspace.Thisismorelikelytohappennearthein-tersectionoftwosubspaces.Second,theselectedneighborsmaynotspantheunderlyingsubspace.Thus,Kneedstobesmallenoughsothatonlypointsinthesamesubspacearechosenandlargeenoughsothattheneighborsspanthelocalsubspace.SLBFresolvestheseissuesbychoosingthesizeoftheneighborhoodautomatically.NoticealsothatbothGPCAandLSAarebasedonalinearprojectionfollowedbyspectralclustering.Whileinprinciplebothalgorithmscanuseanylinearprojection,GPCApreferstousethesmallestpossibledimensionr=dmax+1,soastoreducethecomputationalcomplexity.Ontheotherhand,LSAusesaslightlylargerdimensionr=rank(X)Pdi,becauseifthedimensionoftheprojectionistoosmall(lessthanrank(X)),theprojectedsubspacesarenotindependentandLSAhasproblemsneartheintersectionoftwosubspaces.AnothermajordifferenceisthatLSAtsasubspacelocallyaroundeachprojectedpoint,whileGPCAusesthegradientsofapolynomialthatisgloballyttotheprojecteddata.LocallyLinearManifoldClustering(LLMC).TheLLMCalgorithm[23]isalsobasedonttingalocalsubspacetoapointanditsK-NNs.Specically,everypointjiswrittenasanafnecombinationofallotherpointsk6=j.Thecoef-cientswjkarefoundinclosedformbyminimizingthecostNXj=1kxj�Xk6=jwjkxkk2=k(I�W)X�k2F;(22)7 subjecttoPk6=jwjk=1andwjk=0ifxkisnotaK-NNofxj.Then,theafnitymatrixandthematrixLarebuiltasA=W+W��W�WandL=(I�W)�(I�W):(23)Itisshownin[23]thatwheneverypointanditsK-NNsarealwaysinthesamesubspace,thentherearevectorsvinthenullspaceofLwiththepropertythatvj=vkwhenpointsjandkareinthesamesubspace.However,thesevectorsarenottheonlyvectorsinthenullspaceandspectralclusteringisnotdirectlyapplicable.Inthiscase,aprocedureforprop-erlyselectinglinearcombinationsoftheeigenvectorsofLisneeded,asdiscussedin[23].ArstadvantageofLLMCisitsrobustnesstooutliersThisisbecause,asinthecaseofLSAandSLBF,outliersareoftenfarfromtheinliers,henceitisunlikelythattheyarechosenasneighborsoftheinliers.Anotherimportantadvan-tageofLLMCisthatitisalsoapplicabletonon-linearsub-spaces,whilealltheothermethodsdiscussedsofarareonlyapplicabletolinear(orafne)subspaces.However,LLMCsuffersfromthesamedisadvantageofLSA,namelythatitisnotalwaysthecasethatapointanditsK-NNsareinthesamesubspace,especiallywhenthesubspacesarenotindependent.Also,properlychoosingthenumberofnearestneighborsisachallenge.Theseissuescouldberesolvedbychoosingtheneighborhoodautomatically,asdonebySLBF.SparseSubspaceClustering(SSC).SSC[24,25]isalsobasedontheideaofwritingadatapointasalinear(afne)combinationofneighboringdatapoints.However,thekeydifferencewithLSA,SLBFandLLMCisthat,insteadofchoosingneighborsbasedontheangularorEuclideandis-tancebetweenpairsofpoints(whichcanleadtoerrorsinchoosingtheneighbors),theneighborscanbeanyotherpointsinthedataset.Inprinciple,thisleadstoanill-posedproblemwithmanypossiblesolutions.Toresolvethisissue,theprincipleofsparsityisinvoked.Specically,everypointiswrittenasasparselinear(afne)combinationofallotherdatapointsbyminimizingthenumberofnonzerocoefcientswjksubjecttoxj=Pk6=jwjkxk(andPwjk=1inthecaseofafnesubspaces).Sincethisproblemiscombinato-rial,asimpler`1optimizationproblemissolvedminfwjkgXk6=jjwjkjs.t.xj=Xk6=jwjkxk�andXk6=jwjk=1:(24)Itisshownin[24]and[25]thatwhenthesubspacesareeitherindependentordisjoint,thesolutiontotheoptimizationprob-lemin(24)issuchthatwjk=0onlyifpointsjandkareindifferentsubspaces.Inotherwords,thesparsestrepresenta-tionisobtainedwheneachpointiswrittenasalinear(afne)combinationofpointsinitsownsubspace.Inthecaseofdatacontaminatedbynoise,theSSCalgo-rithmdoesnotattempttowriteadatapointasanexactlinear(afne)combinationofotherpoints.Instead,apenaltyinthe2-normoftheerrorisaddedtothe`1norm.Specically,thesparsecoefcientsarefoundbysolvingtheproblemminfwjkgXk6=jjwjkj+kxj�Xk6=jwjkxkk2�s.t.Xk6=jwjk=1;(25)where�0isaparameter.Obviously,differentsolutionsforfwjkgwillbeobtainedfordifferentchoicesoftheparameter.However,wearenotinterestedinthespecicvaluesofwjk:allwhatmattersisthat,foreachpointj,thetopnonzerocoefcientscomefrompointsinthesamesubspace.Inthecaseofdatacontaminatedwithoutliers,theSSCalgorithmalgorithmassumesthatxj=Pk6=jwjkxk+ej,wherethevectorofoutliersejisalsosparse.Thesparseco-efcientsandtheoutliersarefoundbysolvingtheproblemminfwjkg;fejgXk6=jjwjkj+kejk1+kxj�Xk6=jwjkxk�ejk2(26)subjecttoPk6=jwjk=1inthecaseofafnesubspaces.Givenasparserepresentationforeachdatapoint,thegraphafnitymatrixisdenedasA=jWj+jW�j:(27)Thesegmentationisthenobtainedbyapplyingspectralclus-teringtotheLaplacian.TheSSCalgorithmpresentsseveraladvantageswithre-specttoallthealgorithmsdiscussedsofar.Withrespecttofactorization-basedmethods,theafnityin(27)isveryro-busttonoise.Thisisbecausethesolutionchangescontinu-ouslywiththeamountofnoise.Specically,withmoderateamountsofnoisethetopnonzerocoefcientswillstillcorre-spondtopointsinthesamesubspace.Withlargeramountsofnoise,someofthenonzerocoefcientswillcomefromothersubspaces.Thesemistakescanbehandledbyspectralcluster-ing,whichisalsorobusttonoise(see[56]).WithrespecttoGPCA,SSCismorerobusttooutliersbecause,asinthecaseofLSA,SLBFandLLMC,itisveryunlikelythatapointinasubspacewillwriteitselfasalinearcombinationofapointthatisveryfarfromtheallthesubspaces.Also,thecompu-tationalcomplexityofSSCdoesnotgrowexponentiallywiththenumberofsubspacesandtheirdimensions.Nonetheless,itrequiressolvingNoptimizationproblemsinNvariables,asper(24),(25)or(26),henceitcanbeslow.WithrespecttoLSAandLLMC,thegreatadvantageofSSCisthattheneigh-borsofapointareautomaticallychosen,withouthavingtospecifythevalueofK.Indeed,thenumberofnonzerocoef-cientsshouldcorrespondtothedimensionofthesubspace.Moreimportantly,theSSCalgorithmisprovablycorrectforindependentanddisjointsubspaces,henceitsperformanceisnotaffectedwhentheNNsofapoint(inthetraditionalsense)donotcomefromthesamesubspacecontainingthatpoint.AnotheradvantageofSCCoverGPCAisthatdoesnotre-quirethedatatobeprojectedontoalow-dimensionalsub-space.ApossibledisadvantageofSSCisthatitisprovably8 Table1:ClassicationerrorsofseveralsubspaceclusteringalgorithmsontheHopkins155motionsegmentationdatabase.Allalgorithmsusetwoparameters(d;r),wheredisthedimensionofthesubspacesandristhedimensionoftheprojection.Afnesubspaceclusteringalgorithmstreatsubspacesas3-dimensionalafnesubspaces,i.e.,d=3,whilelinearsubspaceclusteringalgorithmstreatsubspacesas4-dimensionallinearsubspaces,i.e.,d=4.Thedimensionsoftheprojectionsarer=5,r=4n,wherenisthenumberofmotions,andr=2F,whereFisthenumberofframes.ALCusesasparsitypreserving(sp)dimensionfortheprojection.AllalgorithmsusePCAtoperformtheprojection,exceptforSSCwhichusesarandomprojectionwithentriesdrawnfromaBernoulli(SSC-B)orNormal(SSC-N)distribution.TheresultsforGPCAcorrespondtothespectralclustering-basedGPCAalgorithm.LLMC-GdenotesLLMCinitializedbythealgebraicGPCAalgorithm. Twomotions Threemotions All Check.(78) Trafc(31) Articul.(11) All(120) Check.(26) Trafc(7) Articul.(2) All(35) (155) Mean Median Mean Median Mean Median Mean Median Mean Median Mean Median Mean Median Mean Median Mean Median GPCA(4,5) 6.09 1.03 1.41 0.00 2.88 0.00 4.59 0.38 31.95 32.93 19.83 19.55 16.85 16.85 28.66 28.26 10.34 2.54GPCA(4n-1,4n) 4.78 0.51 1.63 0.00 6.18 3.20 4.10 0.44 36.99 36.26 39.68 40.92 29.62 29.62 37.11 37.18 11.55 1.36 RANSAC(4,5) 6.52 1.75 2.55 0.21 7.25 2.64 5.56 1.18 25.78 26.00 12.83 11.45 21.38 21.38 22.94 22.03 9.76 3.21 LSA(4,5) 8.84 3.43 2.15 1.00 4.66 1.28 6.73 1.99 30.37 31.98 27.02 34.01 23.11 23.11 29.28 31.63 11.82 4.00LSA(4,4n) 2.57 0.27 5.43 1.48 4.10 1.22 3.45 0.59 5.80 1.77 25.07 23.79 7.25 7.25 9.73 2.33 4.94 0.90 LLMC(4,5) 4.85 0.00 1.96 0.00 6.16 1.37 4.22 0.00 9.06 7.09 6.45 0.00 5.26 5.26 8.33 3.19 5.15 0.00LLMC(4,4n) 3.96 0.23 3.53 0.33 6.48 1.30 4.08 0.24 8.48 5.80 6.04 4.09 9.38 9.38 8.04 4.93 4.97 0.87LLMC-G(4,5) 4.34 0.00 2.13 0.00 6.16 1.37 3.95 0.00 8.87 7.09 5.62 0.00 5.26 5.26 8.02 3.19 4.87 0.00LLMC-G(4,4n) 2.83 0.00 3.61 0.00 5.94 1.30 3.32 0.00 8.20 5.26 6.04 4.60 8.32 8.32 7.78 4.93 4.37 0.53 MSL 4.46 0.00 2.23 0.00 7.23 0.00 4.14 0.00 10.38 4.61 1.80 0.00 2.71 2.71 8.23 1.76 5.03 0.00 ALC(4,5) 2.56 0.00 2.83 0.30 6.90 0.89 3.03 0.00 6.78 0.92 4.01 1.35 7.25 7.25 6.26 1.02 3.76 0.26ALC(4,sp) 1.49 0.27 1.75 1.51 10.70 0.95 2.40 0.43 5.00 0.66 8.86 0.51 21.08 21.08 6.69 0.67 3.37 0.49 SCC(3,4) 2.99 0.39 1.20 0.32 7.71 3.67 2.96 0.42 7.72 3.21 0.52 0.28 8.90 8.90 6.34 2.36 3.72 SCC(3,4n) 1.76 0.01 0.46 0.16 4.06 1.69 1.63 0.06 6.00 2.22 1.78 0.42 5.65 5.65 5.14 1.67 2.42 SCC(3,2F) 1.77 0.00 0.63 0.14 4.02 2.13 1.68 0.07 6.23 1.70 1.11 1.40 5.41 5.41 5.16 1.58 2.47 SCC(4,5) 2.31 0.25 0.71 0.26 5.05 1.08 2.15 0.27 5.56 2.03 1.01 0.47 8.97 8.97 4.85 2.01 2.76 SCC(4,4n) 1.30 0.04 1.07 0.44 3.68 0.67 1.46 0.16 5.68 2.96 2.35 2.07 10.94 10.94 5.31 2.40 2.33 SCC(4,2F) 1.31 0.06 1.02 0.26 3.21 0.76 1.41 0.10 6.31 1.97 3.31 3.31 9.58 9.58 5.90 1.99 2.42 SLBF(3,2F) 1.59 0.00 0.20 0.00 0.80 0.00 1.16 0.00 4.57 0.94 0.38 0.00 2.66 2.66 3.63 0.64 1.66 SSC-B(4,4n) 0.83 0.00 0.23 0.00 1.63 0.00 0.75 0.00 4.49 0.54 0.61 0.00 1.60 1.60 3.55 0.25 1.45 0.00SSC-N(4,4n) 1.12 0.00 0.02 0.00 0.62 0.00 0.82 0.00 2.97 0.27 0.58 0.00 1.42 0.00 2.45 0.20 1.24 0.00 Fig.3:Percentageofsequencesforwhichtheclassicationerrorislessthanorequaltoagivenpercentageofmisclassication.ThealgorithmstestedareGPCA(4,5),RANSAC(4,5),LSA(4,4n),LLMC(4,4n),MSL,ALC(4,sp),SCC(4,4n),SSC-N(4,4n).12 4.CONCLUSIONSANDFUTUREDIRECTIONSOverthepastfewdecades,signicantprogresshasbeenmadeinclusteringhigh-dimensionaldatasetsdistributedaroundacollectionoflinearandafnesubspaces.Thisarticlepre-sentedareviewofsuchprogress,whichincludedanumberofexistingsubspaceclusteringalgorithmstogetherwithanexperimentalevaluationonthemotionsegmentationproblemincomputervision.Whileearlieralgorithmsweredesignedundertheassumptionsofperfectdataandperfectknowledgeofthenumberofsubspacesandtheirdimensions,through-outtheyearsalgorithmsstartedtohandlenoise,outliers,datawithmissingentries,unknownnumberofsubspacesandun-knowndimensions.Inthecaseofnoiselessdatadrawnfromlinearsubspaces,thetheoreticalcorrectnessofexistingalgo-rithmsiswellstudiedandsomealgorithmssuchasGPCAareabletohandleanunknownnumberofsubspacesofun-knowndimensionsinanarbitraryconguration.However,whileGPCAisapplicabletoafnesubspaces,atheoreticalanalysisofGPCAforafnesubspacesinthenoiselesscaseisstilldue.Inthecaseofnoisydata,thetheoreticalcorrectnessofexistingalgorithmsislargelyuntouched.Tothebestofourknowledge,therstworksinthisdirectionare[45,59].Byandlarge,mostexistingalgorithmsassumethatthenumberofsubspacesandtheirdimensionsareknown.Whilesomealgorithmscanprovideestimatesforthesequantities,theires-timatescomewithnotheoreticalguarantees.Inourview,thedevelopmentoftheoreticallysoundalgorithmsforndingthenumberofsubspacesandtheirdimensioninthepresenceofnoiseandoutliersisaveryimportantopenchallenge.Ontheotherhand,itisimportanttomentionthatmostexistingalgo-rithmsoperateinabatchfashion.Inreal-timeapplications,itisimportanttoclusterthedataasitisbeingcollected,whichmotivatesthedevelopmentofonlinesubspaceclusteringal-gorithms.Theworksof[63]and[15]aretwoexamplesinthisdirection.Finally,inourviewthegrandchallengeforthenextdecadewillbetodevelopclusteringalgorithmsfordatadrawnfrommultiplenonlinearmanifolds.Theworksof[64,65,66,67]havealreadyconsideredtheproblemofclus-teringquadratic,bilinearandtrilinearsurfacesusingalgebraicalgorithmsdesignedfornoisefreedata.Thedevelopmentofmethodsthatareapplicabletomoregeneralmanifoldswithcorrupteddataisstillatitsinfancy.5.AUTHORRen´eVidal(rvidal@jhu.edu)receivedhisB.S.degreeinElectricalEngineering(highesthonors)fromthePonticiaUniversidadCat´olicadeChilein1997andhisM.S.andPh.D.degreesinElectricalEngineeringandComputerSci-encesfromtheUniversityofCaliforniaatBerkeleyin2000and2003,respectively.HewasaresearchfellowattheNa-tionalICTAustraliain2003andjoinedTheJohnsHopkinsUniversityin2004asafacultymemberintheDepartmentofBiomedicalEngineeringandtheCenterforImagingSci-ence.Hewasco-editorofthebook“DynamicalVision”andhasco-authoredmorethan100articlesinbiomedicalimageanalysis,computervision,machinelearning,hybridsystems,androbotics.Heisrecipientofthe2009ONRYoungIn-vestigatorAward,the2009SloanResearchFellowship,the2005NFSCAREERAwardandthe2004BestPaperAwardHonorableMentionattheEuropeanConferenceonCom-puterVision.Healsoreceivedthe2004SakrisonMemorialPrizefor“completinganexceptionallydocumentedpieceofresearch”,the2003EliJuryawardfor“outstandingachieve-mentintheareaofSystems,Communications,Control,orSignalProcessing”,the2002StudentContinuationAwardfromNASAAmes,the1998MarcosOrregoPuelmaAwardfromtheInstituteofEngineersofChile,andthe1997AwardoftheSchoolofEngineeringofthePonticiaUniversidadCat´olicadeChiletothebestgraduatingstudentoftheschool.HeisamemberoftheIEEEandtheACM.6.REFERENCES[1]A.Yang,J.Wright,Y.Ma,andS.Sastry,“Unsupervisedsegmentationofnaturalimagesvialossydatacompres-sion,”ComputerVisionandImageUnderstanding,vol.110,no.2,pp.212–225,2008.[2]R.Vidal,R.Tron,andR.Hartley,“MultiframemotionsegmentationwithmissingdatausingPowerFactoriza-tionandGPCA,”InternationalJournalofComputerVi-sion,vol.79,no.1,pp.85–105,2008.[3]J.Ho,M.H.Yang,J.Lim,K.C.Lee,andD.Kriegman,“Clusteringappearancesofobjectsundervaryingillu-minationconditions.,”inIEEEConf.onComputerVi-sionandPatternRecognition,2003.[4]WeiHong,JohnWright,KunHuang,andYiMa,“Multi-scalehybridlinearmodelsforlossyimagerep-resentation,”IEEETrans.onImageProcessing,vol.15,no.12,pp.3655–3671,2006.[5]R.Vidal,S.Soatto,Y.Ma,andS.Sastry,“Analge-braicgeometricapproachtotheidenticationofaclassoflinearhybridsystems,”inConferenceonDecisionandControl,2003,pp.167–172.[6]L.Parsons,E.Haque,andH.Liu,“Subspaceclusteringforhighdimensionaldata:areview,”ACMSIGKDDExplorationsNewsletter,2004.[7]T.E.BoultandL.G.Brown,“Factorization-basedseg-mentationofmotions,”inIEEEWorkshoponMotionUnderstanding,1991,pp.179–186.[8]J.CosteiraandT.Kanade,“Amultibodyfactorizationmethodforindependentlymovingobjects.,”Int.JournalofComputerVision,vol.29,no.3,1998.14 [36]K.KanataniandC.Matsunaga,“Estimatingthenumberofindependentmotionsformultibodymotionsegmen-tation,”inEuropeanConf.onComputerVision,2002,pp.25–31.[37]K.Kanatani,“Geometricinformationcriterionformodelselection,”InternationalJournalofComputerVi-sion,pp.171–189,1998.[38]L.Zelnik-ManorandM.Irani,“Onsingle-sequenceandmulti-sequencefactorizations,”Int.JournalofCom-puterVision,vol.67,no.3,pp.313–326,2006.[39]Y.Ma,A.Yang,H.Derksen,andR.Fossum,“Esti-mationofsubspacearrangementswithapplicationsinmodelingandsegmentingmixeddata,”SIAMReview,2008.[40]H.Derksen,“Hilbertseriesofsubspacearrangements,”JournalofPureandAppliedAlgebra,vol.209,no.1,pp.91–98,2007.[41]N.Ozay,MSznaier,C.Lagoa,andO.Camps,“Gpcawithdenoising:Amoments-basedconvexapproach,”inIEEEConferenceonComputerVisionandPatternRecognition,2010.[42]A.Yang,S.Rao,A.Wagner,Y.Ma,andR.Fossum,“Hilbertfunctionsandapplicationstotheestimationofsubspacearrangements,”inIEEEInternationalConfer-enceonComputerVision,2005.[43]K.Huang,Y.Ma,andR.Vidal,“Minimumeffectivedimensionformixturesofsubspaces:ArobustGPCAalgorithmanditsapplications,”inIEEEConferenceonComputerVisionandPatternRecognition,2004,vol.II,pp.631–638.[44]R.Duda,P.Hart,andD.Stork,PatternClassication,Wiley,NewYork,2ndedition,2000.[45]A.AldroubiandK.Zaringhalam,“NonlinearleastsquaresinRN,”ActaApplicandaeMathematicae,vol.107,no.1-3,pp.325–337,2009.[46]A.Aldroubi,C.Cabrelli,andU.Molter,“Optimalnon-linearmodelsforsparsityandsampling,”JournalofFourierAnalysisandApplications,vol.14,no.5-6,pp.793–812,2008.[47]M.TippingandC.Bishop,“Probabilisticprincipalcom-ponentanalysis,”JournaloftheRoyalStatisticalSoci-ety,vol.61,no.3,pp.611–622,1999.[48]A.Dempster,N.Laird,andD.Rubin,“Maximumlikeli-hoodfromincompletedataviatheEMalgorithm,”Jour-naloftheRoyalStatisticalSocietyB,vol.39,pp.1–38,1977.[49]C.Archambeau,N.Delannay,andM.Verleysen,“Mix-turesofrobustprobabilisticprincipalcomponentana-lyzers,”Neurocomputing,vol.71,no.7–9,pp.1274–1282,2008.[50]A.GruberandY.Weiss,“MultibodyfactorizationwithuncertaintyandmissingdatausingtheEMalgorithm,”inIEEEConf.onComputerVisionandPatternRecog-nition,2004,vol.I,pp.707–714.[51]J.PaisleyandL.Carin,“Nonparametricfactoranalysiswithbetaprocesspriors,,”inInternationalConferenceonMachineLearning,2009.[52]A.Leonardis,H.Bischof,andJ.Maver,“Multipleeigenspaces,”PatternRecognition,vol.35,no.11,pp.2613–2627,2002.[53]Z.Fan,J.Zhou,andY.Wu,“Multibodygroupingbyinferenceofmultiplesubspacesfromhigh-dimensionaldatausingoriented-frames,”IEEETrans.onPatternAnalysisandMachineIntelligence,vol.28,no.1,pp.91–105,2006.[54]M.A.FischlerandR.C.Bolles,“RANSACrandomsampleconsensus:Aparadigmformodelttingwithapplicationstoimageanalysisandautomatedcartogra-phy,”CommunicationsoftheACM,vol.26,pp.381–395,1981.[55]J.YanandM.Pollefeys,“Articulatedmotionsegmenta-tionusingRANSACwithpriors,”inWorkshoponDy-namicalVision,2005.[56]U.vonLuxburg,“Atutorialonspectralclustering,”StatisticsandComputing,vol.17,2007.[57]S.Agarwal,J.Lim,L.Zelnik-Manor,P.Perona,D.Kriegman,andS.Belongie,“Beyondpairwiseclus-tering,”inIEEEConferenceonComputerVisionandPatternRecognition,June2005,vol.2,pp.838–845.[58]V.Govindu,“Atensordecompositionforgeometricgroupingandsegmentation,”inIEEEConferenceonComputerVisionandPatternRecognition,2005,pp.1150–1157.[59]G.ChenandG.Lerman,“Foundationsofamulti-wayspectralclusteringframeworkforhybridlinearmodel-ing,”FoundationsofComputationalMathematics,vol.9,no.5,2009.[60]G.Chen,S.Atev,andG.Lerman,“Kernelspectralcur-vatureclustering(KSCC),”inWorkshoponDynamicalVision,2009.[61]R.TronandR.Vidal,“Abenchmarkforthecompari-sonof3-Dmotionsegmentationalgorithms,”inIEEEConferenceonComputerVisionandPatternRecogni-tion,2007.16 [62]F.LauerandC.Schn¨orr,“Spectralclusteringoflinearsubspacesformotionsegmentation,”inIEEEInterna-tionalConferenceonComputerVision,2009.[63]R.Vidal,“Onlineclusteringofmovinghyperplanes,”inNeuralInformationProcessingSystems,NIPS,2006.[64]R.Vidal,Y.Ma,S.Soatto,andS.Sastry,“Two-viewmultibodystructurefrommotion,”InternationalJour-nalofComputerVision,vol.68,no.1,pp.7–25,2006.[65]R.VidalandY.Ma,“Auniedalgebraicapproachto2-Dand3-Dmotionsegmentation,”JournalofMathe-maticalImagingandVision,vol.25,no.3,pp.403–421,2006.[66]R.VidalandR.Hartley,“Three-viewmultibodystruc-turefrommotion,”IEEETransactionsonPatternAnal-ysisandMachineIntelligence,vol.30,no.2,pp.214–227,2008.[67]S.Rao,A.Yang,S.Sastry,andY.Ma,“Robustalgebraicsegmentationofmixedrigid-bodyandplanarmotionsfromtwoviews,”InternationalJournalofComputerVision,vol.88,no.3,pp.425–446,2010.17