/
UncorrelatedMultilinearPrincipalComponentAnalysis UncorrelatedMultilinearPrincipalComponentAnalysis

UncorrelatedMultilinearPrincipalComponentAnalysis - PDF document

luanne-stotts
luanne-stotts . @luanne-stotts
Follow
370 views
Uploaded On 2015-11-23

UncorrelatedMultilinearPrincipalComponentAnalysis - PPT Presentation

compositionTRODisusedtorepresentaclassofimagesbasedonvariancemaximizationandgreedysuccessiveresiduecalculationAtwodimensionalPCA2DPCAisproposedinYangetal2004thatconstructsanimagecovaria ID: 203373

composition(TROD)isusedtorepresentaclassofim-agesbasedonvariancemaximizationand(greedy)suc-cessiveresiduecalculation.Atwo-dimensionalPCA(2DPCA)isproposedin(Yangetal. 2004)thatcon-structsanimagecovaria

Share:

Link:

Embed:

Download Presentation from below link

Download Pdf The PPT/PDF document "UncorrelatedMultilinearPrincipalComponen..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

UncorrelatedMultilinearPrincipalComponentAnalysis composition(TROD)isusedtorepresentaclassofim-agesbasedonvariancemaximizationand(greedy)suc-cessiveresiduecalculation.Atwo-dimensionalPCA(2DPCA)isproposedin(Yangetal.,2004)thatcon-structsanimagecovariancematrixusingimagema-tricesasinputs.However,lineartransformationisappliedonlytotherightsideofimagematricessotheimagedataisprojectedinonemodeonly,result-inginpoordimensionalityreduction.Amoregeneralalgorithmnamedgeneralizedlowrankapproximationofmatrices(GLRAM)wasintroducedin(Ye,2005),whichappliestwolineartransformstoboththeleftandrightsidesofinputimagematricesandresultsinabetterdimensionalityreductionthan2DPCA.GLRAMisdevelopedfromtheperspectiveofapprox-imationwhilethegeneralizedPCA(GPCA)ispro-posedin(Yeetal.,2004)fromtheviewofvariationmaximization,asanextensionofPCA.Later,thecon-currentsubspacesanalysis(CSA)isformulatedin(Xuetal.,2005)foroptimalreconstructionofgeneralten-sorobjects,whichcanbeconsideredasageneraliza-tionofGLRAM,andthemultilinearPCA(MPCA)introducedin(Luetal.,2008a)targetsatvariationmaximizationforgeneraltensorobjectsintheexten-sionofPCAtothemultilinearcase,whichcanbecon-sideredasafurthergeneralizationofGPCA.However,noneoftheexistingmultilinearextensionsofPCAmentionedabovetakesanimportantpropertyofPCAintoaccount,i.e.,PCAderivesuncorrelatedfeatures,whichcontainminimumredundancyanden-sureindependenceamongfeatures.Instead,mostofthemproduceorthogonalbasesineachmode.Al-thoughuncorrelatedfeaturesimplyorthogonalprojec-tionbasesinPCA,thisisnotnecessarilythecaseforitsmultilinearextension.Withthismotivation,thispaperinvestigatesmultilinearextensionofPCAthatcanproduceuncorrelatedfeatures.WeproposeanoveluncorrelatedmultilinearPCA(UMPCA)forunsuper-visedtensorobjectdimensionalityreduction(featureextraction).UMPCAisbasedonthetensor-to-vectorprojection(TVP)(Luetal.,2008b)anditfollowstheclassicalPCAderivationofsuccessivevariancemaxi-mization(Jolli e,2002).Thus,anumberofelemen-tarymultilinearprojections(EMPs)aresolvedtomax-imizethecapturedvariancewiththezero-correlationconstraint.Thesolutionisiterativeinnature,asmanyothermultilinearalgorithms(Xuetal.,2005;Yeetal.,2004;Shashua&Levin,2001).Therestofthispaperisorganizedasfollows.Section2reviewsbasicmultilinearnotationsandoperations,aswellastheconceptoftensor-to-vectorprojection.InSec.3,theproblemofUMPCAisformulatedandthesolutionisderivedasasequentialiterativeprocess.Table1.Notations NotationsDescriptions Xm,m=1;:::;Mthemthinputtensorsampleu(n),n=1;:::;Nthen-modeprojectionvectorfu(n)Tp;n=1;:::;NgthepthEMP,wherepistheindexoftheEMPymtheprojectionofXmontheTVPfu(n)Tp;n=1;:::;NgPp=1ym(p)=ymp=gp(m)theprojectionofXmonthepthEMPfu(n)Tp;n=1;:::;Nggpthepthcoordinatevector Next,Sec.4evaluatesthee ectivenessofUMPCAinthepopularfacerecognitiontaskthroughcomparisonwithPCA,MPCAandTROD.Finally,theconclusionsaredrawninSec.5.2.MultilinearFundamentalsThissectionintroducesthemultilinearnotations,op-erationsandprojectionsneededinthepresentationofUMPCA,andforfurtherpursuingofmultilinearalge-bra,(Lathauweretal.,2000)isagoodreference.TheimportantnotationsusedinthispaperarelistedinTable1forhandyreference.2.1.NotationsandbasicmultilinearoperationsDuetothemultilinearnatureoftensorobjects,newnotationshavebeenintroducedintheliteratureformathematicalanalysis.Followingthenotationsin(Lathauweretal.,2000),wedenotevectorsbylow-ercaseboldfaceletters,e.g.,x;matricesbyuppercaseboldfaceletters,e.g.,U;andtensorsbycalligraphicletters,e.g.,A.Theirelementsaredenotedwithin-dicesinparentheses.Indicesaredenotedbylowercaselettersandspantherangefrom1totheuppercaseletteroftheindex,e.g.,n=1;2;:::;N.AnNth-ordertensorA2RI1I2:::INisaddressedbyNindicesin,n=1;:::;N,andeachinaddressesthen-modeofA.Then-modeproductofatensorAbyamatrixU2RJnIn,denotedbyAnU,isatensorwithentries:(AnU)(i1;:::;in�1;jn;in+1;:::;iN)=XinA(i1;i2;:::;iN)U(jn;in):(1)ThescalarproductoftwotensorsA;B2RI1I2:::INisde nedas:A;B&#x]TJ/;༔ ; .96;& T; 10;&#x.516;&#x 0 T; [0;=Xi1:::XiNA(i1;:::;iN)B(i1;:::;iN):(2)Arank-onetensorAequalstotheouterproductofN UncorrelatedMultilinearPrincipalComponentAnalysis Inotherwords,theUMPCAobjectiveistodetermineasetofPEMPsfu(n)Tp;n=1;:::;NgPp=1thatmaxi-mizethevariancewhileproducingfeatureswithzero-correlation.Thus,theobjectivefunctionforthepthEMPisfu(n)Tp;n=1;:::;Ng=argmaxMXm=1(ymp� yp)2;subjecttou(n)Tpu(n)p=1andgTpgq kgpkkgqk=pq;p;q=1;:::;P;(6)wherepqistheKroneckerdelta(de nedas1forp=qandas0otherwise).3.2.TheUMPCAalgorithmTosolvetheUMPCAproblem(6),wefollowthesuc-cessivevariancemaximizationapproachinthederiva-tionofPCAin(Jolli e,2002).ThePEMPsfu(n)Tp;n=1;:::;NgPp=1aredeterminedonebyoneinPsteps,withthepthstepobtainingthepthEMP:Step1:Determinethe rstEMPfu(n)T1;n=1;:::;NgbymaximizingSyT1withoutanycon-straint.Step2:DeterminethesecondEMPfu(n)T2;n=1;:::;NgbymaximizingSyT2subjecttothecon-straintthatgT2g1=0.Stepp(p=3;:::;P):DeterminethepthEMPfu(n)Tp;n=1;:::;NgbymaximizingSyTpsubjecttotheconstraintthatgTpgq=0forq=1;:::;p�1.InordertosolveforthepthEMPfu(n)Tp;n=1;:::;Ng,weneedtodetermineNsetsofparameterscorrespond-ingtoNprojectionvectors,u(1)p;u(2)p;:::u(N)p,oneineachmode.Unfortunately,simultaneousdetermina-tionoftheseNsetsofparametersinallmodesisacomplicatednon-linearproblemwithoutanexistingoptimalsolution,exceptwhenN=1,whichistheclassicalPCAwhereonlyoneprojectionvectoristobesolved.Therefore,wefollowtheapproachinthealternatingleastsquare(ALS)algorithm(Harshman,1970)tosolvethismultilinearproblem.ForeachEMPtobedetermined,theparametersoftheprojectionvec-toru(n)pforeachmodenareestimatedonemodebyonemodeseparately,conditionedonfu(n)p;n6=ng,theparametervaluesoftheprojectionvectorsintheothermodes.Tosolveforu(n)pinthen-mode,assumingthatfu(n)p;n6=ngisgiven,thetensorsamplesarepro-jectedinthese(N�1)modesfn6=ng rsttoobtainthevectors~y(n)mp=Xm1u(1)Tp:::n�1u(n�1)Tpn+1u(n+1)Tp:::Nu(N)Tp;(7)where~y(n)mp2RIn.Thisconditionalsubproblemthenbecomestodetermineu(n)pthatprojectsthevectorsamplesf~y(n)mp;m=1;:::;Mgontoalinesothatthevarianceismaximized,subjecttothezero-correlationconstraint,whichisaPCAproblemwiththeinputsamplesf~y(n)mp;m=1;:::;Mg.Thecorrespondingto-talscattermatrix~S(n)Tpisthende nedas~S(n)Tp=MXm=1(~y(n)mp�~y(n)p)(~y(n)mp�~y(n)p)T;(8)where~y(n)p=1 MPm~y(n)mp.With(8),wearereadytosolveforthePEMPs.Forp=1,theu(n)1thatmaximizesthetotalscatteru(n)T1~S(n)T1u(n)1intheprojectedspaceisobtainedastheuniteigenvectorof~S(n)T1associatedwiththelargesteigenvalue.Next,weshowhowtodeterminethepth(p�1)EMPgiventhe rst(p�1)EMPs.Giventhe rst(p�1)EMPs,thepthEMPaimstomaximizethetotalscatterSyTp,subjecttotheconstraintthatfeaturesprojectedbythepthEMPareuncorrelatedwiththoseprojectedbythe rst(p�1)EMPs.Let~Y(n)p2RInMbeamatrixwith~y(n)mpasitsmthcolumn,i.e.,~Y(n)p=h~y(n)1p;~y(n)2p;:::;~y(n)Mpi,thenthepthcoordinatevectorisgp=~Y(n)Tpu(n)p.Theconstraintthatgpisun-correlatedwithfgq;q=1;:::;p�1gcanbewrittenasgTpgq=u(n)Tp~Y(n)pgq=0;q=1;:::;p�1:(9)Thus,u(n)p(p�1)canbedeterminedbysolvingthefollowingconstrainedoptimizationproblem:u(n)p=argmaxu(n)Tp~S(n)Tpu(n)p;(10)subjecttou(n)Tpu(n)p=1andu(n)Tp~Y(n)pgq=0;q=1;:::;p�1;Thesolutionisgivenbythefollowingtheorem:Theorem1.Thesolutiontotheproblem(10)isthe(unit-length)eigenvectorcorrespondingtothelargesteigenvalueofthefollowingeigenvalueproblem: (n)p~S(n)Tpu=u;(11) UncorrelatedMultilinearPrincipalComponentAnalysis 3.3.Initialization,projectionorderandterminationAsaniterativealgorithm,theUMPCAmaybeaf-fectedbytheinitializationmethod,theprojectionor-derandtheterminationconditions.Duetothespaceconstraint,theseissues,aswellastheconvergenceandcomputationalissues,arenotstudiedhere.Instead,weadoptsimpleimplementationstrategiesforthem.First,weusetheuniforminitializationforUMPCA,wherealln-modeprojectionvectorsareinitializedtohaveunitlengthandthesamevaluealongtheIndi-mensionsinn-mode,whichisequivalenttotheallonesvector1withpropernormalization.Second,asshowninAlgorithm1,theprojectionorder,whichisthemodeorderingincomputingtheprojectionvectors,isfrom1-modetoN-mode,asinothermultilinearalgorithms(Ye,2005;Xuetal.,2005;Luetal.,2008a).Third,theiterationisterminatedbysettingK,themaximumnumberofiterations.4.ExperimentalEvaluationTheproposedUMPCAcanpotentiallybene tvariousapplicationsinvolvingtensorialdata,asmentionedinSec.1.Sincefacerecognitionhaspracticalimpor-tanceinsecurity-relatedapplicationssuchasbiomet-ricauthenticationandsurveillance,ithasbeenusedwidelyforevaluationofunsupervisedlearningalgo-rithms(Shashua&Levin,2001;Yangetal.,2004;Xuetal.,2005;Ye,2005).Therefore,inthissection,wefocusonevaluatingthee ectivenessofUMPCAonthispopularclassi cationtaskthroughperformancecomparisonwithexistingunsuperviseddimensionalityreductionalgorithms.4.1.TheFERETdatabaseTheFacialRecognitionTechnology(FERET)database(Phillipsetal.,2000)iswidelyusedfortestingfacerecognitionperformance,with14,126imagesfrom1,199subjectscoveringawiderangeofvariationsinviewpoint,illumination,facialex-pression,racesandages.Asubsetofthisdatabaseisselectedinourexperimentalevaluationanditconsistsofthosesubjectswitheachsubjecthavingatleasteightimageswithatmost15degreesofposevariation,resultingin721faceimagesfrom70subjects.Sinceourfocushereisontherecognitionoffacesratherthantheirdetection,allfaceimagesaremanuallycropped,aligned(withmanuallyannotatedcoordinateinformationofeyes)andnormalizedto8080pixels,with256graylevelsperpixel.Figure1showssomesamplefaceimagesfromtwosubjectsinthisFERETsubset. Figure1.ExamplesoffaceimagesfromtwosubjectsintheFERETsubsetusedinourexperimentalevaluation.4.2.FacerecognitionperformancecomparisonIntheevaluation,wecomparetheperformanceoftheUMPCAagainstthreePCA-basedunsupervisedlearningalgorithms:thePCA(eigenface)algorithm(Turk&Pentland,1991),theMPCAalgorithm(Luetal.,2008a)2andtheTRODalgorithm(Shashua&Levin,2001).ThenumberofiterationsinTRODandUMPCAissettoten,withthesame(uniform)initial-izationused.ForMPCA,weobtainthefullprojectionandselectthemostdescriptivePfeaturesforrecogni-tion.Thefeaturesobtainedbythesefouralgorithmsarearrangedindescendingvariationcaptured(mea-suredbyrespectivetotalscatter).Forclassi cationofextractedfeatures,weusethenearestneighborclassi- er(NNC)withEuclideandistancemeasure.Gray-levelfaceimagesarenaturallysecond-orderten-sors(matrices),i.e.,N=2.Therefore,theyareinputdirectlyas8080tensorstothemultilin-earalgorithms(MPCA,TROD,UMPCA),whileforPCA,theyarevectorizedto64001vectorsasin-put.Foreachsubjectinafacerecognitionexperiment,L(=1;2;3;4;5;6;7)samplesarerandomlyselectedforunsupervisedtrainingandtherestareusedfortesting.Wereporttheresultsaveragedovertensuchrandomsplits(repetitions).Figures2and3showthedetailedresults3forL=1andL=7,respectively.L=1isanextremesmallsamplesizescenariowhereonlyonesampleperclassisavailablefortraining,theso-calledonetrainingsample(OTS)caseimportantinpractice(Wangetal.,2006),andL=7isthemaximumnumberoftrainingsampleswecanuseinourexperiments.Figures2(a)and3(a)plotthecorrectrecognitionratesagainstP,thedi-mensionalityofthesubspaceforP=1;:::;10,andFigs2(b)and3(b)plotthoseforP=15;:::;80.Fromthe gures,UMPCAoutperformstheotherthreemethodsinbothcasesandacrossalldimensionality,indicatingthattheuncorrelatedfeaturesextracteddirectlyfromthetensorialfacedataaremoree ectiveinclassi - 2NotethatMPCAwithN=2isequivalenttoGPCA.3NotethatforPCAandUMPCA,thereareatmost69featureswhenL=1(only70facesfortraining). UncorrelatedMultilinearPrincipalComponentAnalysis 5.ConclusionsThispaperproposesanoveluncorrelatedmultilinearPCAalgorithm,whereuncorrelatedfeaturesareex-tracteddirectlyfromtensorialrepresentationthroughatensor-to-vectorprojection.Thealgorithmsucces-sivelymaximizesvariancecapturedbyeachelemen-taryprojectionwhileenforcingthezero-correlationconstraint.Thesolutionemploysthealternatingpro-jectionmethodandisiterative.ExperimentsonfacerecognitiondemonstratethatcomparedwithotherunsupervisedlearningalgorithmsincludingthePCA,MPCAandTROD,theUMPCAachievesthebestre-sultsanditisparticularlye ectiveinlow-dimensionalspaces.Thus,facerecognitionthroughunsupervisedlearningbene tsfromtheproposedUMPCAandinfutureresearch,itisworthwhiletoinvestigatewhetherUMPCAcancontributeinotherunsupervisedlearningtasks,suchasclustering.AcknowledgmentsTheauthorswouldliketothanktheanonymousre-viewersfortheirinsightfulcomments.ThisworkispartiallysupportedbytheOntarioCentresofExcel-lencethroughtheCommunicationsandInformationTechnologyOntarioPartnershipProgramandtheBellUniversityLabs-attheUniversityofToronto.ReferencesCarroll,J.D.,&Chang,J.J.(1970).Analysisofindi-vidualdi erencesinmultidimensionalscalingviaann-waygeneralizationof\eckart-young"decomposi-tion.Psychometrika,35,283{319.Faloutsos,C.,Kolda,T.G.,&Sun,J.(2007).Min-inglargetime-evolvingdatausingmatrixandtensortools.Int.Conf.onDataMining2007Tutorial.Harshman,R.A.(1970).Foundationsoftheparafacprocedure:Modelsandconditionsforan\explana-tory"multi-modalfactoranalysis.UCLAWorkingPapersinPhonetics,16,1{84.Jolli e,I.T.(2002).Principalcomponentanalysis,secondedition.SpringerSeriresinStatistics.Koren,Y.,&Carmel,L.(2004).Robustlineardi-mensionalityreduction.IEEETrans.Vis.Comput.Graphics,10,459{470.Lathauwer,L.D.,Moor,B.D.,&Vandewalle,J.(2000).Onthebestrank-1andrank-(R1;R2;:::;RN)approximationofhigher-orderten-sors.SIAMJournalofMatrixAnalysisandAppli-cations,21,1324{1342.Law,M.H.C.,&Jain,A.K.(2006).Incrementalnon-lineardimensionalityreductionbymanifoldlearn-ing.IEEETrans.PatternAnal.Mach.Intell.,28,377{391.Lu,H.,Plataniotis,K.N.,&Venetsanopoulos,A.N.(2008a).MPCA:Multilinearprincipalcomponentanalysisoftensorobjects.IEEETrans.NeuralNetw.,19,18{39.Lu,H.,Plataniotis,K.N.,&Venetsanopoulos,A.N.(2008b).Uncorrelatedmultilineardiscriminantanalysiswithregularizationandaggregationforten-sorobjectrecognition.IEEETrans.NeuralNetw.acceptedpendingminorrevision.Phillips,P.J.,Moon,H.,Rizvi,S.A.,&Rauss,P.(2000).TheFERETevaluationmethodforfacerecognitionalgorithms.IEEETrans.PatternAnal.Mach.Intell.,22,1090{1104.Shakhnarovich,G.,&Moghaddam,B.(2004).Facerecognitioninsubspaces.HandbookofFaceRecog-nition(pp.141{168).Springer-Verlag.Shashua,A.,&Levin,A.(2001).Linearimagecod-ingforregressionandclassi cationusingthetensor-rankprinciple.Proc.IEEEConf.onComputerVi-sionandPatternRecognition(pp.42{49).Turk,M.,&Pentland,A.(1991).Eigenfacesforrecog-nition.JournalofCognitiveNeurosicence,3,71{86.Wang,J.,Plataniotis,K.N.,Lu,J.,&Venetsanopou-los,A.N.(2006).Onsolvingthefacerecognitionproblemwithonetrainingsamplepersubject.Pat-ternRecognition,39,1746{1762.Xu,D.,Yan,S.,Zhang,L.,Zhang,H.-J.,Liu,Z.,&Shum;,H.-Y.(2005).Concurrentsubspacesanaly-sis.Proc.IEEEComputerSocietyConf.onCom-puterVisionandPatternRecognition(pp.203{208).Yang,J.,Zhang,D.,Frangi,A.F.,&Yang,J.(2004).Two-dimensionalPCA:anewapproachtoappearance-basedfacerepresentationandrecogni-tion.IEEETrans.PatternAnal.Mach.Intell.,26,131{137.Ye,J.(2005).Generalizedlowrankapproximationsofmatrices.MachineLearning,61,167{191.Ye,J.,Janardan,R.,&Li,Q.(2004).GPCA:Ane-cientdimensionreductionschemeforimagecom-pressionandretrieval.The10thACMSIGKDDInt.Conf.onKnowledgeDiscoveryandDataMin-ing(pp.354{363).

Related Contents


Next Show more