/
INTRA-PERSONALKERNELSPACEFORFACERECOGNITIONShaohuaKevinZhou,RamaChella INTRA-PERSONALKERNELSPACEFORFACERECOGNITIONShaohuaKevinZhou,RamaChella

INTRA-PERSONALKERNELSPACEFORFACERECOGNITIONShaohuaKevinZhou,RamaChella - PDF document

faustina-dinatale
faustina-dinatale . @faustina-dinatale
Follow
380 views
Uploaded On 2016-06-18

INTRA-PERSONALKERNELSPACEFORFACERECOGNITIONShaohuaKevinZhou,RamaChella - PPT Presentation

2INTRAPERSONALSPACEMODELING21IntraPersonalSpaceIPSInthispaperweareonlyinterestedintestingthegeneralizationcapabilityofouralgorithmItisourhopethatthetrainingstagecanlearntheintrinsiccharacter ID: 367588

2.INTRA-PERSONALSPACEMODELING2.1.Intra-PersonalSpace(IPS)Inthispaper weareonlyinterestedintestingthegeneral-izationcapabilityofouralgorithm.Itisourhopethatthetrainingstagecanlearntheintrinsiccharacter

Share:

Link:

Embed:

Download Presentation from below link

Download Pdf The PPT/PDF document "INTRA-PERSONALKERNELSPACEFORFACERECOGNIT..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

INTRA-PERSONALKERNELSPACEFORFACERECOGNITIONShaohuaKevinZhou,RamaChellappaCenterforAutomationResearchUniversityofMarylandCollegePark,MD,20742BabackMoghaddamMitsubishiElectricResearchLaboratories201BroadwayCambridge,MA02139ABSTRACTIntra-personalspacemodelingproposedbyMoghaddamet.al.hasbeensuccessfullyappliedinfacerecognition.Intheirworktheregularprincipalsubspacesarederivedfromtheintra-personalspaceusingaprincipalcomponentanaly-sisandembeddedinaprobabilisticformulation.Inthispa-per,wederivetheprincipalsubspacefromtheintra-personalkernelspacebydevelopingaprobabilisticanalysisofkernelprincipalcomponentsforfacerecognition.WetestthisnewalgorithmonasubsetoftheFERETdatabasewithillumina-tionandfacialexpressionvariations.Therecognitionper-formancedemonstratesitsadvantageoverothertraditionalsubspaceapproaches.1.INTRODUCTIONSubspacerepresentationshavebeenwidelyusedforfacerecognitiontask.Forarecentreviewonfacerecognition,referto[4].Amongthem,twofamousexamplesarethe'Eigenface'[18]and'Fisherface'[3,7]approaches.The'Eigenface'approachderivesitssubspacefromaprinci-palcomponentanalysis(PCA)whilethe'Fisherface'ap-proachfromaFisherdiscriminantanalysis(FDA).Bothap-proachesattainedsatisfactoryperformancesintheFERETtestasdocumentedin[14].Recently,thereisanincreasingtrendofapplyingkernelsubspacerepresentationstofacerecognition[19,20,11],wherekernelmethodssuchasthekernelPCA(KPCA)[15]andthekernelFDA(KFDA)[2],correspondingtothe'ker-nelized'versionsofthePCAandtheFDArespectively,areinvokedtoderivethesubspace.Bymappingtheoriginaldataintoahigh-dimensional,oreveninnite-dimensionalfeaturespace,thekernelmethodsareabletocapturehigher-orderstatisticaldependencies,whichtypicallyaboundinhumanfacialimagescapturedunderdifferentscenarioswithvariationsinpose,illuminationandfacialexpression,etc.However,thecomputationinvolvedinthekernelmethodsisstillmaintainedalmostatthesamelevelasthatinthenon-kernelmethods,asguaranteedbythe`kerneltrick'.ThisPartiallysupportedbytheDARPA/ONRGrantN00014-03-1-0520.featurespaceisknownasthereproducingkernelHilbertspace(RKHS)[15].Inthispaper,weinvestigatea`kernelized'versionoftheintra-personalspace(IPS)algorithm,whichwasoriginallyproposedbyMoghaddamet.al.[12].Anintra-personalspaceisconstructedbycollectingallthedifferenceimagesbetweenanytwoimagepairsbelongingtothesameindi-vidual,tocaptureallintra-personalvariations.UsingthePCA,theIPSisdecomposedintotwosubspaces,aprincipalsubspaceandanerrorresidualsubspaceandthesetwosub-spacesareembeddedinaprobabilisticformulation.How-ever,thePCAonlyaccountsforthesecond-orderstatisticsoftheIPSandtheroleofthehigher-orderstatisticsoftheIPSisnotclear.Thispaperattemptstoaddressthisis-suebyreplacingthePCAwiththeKPCA.However,thisreplacementisnontrivialastheordinaryKPCAdoesnotaccomodateaprobabilisticanalysis.Weproposeaproba-bilisticanalysisofthekernelprincipalcomponents,whichintegratesaprobabilisticPCA(PPCA)[17]intotheKPCA.Thispaperisstructuredasfollows.Insection2,wedemonstratetheimportanceoftheintra-personalspacebycomparingitwithregularsubspacealgorithms.WereviewtherelevanttheoreticalissuesregardingtheKPCAinsec-tion3,andpresentaprobabilisticanalysisoftheKPC'sinsection4.Section5appliestheproposedalgorithmtoasub-setoftheFERETdatabase[14]andpresentstheobtainedexperimentalresults.Section6concludesthepaper.1.1.Notationsxisascalar,xavector,andXamatrix.XTrepresentsthematrixtransposeandtr(X)thematrixtrace.Imdenotesanmmidentitymatrix.1denotesavectorormatrixofones.D[a1;a2;:::;am]meansanmmdiagonalmatrixwithdiagonalelementsa1;a2;:::;am.p(:)isageneralprobabilitydensityfunction.N(;)meansanormalden-sitywithameanandacovariancematrix. 2.INTRA-PERSONALSPACEMODELING2.1.Intra-PersonalSpace(IPS)Inthispaper,weareonlyinterestedintestingthegeneral-izationcapabilityofouralgorithm.Itisourhopethatthetrainingstagecanlearntheintrinsiccharacteristicsofthetargetspace.Wefollow[14]todenethreesets,namelythetraining,galleryandprobesets.Thereisnooverlapbetweenthetrainingsetandthegallerysetintermsoftheidentity.Assumethatinthetrainingseteachclass,=1;:::;C,possessesJcobservations,indexedbyjc=1;:::;Jc.Into-talwehaveNPCc=1Jcimagesfxc;jcginthetrainingset.TypicallyNdwheredisthenumberofpixelsinoneimage.Notethat,whentheclassinformationisnotim-portantinaparticularcontext,wesimplydropthenotationanddenotethetrainingsetbyXdN=[x1;:::;xN].ThegallerysetisdenotedbyYdN=[y1;:::;yM],whereeachindividualm,m=1;:::;M,possessonlyoneobservationym.Therecognitionalgorithmdeterminestheidentityofagivenprobeimagezas^mamongthesetf1;:::;Mg.Regularsubspacealgorithmsforfacerecognitionproceedasfollows:Inthetrainingstage,fromthetrainingsetX,qbasisvectors(qN)formingasubspaceprojectionmatrixUdq=[u1;;uq]inRdarelearnedsuchthatthenewrepresentation^XUTXsatisescertainproper-ties.DifferentpropertiesgiverisetodifferentkindsofanalysismethodssuchasthePCA,theFDA,andtheindependentcomponentanalysis(ICA)[8].Inthetestingstage,thealgorithmsusuallydeterminetheidentityoftheprobezasfollows:^m=argminm=1;:::;MjjUT(zym)jj;(1)wherejj:jjisacertainnormmetric.In(1),zymplaysacrucialrole.However,itsprojec-tionontotheAmatrixisnotguaranteedtobesmallevenwhenzandymaretwomembersbelongingtothesameclassbecausethelearningalgorithmisnotgearedtowardsminimizingsuchdistance.ThisistrueevenfortheFDAastheminimizationintheFDAiswithrespecttotheclasscenter,nottheclassmemberitself.Thisisasignicantdif-ferencebetweentheclasscenterandtheclassmemberaspose/illuminatin/expressionvariationsmightseverelydevi-atetheclassmemberfromtheclasscenter.Toefcientlycapturethecharacteristicofthedifferencebetweenclassmembers,Moghaddamet.al.[12]introducedtheintra-personalspace(IPS).TheIPSisconstructedbycollectingallthedifferenceimagesbetweenanytwoimagepairsbelongingtothesameindividual.TheconstructionoftheIPSismeanttocaptureallthepossibleintra-personalvariationsintroducedduringtheimageacquisition.DenotetheIPSby.Itsconstructionproceedsasfol-lows:Fromthetrainingsetfxc;jc;=1;:::;Cg,wecanconstructc;kcxc;j1cxc;j2c;j1cj2c.Henceforthesameindividual,wehaveKcJc(Jc1)=2dif-ferenceimages.Now,wehavereached=fc;kc;1;:::;C;kc=1;:::;Kcg,witheachc;kctreatedasani.i.d.realization.WiththeavailabilityofthetrainingsamplefortheIPS,wecanlearnaprobabilisticdensityfunction(PDF)onit,sayp(x),wherexisanarbitrarypointlyinginthespace.Now,giventhegallerysetYandthedensityp(x),theidentity^moftheprobeimagezisdeterminedbyamaximumlikelihood(ML)rule:^m=argmaxm=1;:::;Cp(zym):(2)SimilartotheFDA,anextra-personalspace(EPS)canbeconstructedtomimicthebetween-classdifferenceandthentherecognitionmechanismfollowsamaximumaposteriori(MAP)rule.See[12]fordetails.Therefore,thisstudycanberegardedasa'generalized'discriminantanalysis.How-ever,ascommentedin[12],usingonlytheIPSmodelingdoesnotsacricetherecognitionperformance.2.2.ProbabilisticsubspacedensityandprobabilisticprincipalcomponentanalysisIn[12],aprobabilisticsubspace(PS)densitypisused[13].Theprobabilisticsubspace(PS)densitydecomposesthedataspaceintotwosubspaces,aprincipalsubspaceandanerrorresidualsubspace.Supposethatthecovariancema-trixofthedataspaceisC,whoseeigenpairsaregivenbyf(i;ui)gdi=1withdbeingthedimensionalityofthedataspace,thePSdensityiswrittenas:p(x)=fexp(12Pqi=1(uTix)2i)(2)q=2Qqi=11=2igfexp(2(x)2)(2)(dq)=2g;(3)where2(x)=kxk2Pqi=1y2iisthereconstructionerror,and1dqdXi=q+1i1dqftr()qXi=1ig:(4)Inpractice,wecannotcomputealleigenpairsdueto`curseofdimensionality'.However,inthePSdensity,weareonlyinterestedinthetopqeigenpairs.ItisveryinterestingtonotethattheprobabilisticPCA(PPCA)[17]isverysimilartothePSdensity.ThetheoryofPPCAisbrieyreviewedinSection4.ThekeyobservationisthatPPCArelatestotheordinaryPCAbythefactthatthetopqeigenpairsofthecovariancematrixaremaintained. WeimplementedboththePSandPPCAintheexperi-mentsandfoundthattheirperformancesweresimilar.Thus,inthesequel,weusethePPCAinsteadduetoitsprobabilis-ticinterpretation.3.KERNELPRINCIPALCOMPONENTANALYSIS3.1.PCAinthefeaturespaceSupposethatfx1;x2;:::;xNgarethegiventrainingsam-plesintheoriginaldataspaceRd.TheKPCAoperatesinahigher-dimensionalfeaturespaceRfinducedbyanon-linearmappingfunction:Rd!Rf,wheref�dandcouldevenbeinnite.ThetrainingsamplesinRfaredenotedbyfN=[1;2;:::;N],wheren:(xn)Rf.Denotethesamplemeaninthefeaturespaceas0:N1NXn=1(xn)=s;sN1:N11:(5)ThecovariancematrixinthefeaturespacedenotedbyCffisgivenasC:N1NXn=1(n0)(n0)T=JJTT= T;(6)whereJ:N1=2(INs1T);:=J:(7)TheKPCAperformsaneigen-decompositionoftheco-variancematrixCinthefeaturespace.Duetothehighdi-mensionalityofthefeaturespace,wecommonlypossessin-sufcientnumberofsamples,i.e.,therankoftheCmatrixismaximallyNinsteadof.However,computingeigen-systemisstillpossibleusingthemethodpresentedin[18].Beforethat,werstshowhowtoavoidtheexplicitknowl-edgeofthenonlinearfeaturemapping.3.2.KerneltrickDeneK:= T =JTTJJTKJ;(8)whereK:=Tisthegrandmatrixorthedotproductmatrixandcanbeevaluatedusingthe`kerneltrick';thustheexplicitknowledgeofthemappingfunctionisavoided.Givenakernelfunctionksatisfyingk(x;y)=(x)T(y);x;y2Rd;(9)the(i;j)thentryofthegrandmatrixKcanbecalculatedasfollows:Kij(xi)T(xj)=k(xi;xj):(10)TheexistenceofsuchkernelfunctionsisguaranteedbytheMercer'sTheorem[10].OneexampleistheGaussiankernel(ortheRBFkernel)whichhasbeenwidelystudiedintheliteratureandthefocusofthispaper.Itisdenedask(x;y)=exp((22)1kxyk2)x;y2Rd;(11)wherecontrolsthekernelwidth.Inthiscasewehave.Theuseofthe`kerneltrick'(orkernelembedding)[15]captureshigh-orderstatisticalinformationsincethefunc-tioncomingfromthenonlinearkernelfunctionisnonlin-ear.Wealsonotethat,aslongasthecomputationsofinter-estcanbecastintermsofdotproducts,wecansafelyusethe`kerneltrick'toembedouroperationsintothefeaturespace.Thisistheessenceofallkernelmethodsincludingthiswork.3.3.ComputingeigensystemfortheCmatrixAsshownin[9,18],theeigensystemforCcanbede-rivedfromK.SupposethattheeigenpairsforKaref(n;vn)gNn=1,wheren'saresortedinanon-increasingorder.WenowhaveKvn= Tvnnvn;n=1;:::;N:(12)Pre-multiplying(12)bygivesrisestoT( vn)=C( vn)=n( vn);n=1;:::;N:(13)HencenisthedesiredeigenvalueofC,withitscorre-spondingeigenvectorvn.Togetthenormalizedeigen-vectorunforC,weonlyneedtonormalizevn.( vn)T( vn)=vTnTvnvTnnvnn:(14)So,un=(n)1=2vn;n=1;:::;N;(15)Inamatrixform(ifonlytopqeigenvectorsareretained),Uq:=[u1;:::;uq]= Vq1=2q;(16)whereVq:=[v1;:::;vq]andq:D[1;:::;q].Itisclearthatwearenotoperatinginthefullfeaturespace,butinalow-dimensionalsubspaceofit,whichisspannedbythetrainingsamples.Itseemsthatthemod-elingcapacityislimitedbythesubspacedimensionality,orbythenumberofthesamples.Inreality,ithoweverturnsoutthateveninthissubspacethesmallesteigenvaluesareveryclosetozero,whichmeansthatthefullfeaturespacecanbefurthercapturedbyasubspacewithaneven-lowerdimensionality.Thismotivatestheuseofthelatentmodel. 4.PROBABILISTICANALYSISOFKERNELPRINCIPALCOMPONENTSInthissection,wepresentthetheoryofprobabilisticanal-ysisofkernelprincipalcomponents,whichuniesthePPCAandtheKPCAinonetreatment.Wecallthisanal-ysisasprobabilistickernelprincipalcomponentanalysis(PKPCA).WethenpresenthowtocomputetheMaha-lanobisdistanceandstudyitslimitingbehavior.4.1.TheoryofPKPCAProbabilisticanalysisassumesthatthedatainthefeaturespacefollowsaspecialfactoranalysismodelwhichrelatesan-dimensionaldata(x)toalatentq-dimensionalvari-ablezas(x)=Wz;(17)wherezN(0;Iq),N(0;If),andisaqloadingmatrix.Therefore,(x)N(;),where=TIf.Typically,wehaveqNf.Themaximumlikelihoodestimates(MLE's)forandaregivenby0N1NXn=1(xn)=s;Uq(qIq)1=2R;(18)whereRisanyqqorthogonalmatrix,andUqandqcon-tainthetopqeigenvectorsandeigenvaluesoftheCmatrix.Substituting(16)into(18),weobtainthefollowing:= Vq1=2q(qIq)1=2R= Q=JQ;(19)wheretheNqmatrixQisdenedasQ:Vq(Iq1q)1=2R:(20)Sincethematrix(Iq1q)inQisdiagonal,additionalsavingsincomputingitssquarerootarerealized.Withoutlossofgenerality,weassumethatRI.TheMLEforisgivenas=(q)1ftr(C)tr(q)g(q)1ftr(K)tr(q)g:(21)In(21),theapproximationneedstheassumptionthatthere-mainingeigenvaluesarezero.Thisisareasonableassump-tionsupportedbyempiricalevidencesonlywhenisnite.Whenisinnite,thisisdoubtfulsincethisalwaysgives=0.Insuchacase,wetemporarilysetamanualchoice&#x-278;&#x.236;0.Laterweshowthatwecanactuallyletbezeroasalimitingcase.However,evenifaxedisused,theoptimalestimateforisstillsameasin(20).Itisinterestingthat(21)isthesameas(4).Now,thecovariancematrixisgivenby=JQQTJTTIf=ATIf;(22)whereAisaNNmatrixgivenbyA:JQQTJTJVq(Iq1q)VTqJT:(23)ThisoffersaregularizedapproximationtothecovariancematrixC=JJTT.Especiallythetopqeigenval-ues/vectorsoftheandCmatricesareequivalent1.An-otherapproximationoftenseenintheliteratureis=CIf.However,thisapproximationchangestheeigenvalueswhileleavingtheeigenvectorsunchanged.ItisinterestingtonotethatTipping[16]usedasimilartechniquetoapprox-imatethecovariancematrixCas=JDJTTIf,whereDisadiagonalmatrixwithmanydiagonalentriesbe-ingzero,i.e.,Disrankdecient.ThiscanbeinterpretedinourapproachsinceinourcomputationDQQTisalsorankdecient.However,wedonotenforceDtobeadiag-onalmatrix.Also,Tipping'sapproximationmightchangeboththeeigenvaluesandeigenvectors.AusefulmatrixdenotedbyMqq,whichcanbethoughtasa'reciprocal'matrixforisdenedasM:IqTIqQTKQ:(24)If(20)issubstitutedinto(24),itiseasytoshow(refertotheAppendix)thatM=q.4.2.MahalanobisdistanceGivenavectory2Rd,weareofteninterestedincom-putingtheMahalanobisdistance(seeSec.5)L(y):((y)0)T1((y)0).Firstly,1iscomputedas1=(IfT)11(IfWM1T)1(IfJQM1QTJTT)1(IfBT);(25)whereBisanNNmatrixgivenby(refertoAppendix)BJQM1QTJTJV(12)VTJT:(26)Then,theMahalanobisdistanceiscalculatedasfollows:L(y)=((y)0)T1((y)0)1fyhTyBhyg;(27)whereyandhyaredenedby:y:=((y)0)T((y)0)=k(y;y)2kTyssTKs;(28)1Infact,theremainingeigenvectorsareunchangedthoughthoseeigen-valuesarechanged. hy:=T((y)0)=kyKs;(29)ky:=T(y)=[k(x1;y);:::;k(xN;y)]T:(30)Wenowobservethatwhenapproacheszero,thequan-tityL(y)hasalimit^L(y)givenby^L(y)=yhTy^Bhy;(31)where^BJQM1QTJTJV1VTJT:(32)NoticethatthislimitingMahalanobisdistancedoesnotde-pendonthechoice.Thus,weusethislimitingMaha-lanobisdistanceinthefollowupexperiments.Also,thisalsoclosestheloopforusingazero.5.EXPERIMENTALRESULTSONFACERECOGNITIONWeperformfacerecognitionusingasubsetoftheFERETdatabase[14]with200subjectsonly.Eachsubjecthas3images:(a)onetakenundercontrolledlightingconditionwithneutralexpression;(b)onetakenunderthesamelight-ingconditionasabovebutwithdifferentfacialexpressions(mostlysmiling);and(c)onetakenunderdifferentlight-ingconditionandmostlywithaneutralexpression.Fig.1showssomefaceexamplesinthisdatabase.Allimagesarepre-processedusingzero-mean-unit-varianceoperationandmanuallyregisteredusingtheeyepositions.Werandomlydividethe200objectsintotwosets,withonesetfortrainingandtheotheronefortesting.Wefocusontheeffectsoftwodifferentvariationsinfacialexpressionandillumination.Foroneparticularvariation,sayillumina-tionvariation,weuse200imagesbelongingtotherst100subjectsasthetrainingsetforlearningandtheremaining200imagesasthegalleryandprobesetsfortesting,withimagesinthecategory(a)asthegalleryset,andthoseinthecategory(c)astheprobeset.Thisrandomdivisionisrepeated20timesandwetaketheiraveragesasthenalresult.Weperformourprobabilisticanalysisofkernelprinci-palcomponentsontheIPS.Thisactuallyderivestheintra-personalkernelsubspaceasshowninsection4.Itturnsoutthat(2)isequivalentto^m=argminm=1;:::;C^L(zym);(33)whereL(:)istheMahalanobisdistancedenedin(27).Forcomparison,wehaveimplementedthefollowingeightmethods:thePKPCAandthePPCA[17]withtheIPSmodeling,theKFDA[2]andtheFDA[6],theKPCA[15]andthePCA[18],andthekernelICA(KICA)[1]andtheICA[8].ForthePKPCA/IPSandthePPCA/IPS,theIPSisconstructedbasedonthetrainingsetandthePKPCA/PPCAdensityisttedontopofthat.FortheKPCA,thePCA,theKICAandtheICA,all200trainingimagesareregardedly-inginonefacespace(FS)andthenthelearningalgorithmsareappliedonthatFS.FortheKFDAandtheFDA,theidentityinformationofthetrainingsetisemployed.Table1liststherecognitionrate,averagingthoseof20simulations,usingthetop1match.ThePKPCA/IPSalgo-rithmattainsthebestperformancesinceitcombinesthedis-criminativepoweroftheIPSmodelingandthemeritofthePKPCA.Asmentionedearlier,usingthePSdensitywiththeIPSmodelingproducesthesameresultsasPPCA/IPS.Also,usingthedualIPS/EPSmodelingdoesnotfurtherimprovetheresults.ComparedtothePPCA/IPS,theimprovementisnotsignicant,indicatingthatsecond-orderstatisticsmightbeenoughaftertheIPSmodelingforthefacerecognitionproblem.However,usingthePKPCAmaybemoreeffec-tivesinceitalsotakesintoaccountthehigher-orderstatis-ticsbesidesthesecond-orderones.Anotherobservationisthatvariationsinilluminationareeasiertomodelthanfacialexpressionusingsubspacemethods.6.CONCLUSIONInthispaper,weillustratedtheimportanceoftheintra-personalspaceforarecognitionproblem.Then,wepro-posedaprobabilisticanalysisofkernelprincipalcompo-nentsandcomputedtheMahalanobisdistanceanditslimit-ingdistance.Finally,wehaveappliedthisproposedprob-abilisticapproachwithIPSmodelingtoafacedatasetandhighlighteditsadvantages.Analnoteisthatouranaly-sisisquitegeneralandisapplicabletootherlearningandrecognitiontasks.7.REFERENCES[1]F.BachandM.I.Jordan.Kernelindependentcomponentanalysis.TechnicalReportCSD-01-1166,ComputerScienceDivision,UniversityofCalifornia,Berkeley,2001.[2]G.BaudatandF.Anouar.Generalizeddiscriminantanalysisusingakernelapproach.NeuralComputation,12(10):2385–2404,2000.[3]P.N.Belhumeur,J.P.Hespanha,andD.J.Kriegman.Eigen-facesvs.sherfaces:Recognitionusingclassspeciclinearprojection.IEEETrans.PAMI,19,1997.[4]R.Chellappa,C.L.Wilson,andS.Sirohey.Humanandma-chinerecognitionoffaces,asurvey.ProceedingsofIEEE,83:705–740,1995.[5]A.P.Dempster,N.M.Laird,andD.B.Rubin.Maximumlikelihoodfromincompletedataviatheemalgorithm.J.Roy.Statist.Soc.B,1977.[6]R.O.Duda,P.E.Hart,andD.G.Stork.PatternClassica-tion.Wiley-Interscience,2001. Fig.1.Toprow:neutralfaces.Middlerow:faceswithfacialexpression.Bottomrow:facesunderdifferentillumination.Imagesizeis24by21inpixels.PKPCA/IPSPPCA/IPSKFDAFDAKPCAPCAKICAICAExpression79%78%73%72%64%68%61%53%Illumination84%82%65%75%52%73%61%57%Table1.Therecognitionrates.[7]K.EtemadandR.Chellappa.Discriminantanalysisforrecognitionofhumanfaceimages.JournalofOpticalSo-cietyofAmericaA,pages1724–1733,1997.[8]A.Hyvarinen.Surveyonindependentcomponentanalysis.NeuralComputingSurveys,2:94–128,1999.[9]M.KirbyandL.Sirovich.Applicationofkarhunen-lo´eveprocedureofthecharacterizationofhumanfaces.IEEETransactiononPatternAnalysisandMachineIntelligence,12(1):103–108,1990.[10]J.Mercer.Functionsofpositiveandnegativetypeandtheirconnectionwiththethoeryofintegralequations.Philos.Trans.Roy.Soc.London,A209:415–446,1909.[11]B.Moghaddam.Principalmanifoldsandprobabilisticsubspacesforvisualrecognition.IEEETrans.PAMI,24(6):780–788,2002.[12]B.Moghaddam,T.Jebara,andA.Pentland.Bayesianmod-elingoffacialsimilarity.AdvancesinNeuralInformationProcessingSystem,1998.[13]B.MoghaddamandA.Pentland.Probabilisticvisuallearn-ingforobjectrepresentation.IEEETrans.PAMI,19(7):696–710,1997.[14]P.J.Philipps,H.Moon,S.Rivzi,andP.Ross.TheFERETevaluationmethodologyforface-recognitionalgo-rithms.IEEETrans.PAMI,22:1090–1104,2000.[15]B.Sch¨olkopf,A.Smola,andK.-R.M¨uller.Nonlinearcom-ponentanalysisasakerneleigenvalueproblem.NeuralComputation,10(5):1299–1319,1998.[16]M.Tipping.Sparsekernelprinicipalcomponentanalysis.NIPS,2001.[17]M.TippingandC.Bishop.Probabilisticprincipalcompo-nentanalysis.JournaloftheRoyalStatisticalSociety,SeriesB,61(3):611–622,1999.[18]M.TurkandA.Pentland.Eigenfacesforrecognition.Jour-nalofCognitiveNeutoscience,3:72–86,1991.[19]M.-H.Yang.Facerecognitionusingkernelmethods.NIPS,2001.[20]M.-H.Yang.Kerneleigenfacesvs.kernelsherfaces:Facerecognitionusingkernelmethods.Proc.ofIntl.Conf.onFaceandGestureRecognition,2002.8.APPENDIX–SOMEUSEFULCOMPUTATIONS8.1.ComputationrelatedtoMWerstcomputeQTKQandthenM.QTKQ=(I1)1=2VTKV(I1)1=2=(I1)1=2(I1)1=2=I;(34)wherethefactthatVTKVVTJTKJV=isused.Therefore,MIQTKQI+(I)=:(35)jMjjjqYi=1i;M1=1:(36)8.2.ComputationrelatedtoAandBAJQQTJTJV(I1)1=2(I1)1=2VTJTJV(I1)VTJT(37)BJQM1QTJTJV(I1)1=21(I1)1=2VTJTJV(12)VTJT(38)