/
CMSC  Spring  Learning Theory Lecture  Mistake Bound Model Halving Algorithm Linear Classiers CMSC  Spring  Learning Theory Lecture  Mistake Bound Model Halving Algorithm Linear Classiers

CMSC Spring Learning Theory Lecture Mistake Bound Model Halving Algorithm Linear Classiers - PDF document

pasty-toler
pasty-toler . @pasty-toler
Follow
553 views
Uploaded On 2014-11-27

CMSC Spring Learning Theory Lecture Mistake Bound Model Halving Algorithm Linear Classiers - PPT Presentation

In each part we will make different assumptions about the data generating process Online Learning No assumptions about data generating process Worst case analysis Fundamental connections to Game Theory Statistical Learning Assume data consists of in ID: 17669

each part

Share:

Link:

Embed:

Download Presentation from below link

Download Pdf The PPT/PDF document "CMSC Spring Learning Theory Lecture M..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Notethatweareignoringefciencyissueshere.WehavenotsaidanythingabouttheamountofcomputationAhastodoineachroundinordertoupdateitshypothesisfromhttoht+1.Settingthisissueasideforamoment,wehavearemarkablysimplealgorithmHALVING(C)thathasamistakeboundoflg(jCj)foranyniteconceptclassC.ForanitesetHofhypotheses,denethehypothesismajority(H)asfollows,majority(H)(x):=(+1jfh2Hjh(x)=+1gjjHj=2;�1otherwise: Algorithm1HALVING(C) C1 Ch1 majority(C1)fort=1toTdoReceivextPredictht(xt)ReceiveytCt+1 ff2Ctjf(xt)=ytght+1 majority(Ct+1)endfor Theorem2.2.ForanyniteconceptclassC,wehavemistake(HALVING(C);C))lgjCj:Proof.ThekeyideaisthatifthealgorithmmakesamistakethenatleasthalfofthehypothesisinCtareeliminated.Formally,ht(xt)6=yt)jCt+1jjCtj=2:Therefore,denotingthenumberofmistakesuptotimetbyMt,Mt:=TXt=11[ht(xt)6=yt];wehavejCt+1jjC1j 2Mt=jCj 2Mt(1)Sincethereisanf2Cwhichperfectlyclassiesallxt,wealsohave1jCt+1j:(2)Combining(1)and(2),wehave1jCj 2Mt;whichgivesMtlg(jCj). 3LinearClassiersandMarginLetusnowlookataconcreteexampleofaconceptclass.SupposeX=Rdandwehaveavectorw2Rd.Wedenethehypothesis,hw(x)=sgn(wx);2 Thisboundisnicebecauseeventhoughwehadanuncountableconceptclasstobeginwith,themarginassumptionallowedustoworkwithanitesubsetoftheconceptclassandwewereabletoderiveamistakebound.However,theresultisunsatisfactorybecauserunningthehalvingalgorithmonC linisextremelyinefcient.Onemightwonderifonecanusethespecialstructureofthespaceoflinearclassierstoimplementthehalvingalgorithmmoreefciently.Indeed,itpossibletoimplementavariantofthehalvingalgorithmefcientlyusingtheellipsoidmethoddevelopedforthelinearprogrammingfeasibilityproblem.Notethatthemistakebounddependsexplicitlyonthedimensiondoftheproblem.Wewouldalsoliketobeabletogiveadimensionindependentmistakebound.Indeed,aclassicalgorithmcalledPERCEPTRONhassuchamistakebound.4