/
Sparse Recovery Using Sparse Matrices Anna Gilbert Piotr Indyk Abstract We survey algorithms Sparse Recovery Using Sparse Matrices Anna Gilbert Piotr Indyk Abstract We survey algorithms

Sparse Recovery Using Sparse Matrices Anna Gilbert Piotr Indyk Abstract We survey algorithms - PDF document

briana-ranney
briana-ranney . @briana-ranney
Follow
635 views
Uploaded On 2015-01-14

Sparse Recovery Using Sparse Matrices Anna Gilbert Piotr Indyk Abstract We survey algorithms - PPT Presentation

Such matrices has several attractive properties they support algorithms with low computational complexity and make it easy to perform in cremental updates to signals We discuss applications to several areas including compressive sensing data stream ID: 31194

Such matrices has several

Share:

Link:

Embed:

Download Presentation from below link

Download Pdf The PPT/PDF document "Sparse Recovery Using Sparse Matrices An..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

2vector,andthereforeiscapturedinthelinearsketchingmodel1.Seee.g.,[SAZ09],[ESAZ09],[KBG+10]forfurtherdiscussionandrecentdevelopmentsinthearea.Ineachoftheseapplications,itisuseful(andoftencrucial)thatthemeasurementmatrixAbeasparsematrix,i.e.,containveryfewnon-zeroelementspercolumn.Indatastreamprocessing,thetimeneededtoupdatethesketchAxundertheupdateiisproportionaltothenumberofnon-zeroelementsinthevectorAi,whichisequaltothenumberofnon-zerosinthei-thcolumnofA.Inexperimentpooling,thedesignmatrixAisabinarymatrixthatcaptureswhichcompoundsarepooledtogetherandthemeasurementsAxreecttheactivitylevelsofthepooledcompounds.Inmanychemicalandbiologicalapplications,theassumptionthatcompoundactivityisalinearfunctionoftheconcentrationholdsonlywhentherearenotmanycompoundsmixedtogetherinasinglepool;thus,thedesignmatrixAshouldbenotonlybinarybutalsosparse.Inotherapplications,sparsitycanbeusefulforcomputationalreasons:onecancomputethematrix-vectorproductAxveryquickly2.A.Denitions,andclassicationoftheresultsFormally,wedenethesparserecoveryproblemasfol-lows.LetErrkq=Errkq(x)bethesmallestpossible`qapproximationerrorkx�x0kq,wherex0rangesoverallk-sparsevectors(i.e.,thathaveatmostknon-zeroentries).Ourgoalis,givenAx,tondavector^xsuchthatthe`papproximationerror3kx�^xkpisatmostc�0timesErrkq(x),i.e.,k^x�xkpcErrkq(x)(1)Notethatforanyvalueofp,theerrorkx�^xkpisminimizedwhen^xconsistsoftheklargest(inmagnitude)coefcientsofx.Werefertosuch^xasthe”head”ofthesignalx,whilex�^xwillbecalledthe”tail”ofx.Asmentionedearlier,weaimtodesignsparserecoveryschemesthatachieveshortsketches,havelowalgorith-miccomplexityandprovide”good”recoveryguarantees.Inaddition,theschemesdescribedinthissurveycanbeclassiedbasedonothercharacteristics,suchas(i)whethertheschemesarerandomizedordeterministic,or(ii)howgeneralistheclassofsignalsxsupportedbytheschemes.Inthefollowingweelaborateonbothissues.1Infact,wecanassumeanevenmoregeneralsetting,whereweallowageneralvectorx2Rn,andourgoalistoidentifythetopkmostsignicantcoefcientsfromthesetoflinearmeasurements.Thisisapplicableinasettingwheretheentriesinxrepresenttheactivitylevelofncompounds,orageneticresponseinabiologicalsample2Specically,thematrix-vectorproductcanbecomputedintimeO(ns),wheresisthecolumnsparsityofA.AswewillseeinSectionIII,inmanysettingsonecanachieves=O(log(n=k)),whichleadstotherunningtimeofO(nlog(n=k)).ThiscomparesfavorablytotheO(nm)timeachievableforrandomGaussianmatrices,ortotheO(nlogn)timeachievableforrandomFourier-likematrices.3Itisnaturaltoconsiderp=q.However,aswewillseelater,otherguaranteesarealsopossible.Randomization:wedistinguishbetweentwoclassesofschemes:for-eachandfor-all.ThelatterdescribesaschemeinwhichonematrixAworksforallsignalsx.Intheformercase,thematrixAischosenatrandomfromsomedistributionandforeachsignalx,therecoveryalgorithmworks”withhighprobability”(atleast1�1=n)4.Naturally,schemeswiththefor-allpropertyarepreferabletothosewiththefor-eachguarantee(ifallotherparametersarethesame).Wenotethat“for-all”doesnotmeanthatthematrixisconstructedinan“explicit”orefcientmanner.Infact,mostoftheconstructionspresentedhereusetheprobabilisticmethod.Althoughitispossibletocon-structrecoveryschemesexplicitly[DeV07],[Mut06],[BGI+08],suchschemestendtorequiremoremeasure-ments.Generalityofsupportedsignals:ideally,therecoveryschemesshouldsupportarbitrarysignalsx.Inthissurvey,wefocusondescribingsuchschemes.However,therehasbeenplentyofworkonalgorithmssupportingmorerestrictiveclassesofsignals.Inparticular,therehavebeenseveralschemesbasedonsparsematricesthatworkfor(almost)exactlyk-sparsesignals[SBB06b],[SBB06a],[XH07],[JXHC08],[SBB08],[WWR08],[KDXH08],[LMP+08].Althoughwedonotcoverthemindetail,wepointoutrelevantconnectionsandreferenceswheneverpossible.B.SurveysummaryWepresentanoverviewofthealgorithmsforsparserecoverythatutilizesparsemeasurementmatrices.Thede-scriptionisdividedintotwosections:for-eachalgorithmsarecoveredinsectionII,whilefor-allalgorithmsaredescribedinsectionIII.Historically,mostofthefor-eachschemeshavebeendevelopedinthedatastreamcommunityduringtheperiod2001-2004.Incontrast,mostofthealgorithmswithfor-allguaranteeshavebeendiscoveredafter2004,duringtheprocessofunifyingtheideasofcompressivesensinganddatastreamalgorithms.Wepresentthealgorithmsinthesamechronologicalorder.AlmostallschemesdescribedhereoffersketchlengthboundsofO(klogn)orless,whichmatchesorisclosetothelowerboundof (klog(n=k))shownin[BIPW10].Theyaresupportedbyefcientalgorithms,withrunningtimesrangingfrompolynomialinntonear-linearinn.Theyofferavarietyofapproximationguarantees,startingfroma”plainvanilla”guaranteeofEquation1withp=q=1(thel1=l1-guarantee)tomorecomplex(butoftenstronger)ones.Theexactsketchlengthbounds,approximationguaranteesandalgorithmrunningtimesarestatedintheorems1to10.4Weadoptheretheterminologyfrequentlyusedincomputerscience.Notethatonecouldrequireweakerprobabilitybounds,e.g.,1�o(1).However,allalgorithmspresentedinthissurveynaturallyachievethestrongerprobabilityboundwithoutchangingthe(asymptotic)boundonthenumberofmeasurements. 4Theorem1:ThereexistsadistributionovermnmatricesA,m=O(k= logn),suchthatforanysignalx,givenAx,wecanrecover^x=xmedsuchthatk^x�xk1 =kErrk1withhighprobability.ThecolumnsparsityofAisO(logn),andthetimeneededtorecover^xfromAxisO(nlogn).Weconcludebyobservingthattheapproximationguaran-teeintheabovetheoremimpliesaweakerbutperhapsmoreintuitiveguaranteeaboutthel1approximationerror.Considerthevector^xconsistingoftheklargest(inmagnitude)elementsofxmed.Thenwehavekx�^xk1(1+3 )Errk1Toshowthis,letSbethesetoftheklargestinmagnitudecoordinatesofx,andlet^Sbethesupportof^x.Notethatk^xSk1k^x^Sk1.Wehavekx�^xk1kxk1�kx^Sk1+kx^S�^x^Sk1kxk1�k^x^Sj1+2 Errk1kxk1�k^xSk1+2 Errk1kxk1�kxSk1+3 Errk1(1+3 )Errk1Formoredetaileddescriptionsofthealgorithms,see[CM04],[CCFC02],[EV03].B.Count-SketchThenext6algorithm,calledCount-Sketch[CCFC02],pro-videserrorguaranteesthatareafunctionofErrk2asopposedtoErrk1.ThisisaccomplishedbyusingadistributionovermatricesAverysimilartothoseusedbyCount-Min,withonedifference:eachnon-zeroentryischosenindependentlyanduniformlyatrandomfromf�1;1g(insteadjustbeingequalto1).Formally,letri;lbeindependentrandomvari-ableswithvalueschosenuniformlyatrandomfromf�1;1g,andletthefunctionsh1:::hdbedenedasintheprevioussection.ThenthematrixAisaverticalconcatenationofmatricesA(h1);:::;A(hd),where(A(hl))j;iisequaltori;lifj=hl(i),andareequalto0otherwise.Toestimatethecoordinatexionethenusesthemedianestimatorxmed0=medianlri;l(A(hl)x)hl(i)TheanalysisofCount-SketchreliesontheobservationthatPr[(ri;lA(hl)x)hl(i)�xi)2C=wErrk2]1=4forsomeabsoluteconstantC�0.Thenalguaranteeiscapturedbythefollowingtheorem:6Chronologically,theCount-SketchalgorithmhasbeeninventedbeforeCount-Min.Itiseasier,however,todescribetheideasinthereverseorder.Theorem2:ThereexistsadistributionovermnmatricesA,m=O(k= logn),suchthatforanysignalx,givenAx,wecanrecover^xsuchthatk^x�xk21 =k(Errk2)2withhighprobability.ThecolumnsparsityofAisO(logn),andthetimeneededtorecover^xfromAxisO(nlogn).Asbefore,theapproximationguaranteeinthetheoremimpliesaweakerbutmoreintuitiveguarantee,thistimeaboutthel2approximationerror.Considerthevector^xconsistingoftheklargest(inmagnitude)elementsofxmed0.Thenwehave[CM06]:kx�^xk22(1+9p )(Errk2)2Theproofproceedsasfollows.LetE=Errk2.LetSbethesetofklargest(inmagnitude)coordinatesofx,andlet^Sbethesupportof^x.Moreover,foranysetP,let�PdenotethecomplementofP.Wehavekx�^xk22k(x�^x)^Sk22+kxS�^Sk22+kx�(S[^S)k22(2)Thersttermisboundedbyk =kE2= E2.Toboundthesecondterm,weproceedasfollows.Consideranyi2S�^Sandj2^S�S.Wehavejxij�jxjjj^xij�j^xjj+2p =kE2p =kE(3)Leta=maxi2S�^Sjxijandb=minj2^S�Sjxjj.FromEquation3wehaveab+2p =kE.ThuskxS�^Sk22a2jS�^Sj(b+2p =kE)2jS�^SjSincekx^S�Sk22b2j^S�Sj=b2jS�^Sj,wecontinuekxS�^Sk22(kx^S�Sk2=q jS�^Sj+2p =kE)2jS�^Sj(kx^S�Sk2+2p E)2kx^S�Sk22+4kx^S�Sk2p E+4 E2kx^S�Sk22+4p E2+4 E2kx^S�Sk22+8p E2PluggingintoEquation2wegetkx�^xk22 E2+kx^S�Sk22+8p E2+kx�(S[^S)k229p E2+kx�Sk22=(1+9p )E2C.SublinearalgorithmsTheabovealgorithmsallrunintimeatleastlinearinthesignalsizeastheyentailestimatingavalueforeachcoordinateinthesignal,eventhosethatareinsignicant.Ifourgoalistojustreportknon-zerotermsofk-sparseapproximation,thenitissufcienttond(orapproximate)thetopkvaluesonlytoachievesimilarerrorguarantees.Sublinearalgorithmsaimtodojustthatandtodosointime 6d=O(log(n=s)=)andm=jVj=O(slog(n=s)=2).Sinceourconstructionsrequires=O(k)andstrictlyboundedawayfromzero,theresultingmatriceswillhaveO(klog(n=k))rows.Formanyapplicationsoneoftenneedsanexplicitex-pander,i.e.,anexpanderforwhichwecanefcientlycom-putetheneighborsetofagivenleftvertex.Noexplicitconstructionswiththeaforementionedparametersareknown.However,itisknown[GUV07]howtoexplicitlyconstructexpanderswithleftdegreed=O((logjUj)(logs)=)1+1= andrightsetsize(d2s1+ ),foranyxed �0.Forsimplicity,intheremainderofthispaper,wewillassumeexpanderswiththeoptimalparameters.Unlikeinthefor-eachcase8,thealgorithmsinthissectionareknowntoberesilienttothemeasurementnoise.Thatis,wecouldassumethatwearegivenanoisysketchvectorb=Ax+,whereisthe“measurementnoise”vector.Inthatcase,theerrorboundsintheapproximationguaranteeswouldhaveanadditionaltermdependingon=kk1=d.However,forthesakeofconsistency,wewillfocusthedescriptiononthenoise-freecasewhereb=Ax.Thereaderisreferredtotheoriginalpapersfortheboundsforthenoise-resilientvariantsofthealgorithms.A.RIP(1)andl1minimizationInthissectionwegiveanoverviewofthe”geometric”approachtosparserecoveryusingsparsematrices,introducedin[BGI+08].Theapproachusesthel1minimizationalgo-rithmthathasbeenearliershowntoworkforrandomdensematrices[CRT06],[Don06].Inthenoiselesscaseb=Ax,thealgorithmproceedsbynding^xsuchthatA^x=bandk^xk1isminimized.Tounderstandwhentheabovealgorithmperformsanac-curaterecovery,weneedthefollowinggeneralizeddenitionoftheRestrictedIsometryProperty.Denition2:AnmnmatrixAissaidtosatisfyRIP(p;k;)if,foranyk-sparsevectorx,wehavekxkp(1�)kAxkpkxkp:Forthecaseofp=2,thenotionwasintroduced9in[CRT06],whichalsoshowedthatifamatrixAsatisesthisproperty,thenthel1minimizationprocedureproducesanaccuratesolution.Sincethentherehasbeenatremen-dousamountofstudyofthepropertiesandconstructionofRIP(2;k;)(orRIP(2),forshort)matrices.Unfortunately,sparsematricescannotsatisfytheRIP(2)property,unless8Itshouldbenotedthat,althoughthefor-eachalgorithmshavetypicallybeennotanalyzedforthecaseofnoisysketches,thealgorithmthemselvescouldverywellbequiteresilienttovariousformsofnoise.9Theoriginalpaper[CRT06]employedaslightlydifferentnotationusing”doublesidederror”,i.e.,requiringthatkxk2(1�0)kAxk2kxk2(1+0).Thetwodenitionscanbeseentobeequivalent,byscalingAandsetting(1+)=(1+0)=(1�0).theirnumberofrowsis”large”[Cha08].Inparticular,sparse0-1matricesmusthaveatleast (k2)rows.However,itwasshown[BGI+08]thatsuchmatricescansatisfyRIP(p)forpequal(orveryclose)to1.Inparticular,theadjacencymatricesofexpandergraphsdohavethisproperty10.Byearlierarguments,suchmatriceshaveO(klog(n=k))rows,whichtranslatesintoO(klog(n=k))sketchlengthbound.Lemma4:ConsideranymnmatrixAthatistheadjacencymatrixofan(k;d;)-unbalancedexpanderG=(U;V;E).ThenthescaledmatrixA=dsatisestheRIP(1;k;)propertyfor=2.Proof:Letx2Rnbeak-sparsevector.Withoutlossofgenerality,weassumethatthecoordinatesofxareorderedsuchthatjx1j:::jxnj.Weordertheedgeset=(it;jt),t=1:::dnofGinalexicographicmanner.Itishelpfultoimaginethattheedgese1;e2:::ofEarebeingaddedtothe(initiallyempty)graph.Anedgeet=(it;jt)causesacollisionifthereexistsanearlieredgees=(is;js);st,suchthatjt=js.WedeneE0tobethesetofedgeswhichdonotcausecollisions,andE00=E�E0.Claim5:WehaveX(i;j)2E00jxijdkxk1Proof:Foreacht=1:::dn,weuseanindicatorvariablert2f0;1g,suchthatrt=1iffet2E00.Deneavectorz2Rdnsuchthatzt=jxitj.ObservethatX(i;j)2E00jxij=Xet=(it;jt)2Ertjxitj=rzToupperboundthelatterquantity,observethatthevectorssatisfythefollowingconstraints:Thevectorzisnon-negative.Thecoordinatesofzaremonotonicallynon-increasing,andatmostkdofthemarenon-zero.ForeachprexsetPi=f1:::dig,ik,wehavekrjPik1di-thisfollowsfromtheexpansionpropertiesofthegraphG.rjP1=0,sincethegraphissimple.Itfollowsthatforanyr;zsatisfyingtheaboveconstraints,wehaverzkzk1.Sincekzk1=dkxk1,thelemmafollows. SincekAxk1Xet=(it;jt)2E0jxitj�Xet=(it;jt)2E00jxitj;Claim5immediatelyimpliesthatkAxk1dkxk1(1�2).SinceforanyxwehavekAxk1dkxk1,itfollowsthatA=dsatisestheRIP(1;k;2)property. 10Infact,forsomerangeofparameters,theoppositeholds,i.e.,0-1matricesthatsatisfyRIP(1)areadjacencymatricesofexpandergraphs.See[Cha08],[BGI+08]formoredetails. 8decoding^x Ifromb�( I).Thisisaccomplishedduringthesecondphase,whichproceedsinasequenceofiterations.Ineachiteration,thealgorithmidentiescoordinatesi2 Isuchthatmostofelementsof�(i)donothaveanyotherneighborsin�( I).Thealgorithmthenestimatesthevalues^xiofsuchcoordinates(usingthemedianestimator),eliminatesthemfromthe I,andsubtractstheircontributiontothesketch.Theprocessiscontinueduntiltheset Ibecomesempty.Sinceeachcoordinateoftheapproximationisestimatedonlyonce,andisneverrevisedagain,theEMPalgorithmisveryefcient:itrunsintimeproportionaltothenumberofedgesinthegraphG,whichisO(nlog(n=k)).Therecoveredvector^xprovidesanapproximationinthel1norm,i.e.,wehavethatTheorem9:Thereexistsanmn(expander)matrixA,m=O(klog(n=k)= 2),suchthatforanysignalx,givenAx,wecanrecover^xsuchthatkx�^xk1(1+ )Errk1ThecolumnsparsityofAisO(log(n)= 2),andtherecoveryalgorithm(EMP)hasO(nlog(n=k)= 2)runningtime.AlthoughEMPoffersexcellentasymptoticguarantees,itsempiricalperformanceisnotsogreat.Specically,thenum-berofmeasurementsrequiredbythealgorithmtoachievecorrectrecoveryissuboptimal.Forexample,ourrecoveryexperimentsonrandomsignedk-sparsesignalsoflengthn,fork=50andn=20000,showthatonetypicallyneedsatleast5000measurementstorecoverthesignalcorrectlyusingtheEMPalgorithm.Incomparison,thelinear-programming-basedrecoveryalgorithmforsparsematricesdescribedearlierrequiresonlyabout450measurementstoperformthesametask12.2)SMP:TheSMPborrowssomeoftheideaspresentinEMP,butithasbeenalsoinuencedbytherecentiterativealgorithmsforsparserecoveryusingdensematrices,suchas[NT08].Therunningtimeofthenewalgorithmisslightlyhigher(byalogarithmicfactor)thanofEMP.However,em-pirically,thealgorithmperformssuccessfulrecoveryfromasignicantlysmallernumberofmeasurements.Inparticular,fortheinstancesdescribedabove,SMPtypicallyneedsabout2000measurements.TheasymptoticboundonthenumberofrequiredmeasurementsisstillO(klog(n=k)).Therecoveryalgorithmisiterative,inthespiritofMatch-ingPursuit[TG05].Ineachiteration,thealgorithmestimatesthedifferencebetweenthecurrentapproximation^xjandthesignalxfromthesketchA^xj�b.Theestimation,denotedbyuisobtainedbyusingthemedianestimatorasinEMP.Theapproximation^xjisupdatedbyu,andtheprocessisrepeated.LetHl[y]bea“thresholdingoperator”,whichzerosoutallbutthellargestinmagnitudecoefcientsoftheargumenty.Also,letC�0besomeconstant.Thedetailsofthe12Forbothalgorithmsweusedrandomlygenerated0-1matriceswithcolumnsparsityequalto20. 1)Letj=02)Let^xj=03)RepeatTtimesa)Letj=j+1b)Letb=b�A^xj�1Remark:b=A(x0�^xj�1)+0c)Letu=Emed(b)d)Letuj=H2k[u]Remark:kuj�(x0�^xj�1)k1kx0�^xj�1k=4+C0e)Let^xj=^xj�1+ujRemark:kx0�^xjk1kx0�^xj�1k=4+C0f)Let^xj=Hk[^xj]Remark:kx0�^xjk1kx0�^xj�1k=2+2C0 Fig.1.TheSparseMatchingPursuitalgorithm:pseudocodeandremarksontheanalysis.algorithm,togetherwithremarksaboutthepropertiesusedintheanalysis,aredepictedinFigure1.Theremarksrelyonthefollowingtrick,borrowedfrom[NT08]:wecandecomposetheinputsignalxintothe“head”x0(containingthekmostsignicantcomponentsofx)andthe“tail”x�x0.Then,wecaninterpretthe“sketchofthetail”termA(x�x0)asmeasurementnoise.Thatis,wecanassumethatthesketchbisequaltoAx0+0,where0=A(x�x0)andx0isk-sparse.NotethattheRIP(1)prop-ertyofAimpliesthatkA(x�x0)k1dkx�x0k1=dErrk1.Wedene0=k0k1=dErrk1.Fromtheremarksinthealgorithmdescriptionweconcludethatforanyj=1;2;:::T,wehavek^xj�x0k1kx0k1=2j+O(0)Thus,settingthenumberofiterationstoT=log(kx0k1=0)guaranteesthatk^xT�x0k1=O(0)=O(Errk1)Thefollowingtheoremsummarizesthediscussion.Theorem10:Thereexistsanmn(expander)matrixA,m=O(klog(n=k)),suchthatforanysignalx,givenAx,wecanrecover^xsuchthatkx�^xk1cErrk1foranabsoluteconstantc�0.ThecolumnsparsityofAisO(logn),andtherecoveryalgorithm(SMP)hasO(nlog(n=k)T)runningtime,forTdenedasabove.3)Connectionstomessage-passingalgorithms:TheSMPalgorithmdescribedabove,aswellastheaforementionedalgorithmsfrom[XH07],[Ind08],[JXHC08],canbein-terpretedinageneralframeworkofmessage-passingal-gorithms.SuchalgorithmsstructuretheiroperationsbasedonthebipartitegraphGunderlyingthematrixA.Speci-cally,eachnodeofthegraphcanbeviewedasaseparate 10[Ind08]P.Indyk.Explicitconstructionsforcompressedsensingofsparsesignals.SODA,2008.[IR08]P.IndykandM.Ruzic.Near-optimalsparserecoveryinthel1norm.FOCS,2008.[JXHC08]S.Jafarpour,W.Xu,B.Hassibi,andA.R.Calderbank.Efcientandrobustcompressedsensingusinghigh-qualityexpandergraphs.Manuscript,2008.[KBG+10]R.Kainkaryam,A.Bruex,A.Gilbert,P.Woolf,andJ.Schiefel-bein.poolmc:Smartpoolingofmrnasamplesinmicroarrayexperiments.Manuscript,2010.[KDXH08]M.A.Khajehnejad,A.G.Dimakis,W.Xu,andB.Hassibi.Sparserecoveryofpositivesignalswithminimalexpansion.Manuscript,2008.[KSZC03]B.Krishnamurthy,S.Sen,Y.Zhang,andY.Chen.Sketch-basedchangedetection.SIGCOMMconferenceonInternetmeasurement,2003.[LKM+06]J.Laska,S.Kirolos,Y.Massoud,R.Baraniuk,A.Gilbert,M.Iwen,andM.Strauss.Randomsamplingforanalog-to-informationconversionofwidebandsignals.IEEEDallasCircuitsandSystemsWorkshop(DCAS),2006.[LMP+08]Y.Lu,A.Montanari,B.Prabhakar,S.Dharmapurikar,andA.Kabbani.Counterbraids:Anovelcounterarchitectureforper-owmeasurement.Sigmetrics,2008.[Mut03]S.Muthukrishnan.Datastreams:Algorithmsandapplications(invitedtalkatsoda'03).Availableathttp://athos.rutgers.edu/muthu/stream-1-1.ps,2003.[Mut06]S.Muthukrishnan.Somealgorithmicproblemsandresultsincompressedsensing.Allerton,2006.[NT08]D.NeedellandJ.A.Tropp.CoSaMP:Iterativesignalrecoveryfromincompleteandinaccuratesamples.Appl.Comp.Har-monicAnal.,2008.Toappear.[SAZ09]N.Shental,A.Amir,andO.Zuk.Rare-alleledetectionusingcompressedse(que)nsing.arXiv:0909.0400,2009.[SBB06a]S.Sarvotham,D.Baron,andR.G.Baraniuk.Compressedsensingreconstructionviabeliefpropagation.TechnicalReportECE-0601,ElectricalandComputerEngineeringDepartment,RiceUniversity,2006.[SBB06b]S.Sarvotham,D.Baron,andR.G.Baraniuk.Sudocodes-fastmeasurementandreconstructionofsparsesignals.IEEEInternationalSymposiumonInformationTheory,2006.[SBB08]S.Sarvotham,D.Baron,andR.G.Baraniuk.Bayesiancompressivesensingviabeliefpropagation.Manuscript,2008.[TG05]J.A.TroppandA.C.Gilbert.SignalrecoveryfrompartialinformationviaOrthogonalMatchingPursuit.SubmittedtoIEEETrans.Inform.Theory,April2005.[TLD+09]J.Tropp,M.Laska,M.Duarte,J.Romberg,andR.Baraniuk.Beyondnyquist:Efcientsamplingofsparsebandlimitedsig-nals.IEEETrans.Info.Theory,2009.[TLW+06]DharmpalTakhar,JasonLaska,MichaelB.Wakin,MarcoF.Duarte,DrorBaron,ShriramSarvotham,KevinKelly,andRichardG.Baraniuk.Anewcompressiveimagingcam-eraarchitectureusingoptical-domaincompression.InProc.IS&T/SPIESymposiumonElectronicImaging,2006.[WWR08]W.Wang,M.J.Wainwright,andK.Ramchandran.Information-theoreticlimitsonsparsesignalrecovery:Denseversussparsemeasurementmatrices.Manuscript,2008.[XH07]W.XuandB.Hassibi.Efcientcompressivesensingwithde-terminsticguaranteesusingexpandergraphs.IEEEInformationTheoryWorkshop,2007.