/
PEWA:Patch-basedExponentiallyWeightedAggregationforimagedenoising
... PEWA:Patch-basedExponentiallyWeightedAggregationforimagedenoising
...

PEWA:Patch-basedExponentiallyWeightedAggregationforimagedenoising ... - PDF document

iainnoli
iainnoli . @iainnoli
Follow
342 views
Uploaded On 2020-11-19

PEWA:Patch-basedExponentiallyWeightedAggregationforimagedenoising ... - PPT Presentation

CharlesKervrannInriaRennesBretagneAtlantiqueSerpicoProjectTeamCampusUniversitairedeBeaulieu35042RennesCedexFrancecharleskervranninriafrAbstractPatchbasedmethodshavebeenwidelyusedfornoisereduct ID: 817355

psnr 512 bayes 256 512 psnr 256 bayes 147 148 ipol 313 473 means basedtransform 2013 2011 bm3d ieeetransactionsonimageprocessing

Share:

Link:

Embed:

Download Presentation from below link

Download Pdf The PPT/PDF document "PEWA:Patch-basedExponentiallyWeightedAgg..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

PEWA:Patch-basedExponentiallyWeightedAgg
PEWA:Patch-basedExponentiallyWeightedAggregationforimagedenoisingCharlesKervrannInriaRennes-BretagneAtlantiqueSerpicoProject-TeamCampusUniversitairedeBeaulieu,35042RennesCedex,Francecharles.kervrann@inria.frAbstractPatch-basedmethodshavebeenwidelyusedfornoisereductioninrecentyears.Inthispaper,weproposeageneralstatisticalaggregationmethodwhichcombinesimagepatchesdenoisedwithseveralcommonly-usedalgorithms.Weshowthatweaklydenoisedversionsoftheinputimageobtainedwithstandardmethods,canservetocomputeanefcientpatch-basedaggregatedestimator.Inourapproach,weevaluatetheStein'sUnbiasedRiskEstimator(SURE)ofeachdenoisedcan-didateimagepatchandusethisinformationtocomputetheexponentialweightedaggregation(EWA)estimator.Theaggregationmethodisexibleenoughtocom-bineanystandarddenoisingalgorithmandhasaninterpretationwithGibbsdistri-bution.Thedenoisingalgorithm(PEWA)isbasedonaMCMCsamplingandisabletoproduceresultsthatarecomparabletothecurrentstate-of-the-art.1IntroductionSeveralmethodshavebeenproposedtosolvetheimagedenoisingproblemincludinganisotropicdiffusion[15],frequency-basedmethods[26],BayesianandMarkovRandomFieldsmethods[20],locallyadaptivekernel-basedmethods[17]andsparserepresentation[10].Theobjectiveistoesti-mateacleanimagegenerallyassumedtobecorruptedwithadditivewhiteGaussian(AWG)noise.Inrecentyears,state-of-the-artresultshavebeenconsiderablyimprovedandthetheoreticallim-itsofdenoisingalgorithmsarecurrentlydiscussedintheliterature[4,14].Themostcompetitivemethodsaremostlypatch-basedmethods,suchasBM3D[6],LSSC[16],EPLL[28],NL-Bayes[12],inspiredfromtheN(on)L(ocal)-means[2].IntheNL-meansmethod,eachpatchisreplacedbyaweightedmeanofthemostsimilarpatchesfoundinthenoisyinputimage.BM3Dcombinesclusteringofnoisypatches,DCT-basedtransformandshrinkageoperationtoachievethecurrentstate-of-the-artresults[6].PLOW[5],S-PLE[24]andNL-Bayes[12],fallinginthesamecate-goryoftheso-calledinternalmethods,areabletoproduceverycomparableresults.UnlikeBM3D,covariancesmatricesofclusterednoisypatchesareempiricallyestimatedtocomputeaMaximumAPosteriori(MAP)oraMinimum-Mean-Squared-Error(MMSE)estimate.Theaforementionedalgorithmsneedtwoiterations[6,12,18]andtheperformancesaresurprisinglyveryclosetothestate-of-the-artinaveragewhilethemotivationandthemodelingframeworksarequitedifferent.Inthispaper,theproposedPatch-basedExponentialWeightedAggregation(PEWA)algorithm,requir-ingnopatchclustering,achievesalsothestate-of-the-artresults.Asecondcategoryofpatch-basedexternalmethods(e.g.FoE[20],EPLL[28],MLP[3])hasbeenalsoinvestigated.Theprincipleistoapproximatethenoisypatchesusingasetofpatchesofanexternallearneddictionary.Thestatisticsofanoise-freetrainingsetofimagepatches,serveaspriorsfordenoising.EPLLcomputesapriorfromamixtureofGaussianstrainedwithadatabaseofcleanimagepatches[28];denoisingisthenperformedbymaximizingtheso-calledExpectedPatchLogLikelihood(EPLL)criteriausinganoptimizationalgorithm.Inthislineofwork,amulti-1layerperceptron(MLP)procedureexploitingatrainingsetofnoisyandno

ise-freepatcheswasabletoachievethestate-
ise-freepatcheswasabletoachievethestate-of-the-artperformance[3].Nevertheless,thetrainingprocedureisdedicatedtohandleaxednoiselevelandthedenoisingmethodisnotexibleenough,especiallyforrealapplicationswhenthesignal-to-noiseratioisnotknown.Recently,thesimilarityofpatchpairsextractedfromtheinputnoisyimageandfromcleanpatchdatasethasbeenstudiedin[27].Theauthorsobservedthatmorerepetitionsarefoundinthesamenoisyimagethaninacleanimagepatchdatabaseofnaturalimages;also,itisnotnecessarytoexaminepatchesfarfromthecurrentpatchtondgoodmatching.Whiletheexternalmethodsareattractive,computationisnotalwaysfeasiblesinceaverylargecollectionofcleanpatchesarerequiredtodenoiseallpatchesintheinputimage.Otherauthorshavepreviouslyproposedtolearnadictionaryonthenoisyimage[10]ortocombineinternalandexternalinformation(LSSC)[16].Inthispaper,wefocusoninternalmethodssincetheyaremoreexibleforrealapplicationsthanexternalmethods.Theyarelesscomputationallydemandingandremainthemostcompetitive.Ourapproachconsistsinestimatinganimagepatchfrom“weakly”denoisedimagepatchesintheinputimage.Weconsiderthegeneralproblemofcombiningmultiplebasicestimatorstoachieveanestimationaccuracynotmuchworsethanthatofthe“best”singleestimatorinsomesense.Thisproblemisimportantforpracticalapplicationsbecausesingleestimatorsoftendonotperformaswellastheircombinations.ThemostimportantandwidelystudiedaggregationmethodthatachievestheoptimalaverageriskistheExponentialWeightedAggregation(EWA)algorithm[13,7,19].Salmon&LePennechavealreadyinterpretedtheNL-meansasaspecialcaseoftheEWAprocedurebuttheresultsoftheextendedversiondescribedin[21]weresimilarto[2].Ourestimatorcombinationisthenachievedthroughatwo-stepprocedure,wheremultipleestimatorsarerstcomputedandarethencombinedinasecondseparatecomputingstep.Weshallseethattheproposedmethodcanbethoughtasaboostingprocedure[22]sincetheperformanceofthepre-computedestimatorsinvolvedintherststepareratherpoor,bothvisuallyandintermsofpeaksignal-to-noiseratio(PSNR).Ourcontributionsarethefollowingones:1.Weshowthat“weak”denoisedversionsoftheinputnoisyimagescanbecombinedtogetaboostedestimator.2.AspatialBayesianpriorandaGibbsenergyenabletoselectgoodcandidatepatches.3.WeproposeadedicatedMonteCarloMarkovChain(MCMC)samplingproceduretocom-puteefcientlythePEWAestimator.TheexperimentalresultsarecomparabletoBM3D[6]andthemethodisimplementedefcientlysinceallpatchescanbeprocessedindependently.2Patch-basedimagerepresentationandSUREestimationFormally,werepresentan-dimensionalimagepatchatlocationx2XR2asavectorf(x)2Rn.Wedenetheobservationpatchv(x)2Rnas:v(x)=f(x)+"(x)where"(x)N(0;2Inn)representstheerrors.Weareinterestedinanestimatorbf(x)off(x)assumedtobeindependentoff(x)thatachievesasmallL2risk.WeconsidertheStein'sUnbiasedRiskEstimatorR(bf(x))=kv(x)�bf(x)k2n�n2intheMeanSquareErrorsensesuchthatE[R(bf(x))]=E[kf(x)�bf(x)k2n](Edenotesthemath-ematicalexpectation).SUREhasbeenalreadyinvestigatedforim

agedenoisingusingNL-means[23,9,22,24]and
agedenoisingusingNL-means[23,9,22,24]andforimagedeconvolutionin[25].3AggregationbyexponentialweightsAssumeafamilyff(x);2goffunctionssuchthatthemapping!f(x)ismeasurableand=f1;;Mg.Functionsf(x)canbeviewedassomepre-computedestimatorsoff(x)or“weak”denoisersindependentofobservationsv(x),andconsideredasfrozeninthefollowing.ThesetofMestimatorsisassumedtobeverylarge,thatiscomposedofseveralhundredsofthousands2ofcandidates.Inthispaper,weconsideraggregatesthatareweightedaveragesofthefunctionsinthesetff(x);2gwithsomedata-dependentweights:bf(x)=MX=1w(x)f(x)suchthatw(x)0andMX=1w(x)=1:(1)Assuggestedin[19],wecanassociatetwoprobabilitymeasuresw(x)=fw1(x);;wM(x)gand(x)=f1(x);;M(x)gonf1;;MgandwedenetheKullback-Leiblerdivergenceas:DKL(w(x);(x))=MX=1w(x)logw(x)(x):(2)Theexponentialweightsareobtainedasthesolutionofthefollowingoptimizationproblem:bw(x)=argminw(x)2RM(MX=1w(x)(R(f(x)))+ DKL(w(x);(x)))subjectto(1)(3)where �0and(z)isafunctionofthefollowingform(z)=jzj.FromtheKarush-Kuhn-Tuckerconditions,theuniqueclosed-formsolutionisw(x)=exp(�(R(f(x)))= )(x)PM0=1exp(�(R(f0(x)))= )0(x);(4)where canbeinterpretedasa“temperature”parameter.Thisestimatorsatisesoracleinequalitiesofthefollowingform[7]:E[R(bf(x))]minw(x)2RM(MX=1w(x)(R(f(x)))+ DKL(w(x);(x))):(5)Theroleofthedistributionistoputapriorweightonthefunctionsintheset.Whenthereisnopreference,theuniformpriorisacommonchoicebutotherchoicesarepossible(see[7]).Intheproposedapproach,wedenethesetofestimatorsasthesetofpatchestakenindenoisedversionsoftheinputimagev.Thenextquestionistodevelopamethodtoefcientlycomputethesumin(1)sincethecollectioncanbeverylarge.ForatypicalimageofN=512512pixels,wecouldpotentiallyconsiderM=LNpre-computedestimatorsifweapplyLdenoiserstotheinputimagev.4PEWA:Patch-basedEWAestimatorSupposethatwearegivenalargecollectionofMcompetingestimators.Thesebasisestimatorscanbechosenarbitrarilyamongtheresearchersfavoritedenoisingalgorithm:Gaussian,Bilateral,Wiener,DiscreteCosineTransformorothertransform-basedlterings.LetusemphasizeherethatthenumberofbasicestimatorsMisnotexpectedtogrowandistypicallyverylarge(Mischosenontheorderofseveralhundredsofthousands).Inaddition,theessentialideaisthatthesebasicestimatorsonlyslightlyimprovethePSNRvaluesofafewdBs.Letusconsideru`;`=1;;Ldenoisedversionsofv.Agivenpre-computedpatchestimatorf(x)isthenan-dimensionalpatchtakeninthedenoisedimageu`atanylocationy2X,inthespiritoftheNL-meansalgorithmwhichconsidersonlythenoisyinputpatchesfordenoising.Theproposedestimatoristhenmoregeneralsinceasetofdenoisedpatchesatagivenlocationareused.Ourestimatoristhenofthefollowingformifwechoo

se(z)=jzj:bf(x)=1Z(x)LX`=1Xy2Xe�
se(z)=jzj:bf(x)=1Z(x)LX`=1Xy2Xe�jR(u`(y))j= `(y)u`(y);Z(x)=LX`0=1Xy02Xe�jR(u`0(y0))j= `0(y)(6)whereZ(x)isanormalizationconstant.Insteadofconsideringauniformprioroverthesetofdenoisedpatchestakeninthewholeimage,itisappropriatetoencouragepatcheslocatedinthe3neighborhoodofx[27].ThiscanbeachievedbyintroducingaspatialGaussianpriorG(z)/e�z2=(22)inthedenitionasbfPEWA(x)=1Z(x)LX`=1Xy2Xe�jR(u`(y))j= G(x�y)u`(y):(7)TheGaussianpriorhasasignicantimpactontheperformanceoftheEWAestimator.Moreover,thepracticalperformanceoftheestimatorstronglyreliesonanappropriatechoiceof .Thisimportantquestionhasbeenthoroughlydiscussedin[13]and =42ismotivatedbytheauthors.Finally,ourpatch-basedEWA(PEWA)estimatorcanbewrittenintermsofenergiesandGibbsdistributionsas:bfPEWA(x)=1Z(x)LX`=1Xy2Xe�E(u`(y))u`(y);Z(x)=LX`0=1Xy02Xe�E(u`0(y0));(8)E(u`(y))=jkv(x)�u`(y)k2n�n2j42+kx�yk2222:Thesumsin(8)cannotbecomputed,especiallywhenweconsideralargecollectionofestimators.Inthatsense,itdiffersfromtheNL-meansmethods[2,11,23,9]whichexploitspatchesgenerallytakeninaneighborhoodofxedsize.Instead,weproposeaMonte-CarlosamplingmethodtoapproximatelycomputesuchanEWAwhenthenumberofaggregatedestimatorsislarge[1,19].4.1Monte-CarlosimulationsforcomputationBecauseofthehighdimensionalityoftheproblem,weneedefcientcomputationalalgorithms,andthereforewesuggestastochasticapproachtocomputethePEWAestimator.Letuscon-siderarandomprocess(Fn(x))n0consistinginaninitialnoisypatchF0(x)=v(x).Thepro-posedMonte-CarloprocedurerecommendedtocomputetheestimatorisbasedonthefollowingMetropolis-Hastingsalgorithm:Drawapatchbyconsideringatwo-stagedrawingprocedure:drawuniformlyavalue`inthesetf1;2;;Lg.drawapixely=yc+ ;y2X;with N(0;I222)andycisthepositionofthecurrentpatch.Attheinitializationyc=x.DeneFn+1(x)as:Fn+1(x)=u`(y)if e�E(u`(y));Fn(x))Fn(x)otherwise(9)where isarandomvariable: U[0;1]andE(u`(y);Fn(x))4=E(u`(y))�E(Fn(x)).IfweassumetheMarkovchainisergodic,homogeneous,reductible,reversibleandstationary,foranyF0(x),wehavealmostsurelylimT!+11T�TbTXn=TbFn(x)bfPEWA(x)(10)whereTisthemaximumnumberofsamplesoftheMonte-Carloprocedure.Itisalsorecommendedtointroduceaburn-inphasetogetamoresatisfyingestimator.Hence,therstTbsamplesarediscardedintheaverageTheMetropolis-HastingsruleallowsreversibilityandthenstationarityoftheMarkovchain.Thechainisirreduciblesinceitispossibletoreachanypatchinthesetofpossibleconsideredpatches.TheconvergenceisensuredwhenTtendstoinnity.Inpractice,TisassumedtobehightogetareasonableapproximationofbfPEWA(x).Inourimplementation,wesetT1000andTb=250toproducefastandsatisfyingresults.Toimproveconvergencespeed,wecanuseseveralchainsinsteadofonlyone[21].IntheMetropolis-Hastingsdynamics,somepatchesaremorefrequentlyselectedthanothersatagivenlocation.Thenumberofoccurrencesofaparticularcandidatepa

tchcanbethenevaluated.Inconstantimageare
tchcanbethenevaluated.Inconstantimageareas,thereisprobablynopreferenceforanyonepatchoveranyotherandalownumberofcandidatepatchesisexpectedalongimagecontoursanddiscontinuities.44.2PatchoverlappinganditerationsThenextstepistoextendthePEWAprocedureateverypositionoftheentireimage.Toavoidblockeffectsatthepatchboundaries,weoverlapthepatches.Asaresult,forthepixelslyingintheoverlappingregions,weobtainmultipleEWAestimates.Thesecompetingestimatesmustbefusedoraggregatedintothesinglenalestimate.ThenalaggregationcanbeperformedbyaweightedaverageofthemultipleEWAestimatesassuggestedin[21,5,22].Thesimplestmethodofaggregatingsuchmultipleestimatesistoaveragethemusingequalweights.SuchuniformaveragingprovidedthebestresultsinourexperimentsandamountstofusingnindependentMarkovchains.Theproposedimplementationproceedsintwoidenticaliterations.Attherstiteration,theesti-mationisperformedusingseveraldenoisedversionsofthenoisyimage.Attheseconditeration,therstestimatorisusedasanadditionaldenoisedimageintheproceduretoimprovelocallytheestimationasin[6,12].TheseconditerationimprovesthePSNRvaluesintherangeof0.2to0.5dBasdemonstratedbytheexperimentspresentedinthenextsection.Notethattherstiterationisabletoproduceverysatisfyingresultsforlowandmediumlevelsofnoise.Inpracticalimaging,weusethemethoddescribedin[11]toestimatethenoisevariance2forreal-worldnoisyimages.5ExperimentalresultsWeevaluatedthePEWAalgorithmon25naturalimagesshowingnatural,man-made,indoorandoutdoorscenes(seeFig.1).EachoriginalimagewascorruptedwithwhiteGaussiannoisewithzeromeanandvariance2.Inourexperiments,thebestresultsareobtainedwithn=77patchesandL=4imagesuldenoisedwithDCT-basedtransform[26];weconsiderthreedifferentDCTshrinkagethresholds:1:25;1:5and1:75toimprovethePSNRof1to6dbatmost,dependingonandimages(seeFigs.2-3).Thefourthimageisthenoisyinputimageitself.WeevaluatedthealgorithmwithalargernumberLofdenoisedimagesandthequalitydropsby0.1dbto0.3db,whichisvisuallyimperceptible.IncreasingLsuggestalsotoconsideringmorethan1000samplessincethespaceofcandidatepatchesislarger.Thepriorneighborhoodsizecorrespondstoadiskofradius=7pixelsbutitcanbesmaller.PerformancesofPEWAandothermethodsarequantiedintermsofPSNRvaluesforseveralnoiselevels(seeTables1-3).Table1reportstheresultsobtainedwithPEWAoneachindividualimagefordifferentvaluesofstandarddeviationofnoise.Table2comparestheaveragePSNRvaluesonthese25imagesobtainedbyPEWA(after1and2iterations)andtwostate-of-the-artdenoisingmethods[6,12].Weusedtheimplementationsprovidedbytheauthors:BM3D(http://www.cs.tut./˜foi/GCF-BM3D/)andNL-Bayes(www.ipol.im).ThebestPSNRvaluesareinboldandtheresultsarequan-titativelyquitecomparableexceptforveryhighlevelsofnoise.WecomparedPEWAtothebaselineNL-means[2]andDCT[26](usingtheimplementationofwww.ipol.im)sincetheyformthecoreofPEWA.ThePSNRvaluesincreasesof1.5dband1.35dbonaverageoverNL-meansandDCTre-spectively.Finally,wecomparedtheresultstotherecentS-PLEmethodwhichusesSUREtoguid

etheprobabilisticpatch-basedlteringd
etheprobabilisticpatch-basedlteringdescribedin[24].Figure2showsthedenoisingresultsonthenoisyValdemossa(=15),Man(=20)andCastle(=25)imagesdenoisedwithBM3D,NL-BayesandPEWA.Visualqualityofmethodsiscomparable.Table3presentsthedenoisingresultswithPEWAifthepre-computedestimatorsareobtainedwithaWienerltering(spatialdomain1)andDCT-basedtransform[26].TheresultsofPEWAwith55or77patchesarealsogiveninTable3,foroneandtwoiterations.NotethatNL-meanscanbeconsideredasaspecialcaseoftheproposedmethodinwhichtheoriginalnoisypatchesconstitutethesetof“weak”estimators.TheMCMC-basedprocedurecanbethenconsideredasanalternativeproceduretotheusualimplementationofNL-meanstoacceleratesummation.Accordingly,inTable3weaddedafaircomparison(77patches)withtheimplementationofNL-meansalgorithm(IPOL(ipol.im))whichrestrictsthesearchofsimilarpatchesinaneighborhoodof2121pixels.Intheseexperiments,“PEWAbasic”(1iteration)producedbetterresultsespeciallyfor10.Finallywecomparedtheseresultswiththemostpopularandcompetitivemethodsonthesameimages.ThePSNRvaluesareselectedfrompublicationscitedintheliterature.LSSCandBM3Darethemost1u`(x)=mean(v(x))+max0;var(v(x))�a`2var(v(x))(v(x)�mean(v(x))),where`=f1;2;3ganda1=0:15;a2=0:20;a3=0:25.5cameraman(256256)peppershouseLenabarbara(256256)(256256)(512512)(512512)boatmancouplehill(512512)(512512)(512512)(512512)alleycomputerdiceowers(192128)(704469)(704469)(704469)girltrafctreesvalldemossa(704469)(704469)(192128)(769338)mayaasia(313473)(313473)aircraftpanther(473313)(473313)castleyoungman(313473)(313473)tigermanonwallpicture(473313)(473313)Figure1:Setof25testedimages.Topleft:imagesfromtheBM3Dwebsite(cs.tut./˜foi/GCF-BM3D/);Bottomleft:imagesfromIPOL(ipol.im);Right:imagesfromtheBerkeleysegmentationdatabase(eecs.berkeley.edu/Research/Projects/CS/vision/bsds/).performantbutPEWAisabletoproducebetterresultsonseveralpiecewisesmoothimageswhileBM3Dismoreappropriatefortexturedimages.Intermsofcomputationalcomplexity,denoisinga512512grayscaleimagewithanunoptimizedimplementationofourmethodinC++takeabout2mins(IntelCorei764-bitCPU2.4Ghz).Re-cently,PEWAhasbeenimplementedinparallelsinceeverypatchcanbeprocessedindependentlyandthecomputationaltimesbecomeafewseconds.6ConclusionWepresentedanewgeneraltwo-stepdenoisingalgorithmbasedonnon-localimagestatisticsandpatchrepetition,thatcombinesideasfromthepopularNL-means[6]andBM3Dalgorithms[6]andtheoreticalresultsfromthestatisticalliteratureonExponentiallyWeightedAggregation[7,21].TherststepofPEWAinvolvesthecomputationofdenoisedimagesobtainedwithaseparatecollec-tionofmultipledenoisers(Wiener,DCT...)appliedtotheinputimage.Inthesecondstep,thesetofdenoisedimagepatchesareselectivelyexploitedtocomputeanaggregatedestimator.WeshowedthattheestimatorcanbecomputedinreasonabletimeusingaMonte-CarloMarkovChain

(MCMC)samplingprocedure.IfweconsiderDCT-
(MCMC)samplingprocedure.IfweconsiderDCT-basedtransform[6]intherststep,theresultsarecomparableinaveragetothestate-of-the-artresults.ThePEWAmethodgeneralizestheNL-meansalgorithminsomesensebutsharealsocommonfeatureswithBM3D(e.g.DCTtransform,two-stagecollaborativeltering).tches,contrarytoNL-BayesandBM3D.Forfuturework,wavelet-basedtransform,multipleimagepatchsizes,robuststatisticsandsparsepriorswillbeinvestigatedtoimprovetheresultsoftheexiblePEWAmethod.6noisy(PSNR=24.61)PEWA(PSNR=29.25)BM3D[6](PSNR=29.19)NL-Bayes[12](PSNR=29.22)Figure2:Comparisonofalgorithms.ValldemossaimagecorruptedwithwhiteGaussiannoise(=15).ThePSNRvaluesofthethreeimagesdenoisedwithDCT-basedtransform[26]arecombinedwithPEWAare27.78,27.04and26.26.)noisyPEWABM3D[6]NL-Bayes[12](PSNR=20.18)(PSNR=29.49)(PSNR=29.36)(PSNR=29.48)noisyPEWABM3D[6]NL-Bayes[12](PSNR=22.11)(PSNR=30.50)(PSNR=30.59)(PSNR=30.60)Figure3:Comparisonofalgorithms.Firstrow:CastleimagecorruptedwithwhiteGaussiannoise(=25).ThePSNRvaluesofthethreeimagesdenoisedwithDCT-basedtransform[26]andcombinedwithPEWAare25.77,24.26and22.85.Secondrow:ManimagecorruptedwithwhiteGaussiannoise(=20).ThePSNRvaluesofthethreeimagesdenoisedwithDCT-basedtransform[26]andcombinedwithPEWAare27.42,26.00and24.67.7=5=10=15=20=25=50=100Cameraman38.2034.2331.9830.6029.4826.2522.81Peppers38.0034.6832.7531.4030.3026.6922.84House39.5636.4034.8633.7232.7729.2925.35Lena38.5735.7834.1232.9031.8928.8325.65Barbara38.0934.7332.8631.4330.2826.5822.95Boat37.1233.7531.9430.6429.6526.6423.63Man37.6833.9331.9330.5029.5026.6724.15Couple37.3533.9131.9830.5729.4826.0223.27Hill37.0133.5231.6930.5029.5626.9224.49Alley36.2932.2029.9828.5427.4624.1321.37Computer39.0435.1332.8131.2330.0126.3823.27Dice46.8243.8742.0540.5839.3635.3330.82Flowers43.4839.6737.4735.9034.5530.8127.53Girl43.9541.2239.5238.2737.3334.1430.50Trafc37.8533.5431.1329.5828.4825.5022.90Trees34.8829.9327.4925.8624.6921.7820.03Valldemossa36.6531.7929.2527.5926.3723.1820.71Aircraft37.5934.6233.0031.7530.7227.6824.99Asia38.6734.4632.2530.7329.6026.6324.32Castle38.0634.1332.0230.5629.4926.1523.09ManPicture37.7833.5831.2729.7328.4424.6521.50Maya34.7229.6427.1725.4224.2822.8518.17Panther38.5333.9131.5630.0228.8325.5922.75Tiger36.9232.8530.6329.1327.9924.6321.90Youngman40.7937.3635.5834.3033.2529.5925.20Average38.5434.7532.6731.2630.1526.9523.76Table1:Denoisingresultsonthe25testedimagesforseveralvaluesof.ThePSNRvaluesareaveragedover3experimentscorrespondingto3differentnoiserealizations.=5=10=15=20=25=50=100PEWA138.2734.3932.2630.7629.6226.0022.35PEWA238.5434.7532.6731.2630.1526.9523.76BM3D[6]38.6434.7832.6831.2530.1926.9724.08NL-Bayes[12]38.6034.7532.4831.2230.1226.9023.65S-PLE[24]38.1734.3832.3530.6729.7726.4623.21NL-means[2]37.4433.3531.0030.1628.9625.5322.29DCT[26]37.8133.5731.8729.9528.9725.9123.08Table2:Averageofdenois

ingresultsoverthe25testedimagesforsevera
ingresultsoverthe25testedimagesforseveralvaluesof.Theexperi-mentswithNL-Bayes[12],S-PLE[24],NL-means[2]andDCT[26]havebeenperformedusingtheusingtheimplementationofIPOL(ipol.im).ThebestPSNRvaluesareinbold.ImagePeppersHouseLenaBarbara(256256)(256256)(512512)(512512)5.0015.0025.0050.005.0015.0025.0050.005.0015.0025.0050.005.0015.0025.0050.00PEWA1(W)(55)36.6930.5827.5022.8537.8931.8828.5523.4937.2731.4328.3023.4536.3930.1829.3122.71PEWA2(W)(55)37.4532.2029.7226.0938.9834.2732.1328.3538.0533.4031.1127.8037.1331.9429.4725.58PEWA1(W)(77)36.7230.6027.6022.8237.9031.9028.5923.5237.2631.4528.3323.4536.4030.1827.3222.71PEWA2(W)(77)37.3432.3430.1126.5339.0034.5732.5129.0438.0033.6531.5628.4037.0032.1030.0026.20PEWA1(D)(55)37.7032.4529.8326.0139.2834.2331.7927.7238.4633.7231.3327.5937.7132.2029.5525.58PEWA2(D)(55)37.9532.8030.2026.6639.4634.7431.6729.1538.5733.9631.8128.4338.0332.7030.0326.01PEWA1(D)(77)37.7132.4329.8726.0039.2734.2631.7927.7138.4533.7231.2527.6237.7032.3029.8426.20PEWA2(D)(77)38.0032.7530.3026.6939.5634.8332.7729.2938.5834.1231.8928.8338.0932.8630.2826.58PEWABasic(77)36.8831.3429.4726.0237.8834.1332.1428.2537.3933.2631.2027.9236.8031.8929.7625.83NL-means[2](77)36.7730.9328.7624.2437.7532.3631.1127.5436.6532.0030.4527.3236.7930.6528.9925.63BM3D[6]38.1232.7030.1626.6839.8334.9432.8629.6938.7234.2732.0829.0538.3133.1130.7227.23NL-Bayes[12]38.0932.2629.7926.1039.3933.7731.3627.6238.7533.5131.1627.6238.3832.4730.0226.45ND-SAFIR[11]37.3432.1329.7325.2937.6234.0832.2228.6737.9133.7031.7328.3837.1231.8029.2424.09K-SVD[10]37.8032.2329.8126.2439.3334.1931.9728.0138.6333.7631.3527.8538.0832.3329.5425.43LSSC[16]38.1832.8230.2126.6239.9335.3533.1530.0438.6934.1531.8728.8738.4833.0030.4727.06PLOW[5]37.6931.8229.5326.3239.5234.7232.7029.0838.6633.9031.9228.3237.9821.1730.2026.29SOP[18]37.6332.4030.0126.7538.7634.3532.5429.6438.3133.8431.8028.9637.7432.6530.3727.35Table3:ComparisonofseveralversionsofPEWA(W(Wiener),D(DCT),Basic)andcompetitivemethodsonafewstandardimagescorruptedwithwhiteGaussiannoise.ThebestPSNRvaluesareinbold(PSNRvaluesfrompublicationscitedintheliterature).8References[1]Alquier,P.,Lounici,K.(2011)PAC-Bayesianboundsforsparseregressionestimationwithexponentialweights.ElectronicJournalofStatistics5:127-145.[2]BuadesA.,Coll,B.&Morel,J.-M.(2005)Areviewofimagedenoisingalgorithms,withanewone.SIAMJ.MultiscaleModeling&Simulation,4(2):490-530.[3]Burger,H.,Schuler,C.&Harmeling,S.(2012)Imagedenoising:canplainneuralnetworkscompetewithBM3D?InIEEEConf.Comp.Vis.Patt.Recogn.(CVPR'12),pp.2392-2399,Providence,RhodesIsland.[4]Chatterjee,P.&Milanfar,P.(2010)Isdenoisingdead?,IEEETransactionsonImageProcessing,19(4):895-911.[5]Chatterjee,P.&Milanfar,P.(2012)Patch-basednear-optimalimagedenoising,IEEETransactionsonImageProcessing,21(4):1635-1649.[6]Dabov,K.,Foi,A.,Katkovnik,V.&Egiazarian,K.(2007)Imagedenoisingbysparse3Dtransform-domaincollabora

tiveltering,IEEETransactionsonImageP
tiveltering,IEEETransactionsonImageProcessing,16(8):2080-2095.[7]Dalayan,A.S.&Tsybakov,A.B.(2008)Aggregationbyexponentialweighting,sharpPAC-Bayesianboundsandsparsity.MachineLearning72:39-61.[8]Dalayan,A.S.&Tsybakov,A.B.(2009)SparseregressionlearningbyaggregationandLangevinMonteCarlo.AvailableatArXiv:0903.1223[9]Deledalle,C.-A.,Duval,V.,Salmon,J.(2012)Non-localmethodswithshape-adaptivepatches(NLM-SAP).J.MathematicalImagingandVision,43:103-120.[10]Elad,M.&Aharon,M.(2006)Imagedenoisingviasparseandredundantrepresentationsoverlearneddictionaries.IEEETransactionsonImageProcessing,15(12):3736-3745.[11]Kervrann,C.&Boulanger,J.(2006)Optimalspatialadaptationforpatch-basedimagedenoising.IEEETranscationsonImageProcessing,15(10):2866-2878.[12]Lebrun,M.,Buades,A.&Morel,J.-M.(2013)Implementationofthe“Non-LocalBayes”(NL-Bayes)imagedenoisingalgorithm,ImageProcessingOnLine,3:1-42.http://dx.doi.org/10.5201/ipol.2013.16[13]Leung,G.&Barron,A.R.(2006)Informationtheoryandmixingleast-squaresregressions.IEEETrans-actionsonInformationTheory52:3396-3410.[14]Levin,A.,Nadler,B.,Durand,F.&Freeman,W.T.(2012)Patchcomplexity,nitepixelcorrelationsandoptimaldenoising.Europ.Conf.Comp.Vis.(ECCV'12),pp.73-86,Firenze,Italy.[15]Louchet,C.&Moisan,L.(2011)TotalVariationasalocallter.SIAMJ.ImagingSciences4(2):651-694.[16]Mairal,J.,Bach,F.,Ponce,J.,Sapiro,G.&.Zisserman,A.(2009)Non-localsparsemodelsforimagerestoration.InIEEEInt.Conf.Comp.Vis.(ICCV'09),pp.2272-2279,Tokyo,Japan.[17]Milanfar,P.(2013)ATourofmodernimageltering.IEEESignalProcessingMagazine,30(1):106-128.[18]Ram,I.,Elad,M.&Cohen,I.(2013)Imageprocessingusingsmoothorderingofitspatches.IEEETransactionsonImageProcessing,22(7):2764–2774[19]Rigollet,P.&Tsybakov,A.B.(2012)Sparseestimationbyexponentialweighting.StatisticalScience27(4):558-575.[20]Roth,S.&Black,M.J.(2005)Fieldsofexperts:Aframeworkforlearningimagepriors.InIEEEConf.Comp.Vis.Patt.Recogn.(CVPR'05),vol.2,pp.860-867,SanDiego,CA.[21]Salmon,J.&LePennec,E.(2009)NL-Meansandaggregationprocedures.InIEEEInt.Conf.ImageProcess.(ICIP'09),pp.2977-2980,Cairo,Egypt.[22]Talebi,H.,Xhu,X.&Milanfar,P.(2013)HowtoSAIF-lyboostdenoisingperformance.InIEEETrans-actionsonImageProcessing22(4):1470-1485.[23]VanDeVille,D.&Kocher,M.(2009)SUREbasednon-localmeans.IEEESignalProcessingLetters,16(11):973-976,2009.[24]Wang,Y.-Q.&Morel,J.-M.(2013).SUREguidedGaussianmixtureimagedenoising.SIAMJ.ImagingSciences,6(2):999-1034.[25]Xue,F.,Luisier,F.&BluT.(2013)Multi-wienerSURE-LETdeconvolution.IEEETransactionsonImageProcessing,22(5):1954-1968.[26]YuG.&SapiroG.(2011).DCTimagedenoising:asimpleandeffectiveimagedenoisingalgorithm.ImageProcessingOnLine(http://dx.doi.org/10.5201/ipol.2011.ys-dct).[27]Zontak,M.&Irani,M.(2011)Internalstatisticsofasinglenaturalimage.InIEEEComp.Vis.Patt.Recogn.(CVPR'11),pp.977-984,ColoradoSprings,CO.[28]Zoran,D.&Weiss,Y.(2011)Fromlearningmodelsofnaturalimagepatchestowholeimagerestoration.InIEEEInt.Conf.Comp.Vis.(ICCV'11),pp.479-486,Barcelona,Spain