/
Confused,Timid,andUnstable:PickingaVideoStreamingRateisHardTe-YuanHuan Confused,Timid,andUnstable:PickingaVideoStreamingRateisHardTe-YuanHuan

Confused,Timid,andUnstable:PickingaVideoStreamingRateisHardTe-YuanHuan - PDF document

cheryl-pisano
cheryl-pisano . @cheryl-pisano
Follow
383 views
Uploaded On 2015-11-16

Confused,Timid,andUnstable:PickingaVideoStreamingRateisHardTe-YuanHuan - PPT Presentation

0 100 200 300 400 500 600 700 800Time s 0 1000 2000 3000 4000 5000kbs 235375560750105014001750 Video Playback Rate Competing Flows Figure1ServiceAAvideostartsstreamingat175Mbsovera5Mbsnetwor ID: 195280

0 100 200 300 400 500 600 700 800Time (s) 0 1000 2000 3000 4000 5000kb/s 235375560750105014001750 Video Playback Rate Competing Flow's Figure1:(ServiceA)Avideostartsstreamingat1.75Mb/sovera5Mb/snetwor

Share:

Link:

Embed:

Download Presentation from below link

Download Pdf The PPT/PDF document "Confused,Timid,andUnstable:PickingaVideo..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Confused,Timid,andUnstable:PickingaVideoStreamingRateisHardTe-YuanHuangNikhilHandigolBrandonHellerNickMcKeownRameshJohariStanfordUniversity{huangty,nikhilh,brandonh,nickm,ramesh.johari}@stanford.eduABSTRACTToday'scommercialvideostreamingservicesusedynamicrateselectiontoprovideahigh-qualityuserexperience.MostserviceshostcontentonstandardHTTPserversinCDNs,sorateselectionmustoccurattheclient.Wemeasurethreepopularvideostreamingservices{Hulu,Net\rix,andVudu{and ndthataccurateclient-sidebandwidthestimationabovetheHTTPlayerishard.Asaresult,rateselectionbasedoninaccurateestimatescantriggerafeedbackloop,leadingtoundesirablyvariableandlow-qualityvideo.Wecallthisphenomenonthedownwardspirale ect,andwemeasureitonallthreeservices,presentinsightsintoitsrootcauses,andvalidateinitialsolutionstopreventit.CategoriesandSubjectDescriptorsC.2.0[ComputerSystemsOrganization]:Computer-CommunicationNetworks|General;C.4[PerformanceofSystems]:[Measurementtechniques]GeneralTermsMeasurementKeywordsHTTP-basedVideoStreaming,VideoRateAdaptation1.INTRODUCTIONVideostreamingisahugeandgrowingfractionofInter-nettrac,withNet\rixandYoutubealoneaccountingforover50%ofthepeakdownloadtracintheUS[18].Sev-eralbigvideostreamingservicesrunoverHTTPandTCPe.g.Hulu,Net\rix,Vudu,YouTube)andstreamdatatotheclientfromoneormorethird-partycommercialCDNs(e.g.Akamai,Level3orLimelight).StreamingoverHTTPhasseveralbene ts:ItisstandardizedacrossCDNs(allowingaportablevideostreamingservice),itiswell-established(whichmeanstheCDNshavealreadymadesureservicecanPermissiontomakedigitalorhardcopiesofallorpartofthisworkforpersonalorclassroomuseisgrantedwithoutfeeprovidedthatcopiesarenotmadeordistributedforprotorcommercialadvantageandthatcopiesbearthisnoticeandthefullcitationontherstpage.Tocopyotherwise,torepublish,topostonserversortoredistributetolists,requirespriorspecicpermissionand/orafee.November14–16,2012,Boston,Massachusetts,USA.Copyright2012ACM978-1-4503-1705-4/12/11...$15.00. 0 100 200 300 400 500 600 700 800Time (s) 0 1000 2000 3000 4000 5000kb/s 235375560750105014001750 Video Playback Rate Competing Flow's Figure1:(ServiceA)Avideostartsstreamingat1.75Mb/sovera5Mb/snetwork.After395seconds,asecond\rowstarts(fromthesameserver).Thevideocouldstreamat1.75Mb/s(givenitsfairshareof2.5Mb/s),butinsteaddropsdownto235kb/s.reachthroughNATstoend-hosts),andcheap(theserviceissimple,commoditized,andtheCDNscompeteonprice).Thesebene tshavemadepossiblethehugegrowthina ord-able,high-qualitymovieandTVstreaming,forourviewingdelight.WhenvideoisstreamedoverHTTP,thevideoserviceproviderreliesonTCPto ndtheavailablebandwidthandchooseavideorateaccordingly.Forexample,ifaclientestimatesthatthereis1.5Mb/savailableinthenetwork,itmightrequesttheservertostreamvideocompressedto1.3Mb/s(orthehighestvideorateavailableatorbelow1.5Mb/s).Thevideostreamingserviceprovidermustwalkatightrope:Iftheypickavideoratethatistoohigh,theviewerwillexperienceannoyingrebu eringevents;iftheypickastreamingratethatistoolow,theviewerwillex-periencepoorvideoquality.Inbothcases,theexperiencedegrades,andusermaytaketheirviewingelsewhere[9].Itisthereforeimportantforavideostreamingservicetoselectthehighestsafevideorate.Thispaperdescribesameasurementstudyofthreepop-ularHTTP-basedvideostreamingservices(Hulu,Net\rix,andVudu)toseehowwelltheypickthevideorate.Accord-ingtothelatestConsumerReports[21],Net\rixisthemostpopularvideostreamingproviderintheUnitedStates,while 0 100 200 300 400 500 600 700 800Time (s) 0 1000 2000 3000 4000 5000kb/s 235375560750105014001750 Video Playback Rate Competing Flow's Video Flow's (a)ServiceA.Networkbottlenecksetto5Mb/s. 0 100 200 300 400 500 600 700 800Time (s) 0 1000 2000 3000 4000 5000kb/s 65010001500200025003200 Video Playback Rate Competing Flow's Video Flow's Zero Window (b)ServiceB.Networkbottlenecksetto5Mb/s. 0 100 200 300 400 500 600 700Time (s) 0 5000 10000 15000 20000kb/s 3000450067509000 Video Flow's Competing Flow's Video Playback Rate (c)ServiceCHD.Networkbottlenecksetto22Mb/s. 0 100 200 300 400 500 600Time (s) 0 1000 2000 3000 4000 5000kb/s 100015002000 Video Flow's Competing Flow's Video (d)ServiceCSD.Networkbottlenecksetto5Mb/s.Figure4:Thedownwardspirale ectisvisibleinallthreeservices.ThevideoratesavailablefromeachofthethreeservicesaresummarizedinTable2;someplaybackratesmaynotbeavailableforsomevideos.3.3TheCompetingFlowsThecompeting\rowisaTCP\rowdoingalong ledown-load.Toeliminateanyunfairnessduetovariationsinnet-workpathproperties,weensurethatthecompeting\rowisservedbythesameCDN,andusually,bythesameserver.ForServiceAandServiceC,thecompeting\rowisgener-atedbyanopen-endedbyterangerequesttothe lewiththehighestrate.Further,weusetheDNScachetomakesurethatthecompeting\rowcomesfromsameterminationpoint(theserverortheload-balancer)asthevideo\row.ForServiceB,sincethe lesarestoredassmallsegments,anopen-endedrequestonlycreatesshort-lived\rows.In-stead,wegeneratethecompeting\rowbyrequestingtheFlashversionofthesamevideostoredinthesameCDN,usingrtmpdump[20]overTCP.4.THEDOWNWARDSPIRALEFFECTAllthreeservicessu erfromwhatwecallthe\downwardspirale ect"{adramaticanomalousdropinthevideoplay-backrateinthepresenceofacompetingTCP\row.TheproblemisstarklyvisibleinFigure4.Inallfourgraphs,thevideostreamstartsoutaloneandthencompeteswithanotherTCP\row.Assoonasthecompeting\rowstartsup,theclientmysteriouslypicksavideoplaybackratethatisfarbelowtheavailablebandwidth.Ourgoalistounderstandwhythishappens.Togaina rstinklingintowhatisgoingon,wecalculatetheupperboundofwhattheclientmightbelievetheinstan-taneousavailablebandwidthtobe,bymeasuringthearrivalrateofthelastvideosegment.Speci cally,wecalculatethethroughputupperboundasthesizeofareceivedvideoseg-mentdividedbythetimeittooktoarrive(thetimefromwhenthe rstbytearriveduntilthelastbytearrived),whichexcludestheinitialserverresponsetime.Inallofthegraphs,thevideoplaybackratechosenbytheclientisquitestronglycorrelatedwiththecalculatedthroughput.Aswewillsee, hereinliestheproblem:iftheclientisselectingthevideoratebasedonsomefunctionofthethroughputitperceived,andthethroughputissodi erentfromtheactualavailablebandwidth,thenitisnotsurprisingtheclientdoessuchapoorjob.Let'snowseewhatgoeswrongforeachserviceinturn.Foreaseofdiscussion,wewillusevideothroughputtorefertothethroughputaclientperceivedbydownloadingavideosegment.4.1ServiceAFigure4(a)showstheplaybackrateofaServiceAvideosessionalongwiththeclient'svideothroughputovertime.Startingout,thevideostreamistheonly\rowandtheclientrequeststhehighestvideorate(1750kb/s).Thecompeting\rowbeginsafter400seconds;thevideoratesteadilydropsuntilitreachesthelowestrate(235kb/s),anditstaystheremostofthetimeuntilthecompeting\rowstops.Intheory,both\rowsshouldbeabletostreamat2.5Mb/s(theirfairshareofthelink)andtheclientshouldcontinuetostreamat1750kb/s.Werepeatedtheexperiment76timesoverfourdays.In67cases(91%)thedownwardspiralhappens,andtheclientpickseitherthelowestrate,orbouncesbetweenthetwoorthreelowestrates.Injustsevencases(9%)wastheclientabletomaintainaplaybackrateabove1400kb/s.Toensureaccuracyandeliminateproblemsintroducedbycompeting\rowswithdi erentcharacteristics(e.g.TCP\rowswithdi erentRTTs),wemakethecompeting\rowrequestthesamevideo le(encodedat1750kb/s)fromthesameCDN.Unlikethevideo\row,thecompeting\rowisjustasimpleTCP ledownloadanditsdownloadspeedisonlydictatedbyTCPcongestioncontrolalgorithmandnotcappedbythevideoclient.Whydoesthroughputofthevideo\rowdropsomuchbe-lowavailablefair-sharebandwidth?Isitaninherentcharac-teristicofstreamingvideooverHTTP,oristheclientsimplypickingthewrongvideorate?We rstcon rmthattheavailablebandwidthreallyisavailableforstreamingvideo.WedothisusingafeatureprovidedbytheServiceAclientthatallowsuserstoman-uallyselectavideorateanddisabletheclient'sautomaticrateselectionalgorithm.Werepeattheaboveexperiment,butwithaslightmodi cation.Assoonastheclientpicksalowerrate,wemanuallyforcethevideotoplayat1750kb/s.Figure5showstheresults.Interestingly,theclientmain-tainsaplaybackrateof1750kb/swithoutcausingrebu er-ingevents,andthethroughputalsoincreases.Thissuggeststhatthedownwardspirale ectiscausedbyunderestima-tionoftheavailablebandwidthintheclient'srateselectionalgorithm.Thebandwidthisavailable,buttheclientneedstogograbit.4.2ServiceBFigure4(b)showsthesamedownwardspirale ectinSer-viceB.Asbefore,thebottleneckbandwidthis5Mb/sandtheRTTisaround20ms.Westartavideostreamingses-sion rst,allowittosettleatitshighestrate(3200kb/s)andthenstartacompeting\rowafter337seconds,byreadingthesamevideo lefromthesameserver. Toeliminatevariationcausedbycongestionattheserver,weveri edthatthesameproblemoccursifwedownloadthecompetingvideo lefromadi erentserveratthesameCDN. 0 200 400 600 800 1000 1200Time (s) 0 1000 2000 3000 4000 5000kb/s 235375560750105014001750 Video Playback Rate Competing Flow's Video Figure5:(ServiceA)Theclientmanagestomain-tainthehighestplaybackrateifwedisableauto-maticrateselection.Theclientshoulddropthevideorateto2500kb/s(itsfairshareoftheavailablebandwidth).Instead,itstepsallthewaytothelowestrateo eredbyServiceB,650kb/s,andoccasionallyto1000kb/s.Thethroughputplummetstoo.4.3ServiceCWeobservethedownwardspirale ectinServiceCaswell.SinceServiceCdoesnotautomaticallyswitchbetweenitsHDandSDbitrates,wedotwoseparateexperiments.IntheHDexperiment,asshowninFigure4(c),wesetthebottleneckbandwidthto22Mb/s.Tostartwith,theclientpicksthehighestHDvideorate(9Mb/s).Whentheclient'splaybackbu erisfull,thevideo\rowislimitedbythereceivewindow,andthethroughputconvergestothesamevalueastheplaybackrate.Westartthecompeting\rowat100seconds,anditdownloadsthesamevideo le(9Mb/svideorate)fromthesameCDN.Each\rowhas11Mb/savailabletoit,plentyfortheclienttocontinueplayingat9Mb/s.Butinstead,theclientresetstheconnectionandswitchesto4.5Mb/sandthen3Mb/s,beforebouncingaroundseveralrates.SDissimilar.Wesetthebottleneckbandwidthto5Mb/s,andtheclientcorrectlypicksthehighestrate(2000kb/s)tostartwith,asshowninFigure4(d).Whenwestartthecompeting\row,thevideoclientdropsdownto1000kb/seventhoughitsshareis2.5Mb/s.SinceServiceConlyo ersthreeSDrates,wefocusonitsHDserviceintherestofthepaper.5.WALKINGTHEDOWNWARDSPIRALTounderstandhowthedownwardspiralhappens,weex-amineeachserviceinturn.Althougheachserviceentersthedownwardspiralforaslightlydi erentreason,thereisenoughcommonalityforustofocus rstonServiceA(andFigure4(a))andthendescribehowtheothertwoservicesdi er. 160 170 180 190 200 210Time (s) 0 1000 2000 3000 4000 5000kb/s Playout Buffer is full (a)TCPthroughputbeforeandafterthebu er lls. 0 50 100 150 200 250 300Time (s) 0 1 2 3 4 5 6Request Interval (s) Playout Buffer is full (b)Requestintervalbeforeandafterthebu er lls.Figure6:(ServiceA)Beforeandaftertheplaybackbu er llsat185seconds. 0 500 1000 1500 2000 2500Throughput (kb/s) 0 20 40 60 80 100CDF (%) Video Rate 235kb/s Video Rate 375kb/s Video Rate 560kb/s Video Rate 750kb/s Video Rate 1050kb/s Video Rate 1400kb/s Video Rate 1750kb/s (a)ServiceAwithnocompeting\row. 0 500 1000 1500 2000 2500Throughput (kb/s) 0 20 40 60 80 100CDF (%) Video Rate 235kb/s Video Rate 375kb/s Video Rate 560kb/s Video Rate 750kb/s Video Rate 1050kb/s Video Rate 1400kb/s Video Rate 1750kb/s (b)ServiceAwithonecompeting\row.Figure7:(ServiceA)ThroughputatHTTPlayerwithandwithoutacompeting\row.5.1InitialCondition:NoCompetingFlowIntheabsenceofacompeting\row( rst400seconds),theServiceAclientcorrectlychoosesthehighestplaybackrate.Becausetheavailablenetworkbandwidth(5Mb/s)ismuchhigherthantheplaybackrate(1750kb/s),theclientbusily llsupitsplaybackbu erandthebottlenecklinkiskeptfullyoccupied.Eventuallytheplaybackbu er lls(after185seconds)andtheclientpausestoletitdrainalittlebeforeissuingnewrequests.Figure6(a)showshowtheTCPthroughputvariesbeforeandaftertheplaybackbu er llsup.Afterthebu erisfull,thecliententersaperiodicON-OFFsequence.Aswewillseeshortly,theON-OFFsequenceisapartoftheproblem(butonlyonepart).Beforethebu er lls,theclientrequestsanew4-secondsegmentofvideoevery1.5secondsonaverage(becauseitis llingthebu er).Figure6(b)con rmsthatafterthebu erisfull,theclientrequestsanew4-secondsegmentevery4seconds,onaverage.Theproblemisthatduringthe4-secondOFFperiod,theTCPcongestionwindow(cwnd)timesout|duetoinactivitylongerthan200ms|andresetscwndtoitsinitialvalueof10packets[5,6].EventhoughtheclientisusinganexistingpersistentTCPconnection,thecwndneedstorampupfromslowstartforeachnewsegmentdownload.Itisnaturaltoaskiftherepeateddroppingbacktoslow-startreducestheclient'svideothroughput,causingittoswitchtoalowerrate.Withnocompeting\row,itappearstheansweris`no'.Weverifythisbymeasuringthevideothroughputformanyrequests.Wesetthebottlenecklinkrateto2.5Mb/s,usetracescollectedfromactualsessionstoreplaytherequestsoverapersistentconnectiontothesameserver,andpausetherequestsatthesameintervalasthepausesinthetrace.Figure7(a)showstheCDFoftheclient'svideothroughputforrequestscorrespondingtovari-ousplaybackrates.Thevideothroughputisprettyaccurate.Exceptforsomeminorvariation,thevideothroughputac-curatelyre\rectstheavailablebandwidth,andexplainswhytheclientpicksthecorrectrate. 0.5 1.0 1.5 2.0 2.5 3.0Time(s) 2 4 6 8 10 12 14 16 18TCP Congestion Window (segment) Competing Flow Emulated Video Flow (a)A235kbpsSegment. 7 8 9 10 11 12Time(s) 2 4 6 8 10 12 14 16 18TCP Congestion Window (segment) Competing Flow Emulated Video Flow (b)Fivecontiguous235kbpssegmentsconcatenatedintoone.Figure8:(ServiceA)Theevolutionofcwndfordi erentsegmentsizes. 0 500 1000 1500 2000 2500 3000 3500 4000Time(s) 0 500 1000 1500 2000 2500 3000Available Bandwidth/Request Rate (kb/s) Available Bandwidth Video Playout Rate Figure9:(ServiceA)Theclientpicksavideoratedependingontheavailablebandwidth.Thehori-zontalgraylinesaretheavailablerates.5.2TheTrigger:WithaCompetingFlowThingsgowrongwhenthecompeting\rowsstarts(after400seconds).Figure7(b)showstheclient'svideothrough-putaremostlytoolowwhenthereisacompeting\row.welookattheprogressionofcwndforthevideo\rowafteritresumesfromapause,wecantellhowtheserveropensupthewindowdi erentlywhenthereisacompeting\row.Be-causewedon'tcontroltheserver(itbelongstotheCDN)weinsteaduseourlocalproxytoserveboththevideotracandthecompeting\row,andusethetcp_probekernelmodule InFigure7(b),thebottleneckbandwidthissetto5Mb/ssothattheavailablefair-shareofbandwidth(2.5Mb/s)isthesameasinFigure7(a).Notethatsomesegmentdownloadsareabletogetmorethanitsfairshare;inthesecases,thecompeting\rowexperienceslossesandhasnotrampeduptoitsfairshareyet.ThisisthereasonwhysomeoftheCDFcurvesdoesnotendwith100%at2.5Mb/sinFigure7(b).tologthecwndvalues.Thevideotrachereisgeneratedbyrequestinga235kbpsvideosegment.Figure8(a)showshowcwndevolves,startingfromtheinitialvalueof10at1.5seconds,thenrepeatedlybeingbeatendownbythecompetingwget\row.Thecompetingwget\rowhasalready lledthebu erduringtheOFFperiod,andsothevideo\rowseesveryhighpacketloss.Worsestill,thesegmentis nishedbeforecwndclimbsupagain,andwere-entertheOFFpe-riod.TheprocesswillrepeatforeveryON-OFFperiod,andthethroughputisheldarti ciallylow.Forcomparison,andtounderstandtheproblembetter,Figure8(b)showstheresultofthesameexperimentwithasegmentsize vetimeslarger.Withalargersegmentsize,thecwndhaslongertoclimbupfromtheinitialvalue;andhasamuchgreaterlikelihoodofreachingthecorrectsteadystatevalue.Nowthatweknowthevideothroughputtendstobelow(becauseofTCP),wewouldliketobetterunderstandhowtheclientreactstothelowthroughputs.Wecantracktheclient'sbehavioraswesteadilyreducetheavailableband-width,asshowninFigure9.Westartwithabottlenecklinkrateof5Mb/s(andnocompeting\row),dropitto2.5Mb/s(tomimicacompeting\row),andthenkeepdroppingitby100kb/severy3minutes.Thedashedlineshowstheavailablebandwidth,whilethesolidlineshowsthevideoratepickedbytheclient.Theclientchoosesthevideorateconservatively;whenavailablebandwidthdropsfromfrom5Mb/sto2.5Mb/s,thevideorategoesdownto1400kb/s,andsoon.Wecannowputthetwopiecestogether.Consideraclientstreamingataplaybackrateof1750kb/s,themedianvideothroughputitperceivesis1787kb/sasshowninFigure7(b).AccordingtoFigure9,withavideothroughputof1787kb/s,theclientreducesitsplaybackrateto1050kb/s.Thus,50%ofthetimetheplaybackratewillgodownto1050kb/soncethecompeting\rowstarts.ItisinterestingtoobservethattheServiceAclientisbehavingquiterationallygiventhethroughputitperceives.TheproblemisthatbecauseServiceAobservesthethrough-aboveTCP,itisnotawarethatTCPitselfishaving 0 500 1000 1500 2000 2500Throughput (kb/s) 0 20 40 60 80 100CDF (%) Video Rate 650kb/s Video Rate 1000kb/s Video Rate 1500kb/s Video Rate 2000kb/s Video Rate 2500kb/s Video Rate 3200kb/s (a)ServiceBwithnocompeting\row. 0 500 1000 1500 2000 2500Throughput (kbps) 0 20 40 60 80 100CDF (%) Video Rate 650kb/s Video Rate 1000kb/s Video Rate 1500kb/s Video Rate 2000kb/s Video Rate 2500kb/s Video Rate 3200kb/s (b)ServiceBwithonecompeting\row.Figure16:(ServiceB)TheTCPthroughputchangesinthepresenceofacompeting\row. 102 103 104 105 OFFDuration(ms) 0:0 0:2 0:4 0:6 0:8 Fraction(%) Figure14:(ServiceB)AlmostalltheOFFperiodsinasinglevideosessionaregreaterthanRTO(200ms).videothroughputishigher|hencetheServiceAclientpicksahigherrate(1050kb/s).Forcomparison,weasked10volunteerstorerunthisex-perimentwithServiceAintheirhomenetworkconnectedtodi erentISPs,suchasAT&TDSL,Comcast,Verizonanduniversityresidences.Eventhoughtherewassucientavailablebandwidthforthehighestvideorateinthepres-enceofacompeting\row,sevenpeoplereportedarateofonly235kb/s-560kb/s.5.5ServiceBServiceBalsoexhibitsON-OFFbehavior,butattheTCPlevelandnottheHTTPlevel,i.e.,thepausecouldhappenwhiledownloadingavideosegment.Whenitsvideoplay-backbu erisfull,theclientstopstakingdatafromtheTCPsocketbu er.Eventually,theTCPsocketbu eralso llsandtriggersTCP\rowcontroltopausetheserverbysendingazerowindowadvertisement.InFigure4(b),each 0 500 1000 1500 2000 Downloadsizebetweensuccessivezerowindowads(KBytes) 0:0 0:2 0:4 0:6 0:8 Fraction(%) Figure15:(ServiceB)Whenthevideostreamisreceiver-limited,theclientdoesnotrequestmanybytesduringanONperiod.zerowindowadvertisementismarkedbyahexagon.Theclientstartsissuingzerowindowadvertisementsataround100sandcontinuestodosountilafewsecondsafterthecompeting\rowstarts.Figure14showstheCDFofthedu-rationoftheOFFperiods.Almostallthepausesarelongerthan200ms,socwndisresettoitsinitialvalue.Thus,Ser-viceBe ectivelyexhibitsanON-OFFbehaviorsimilartothatofServiceA.Worsestill,duringanONperiod,ServiceBdoesnotre-questmanybytes;Figure15showsthatoverhalfofthetime,itreadsonly800kbytes,whichisnotenoughforthecwndtoclimbuptoitssteadystatebeforethenextOFFperiod.Figure4(b)andFigure16(b)showtheresult,thattheTCPthroughputisonlyaround1Mbpsto1.5Mbps,causingSer-viceBtopickavideorateof1000kb/s,oreven650kb/s.Aswesawearlier,whencompetingwithanother\row,thesmallertherequest,thehigherthelikelihoodofperceiving 0 200 400 600 800 1000Time (s) 0 1000 2000 3000 4000 5000Video Rate (kb/s) Buffer Status Video Throughput Video Rate 0 50 100 150 200 250Video Segments in Buffer (Second) Figure20:Customclient,similartoServiceA{equallyconservative,witha10-samplemovingaver-age lter{displaysthedownwardspiral.areasonablebaseline.ForServiceA,Figure9indicatesthebandwidthbelowwhichtheclientpicksalowervideorate.AssumethatServiceAestimatesbandwidthbysimplydi-vidingthedownloadsizebythedownloadtimeandpassingitthrougha xed-sizemoving-average lter.Wecanesti-matethesizeofthe lterbymeasuringhowlongittakesfromwhenthebandwidthdropsuntiltheclientpicksanewrate.AnumberoftracesfromServiceAsuggesta lterwith10samples,thoughthetruealgorithmisprobablymorenu-anced.TocloselymimictheServiceAclient,ourcustomclientre-queststhevideosegmentswiththesamesizesfromthesamelocationsintheCDN:wecapturethesegmentmapgiventotheclientafterauthentication,whichlocatesthevideoseg-mentsforeachsupportedplaybackrate.Hence,ourcustomclientwillexperiencethesamesegment-sizevariationoverthecourseofthemovie,andwhenitshiftsplaybackrate,thesegmentsizewillchangeaswell.Sinceourcustomclientusestokensfromanearlierplayback,theCDNcannottellthedi erencebetweenourcustomclientandtherealServiceAclient.TofurthermatchServiceA,theplaybackbu erissetto240seconds,theclientusesasinglepersistentcon-nectiontotheserver,anditpauseswhenthebu erisfull.We rstvalidatetheclient,thenconsiderthreechanges:(1)beinglessconservative,(2)changingthe lteringmethod,and(3)aggregatingsegments.6.2ValidatingourCustomClientFigure20showsthecustomclientinaction.Afterdown-loadingeachsegment,thecustomclientselectstheplaybackratebasedonServiceA'sconservativerateselectionalgo-rithm,observedinFigure9.Oncetheplaybackbu erisfull,weintroduceacompeting\row.Liketherealclient,theplaybackratedropssuddenlywhenthecompeting\rowstarts,then\ructuatesoverthecourseofthemovie.Thedownwardspiraldoesnotbottomout,whichwesuspectisduetosomesubtledi erencesbetweenServiceA'salgorithmandours. 0 200 400 600 800 1000Time (s) 0 1000 2000 3000 4000 5000Video Rate (kb/s) Buffer Status Video Throughput Video Rate 0 50 100 150 200 250Video Segments in Buffer (Second) Figure21:Customclient{with10%conservatism,butwitha10-samplemovingaverage lter. 0 200 400 600 800 1000Time (s) 0 1000 2000 3000 4000 5000Video Rate (kb/s) Buffer Status Video Throughput Video Rate 0 50 100 150 200 250Video Segments in Buffer (Second) Figure22:Customclient{with10%conservatism,andwithan80th-percentile lter.6.3LessConservativeBandwidthestimatesbasedondownloadsizesanddura-tionstendtounder-reporttheavailablebandwidth,espe-ciallyinthepresenceofacompeting\row.Ifthealgorithmisconservative,itexacerbatestheproblem.Wetryalessconservativealgorithm,withaconservatismof10%insteadof40%.Conservatismof40%meanstheclientrequestsforavideorateofatmost1.2Mb/swhenitperceives2.0Mb/s,while10%meansitrequestsatmost1.8Mb/swhenper-ceiving2.0Mb/s.AccordingtoFigure9,ServiceArequestsvideoratewithaconservatismofapproximately40%.Fig-ure21showsthatthevideorateishigher,eventhoughtheplaybackbu erstaysfull.Theresultishigherqual-ityvideo,highplaybackbu eroccupancy(i.e.resilienceagainstrebu ering)andfourminutesofbu eringtorespondtochangesinbandwidth.Notethateventhoughthealgo-rithmislessconservative,theunderlyingTCPensuresthealgorithmstaysa\goodcitizen"andonlygetsitsfairshareofavailablebandwidth. 0 200 400 600 800 1000Time (s) 0 1000 2000 3000 4000 5000Video Rate (kb/s) Buffer Status Video Throughput Video Rate 0 50 100 150 200 250Video Segments in Buffer (Second) Figure23:Customclientwithincreasedsegmentsize(5x).6.4BetterFilteringAveraging ltersprovideamorestableestimateofband-width,butasingleoutliercanconfusethealgorithm.Forexample,afewsecondsoflow-informationmoviecreditsre-ducesthesegmentsizeandthealgorithmmightdroptherate.Inplaceofaverages,weconsidermediansandquan-tilestoreducethevulnerabilitytooutliers.Figure22showswhathappensifweusethe80th-percentileofmeasuredrateofthepasttensegmentdownload.Variationisgreatlyre-duced,andthemajorityofthemovieplaysatthehighest-availablerate.Theplaybackbu erhassmall\ructuations,butitisstillfarfromarebu erevent.6.5BiggerSegmentsAsnotedearlier,biggersegmentsprovidebetterestimatesoftheavailablebandwidth,allowingTCPtoescapeslow-start.Figure23showswhathappensifourclientaggregates verequestsintoone.Withthelargersegmentsize,thevideothroughputismorestable,andboththeplaybackrateandbu ersizearemorestable.Insummary,largersegmentsletTCPreachitsfairshareandimprovethevideothroughput.Pickinghigherrateslessconservativelyand lteringmeasurementsmorecare-fullycanimprovevideoquality.Butweshouldnotethattheseimprovementsareforonemovieononeservice.Giventheprevalenceofthedownwardspirale ect,theseshouldnotbeinterpretedashardrecommendations,merelyasaddeddetailtoourunderstandingoftheproblem.7.RELATEDWORKTherelatedworklargelyconsidersthreeoverlappingar-eas:systemsforvideostreaming;measurementstounder-standtheirperformance,andthedesignandanalysisofrateselectionalgorithms.VideoStreamingServices.The rstcategorycoversvideostreamingapproachesusingHTTP,suchasthecom-mercialonesfromAdobe,Apple,andMicrosoftdescribedin[22],whichdi erintheiralignmentofvideoswitchingrates,whetherA/Vstreamsarecombined,andwhetherre-questsareissuedasbyterangesorforpre-speci edseg-ments.AmorerecenttechniqueisMPEGDASH(DynamicAdaptiveStreamingoverHTTP)[7]whichstandardizestheformattingofvideocontentandleavesopenthespeci cclientplayeralgorithm.Thesetechniquesunderpinthema-jorcommercialserviceslikeYouTube,Net\rix,andHulu.VideoStreamingMeasurement.Thesecondcate-gorymeasurestheperformanceofindividualvideostreamingclientsexperiencinglocaltracconditions(\inthelab"),allthewaytodistributedmeasurementsystemsthatcomparetheperformanceofthousandsofclients(\inthewild").Theworkmostsimilartooursis[3],wheretheauthorsalsoparseHTTPmessagestodetermineplaybackratesanduseabandwidthlimitertotestclientsundervaryingnetworkconditions.However,[3]focusesontheunfairnessproblemamongtwovideoplayers,whileinthisworkwefocusontheunfairnessproblembetweenavideoplayerandalong-livedTCP\row.Thispaperconsidersasigni cantlydi er-entscenario:itfocusesonavideoclientcompetingagainstanothervideoclient.Inthiscontext,theyobservesimilarpathologies:poorbandwidthestimation,leadingtoinsta-bility.However,theyexplaintheirobservationsentirelyintermsoftheapplication-layerON-OFFbehaviorofvideoclients;evenifonevideoclientperfectlyobtaineditsfairsharewhenON,itcanfailtocorrectlyestimateavailablebandwidth(dependingontheamountofoverlapwiththeONperiodsoftheotherclient).Bycontrast,ourpaperdemonstratesthatthisisonlyasymptomofamoregen-eralproblem:inaccuratebandwidthestimationoccursevenwhenthecompeting\rowdoesnotexhibitON-OFFbehav-ior.Asweshowinthispaper,theproblemarisesbecauseitishardtoestimatebandwidthaboveTCP.Othershaveidenti edthesameproblembutnotexplaineditscausesorvalidatedpotential xes[4,16].MeasuringtheCDNserversratherthanclientsprovidesdi erentinsights.In[1],theauthorsexaminetheCDNse-lectionstrategyofHulu,whilein[2],theauthorslookatNet\rix.Bothpapers ndapredispositionforclientstostaywiththeoriginalCDN,despitevariationbetweenCDNsandovertime.In[9],theauthorsdescribelessonslearnedfromadistributedcommercialmeasurementsystemtounderstandthee ectsofQuality-of-Experience(QoE)metricsonviewerengagementandretention.Rebu erratesandaveragevideoqualityareQoEmetricswithmeasurableimpactsonviewerengagement,whichunderscorestheimportanceofgettingratemeasurementandselectionrightinthepresenceofcom-peting\rows.Withthemeasurement-driveninsightsfromthesamesystem,[14]proposesaglobalvideocontrolplanetodynamicallyassignclientsachoiceofvideorateandCDNthatoptimizesviewers'experience.Otherworklooksatnetworkcharacteristicsofvideostream-ingtrac,ratherthanfocusingontheclientorviewerexpe-riences[11,19,24].Inparticular,theauthorsin[19]showON-OFFcyclebehaviorforYouTubeandNet\rixanduseamodeltostudyaggregatesofvideoclientandtheire ectsonthenetwork.BothCDNandnetworktracpapersdonotconsiderlocale ectsonmeasuredbandwidthortheire ectsonratestability.RateSelectionAlgorithms.Thethirdcategoryisworkonrateselectionalgorithms.Thisworkcomplementsours,asacontrolsystemalwaysbene tsfrommoreaccu-ratemeasurements.In[8],theauthorsproposeanalgorithmtomaintaintheplayoutbu eratatargetlevel.In[17],theauthorsimplementadi erentbu er-awarerateselectionalgorithmandexperimentallymeasureuserpreferencesforgradualandinfrequentplaybackratechanges.In[23],the authorsmodeltherateselectionproblemasaMarkovDeci-sionProcessanduseadynamicprogrammingtechniquetochooseastreamingstrategythatimprovesQoE.In[13],theauthorsusesimulationstoshowhowparallelHTTPsessionscanimproveplaybackquality.Server-sidepacingisanotherapproachtoselectingrateusedbyYouTube,asdescribedin[10,12].8.CONCLUSIONDespitesomedi erencesinspeci cserviceimplementa-tions,allthreeserviceswestudydisplaydegradedperfor-manceinthepresenceofcompetingtrac,wellbelowthevideoqualitypossibleiftheclientuseditsfairshareofband-width.Atahighlevel,ourmeasurementanalysisandex-perimentssuggestthattherootcauseofthisfailureisalackofinformation.TheHTTPlayerissimplynotprivytocontinuoushigh- delityfeedbackaboutthefairshareatthebottlenecklink.Therearetwowaystointerpretourobservations.Ononehand,weobservethatdeterminingthefairshareofband-widthavailableatthebottleneckispreciselytheroleofTCPThus,onepathforwardmightbetosuggestthatweshoulddesigntheclienttoimproveinformation\rowfromTCPtotheHTTPlayer.Inparticular,weshouldensurethatTCPhasachancetoreachitssteady-statefairshare;forexample,increasingthesegmentsizeenablesthise ect.However,webelievetheremaybeamoreradicalsolu-tion:donotattempttoestimatebandwidthatall!Thevideostreamingclienthastwocompetinggoals:attainthehighestbitratepossiblewhileavoidingbu erunderruns.Thustheobjectiveisnottoensurethebu erstaysfull;theobjectiveistoensurethebu erdoesnotgoempty.Sincethebu erholdsseveralminutesofvideo,thisshiftinperspectivesug-geststhatifthebu erisfullthentheclienthaspickedaratethatistoolow.Rather,theclientshouldincreasethebitratewhenthebu erishighanddecreaseitwhenthebu erfallslow.Thoughthissoundsaggressive,notethatitisexactlythecorrectlayerseparation:ithandso toTCPtheob-jectiveofobtainingthefairshareofbandwidth,andtriestoalwaysensuretheclientpicksthehighestratepossible.Thissuggestsanintriguingpathforwardforfutureresearch:designvideo-streamingclientsthatdeliverhighperformancebyeliminatingbandwidthestimationalltogether.AcknowledgmentWearegratefultotheanonymousreviewersandourshep-herdNinaTaftfortheirvaluablecommentsandfeedback,whichhelpedimprovethe nalversion.TheauthorswouldalsoliketothankKok-KiongYap,MasayoshiKobayashi,VimalkumarJeyakumar,YiannisYiakoumisandNet\rixen-gineersforhelpfuldiscussionsthatshapedthepaper.ThisworkwassupportedbyMr.andMrs.ChunChiuStanfordGraduateFellowship,Hewlett-PackardFellowship,theStan-fordCleanSlateProgram,andtheNationalScienceFoun-dationundergrantsCNS-0904609,CNS-0644114,andCNS-0832820.9.REFERENCES[1]V.Adhikari,Y.Guo,F.Hao,V.Hilt,andZ.-L.Zhang.ATaleofThreeCDNs:AnActiveMeasurementStudyofHuluanditsCDNs.InProceedingsofIEEEConferenceonComputerCommunicationsWorkshops(INFOCOMWKSHPS),pages7{12,March2012.[2]V.K.Adhikari,Y.Guo,F.Hao,M.Varvello,V.Hilt,M.Steiner,andZ.-L.Zhang.UnreelingNet\rix:UnderstandingandImprovingMulti-CDNMovieDelivery.InProceedingsoftheIEEEINFOCOM2012,Orlando,FL,USA,pages1620{1628,March2012.[3]S.Akhshabi,L.Anantakrishnan,C.Dovrolis,andA.Begen.WhatHappensWhenHTTPAdaptiveStreamingPlayersCompeteforBandwidth?InProceedingsoftheACMWorkshoponNetworkandOperatingSystemsSupportforDigitalAudioandVideo(NOSSDAV),June2012.[4]S.Akhshabi,C.Dovrolis,andA.Begen.AnExperimentalEvaluationofRateAdaptationAlgorithmsinAdaptiveStreamingoverHTTP.InProceedingsoftheACMMultimediaSystemsConference(MMSys),SanJose,CA,USA,Feburary2011.[5]M.Allman,V.Paxson,andE.Blanton.TCPCongestionControl.RFC5681(DraftStandard),Sept.2009.[6]M.Allman,V.Paxson,andW.Stevens.TCPCongestionControl.RFC2581(ProposedStandard),Apr.1999.ObsoletedbyRFC5681,updatedbyRFC3390.[7]MPEGDASHspeci cation(ISO/IECDIS23009-1.2),2011.[8]L.DeCicco,S.Mascolo,andV.Palmisano.FeedbackControlforAdaptiveLiveVideoStreaming.InProceedingsoftheACMMultimediaSystemsConference(MMSys),Febrary2011.[9]F.Dobrian,A.Awan,D.Joseph,A.Ganjam,J.Zhan,V.Sekar,I.Stoica,andH.Zhang.UnderstandingtheImpactofVideoQualityonUserEngagement.InProceedingsoftheACMSIGCOMM,Toronto,Canada,August2011.[10]M.Ghobadi,Y.Cheng,A.Jain,andM.Mathis.Trickle:RateLimitingYouTubeVideoStreaming.InProceedingsoftheUSENIXAnnualTechnicalConference(ATC),page6,2012.[11]P.Gill,M.Arlitt,Z.Li,andA.Mahanti.YoutubeTracCharacterization:AViewFromtheEdge.InProceedingsoftheACMSIGCOMMconferenceonInternetMeasurement(IMC),pages15{28,2007.[12]L.Kontothanassis.ContentDeliveryConsiderationsforDi erentTypesofInternetVideo.InProceedingsoftheACMMultimediaSystemsConference(MMSys){Keynote,ChapelHill,NC,USA,Febrary2012.[13]C.Liu,I.Bouazizi,andM.Gabbouj.ParallelAdaptiveHTTPMediaStreaming.InProceedingsoftheIEEEInternationalConferenceonComputerCommunicationsandNetworks(ICCCN),pages1{6,2011.[14]X.Liu,F.Dobrian,H.Milner,J.Jiang,V.Sekar,I.Stoica,andH.Zhang.ACaseforaCoordinatedInternetVideoControlPlane.InProceedingsoftheACMSIGCOMM,Helsinki,Finland,August2012.[15]J.W.Lockwood,N.McKeown,G.Watson,G.Gibb,P.Hartke,J.Naous,R.Raghuraman,andJ.Luo.NetFPGA{AnOpenPlatformforGigabit-RateNetworkSwitchingandRouting.InMSE'07: Proceedingsofthe2007IEEEInternationalConferenceonMicroelectronicSystemsEducationpages160{161,2007.[16]K.Miller,E.Quacchio,G.Gennari,andA.Wolisz.AdaptationalgorithmforadaptivestreamingoverHTTP.InProceedingsoftheIEEEInternationalPacketVideoWorkshop(PV),pages173{178,May2012.[17]R.Mok,X.Luo,E.Chan,andR.Chang.QDASH:aQoE-awareDASHsystem.InProceedingsoftheACMMultimediaSystemsConference(MMSys),pages11{22,Febrary2012.[18]Sandvine:GlobalInternetPhenomenaReport.http://www.sandvine.com/news/pr_detail.asp?ID=312[19]A.Rao,A.Legout,Y.Lim,D.Towsley,C.Barakat,andW.Dabbous.Networkcharacteristicsofvideostreamingtrac.InProceedingsoftheACMCOnferenceonemergingNetworkingEXperimentsandTechnologies(CONEXT),page25.ACM,2011.[20]RTMPDump.http://rtmpdump.mplayerhq.hu/[21]ConsumerReport:StreamingVideoServicesRating.http://www.consumerreports.org/cro/magazine/2012/09/best-streaming-video-services/[22]M.Watson.HTTPAdaptiveStreaminginPractice.InProceedingsoftheACMMultimediaSystemsConference(MMSys){Keynote,SanJose,CA,USA,Febrary2011.[23]S.Xiang,L.Cai,andJ.Pan.AdaptiveScalableVideoStreaminginWirelessNetworks.InProceedingsoftheACMMultimediaSystemsConference(MMSys),pages167{172,Febrary2012.[24]M.Zink,K.Suh,Y.Gu,andJ.Kurose.CharacteristicsofYouTubenetworktracatacampusnetwork-Measurements,models,andimplications.InComputerNetworks,Volume53,Issue,pages501{514.Elsevier,2009.