0 100 200 300 400 500 600 700 800Time s 0 1000 2000 3000 4000 5000kbs 235375560750105014001750 Video Playback Rate Competing Flows Figure1ServiceAAvideostartsstreamingat175Mbsovera5Mbsnetwor ID: 195280
Download Pdf The PPT/PDF document "Confused,Timid,andUnstable:PickingaVideo..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.
Confused,Timid,andUnstable:PickingaVideoStreamingRateisHardTe-YuanHuangNikhilHandigolBrandonHellerNickMcKeownRameshJohariStanfordUniversity{huangty,nikhilh,brandonh,nickm,ramesh.johari}@stanford.eduABSTRACTToday'scommercialvideostreamingservicesusedynamicrateselectiontoprovideahigh-qualityuserexperience.MostserviceshostcontentonstandardHTTPserversinCDNs,sorateselectionmustoccurattheclient.Wemeasurethreepopularvideostreamingservices{Hulu,Net\rix,andVudu{andndthataccurateclient-sidebandwidthestimationabovetheHTTPlayerishard.Asaresult,rateselectionbasedoninaccurateestimatescantriggerafeedbackloop,leadingtoundesirablyvariableandlow-qualityvideo.Wecallthisphenomenonthedownwardspiraleect,andwemeasureitonallthreeservices,presentinsightsintoitsrootcauses,andvalidateinitialsolutionstopreventit.CategoriesandSubjectDescriptorsC.2.0[ComputerSystemsOrganization]:Computer-CommunicationNetworks|General;C.4[PerformanceofSystems]:[Measurementtechniques]GeneralTermsMeasurementKeywordsHTTP-basedVideoStreaming,VideoRateAdaptation1.INTRODUCTIONVideostreamingisahugeandgrowingfractionofInter-nettrac,withNet\rixandYoutubealoneaccountingforover50%ofthepeakdownloadtracintheUS[18].Sev-eralbigvideostreamingservicesrunoverHTTPandTCPe.g.Hulu,Net\rix,Vudu,YouTube)andstreamdatatotheclientfromoneormorethird-partycommercialCDNs(e.g.Akamai,Level3orLimelight).StreamingoverHTTPhasseveralbenets:ItisstandardizedacrossCDNs(allowingaportablevideostreamingservice),itiswell-established(whichmeanstheCDNshavealreadymadesureservicecanPermissiontomakedigitalorhardcopiesofallorpartofthisworkforpersonalorclassroomuseisgrantedwithoutfeeprovidedthatcopiesarenotmadeordistributedforprotorcommercialadvantageandthatcopiesbearthisnoticeandthefullcitationontherstpage.Tocopyotherwise,torepublish,topostonserversortoredistributetolists,requirespriorspecicpermissionand/orafee.November1416,2012,Boston,Massachusetts,USA.Copyright2012ACM978-1-4503-1705-4/12/11...$15.00. 0 100 200 300 400 500 600 700 800Time (s) 0 1000 2000 3000 4000 5000kb/s 235375560750105014001750 Video Playback Rate Competing Flow's Figure1:(ServiceA)Avideostartsstreamingat1.75Mb/sovera5Mb/snetwork.After395seconds,asecond\rowstarts(fromthesameserver).Thevideocouldstreamat1.75Mb/s(givenitsfairshareof2.5Mb/s),butinsteaddropsdownto235kb/s.reachthroughNATstoend-hosts),andcheap(theserviceissimple,commoditized,andtheCDNscompeteonprice).Thesebenetshavemadepossiblethehugegrowthinaord-able,high-qualitymovieandTVstreaming,forourviewingdelight.WhenvideoisstreamedoverHTTP,thevideoserviceproviderreliesonTCPtondtheavailablebandwidthandchooseavideorateaccordingly.Forexample,ifaclientestimatesthatthereis1.5Mb/savailableinthenetwork,itmightrequesttheservertostreamvideocompressedto1.3Mb/s(orthehighestvideorateavailableatorbelow1.5Mb/s).Thevideostreamingserviceprovidermustwalkatightrope:Iftheypickavideoratethatistoohigh,theviewerwillexperienceannoyingrebueringevents;iftheypickastreamingratethatistoolow,theviewerwillex-periencepoorvideoquality.Inbothcases,theexperiencedegrades,andusermaytaketheirviewingelsewhere[9].Itisthereforeimportantforavideostreamingservicetoselectthehighestsafevideorate.Thispaperdescribesameasurementstudyofthreepop-ularHTTP-basedvideostreamingservices(Hulu,Net\rix,andVudu)toseehowwelltheypickthevideorate.Accord-ingtothelatestConsumerReports[21],Net\rixisthemostpopularvideostreamingproviderintheUnitedStates,while 0 100 200 300 400 500 600 700 800Time (s) 0 1000 2000 3000 4000 5000kb/s 235375560750105014001750 Video Playback Rate Competing Flow's Video Flow's (a)ServiceA.Networkbottlenecksetto5Mb/s. 0 100 200 300 400 500 600 700 800Time (s) 0 1000 2000 3000 4000 5000kb/s 65010001500200025003200 Video Playback Rate Competing Flow's Video Flow's Zero Window (b)ServiceB.Networkbottlenecksetto5Mb/s. 0 100 200 300 400 500 600 700Time (s) 0 5000 10000 15000 20000kb/s 3000450067509000 Video Flow's Competing Flow's Video Playback Rate (c)ServiceCHD.Networkbottlenecksetto22Mb/s. 0 100 200 300 400 500 600Time (s) 0 1000 2000 3000 4000 5000kb/s 100015002000 Video Flow's Competing Flow's Video (d)ServiceCSD.Networkbottlenecksetto5Mb/s.Figure4:Thedownwardspiraleectisvisibleinallthreeservices.ThevideoratesavailablefromeachofthethreeservicesaresummarizedinTable2;someplaybackratesmaynotbeavailableforsomevideos.3.3TheCompetingFlowsThecompeting\rowisaTCP\rowdoingalongledown-load.Toeliminateanyunfairnessduetovariationsinnet-workpathproperties,weensurethatthecompeting\rowisservedbythesameCDN,andusually,bythesameserver.ForServiceAandServiceC,thecompeting\rowisgener-atedbyanopen-endedbyterangerequesttothelewiththehighestrate.Further,weusetheDNScachetomakesurethatthecompeting\rowcomesfromsameterminationpoint(theserverortheload-balancer)asthevideo\row.ForServiceB,sincethelesarestoredassmallsegments,anopen-endedrequestonlycreatesshort-lived\rows.In-stead,wegeneratethecompeting\rowbyrequestingtheFlashversionofthesamevideostoredinthesameCDN,usingrtmpdump[20]overTCP.4.THEDOWNWARDSPIRALEFFECTAllthreeservicessuerfromwhatwecallthe\downwardspiraleect"{adramaticanomalousdropinthevideoplay-backrateinthepresenceofacompetingTCP\row.TheproblemisstarklyvisibleinFigure4.Inallfourgraphs,thevideostreamstartsoutaloneandthencompeteswithanotherTCP\row.Assoonasthecompeting\rowstartsup,theclientmysteriouslypicksavideoplaybackratethatisfarbelowtheavailablebandwidth.Ourgoalistounderstandwhythishappens.Togainarstinklingintowhatisgoingon,wecalculatetheupperboundofwhattheclientmightbelievetheinstan-taneousavailablebandwidthtobe,bymeasuringthearrivalrateofthelastvideosegment.Specically,wecalculatethethroughputupperboundasthesizeofareceivedvideoseg-mentdividedbythetimeittooktoarrive(thetimefromwhentherstbytearriveduntilthelastbytearrived),whichexcludestheinitialserverresponsetime.Inallofthegraphs,thevideoplaybackratechosenbytheclientisquitestronglycorrelatedwiththecalculatedthroughput.Aswewillsee, hereinliestheproblem:iftheclientisselectingthevideoratebasedonsomefunctionofthethroughputitperceived,andthethroughputissodierentfromtheactualavailablebandwidth,thenitisnotsurprisingtheclientdoessuchapoorjob.Let'snowseewhatgoeswrongforeachserviceinturn.Foreaseofdiscussion,wewillusevideothroughputtorefertothethroughputaclientperceivedbydownloadingavideosegment.4.1ServiceAFigure4(a)showstheplaybackrateofaServiceAvideosessionalongwiththeclient'svideothroughputovertime.Startingout,thevideostreamistheonly\rowandtheclientrequeststhehighestvideorate(1750kb/s).Thecompeting\rowbeginsafter400seconds;thevideoratesteadilydropsuntilitreachesthelowestrate(235kb/s),anditstaystheremostofthetimeuntilthecompeting\rowstops.Intheory,both\rowsshouldbeabletostreamat2.5Mb/s(theirfairshareofthelink)andtheclientshouldcontinuetostreamat1750kb/s.Werepeatedtheexperiment76timesoverfourdays.In67cases(91%)thedownwardspiralhappens,andtheclientpickseitherthelowestrate,orbouncesbetweenthetwoorthreelowestrates.Injustsevencases(9%)wastheclientabletomaintainaplaybackrateabove1400kb/s.Toensureaccuracyandeliminateproblemsintroducedbycompeting\rowswithdierentcharacteristics(e.g.TCP\rowswithdierentRTTs),wemakethecompeting\rowrequestthesamevideole(encodedat1750kb/s)fromthesameCDN.Unlikethevideo\row,thecompeting\rowisjustasimpleTCPledownloadanditsdownloadspeedisonlydictatedbyTCPcongestioncontrolalgorithmandnotcappedbythevideoclient.Whydoesthroughputofthevideo\rowdropsomuchbe-lowavailablefair-sharebandwidth?Isitaninherentcharac-teristicofstreamingvideooverHTTP,oristheclientsimplypickingthewrongvideorate?Werstconrmthattheavailablebandwidthreallyisavailableforstreamingvideo.WedothisusingafeatureprovidedbytheServiceAclientthatallowsuserstoman-uallyselectavideorateanddisabletheclient'sautomaticrateselectionalgorithm.Werepeattheaboveexperiment,butwithaslightmodication.Assoonastheclientpicksalowerrate,wemanuallyforcethevideotoplayat1750kb/s.Figure5showstheresults.Interestingly,theclientmain-tainsaplaybackrateof1750kb/swithoutcausingrebuer-ingevents,andthethroughputalsoincreases.Thissuggeststhatthedownwardspiraleectiscausedbyunderestima-tionoftheavailablebandwidthintheclient'srateselectionalgorithm.Thebandwidthisavailable,buttheclientneedstogograbit.4.2ServiceBFigure4(b)showsthesamedownwardspiraleectinSer-viceB.Asbefore,thebottleneckbandwidthis5Mb/sandtheRTTisaround20ms.Westartavideostreamingses-sionrst,allowittosettleatitshighestrate(3200kb/s)andthenstartacompeting\rowafter337seconds,byreadingthesamevideolefromthesameserver. Toeliminatevariationcausedbycongestionattheserver,weveriedthatthesameproblemoccursifwedownloadthecompetingvideolefromadierentserveratthesameCDN. 0 200 400 600 800 1000 1200Time (s) 0 1000 2000 3000 4000 5000kb/s 235375560750105014001750 Video Playback Rate Competing Flow's Video Figure5:(ServiceA)Theclientmanagestomain-tainthehighestplaybackrateifwedisableauto-maticrateselection.Theclientshoulddropthevideorateto2500kb/s(itsfairshareoftheavailablebandwidth).Instead,itstepsallthewaytothelowestrateoeredbyServiceB,650kb/s,andoccasionallyto1000kb/s.Thethroughputplummetstoo.4.3ServiceCWeobservethedownwardspiraleectinServiceCaswell.SinceServiceCdoesnotautomaticallyswitchbetweenitsHDandSDbitrates,wedotwoseparateexperiments.IntheHDexperiment,asshowninFigure4(c),wesetthebottleneckbandwidthto22Mb/s.Tostartwith,theclientpicksthehighestHDvideorate(9Mb/s).Whentheclient'splaybackbuerisfull,thevideo\rowislimitedbythereceivewindow,andthethroughputconvergestothesamevalueastheplaybackrate.Westartthecompeting\rowat100seconds,anditdownloadsthesamevideole(9Mb/svideorate)fromthesameCDN.Each\rowhas11Mb/savailabletoit,plentyfortheclienttocontinueplayingat9Mb/s.Butinstead,theclientresetstheconnectionandswitchesto4.5Mb/sandthen3Mb/s,beforebouncingaroundseveralrates.SDissimilar.Wesetthebottleneckbandwidthto5Mb/s,andtheclientcorrectlypicksthehighestrate(2000kb/s)tostartwith,asshowninFigure4(d).Whenwestartthecompeting\row,thevideoclientdropsdownto1000kb/seventhoughitsshareis2.5Mb/s.SinceServiceConlyoersthreeSDrates,wefocusonitsHDserviceintherestofthepaper.5.WALKINGTHEDOWNWARDSPIRALTounderstandhowthedownwardspiralhappens,weex-amineeachserviceinturn.Althougheachserviceentersthedownwardspiralforaslightlydierentreason,thereisenoughcommonalityforustofocusrstonServiceA(andFigure4(a))andthendescribehowtheothertwoservicesdier. 160 170 180 190 200 210Time (s) 0 1000 2000 3000 4000 5000kb/s Playout Buffer is full (a)TCPthroughputbeforeandafterthebuerlls. 0 50 100 150 200 250 300Time (s) 0 1 2 3 4 5 6Request Interval (s) Playout Buffer is full (b)Requestintervalbeforeandafterthebuerlls.Figure6:(ServiceA)Beforeandaftertheplaybackbuerllsat185seconds. 0 500 1000 1500 2000 2500Throughput (kb/s) 0 20 40 60 80 100CDF (%) Video Rate 235kb/s Video Rate 375kb/s Video Rate 560kb/s Video Rate 750kb/s Video Rate 1050kb/s Video Rate 1400kb/s Video Rate 1750kb/s (a)ServiceAwithnocompeting\row. 0 500 1000 1500 2000 2500Throughput (kb/s) 0 20 40 60 80 100CDF (%) Video Rate 235kb/s Video Rate 375kb/s Video Rate 560kb/s Video Rate 750kb/s Video Rate 1050kb/s Video Rate 1400kb/s Video Rate 1750kb/s (b)ServiceAwithonecompeting\row.Figure7:(ServiceA)ThroughputatHTTPlayerwithandwithoutacompeting\row.5.1InitialCondition:NoCompetingFlowIntheabsenceofacompeting\row(rst400seconds),theServiceAclientcorrectlychoosesthehighestplaybackrate.Becausetheavailablenetworkbandwidth(5Mb/s)ismuchhigherthantheplaybackrate(1750kb/s),theclientbusilyllsupitsplaybackbuerandthebottlenecklinkiskeptfullyoccupied.Eventuallytheplaybackbuerlls(after185seconds)andtheclientpausestoletitdrainalittlebeforeissuingnewrequests.Figure6(a)showshowtheTCPthroughputvariesbeforeandaftertheplaybackbuerllsup.Afterthebuerisfull,thecliententersaperiodicON-OFFsequence.Aswewillseeshortly,theON-OFFsequenceisapartoftheproblem(butonlyonepart).Beforethebuerlls,theclientrequestsanew4-secondsegmentofvideoevery1.5secondsonaverage(becauseitisllingthebuer).Figure6(b)conrmsthatafterthebuerisfull,theclientrequestsanew4-secondsegmentevery4seconds,onaverage.Theproblemisthatduringthe4-secondOFFperiod,theTCPcongestionwindow(cwnd)timesout|duetoinactivitylongerthan200ms|andresetscwndtoitsinitialvalueof10packets[5,6].EventhoughtheclientisusinganexistingpersistentTCPconnection,thecwndneedstorampupfromslowstartforeachnewsegmentdownload.Itisnaturaltoaskiftherepeateddroppingbacktoslow-startreducestheclient'svideothroughput,causingittoswitchtoalowerrate.Withnocompeting\row,itappearstheansweris`no'.Weverifythisbymeasuringthevideothroughputformanyrequests.Wesetthebottlenecklinkrateto2.5Mb/s,usetracescollectedfromactualsessionstoreplaytherequestsoverapersistentconnectiontothesameserver,andpausetherequestsatthesameintervalasthepausesinthetrace.Figure7(a)showstheCDFoftheclient'svideothroughputforrequestscorrespondingtovari-ousplaybackrates.Thevideothroughputisprettyaccurate.Exceptforsomeminorvariation,thevideothroughputac-curatelyre\rectstheavailablebandwidth,andexplainswhytheclientpicksthecorrectrate. 0.5 1.0 1.5 2.0 2.5 3.0Time(s) 2 4 6 8 10 12 14 16 18TCP Congestion Window (segment) Competing Flow Emulated Video Flow (a)A235kbpsSegment. 7 8 9 10 11 12Time(s) 2 4 6 8 10 12 14 16 18TCP Congestion Window (segment) Competing Flow Emulated Video Flow (b)Fivecontiguous235kbpssegmentsconcatenatedintoone.Figure8:(ServiceA)Theevolutionofcwndfordierentsegmentsizes. 0 500 1000 1500 2000 2500 3000 3500 4000Time(s) 0 500 1000 1500 2000 2500 3000Available Bandwidth/Request Rate (kb/s) Available Bandwidth Video Playout Rate Figure9:(ServiceA)Theclientpicksavideoratedependingontheavailablebandwidth.Thehori-zontalgraylinesaretheavailablerates.5.2TheTrigger:WithaCompetingFlowThingsgowrongwhenthecompeting\rowsstarts(after400seconds).Figure7(b)showstheclient'svideothrough-putaremostlytoolowwhenthereisacompeting\row.welookattheprogressionofcwndforthevideo\rowafteritresumesfromapause,wecantellhowtheserveropensupthewindowdierentlywhenthereisacompeting\row.Be-causewedon'tcontroltheserver(itbelongstotheCDN)weinsteaduseourlocalproxytoserveboththevideotracandthecompeting\row,andusethetcp_probekernelmodule InFigure7(b),thebottleneckbandwidthissetto5Mb/ssothattheavailablefair-shareofbandwidth(2.5Mb/s)isthesameasinFigure7(a).Notethatsomesegmentdownloadsareabletogetmorethanitsfairshare;inthesecases,thecompeting\rowexperienceslossesandhasnotrampeduptoitsfairshareyet.ThisisthereasonwhysomeoftheCDFcurvesdoesnotendwith100%at2.5Mb/sinFigure7(b).tologthecwndvalues.Thevideotrachereisgeneratedbyrequestinga235kbpsvideosegment.Figure8(a)showshowcwndevolves,startingfromtheinitialvalueof10at1.5seconds,thenrepeatedlybeingbeatendownbythecompetingwget\row.Thecompetingwget\rowhasalreadylledthebuerduringtheOFFperiod,andsothevideo\rowseesveryhighpacketloss.Worsestill,thesegmentisnishedbeforecwndclimbsupagain,andwere-entertheOFFpe-riod.TheprocesswillrepeatforeveryON-OFFperiod,andthethroughputisheldarticiallylow.Forcomparison,andtounderstandtheproblembetter,Figure8(b)showstheresultofthesameexperimentwithasegmentsizevetimeslarger.Withalargersegmentsize,thecwndhaslongertoclimbupfromtheinitialvalue;andhasamuchgreaterlikelihoodofreachingthecorrectsteadystatevalue.Nowthatweknowthevideothroughputtendstobelow(becauseofTCP),wewouldliketobetterunderstandhowtheclientreactstothelowthroughputs.Wecantracktheclient'sbehavioraswesteadilyreducetheavailableband-width,asshowninFigure9.Westartwithabottlenecklinkrateof5Mb/s(andnocompeting\row),dropitto2.5Mb/s(tomimicacompeting\row),andthenkeepdroppingitby100kb/severy3minutes.Thedashedlineshowstheavailablebandwidth,whilethesolidlineshowsthevideoratepickedbytheclient.Theclientchoosesthevideorateconservatively;whenavailablebandwidthdropsfromfrom5Mb/sto2.5Mb/s,thevideorategoesdownto1400kb/s,andsoon.Wecannowputthetwopiecestogether.Consideraclientstreamingataplaybackrateof1750kb/s,themedianvideothroughputitperceivesis1787kb/sasshowninFigure7(b).AccordingtoFigure9,withavideothroughputof1787kb/s,theclientreducesitsplaybackrateto1050kb/s.Thus,50%ofthetimetheplaybackratewillgodownto1050kb/soncethecompeting\rowstarts.ItisinterestingtoobservethattheServiceAclientisbehavingquiterationallygiventhethroughputitperceives.TheproblemisthatbecauseServiceAobservesthethrough-aboveTCP,itisnotawarethatTCPitselfishaving 0 500 1000 1500 2000 2500Throughput (kb/s) 0 20 40 60 80 100CDF (%) Video Rate 650kb/s Video Rate 1000kb/s Video Rate 1500kb/s Video Rate 2000kb/s Video Rate 2500kb/s Video Rate 3200kb/s (a)ServiceBwithnocompeting\row. 0 500 1000 1500 2000 2500Throughput (kbps) 0 20 40 60 80 100CDF (%) Video Rate 650kb/s Video Rate 1000kb/s Video Rate 1500kb/s Video Rate 2000kb/s Video Rate 2500kb/s Video Rate 3200kb/s (b)ServiceBwithonecompeting\row.Figure16:(ServiceB)TheTCPthroughputchangesinthepresenceofacompeting\row. 102 103 104 105 OFFDuration(ms) 0:0 0:2 0:4 0:6 0:8 Fraction(%) Figure14:(ServiceB)AlmostalltheOFFperiodsinasinglevideosessionaregreaterthanRTO(200ms).videothroughputishigher|hencetheServiceAclientpicksahigherrate(1050kb/s).Forcomparison,weasked10volunteerstorerunthisex-perimentwithServiceAintheirhomenetworkconnectedtodierentISPs,suchasAT&TDSL,Comcast,Verizonanduniversityresidences.Eventhoughtherewassucientavailablebandwidthforthehighestvideorateinthepres-enceofacompeting\row,sevenpeoplereportedarateofonly235kb/s-560kb/s.5.5ServiceBServiceBalsoexhibitsON-OFFbehavior,butattheTCPlevelandnottheHTTPlevel,i.e.,thepausecouldhappenwhiledownloadingavideosegment.Whenitsvideoplay-backbuerisfull,theclientstopstakingdatafromtheTCPsocketbuer.Eventually,theTCPsocketbueralsollsandtriggersTCP\rowcontroltopausetheserverbysendingazerowindowadvertisement.InFigure4(b),each 0 500 1000 1500 2000 Downloadsizebetweensuccessivezerowindowads(KBytes) 0:0 0:2 0:4 0:6 0:8 Fraction(%) Figure15:(ServiceB)Whenthevideostreamisreceiver-limited,theclientdoesnotrequestmanybytesduringanONperiod.zerowindowadvertisementismarkedbyahexagon.Theclientstartsissuingzerowindowadvertisementsataround100sandcontinuestodosountilafewsecondsafterthecompeting\rowstarts.Figure14showstheCDFofthedu-rationoftheOFFperiods.Almostallthepausesarelongerthan200ms,socwndisresettoitsinitialvalue.Thus,Ser-viceBeectivelyexhibitsanON-OFFbehaviorsimilartothatofServiceA.Worsestill,duringanONperiod,ServiceBdoesnotre-questmanybytes;Figure15showsthatoverhalfofthetime,itreadsonly800kbytes,whichisnotenoughforthecwndtoclimbuptoitssteadystatebeforethenextOFFperiod.Figure4(b)andFigure16(b)showtheresult,thattheTCPthroughputisonlyaround1Mbpsto1.5Mbps,causingSer-viceBtopickavideorateof1000kb/s,oreven650kb/s.Aswesawearlier,whencompetingwithanother\row,thesmallertherequest,thehigherthelikelihoodofperceiving 0 200 400 600 800 1000Time (s) 0 1000 2000 3000 4000 5000Video Rate (kb/s) Buffer Status Video Throughput Video Rate 0 50 100 150 200 250Video Segments in Buffer (Second) Figure20:Customclient,similartoServiceA{equallyconservative,witha10-samplemovingaver-agelter{displaysthedownwardspiral.areasonablebaseline.ForServiceA,Figure9indicatesthebandwidthbelowwhichtheclientpicksalowervideorate.AssumethatServiceAestimatesbandwidthbysimplydi-vidingthedownloadsizebythedownloadtimeandpassingitthroughaxed-sizemoving-averagelter.Wecanesti-matethesizeofthelterbymeasuringhowlongittakesfromwhenthebandwidthdropsuntiltheclientpicksanewrate.AnumberoftracesfromServiceAsuggestalterwith10samples,thoughthetruealgorithmisprobablymorenu-anced.TocloselymimictheServiceAclient,ourcustomclientre-queststhevideosegmentswiththesamesizesfromthesamelocationsintheCDN:wecapturethesegmentmapgiventotheclientafterauthentication,whichlocatesthevideoseg-mentsforeachsupportedplaybackrate.Hence,ourcustomclientwillexperiencethesamesegment-sizevariationoverthecourseofthemovie,andwhenitshiftsplaybackrate,thesegmentsizewillchangeaswell.Sinceourcustomclientusestokensfromanearlierplayback,theCDNcannottellthedierencebetweenourcustomclientandtherealServiceAclient.TofurthermatchServiceA,theplaybackbuerissetto240seconds,theclientusesasinglepersistentcon-nectiontotheserver,anditpauseswhenthebuerisfull.Werstvalidatetheclient,thenconsiderthreechanges:(1)beinglessconservative,(2)changingthelteringmethod,and(3)aggregatingsegments.6.2ValidatingourCustomClientFigure20showsthecustomclientinaction.Afterdown-loadingeachsegment,thecustomclientselectstheplaybackratebasedonServiceA'sconservativerateselectionalgo-rithm,observedinFigure9.Oncetheplaybackbuerisfull,weintroduceacompeting\row.Liketherealclient,theplaybackratedropssuddenlywhenthecompeting\rowstarts,then\ructuatesoverthecourseofthemovie.Thedownwardspiraldoesnotbottomout,whichwesuspectisduetosomesubtledierencesbetweenServiceA'salgorithmandours. 0 200 400 600 800 1000Time (s) 0 1000 2000 3000 4000 5000Video Rate (kb/s) Buffer Status Video Throughput Video Rate 0 50 100 150 200 250Video Segments in Buffer (Second) Figure21:Customclient{with10%conservatism,butwitha10-samplemovingaveragelter. 0 200 400 600 800 1000Time (s) 0 1000 2000 3000 4000 5000Video Rate (kb/s) Buffer Status Video Throughput Video Rate 0 50 100 150 200 250Video Segments in Buffer (Second) Figure22:Customclient{with10%conservatism,andwithan80th-percentilelter.6.3LessConservativeBandwidthestimatesbasedondownloadsizesanddura-tionstendtounder-reporttheavailablebandwidth,espe-ciallyinthepresenceofacompeting\row.Ifthealgorithmisconservative,itexacerbatestheproblem.Wetryalessconservativealgorithm,withaconservatismof10%insteadof40%.Conservatismof40%meanstheclientrequestsforavideorateofatmost1.2Mb/swhenitperceives2.0Mb/s,while10%meansitrequestsatmost1.8Mb/swhenper-ceiving2.0Mb/s.AccordingtoFigure9,ServiceArequestsvideoratewithaconservatismofapproximately40%.Fig-ure21showsthatthevideorateishigher,eventhoughtheplaybackbuerstaysfull.Theresultishigherqual-ityvideo,highplaybackbueroccupancy(i.e.resilienceagainstrebuering)andfourminutesofbueringtorespondtochangesinbandwidth.Notethateventhoughthealgo-rithmislessconservative,theunderlyingTCPensuresthealgorithmstaysa\goodcitizen"andonlygetsitsfairshareofavailablebandwidth. 0 200 400 600 800 1000Time (s) 0 1000 2000 3000 4000 5000Video Rate (kb/s) Buffer Status Video Throughput Video Rate 0 50 100 150 200 250Video Segments in Buffer (Second) Figure23:Customclientwithincreasedsegmentsize(5x).6.4BetterFilteringAveragingltersprovideamorestableestimateofband-width,butasingleoutliercanconfusethealgorithm.Forexample,afewsecondsoflow-informationmoviecreditsre-ducesthesegmentsizeandthealgorithmmightdroptherate.Inplaceofaverages,weconsidermediansandquan-tilestoreducethevulnerabilitytooutliers.Figure22showswhathappensifweusethe80th-percentileofmeasuredrateofthepasttensegmentdownload.Variationisgreatlyre-duced,andthemajorityofthemovieplaysatthehighest-availablerate.Theplaybackbuerhassmall\ructuations,butitisstillfarfromarebuerevent.6.5BiggerSegmentsAsnotedearlier,biggersegmentsprovidebetterestimatesoftheavailablebandwidth,allowingTCPtoescapeslow-start.Figure23showswhathappensifourclientaggregatesverequestsintoone.Withthelargersegmentsize,thevideothroughputismorestable,andboththeplaybackrateandbuersizearemorestable.Insummary,largersegmentsletTCPreachitsfairshareandimprovethevideothroughput.Pickinghigherrateslessconservativelyandlteringmeasurementsmorecare-fullycanimprovevideoquality.Butweshouldnotethattheseimprovementsareforonemovieononeservice.Giventheprevalenceofthedownwardspiraleect,theseshouldnotbeinterpretedashardrecommendations,merelyasaddeddetailtoourunderstandingoftheproblem.7.RELATEDWORKTherelatedworklargelyconsidersthreeoverlappingar-eas:systemsforvideostreaming;measurementstounder-standtheirperformance,andthedesignandanalysisofrateselectionalgorithms.VideoStreamingServices.TherstcategorycoversvideostreamingapproachesusingHTTP,suchasthecom-mercialonesfromAdobe,Apple,andMicrosoftdescribedin[22],whichdierintheiralignmentofvideoswitchingrates,whetherA/Vstreamsarecombined,andwhetherre-questsareissuedasbyterangesorforpre-speciedseg-ments.AmorerecenttechniqueisMPEGDASH(DynamicAdaptiveStreamingoverHTTP)[7]whichstandardizestheformattingofvideocontentandleavesopenthespecicclientplayeralgorithm.Thesetechniquesunderpinthema-jorcommercialserviceslikeYouTube,Net\rix,andHulu.VideoStreamingMeasurement.Thesecondcate-gorymeasurestheperformanceofindividualvideostreamingclientsexperiencinglocaltracconditions(\inthelab"),allthewaytodistributedmeasurementsystemsthatcomparetheperformanceofthousandsofclients(\inthewild").Theworkmostsimilartooursis[3],wheretheauthorsalsoparseHTTPmessagestodetermineplaybackratesanduseabandwidthlimitertotestclientsundervaryingnetworkconditions.However,[3]focusesontheunfairnessproblemamongtwovideoplayers,whileinthisworkwefocusontheunfairnessproblembetweenavideoplayerandalong-livedTCP\row.Thispaperconsidersasignicantlydier-entscenario:itfocusesonavideoclientcompetingagainstanothervideoclient.Inthiscontext,theyobservesimilarpathologies:poorbandwidthestimation,leadingtoinsta-bility.However,theyexplaintheirobservationsentirelyintermsoftheapplication-layerON-OFFbehaviorofvideoclients;evenifonevideoclientperfectlyobtaineditsfairsharewhenON,itcanfailtocorrectlyestimateavailablebandwidth(dependingontheamountofoverlapwiththeONperiodsoftheotherclient).Bycontrast,ourpaperdemonstratesthatthisisonlyasymptomofamoregen-eralproblem:inaccuratebandwidthestimationoccursevenwhenthecompeting\rowdoesnotexhibitON-OFFbehav-ior.Asweshowinthispaper,theproblemarisesbecauseitishardtoestimatebandwidthaboveTCP.Othershaveidentiedthesameproblembutnotexplaineditscausesorvalidatedpotentialxes[4,16].MeasuringtheCDNserversratherthanclientsprovidesdierentinsights.In[1],theauthorsexaminetheCDNse-lectionstrategyofHulu,whilein[2],theauthorslookatNet\rix.BothpapersndapredispositionforclientstostaywiththeoriginalCDN,despitevariationbetweenCDNsandovertime.In[9],theauthorsdescribelessonslearnedfromadistributedcommercialmeasurementsystemtounderstandtheeectsofQuality-of-Experience(QoE)metricsonviewerengagementandretention.RebuerratesandaveragevideoqualityareQoEmetricswithmeasurableimpactsonviewerengagement,whichunderscorestheimportanceofgettingratemeasurementandselectionrightinthepresenceofcom-peting\rows.Withthemeasurement-driveninsightsfromthesamesystem,[14]proposesaglobalvideocontrolplanetodynamicallyassignclientsachoiceofvideorateandCDNthatoptimizesviewers'experience.Otherworklooksatnetworkcharacteristicsofvideostream-ingtrac,ratherthanfocusingontheclientorviewerexpe-riences[11,19,24].Inparticular,theauthorsin[19]showON-OFFcyclebehaviorforYouTubeandNet\rixanduseamodeltostudyaggregatesofvideoclientandtheireectsonthenetwork.BothCDNandnetworktracpapersdonotconsiderlocaleectsonmeasuredbandwidthortheireectsonratestability.RateSelectionAlgorithms.Thethirdcategoryisworkonrateselectionalgorithms.Thisworkcomplementsours,asacontrolsystemalwaysbenetsfrommoreaccu-ratemeasurements.In[8],theauthorsproposeanalgorithmtomaintaintheplayoutbueratatargetlevel.In[17],theauthorsimplementadierentbuer-awarerateselectionalgorithmandexperimentallymeasureuserpreferencesforgradualandinfrequentplaybackratechanges.In[23],the authorsmodeltherateselectionproblemasaMarkovDeci-sionProcessanduseadynamicprogrammingtechniquetochooseastreamingstrategythatimprovesQoE.In[13],theauthorsusesimulationstoshowhowparallelHTTPsessionscanimproveplaybackquality.Server-sidepacingisanotherapproachtoselectingrateusedbyYouTube,asdescribedin[10,12].8.CONCLUSIONDespitesomedierencesinspecicserviceimplementa-tions,allthreeserviceswestudydisplaydegradedperfor-manceinthepresenceofcompetingtrac,wellbelowthevideoqualitypossibleiftheclientuseditsfairshareofband-width.Atahighlevel,ourmeasurementanalysisandex-perimentssuggestthattherootcauseofthisfailureisalackofinformation.TheHTTPlayerissimplynotprivytocontinuoushigh-delityfeedbackaboutthefairshareatthebottlenecklink.Therearetwowaystointerpretourobservations.Ononehand,weobservethatdeterminingthefairshareofband-widthavailableatthebottleneckispreciselytheroleofTCPThus,onepathforwardmightbetosuggestthatweshoulddesigntheclienttoimproveinformation\rowfromTCPtotheHTTPlayer.Inparticular,weshouldensurethatTCPhasachancetoreachitssteady-statefairshare;forexample,increasingthesegmentsizeenablesthiseect.However,webelievetheremaybeamoreradicalsolu-tion:donotattempttoestimatebandwidthatall!Thevideostreamingclienthastwocompetinggoals:attainthehighestbitratepossiblewhileavoidingbuerunderruns.Thustheobjectiveisnottoensurethebuerstaysfull;theobjectiveistoensurethebuerdoesnotgoempty.Sincethebuerholdsseveralminutesofvideo,thisshiftinperspectivesug-geststhatifthebuerisfullthentheclienthaspickedaratethatistoolow.Rather,theclientshouldincreasethebitratewhenthebuerishighanddecreaseitwhenthebuerfallslow.Thoughthissoundsaggressive,notethatitisexactlythecorrectlayerseparation:ithandsotoTCPtheob-jectiveofobtainingthefairshareofbandwidth,andtriestoalwaysensuretheclientpicksthehighestratepossible.Thissuggestsanintriguingpathforwardforfutureresearch:designvideo-streamingclientsthatdeliverhighperformancebyeliminatingbandwidthestimationalltogether.AcknowledgmentWearegratefultotheanonymousreviewersandourshep-herdNinaTaftfortheirvaluablecommentsandfeedback,whichhelpedimprovethenalversion.TheauthorswouldalsoliketothankKok-KiongYap,MasayoshiKobayashi,VimalkumarJeyakumar,YiannisYiakoumisandNet\rixen-gineersforhelpfuldiscussionsthatshapedthepaper.ThisworkwassupportedbyMr.andMrs.ChunChiuStanfordGraduateFellowship,Hewlett-PackardFellowship,theStan-fordCleanSlateProgram,andtheNationalScienceFoun-dationundergrantsCNS-0904609,CNS-0644114,andCNS-0832820.9.REFERENCES[1]V.Adhikari,Y.Guo,F.Hao,V.Hilt,andZ.-L.Zhang.ATaleofThreeCDNs:AnActiveMeasurementStudyofHuluanditsCDNs.InProceedingsofIEEEConferenceonComputerCommunicationsWorkshops(INFOCOMWKSHPS),pages7{12,March2012.[2]V.K.Adhikari,Y.Guo,F.Hao,M.Varvello,V.Hilt,M.Steiner,andZ.-L.Zhang.UnreelingNet\rix:UnderstandingandImprovingMulti-CDNMovieDelivery.InProceedingsoftheIEEEINFOCOM2012,Orlando,FL,USA,pages1620{1628,March2012.[3]S.Akhshabi,L.Anantakrishnan,C.Dovrolis,andA.Begen.WhatHappensWhenHTTPAdaptiveStreamingPlayersCompeteforBandwidth?InProceedingsoftheACMWorkshoponNetworkandOperatingSystemsSupportforDigitalAudioandVideo(NOSSDAV),June2012.[4]S.Akhshabi,C.Dovrolis,andA.Begen.AnExperimentalEvaluationofRateAdaptationAlgorithmsinAdaptiveStreamingoverHTTP.InProceedingsoftheACMMultimediaSystemsConference(MMSys),SanJose,CA,USA,Feburary2011.[5]M.Allman,V.Paxson,andE.Blanton.TCPCongestionControl.RFC5681(DraftStandard),Sept.2009.[6]M.Allman,V.Paxson,andW.Stevens.TCPCongestionControl.RFC2581(ProposedStandard),Apr.1999.ObsoletedbyRFC5681,updatedbyRFC3390.[7]MPEGDASHspecication(ISO/IECDIS23009-1.2),2011.[8]L.DeCicco,S.Mascolo,andV.Palmisano.FeedbackControlforAdaptiveLiveVideoStreaming.InProceedingsoftheACMMultimediaSystemsConference(MMSys),Febrary2011.[9]F.Dobrian,A.Awan,D.Joseph,A.Ganjam,J.Zhan,V.Sekar,I.Stoica,andH.Zhang.UnderstandingtheImpactofVideoQualityonUserEngagement.InProceedingsoftheACMSIGCOMM,Toronto,Canada,August2011.[10]M.Ghobadi,Y.Cheng,A.Jain,andM.Mathis.Trickle:RateLimitingYouTubeVideoStreaming.InProceedingsoftheUSENIXAnnualTechnicalConference(ATC),page6,2012.[11]P.Gill,M.Arlitt,Z.Li,andA.Mahanti.YoutubeTracCharacterization:AViewFromtheEdge.InProceedingsoftheACMSIGCOMMconferenceonInternetMeasurement(IMC),pages15{28,2007.[12]L.Kontothanassis.ContentDeliveryConsiderationsforDierentTypesofInternetVideo.InProceedingsoftheACMMultimediaSystemsConference(MMSys){Keynote,ChapelHill,NC,USA,Febrary2012.[13]C.Liu,I.Bouazizi,andM.Gabbouj.ParallelAdaptiveHTTPMediaStreaming.InProceedingsoftheIEEEInternationalConferenceonComputerCommunicationsandNetworks(ICCCN),pages1{6,2011.[14]X.Liu,F.Dobrian,H.Milner,J.Jiang,V.Sekar,I.Stoica,andH.Zhang.ACaseforaCoordinatedInternetVideoControlPlane.InProceedingsoftheACMSIGCOMM,Helsinki,Finland,August2012.[15]J.W.Lockwood,N.McKeown,G.Watson,G.Gibb,P.Hartke,J.Naous,R.Raghuraman,andJ.Luo.NetFPGA{AnOpenPlatformforGigabit-RateNetworkSwitchingandRouting.InMSE'07: Proceedingsofthe2007IEEEInternationalConferenceonMicroelectronicSystemsEducationpages160{161,2007.[16]K.Miller,E.Quacchio,G.Gennari,andA.Wolisz.AdaptationalgorithmforadaptivestreamingoverHTTP.InProceedingsoftheIEEEInternationalPacketVideoWorkshop(PV),pages173{178,May2012.[17]R.Mok,X.Luo,E.Chan,andR.Chang.QDASH:aQoE-awareDASHsystem.InProceedingsoftheACMMultimediaSystemsConference(MMSys),pages11{22,Febrary2012.[18]Sandvine:GlobalInternetPhenomenaReport.http://www.sandvine.com/news/pr_detail.asp?ID=312[19]A.Rao,A.Legout,Y.Lim,D.Towsley,C.Barakat,andW.Dabbous.Networkcharacteristicsofvideostreamingtrac.InProceedingsoftheACMCOnferenceonemergingNetworkingEXperimentsandTechnologies(CONEXT),page25.ACM,2011.[20]RTMPDump.http://rtmpdump.mplayerhq.hu/[21]ConsumerReport:StreamingVideoServicesRating.http://www.consumerreports.org/cro/magazine/2012/09/best-streaming-video-services/[22]M.Watson.HTTPAdaptiveStreaminginPractice.InProceedingsoftheACMMultimediaSystemsConference(MMSys){Keynote,SanJose,CA,USA,Febrary2011.[23]S.Xiang,L.Cai,andJ.Pan.AdaptiveScalableVideoStreaminginWirelessNetworks.InProceedingsoftheACMMultimediaSystemsConference(MMSys),pages167{172,Febrary2012.[24]M.Zink,K.Suh,Y.Gu,andJ.Kurose.CharacteristicsofYouTubenetworktracatacampusnetwork-Measurements,models,andimplications.InComputerNetworks,Volume53,Issue,pages501{514.Elsevier,2009.