/
(a)Receivecommand (a)Receivecommand

(a)Receivecommand - PDF document

olivia-moreira
olivia-moreira . @olivia-moreira
Follow
362 views
Uploaded On 2016-04-29

(a)Receivecommand - PPT Presentation

bGotolocation cAskfor147coffee148 dGet147coffee148 eDeliverobjectFigure1OurrobotsearchingforanobjectInathesystemgetsaquerytonda147coffee148andtakeittoroom7001Inbitgoe ID: 298824

(b)Gotolocation (c)Askfor“coffee” (d)Get“coffee” (e)DeliverobjectFigure1:Ourrobotsearchingforanobject.In(a)thesystemgetsaquerytonda“coffee”andtakeittoroom7001.In(b)itgoe

Share:

Link:

Embed:

Download Presentation from below link

Download Pdf The PPT/PDF document "(a)Receivecommand" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

(a)Receivecommand (b)Gotolocation (c)Askfor“coffee” (d)Get“coffee” (e)DeliverobjectFigure1:Ourrobotsearchingforanobject.In(a)thesystemgetsaquerytonda“coffee”andtakeittoroom7001.In(b)itgoestothenearestkitchen.In(c),itasksapersontoplaceacoffeeonit.In(d),itgetsthecoffeeandthepersonsaysthattherobothastheobject.In(e),therobotdeliverstheobjecttoitsdestination.Searchingforobjectshasreceivedconsiderableinter-estintheroboticscommunity.MuchresearchhasfocusedonvisualobjectsearchthatdoesnotleveragetheWeb(Sj¨o¨oetal.2009;Aydemiretal.2011;Velezetal.2011;Joho,Senk,andBurgard2011).Sj¨o¨oetal.(Sj¨o¨oetal.2009)presentamethodforsearchandlocalizationofobjectsbyusinganattentionmechanismasaprimarystepintherecog-nitionprocess.Usingacombinationofviewplanningandvisualsearch,theauthorsuseexistingcomputervisional-gorithmstoefcientlydetectandlocalizedifferentobjects.Aydemiretal.(Aydemiretal.2011)buildsonthisbyus-ingspatialrelationstoperformlarge-scalevisualsearchforobjects.Megeretal.Johoetal.(Joho,Senk,andBurgard2011)focusesproblemofndinganobjectwithamobilerobotinaninitiallyunknown,structuredenvironment.Whiletheprimaryfocusisnotonvision,theypresenttwometh-odsforobjectsearch.Therstisareactivesearchtechniquebasedonobjectsintherobot'simmediatevicinity.Thesec-ondisaglobal,inference-basedapproachthatusestheob-jectarrangementsofexampleenvironments.Finally,Velezetal.(Velezetal.2011)considersthetaskofautonomouslynavigatingthroughtheenvironmentwhilemappingthelo-cationofobjects.Theauthorsdescribeanonlineany-timeframeworkwherevantagepointsprovidethemostinforma-tiveviewofanobjectgivenanoisyobjectdetector.Unliketheseapproaches,ourworkuseshelpfromhumanstodetectandmanipulateobjects.ResearchershavebeguntothinkabouthowrobotsmightbeintegratedwiththeWeb.Megeretal.(Megeretal.2008)describeanintegratedroboticplatformthatusesweb-basedtrainingdatatotrainavisualobjectdetectorandthenper-formexploration,mapping,andactiveattention.Mostsimi-lartothisworkisKollaretal.(KollarandRoy2009),whousedtheco-occurrencesinthelabelsfromtheFlickrphoto-sharingwebsiteasaprioroverwhereobjectsarelocatedinthephysicalenvironment.Posneretal.(Posner,Corke,andNewman2010)demonstrateasystemthatqueriestheWebtohelpreadthevisibletextinthescene.Tenorthetal.(Tenorthetal.2011)describehowinformationontheWorldWideWebandintendedforhumanusemightbeusefulforrobots.Finally,therehasbeenmuchworkthatisfocusedonus-ingtheWebtoextractinformation.Manyapproachesusepointwisemutualinformation(PMI),whichisabletomea-surethesemanticsimilarityofdifferentwords(Turney2001;TurneyandLittman2003;Soderletal.July2004;Magninietal.2002).Theperformanceofallofthesetechniquesde-pendsontheaccuracyofsearchenginehitcounts.Toad-dressthefactthatsearchenginehitcountschangedailyandareotherwiseunreliable,Downeyetal.(Downey,Etzioni,andSoderl2005)developedacombinatorial“balls-and-urns”model(Urnsmodel)thatusestheredundancyofthesameextractionfromdifferentsourcestocomputetheprob-abilityofthecorrectness.Finally,theNever-EndingLan-guageLearner(NELL)addressesactivelyreadstheWeb,learningstructuredinformationfromtheunstructuredweb-pages(Carlsonetal.2010).ObjectEvalOurapproach,calledObjectEval,enablesarobotwithlim-itedsensingtosearchforanobject.Byusingsymbioticau-tonomytherobotisabletoaskpeopletohelpitperformtasks,includingmanipulationandobjectdetection(Rosen-thal,Biswas,andVeloso2010;BiswasandVeloso2012).Tondanobject,therobotmusttherefore(1)receiveacom-mandtondanobject(e.g.,“coffee”)andtakeittoades-tination(e.g.,room7001),(2)computeasequenceoflo-cationstovisitbymaximizinglong-termutility,(3)visitalocation,(4)askapersontoretrievetheobjectandnally(5)ifthere,delivertheobjecttothedestinationorifnot,gotothenextlocationtolookfortheobject.Anexampleofourrobotndinga“coffee”canbeseeninFigure1.ModelObjectEvaltakesasinputanobjectname(e.g.,papers)andadestinationroom(e.g.,room8120),andreturnsaplancon-sistingoflocationsthatrobotshouldvisit.Findingobjects requirestradingoffdifferentobjectivesincluding:thenum-berofinteractionswithpeople,thedistancetraveled,theex-istenceofobjectsatpreviouslyvisitedlocations,andproba-bilityofndinganobjectinalocation.ObjectEvalcombinestheseobjectivesintoautilityfunctionthat,whenmaximized,generatesaplanthattherobotcanexecutetondanobjecteffectively.IfOisanobjectname(e.g.,“papers”),andUistheutilityfunction,thentheproblemcanbeformulatedasndingtheplanthatmaximizestheutility:argmaxplanU(planjO)(1)Theplanisbrokendownintoasequenceofsteps(plani),eachofwhichvisitalocationandaskforanobjectfromaperson.Therobotreceivesareward(R)whenitexecutesithstepoftheplan.Thecurrentstepintheplanissuccessfulwithprobabilityp(planijO).U(planjO)=NXi=1p(planijO)R(plani;O)(2)Inordertocapturetheobjectiveofndingobjectsquickly,therewardateachstepisbrokendownintothreecompo-nents:R(plani;O)=D(plani)I(plani)F(plani;O)(3)Sincetherobotshouldconsiderplansthattravelaslittleaspossible,weincludetherewardD,whichisdependentonthedistancetherobottravels.Discomputedbysubtractingthedistancetraveledfromthemaximumdistancetherobotcouldtravel.Sincepeopleareusedasapartofthesearchprocesstondandmanipulateobjects,weincludethere-wardI,whichisdependentonthenumberofinteractionsthattherobothaswithaperson.Iiscomputedbysubtract-ingthenumberofinteractionsrequiredtosearchalocationforanobjectfromthemaximumnumberofinteractionstherobotwillneedtosearchanylocation.Finally,inordertotakeadvantageoffeedbackfrompeople,weincludethere-wardF,whichusesprevioussearchestohelpsearchforob-jects.ThevalueofFis1ifaqueryobjecthasbeenseenatthesearchlocation,0.5ifthelocationhasnotbeenex-plored,and0ifitisknownnottoexistthere.AlthoughFisxedinthispaper,learningadynamicmodelforhowob-jectsmovewouldenableObjectEvaltohandlecaseswherethequeryobjectmovesbetweendifferentlocationsintheenvironment.ThesecondcomponentofEquation2requiresustocom-putetheprobabilityofapartoftheplan.Asaproxyfortheprobabilityoftheplan,weusetheprobabilitythatthelocationattheithstepoftheplanwillcontainanobjectgiventhattheobjectwasnotseenatthepreviouslyvis-itedlocationsintheplan.Ifljismultinomialoverlocationtypes(e.g.,“ofce,”“printerroom,”“bathroom”)andOisthequeryobject,thenthewecancomputethisprobabilityas:p(planijO)24i�1Yj=1(1�p(ljjO))35p(lijO)(4)Inordertondtheplanwithamaximumutility,therobotmustbeabletocomputep(lijO).ThistermconnectsaqueryobjectO(e.g.,“papers”)toalocationtypeintheenviron-ment(e.g.,“printerroom”).Connectingaquerywordforanobjecttoaplacewheretherobotcanndtheobjectischallengingbecausetherearethousandsofdifferentob-jectnamespeoplemightuse.Wecalculatetheprobabilityp(lijO)byqueryingtheWebforthevalidityofthepredicatelocationHasObject(l,O)overalllocationtypesl.Forexam-ple:p(lj=kitchenjO=coffee),p(locationHasObject(kitchen,coffee))(5)InthenextsectionwedescribehowObjectEvalobtainstheprobabilityofinstancesofthepredicatelocationHasObject.QueryingtheWebTheWorldWideWeb(WWW)containsanenormousamountofsemanticinformationthatmightbeusefulforrobots.Inthispaper,weinvestigatetheuseofthesemanticinformationontheWebtopredictthelocationofobjectsinreal-worldenvironments.WeexpectthatobjectsphysicallypresentinalocationwillbefoundfrequentlyontheWeb.Forexample,oneofthetopsearchresultsfortheobject“pa-per”andthelocation“printerroom”is,“Thereisnomorepaperintheprinterroom,wherecanIndsomemore?”Forobjectsunrelatedtothelocation,suchas“papers”and“elevator”therearefewerpageswhichoftendescribelesssensicaleventssuchas,“CallforPapers,TheInternationalSpaceElevatorConsortium(ISEC)invitesyoutojoinusinWashingtonState.”Therefore,weexpectthatthewordpat-ternsforrelatedtermswillbepredictive,whileun-relatedtermswillbelesspredictive.Figure2showsexampleoftextsnippetsthatarefoundontheWebforobject“papers”andlocations“printerroom”and“bathroom.”ObjectEvalwillcomputetheprobabilityfromEquation5byconvertingpredicateinstancesinrst-orderlogic,suchaslocationHasObject(papers,printerroom),intoasearchquerysuchasf“papers”,“printerroom”g.Thesesearchqueriescanreturnhundredsorthousandsofthemostrel-evantweb-pagesthatrelatetheseterms.Thesearchqueryincludesboththenameofthelocationtypeandthenameofthequeryobjectinordertoretrievehighlyrelevantweb-pages.Incontrast,asearchquerysuchas“papers”willre-turnbothrelevantandirrelevantweb-pagesfordeterminingif“papers”canbefoundina“printerroom.”Thetextontheweb-pagesthatismostrelevanttoapredi-cateinstancewillbenearthesearchterms.Wethereforeex-tracttextsnippetsfromeachoftheweb-pagesthatincludeupto10wordsbefore,after,andinbetweenthequeryobjectandlocationwords.Iftherearemultipletextsnippetsex-tractedfromthesameweb-page,wemergethemintoasin-gletextsnippet.Eachofthetextsnippetsisthentransformedintoafeaturevector,whereeachelementcorrespondstothefrequencyofadistinctwordinthetextsnippet.Thedi-mensionofthevectorisequaltothetotalnumberofdis-tinctwordsthatexistinthetrainingdata.Allthestopwordshavebeendeleted,sinceweexpectthesefeaturestoonlyadd (a) (b)Figure3:In(a)istheprecision/recallcurveforthe45testpredicateinstances.In(b)istheF1-scoreforthe45testpredicateinstanceswhentrainingonasubsetofthetrainingdataset.thattendtoresidethere.Thedataissplitbyrandomlychoos-ing68%ofdatafortrainingand32%fortesting.ObjectEvalistrainedandtestedbyusingtherst20web-pagesthatarereturnedbythesearchengine.Table1showstheresultforasubsetofthetestobjects.ObjectEvalisabletocorrectlydeterminethemostlikelylocationformostobjects.Itin-correctlyclassies“whiteout”tobefoundin“bathroom.”ObjectEvalalsochooses“bathroom”asthemostlikelylo-cationfor“cup”.Althoughthisiscorrectinsomeenviron-ments(e.g.hotels),wegenerallyexpectrobottond“cup”ineithera“kitchen”oran“ofce”.Theresultsshowthatbyrequestingmorespecicquerysuchas“coffeecup,”Ob-jectEvalwillchangeitsclassicationtothe“kitchen.”ObjectEvalwasthenevaluatedusingprecision,recall,andF1(whichisacombinationofprecisionandrecall)overthisdataset.TheESPbaselinereplaceswebsearchwithasearchovertagdocumentsthatcontainthesearchterms(vonAhnandDabbish2004)inordertoprovidecomparisonto(KollarandRoy2009).Figure3(a)showsthatthemodeltrainedonESPperformsworsethanObjectEval,whichlikelyhappensbecausefewlocationsaretaggedintheESPdataset.Finally,thespeedatwhichObjectEvallearnswaseval-uated.Figure3(b)showstheF1-scoreofObjectEvalwhenincreasingthenumberofpredicateinstancesusedforthetraining.Theresultsareobtainedbytrainingonasubsetofthetraininginstancesandevaluatingonallofthetestin-stances.Theresult,somewhatsurprisingly,showsthatOb-jectEvalachievesahighF1valueevenwhenitusesafewtrainingexamples.Forexample,itachievesaF1scoreofabout60%whenitusesonly6trainingexamplesforthetraining.ObjectEvallearnsquicklybecauseasingletraininginstancecouldreturnthousandsormillionsofweb-pages.Forexample,thenumberofdocumentsreferencing“papers”and“printerroom”is61,200accordingtoGoogle.Thisre-sultindicatesthatObjectEvalmightbetrainedevenwithonlyafewpredicateinstances. ObjectLocationTypesBathroomPrinterRoomKitchenOfce coffee0.080.020.720.18marker0.330.530.080.06pen0.150.270.230.35toner0.050.870.020.06scissors0.260.010.610.12whiteout0.660.020.240.08laptop0.10.480.080.34papers00.170.130.7cup0.420.10.360.12coffeecup00.010.730.27speakers0.340.060.250.35 Table1:TheprobabilitythatObjectEvalassignstodifferenttestobjectsforeachlocationtype.Thelocationtypewithmaximumprobabilityisshownasbold.SimulatedExperimentsWehavecreatedalargesimulatedenvironmenttoevaluatehowObjectEvalwillsearchforobjects.Sincethesimulatorusesexactlythesameproceduresasthephysicalrobot,thenumberofinteractions(I)willbeexactlythesameasontherealrobot.Ingeneral,whentherobotasksforanobject,apersonmustanswertwoquestionsandwhenitismovingbetweenoors(usingtheelevator)apersonmustanswervequestions.Tosimulatetheobjectspresentinthebuilding,wehavecreatedasemanticmapof290spacesoverthreeoorsofanofcebuildingthatcontainnamesforobjectsandloca-tionspresentineachspace.Thiswasdonebyaskingsub-jectsonAmazon'sMechanicalTurktolabelimagesof46roomswiththelocationandobjectspresent.TheselabelsweretransferredtospacesforwhichwewerenotabletoacquireimagesbysamplingfromthedatacollectedfromMechanicalTurk.TotesttheabilityofObjectEvaltosearchforobjects,wehaveselected80objecttypesthatwerenotapartofthetrainingset.ObjectEvalwasgivenonlytheloca- Approach Visitedlocations Distance Interactions MeanStandard MeanStandard MeanStandard Error Error Error Baseline 35.86.1 69.67.2 71.512.3ObjectEval(ofine) 14.34.3 33.94.6 28.78.6ObjectEval(interactive) 10.23.8 32.54.4 20.57.7 Table2:Averageandstandarderrorforthenumberofvisitedlocations,distanceandnumberofinteractionsfordifferentapproaches.ThebaselineusesonlythetermsforinteractionIanddistanceDfromEquation2.ObjectEval(ofine)usesbatchtrainingandObjectEval(interactive)isgivennotrainingdata,butinsteadusesthepresenceofobjectsinlocationstoupdatetheprobabilityofalocationgiventheobjectasitperformsasearch(asfromEquation2).tiontypes(e.g.,“kitchen”or“printerroom”)andamapoftheenvironment.Foreachqueryobject,arandomlocationischosenastheobjectdeliverydestination.WeevaluateObjectEvalintwoscenarios:ofinemodeandinteractivemode.Intheofinemode,ObjectEvallearnstheprobabilityfromEquation4byusingasmalldatasetofpredicateinstancesconsistingofobjectsandaplacewherethatobjectcangenerallybefound.Ininteractivemode,therobotstartsperformingthendanddelivertaskinanun-knownenvironmentwithoutthistrainingdata.Byinteract-ingwithpeople,ObjectEvalacquiresexamplesofobjectsandthecorrespondingplacewheretheobjectwasfound.Thisisthenusedtolearnamodelofp(lijO)inEquation4.Whentherobotndsanobjectinalocation,itaddsthistothecurrentsetoftraininginstances.ObjectEvalwillthensearchtheWebandusetheresultingweb-pagesasadditionaltrainingexamplesthatrelatetheobjecttotheobservedloca-tion.Table2showstheresultsofdifferentapproachesthathavebeenusedtondobjects.ThebaselineonlyusesthedistanceandinteractiontermsofEquation2togreedilygeneratethenextlocationtovisitandusesnosemanticinformationabouttheenvironment.ObjectEvalmaximizestheexpectedutilityEquation2inbothofineorinteractivemodes.ThereisacleardownwardtrendinthenumberofvisitedlocationsandnumberofinteractionsforObjectEvalwhencomparedwiththisbaseline,indicatingthatthesystemislearningaboutthephysicalenvironment.Surprisingly,theinteractivemodeofObjectEvalachievesbetterresultsthentheofineversionofObjectEval.SincethetrainingdatafromMechanicalTurkcanbedifferentfromtheobjectsandthelocationsthatarefoundbytherobot,theinteractivever-sionObjectEvalmayhaveanadvantagesinceitlearnsthelocationsofobjectsdirectlyinthetestenvironment.Theof-ineversionstartswithabiasedsetofdata(obtainedfromMechanicalTurk)thatmaynotaccuratelyreectthereal-world.Forexample,peoplefromMechanicalTurkhavean-notated“cup”or“glasses”asexampleofobjectsthatcanbefoundinbathroom.However,inourofceenvironmenttheseobjectsareexpectedtobefoundinofces.Bytrainingontheseexamples,theofineversionofObjectEvalwouldbebiasedtowardndingtheseobjectsinbathroom,whereastheinteractiveversiondoesnothavethisproblembecauseitonlyusestrainingdataaboutobjectsintheenvironment.AlthoughthenumberofvisitedlocationsinTable2may Figure4:ThenumberoflocationsvisitedbytherobotbeforendingthequeryobjectfortheinteractivemodeofObjectE-val(redline)andthebaseline(greenline).Thedataissortedbythenumberofvisitedlocationspersimulationrun.seemhigh,theinteractiveversionofObjectEvalnds80%oftheobjectswithinvelocationsorless,whereasthebase-linendsonly39%inthesamevelocations.Onereasonthatthistermishighisbecauseofahighpenaltyforchoos-ingthewronglocation.Forexample,iftherobotincorrectlyclassies“soap”asbeinginan“ofce”,itwillhavetosearchanorderofmagnitudemorelocationsbecausetheenviron-mentcontainshundredsofofces,whereasitonlycontainsafewbathrooms.Finally,wehaveproledthenumberoflocationsvisitedbeforendinganobject.Figure4showstheresultwhenasearchfor20objectsisrepeated5timesstartingfromdif-ferentinitiallocationtoobtain100runs.ThegureshowsthatObjectEval,afterhavinggatheredonlyafewfacts,hasquicklylearnedtoexecuteefcientplanstondobjectswhencomparedwiththebaselineapproach.RobotExperimentsWehavedemonstratedtheabilityofObjectEvaltondanddeliveranobjectonourmobileofceassistantrobot.WehavequeriedObjectEvalfora“coffee”andaskedittode-livertheobjecttoofce“7001.”Therobotdrovetothenear-estkitchenandaskedforacoffee.Whenapersoncameby,theyplacedacoffeeontherobotandtherobotreturnedto7001withthecoffee.ThissearchcanbeseeninFigure1. ConclusionInthispaper,wehavepresentedanapproach,calledOb-jectEval,whichisabletondanddeliverobjectsinreal-worldenvironments.WehaveshownthatoursystemlearnstoquerytheWebtoevaluatetheprobabilityofphysicalbackgroundknowledgethatrelatesobjectsandlocations.Inaddition,wepresentanapproachforndinganddeliv-eringobjectsthatintegratesinformationaboutobjectloca-tionsfromtheWeb,andinteractivelylearnsaboutthephysi-calenvironmentbygettingfeedbackfromhumans.Wehaveshownpromisingresultsoverabaselineapproachandhavedemonstratedoursystemonamobilerobotnavigatinginanindoorenvironment.AcknowledgmentsThisresearchwaspartlysupportedbytheNationalScienceFoundationawardnumberNSFIIS-1012733.Theviewsandconclusionscontainedinthisdocumentarethoseoftheau-thorsonly.ReferencesAydemir,A.;Sj¨o¨o,K.;Folkesson,J.;Pronobis,A.;andJens-felt,P.2011.Searchintherealworld:Activevisualobjectsearchbasedonspatialrelations.InProceedingsofthe2011IEEEInternationalConferenceonRoboticsandAutomation(ICRA'11).Biswas,J.,andVeloso,M.2012.Depthcamerabasedin-doormobilerobotlocalizationandnavigation.InProceed-ingsoftheIEEEInternationalConferenceonRoboticsandAutomation,ICRA'12.Carlson,A.;Betteridge,J.;Wang,R.C.;Jr.,E.R.H.;andMitchell,T.M.2010.Coupledsemi-supervisedlearningforinformationextraction.InProceedingsoftheThirdACMInternationalConferenceonWebSearchandDataMining(WSDM2010).Downey,D.;Etzioni,O.;andSoderl,S.2005.Aprobabilis-ticmodelofredundancyininformationextraction.InIJCAI,1034–1041.Joho,D.;Senk,M.;andBurgard,W.2011.Learningsearchheuristicsforndingobjectsinstructuredenviron-ments.RoboticsandAutonomousSystems59(5):319–328.Kollar,T.,andRoy,N.2009.Utilizingobject-objectandobject-scenecontextwhenplanningtondthings.InPro-ceedingsoftheIEEEinternationalconferenceonRoboticsandAutomation,ICRA'09,4116–4121.Piscataway,NJ,USA:IEEEPress.Kollar,T.;Samadi,M.;andVeloso,M.2012.Enablingrobotstondandfetchobjectsbyqueryingtheweb.InPro-ceedingsoftheEleventhInternationalConferenceonAu-tonomousAgentsandMultiagentSystems,AAMAS.Magnini,B.;Negri,M.;Prevete,R.;andTanev,H.2002.Isittherightanswer?exploitingwebredundancyforanswervalidation.InInProceedingsofthe40thAnnualMeetingoftheAssociationforComputationalLinguistics,425–432.Makarenko,A.;Williams,S.;Bourgault,F.;andDurrant-Whyte,H.2002.Anexperimentinintegratedexploration.InIntelligentRobotsandSystems,2002.IEEE/RSJInterna-tionalConferenceon,volume1,534–539.IEEE.Meger,D.;Forss´en,P.-E.;Lai,K.;Helmer,S.;McCann,S.;Southey,T.;Baumann,M.;Little,J.J.;andLowe,D.G.2008.Curiousgeorge:anattentivesemanticrobot.RoboticsandAutonomousSystems56:503–511.Posner,I.;Corke,P.;andNewman,P.2010.Usingtext-spottingtoquerytheworld.InProc.oftheIEEE/RSJInt.Conf.onIntelligentRobotsandSystems(IROS).Rosenthal,S.;Biswas,J.;andVeloso,M.2010.Aneffec-tivepersonalmobilerobotagentthroughsymbiotichuman-robotinteraction.InProceedingsofthe9thInternationalConferenceonAutonomousAgentsandMultiagentSystems,AAMAS'10.Samadi,M.;Veloso,M.;andBlum,M.2011.Evaluatingcorrectnessofpropositionsusingtheweb.InProceedingsoftheWorkshoponLearningbyReadinganditsApplicationsinIntelligentQuestion-Answering,IJCAI'11.Sj¨o¨o,K.;L´opez,D.G.;Paul,C.;Jensfelt,P.;andKragic,D.2009.Objectsearchandlocalizationforanindoormobilerobot.JournalofComputingandInformationTechnology17(1):67–80.doi:10.2498/cit.1001182.Soderl,S.;Etzioni,O.;Shaked,T.;andWeld,D.S.July2004.Theuseofweb-basedstatisticstovalidateinformationextraction.InAAAIWorkshoponAdaptiveTextExtractionandMining.Stachniss,C.;Grisetti,G.;andBurgard,W.2005.Informa-tiongain-basedexplorationusingrao-blackwellizedparticlelters.InProc.ofrobotics:scienceandsystems(RSS),65–72.Tenorth,M.;Klank,U.;Pangercic,D.;andBeetz,M.2011.Web-enabledrobots.Robotics&AutomationMagazine,IEEE18(2):58–68.Turney,P.D.,andLittman,M.L.2003.Measuringpraiseandcriticism:Inferenceofsemanticorientationfromasso-ciation.ACMTrans.Inf.Syst.21:315–346.Turney,P.2001.Miningthewebforsynonyms:Pmi-irver-suslsaontoe.Velez,J.;Hemann,G.;Huang,A.;Posner,I.;andRoy,N.2011.Planningtoperceive:Exploitingmobilityforrobustobjectdetection.InProceedingsoftheInternationalCon-ferenceonAutomatedPlanningandScheduling.vonAhn,L.,andDabbish,L.2004.Labelingimageswithacomputergame.InProceedingsoftheSIGCHIconferenceonHumanfactorsincomputingsystems,CHI'04,319–326.NewYork,NY,USA:ACM.Yamauchi,B.1997.Afrontier-basedapproachforau-tonomousexploration.InComputationalIntelligenceinRoboticsandAutomation,1997.CIRA'97.,Proceedings.,1997IEEEInternationalSymposiumon,146–151.IEEE.

Related Contents


Next Show more