FailedsymbolicconditionSymbolicrequest Partisnotvisibletotherobotlocate partrobotpartRobotisnotholdingthepartgive partrobotpartLegisnotalignedwiththeholealign with holelegtopholeLegisnota ID: 131830
Download Pdf The PPT/PDF document "TABLEISUMMARYOFCOMMONFAILURESANDTHESYMBO..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.
TABLEISUMMARYOFCOMMONFAILURESANDTHESYMBOLICGROUNDINGSUSEDTOFORMAHELPREQUEST. FailedsymbolicconditionSymbolicrequest Partisnotvisibletotherobot.locate part(robot,part)Robotisnotholdingthepart.give part(robot,part)Legisnotalignedwiththehole.align with hole(leg,top,hole)Legisnotattachedtothehole.screw in leg(leg,top,hole)Tabletopisnotupsidedown.ip(top) Legacysoftwareisininniteloop.notdetectable]TJ/;ྕ ;.97; T; 6.;և ; Td; [00;Riskofhardwaredamage.notdetectable]TJ/;ྕ ;.97; T; 6.;և ; Td; [00; A.DetectingFailuresTodetectfailures,thesystemcomparestheexpectedstateoftheworldtotheactualstate,assensedbytheperceptualsystem(line6oftheexecutivefunction).Werepresentthestate,q,asavectorofvaluesforlogicalpredicates.ElementsofthestatefortheIKEALACKtableincludewhethertherobotisholdingeachtableleg,whetherthetableisface-uporface-down,andwhethereachlegisattachedtothetable.Inthefurnitureassemblydomain,wecomputethestateusingthetrackedposeofeveryrigidbodyknowntotheVICONsystem,includingeachfurniturepart,eachrobotchassisandhand,andeachhuman.Thesystemrecomputesqfrequently,sinceitmaychangeindependentlyofanydeliberaterobotaction,suchasbyhumaninterventionorfromanunintendedside-effect.Priortoexecutingeachaction,theassemblyexecutivever-iestheaction'spreconditionsagainstq.Likewise,followingeachaction,thepostconditionsareveried.Anyunsatisedconditionindicatesafailureandtriggerstheassemblyexecu-tivetopausetheassemblyprocessandinitiateerrorrecovery.Forexample,therobotmustbegraspingatablelegbeforescrewingitintothehole.Ifittriesandfailstopickupaleg,thenthepost-conditionforthepickupactionwillnotbesatisedinq,whichindicatesafailure.B.RecoveryStrategyWhenafailureoccurs,itsdescriptiontakestheformofanunsatisedcondition.Thesystemthenasksthehumanforhelptoaddresstheproblem.Therobotrstcomputesactionsthat,ifperformedbythehuman,wouldresolvethefailureandenabletheroboticteamtocontinueassemblingthepieceautonomously.Thesystemcomputestheseactionsusingapre-speciedmodelofphysicalactionsapersoncouldtaketorectifyfailedpreconditions.Remedyrequestsareexpressedinasimplesymboliclanguage,showninTableI.Thissymbolicrequest,a,speciestheactionthattherobotwouldlikethepersontotaketohelpitrecoverfromfailures.Howeverthesesymbolicformsarenotappropriateforspeakingtoanuntraineduser.Inthefollowingsection,weexploreaseriesofapproachesthattakeasinputthesymbolicrequestforhelpandgeneratealanguageexpressionaskingahumanforassistance.IV.ASKINGFORHELPFROMAHUMANPARTNEROncethesystemcomputesasymbolicrepresentationofthedesiredaction,a,itsearchesforwords,,whichS!VBNPS!VBNPPPPP!TONPVB!ipjgivejpickupjplaceNP!thewhitelegjtheblacklegjmethewhitetablejtheblacktableTO!abovejbyjnearjunderjwith Fig.3.Partofthecontext-freegrammardeningthelinguisticsearchspace.effectivelycommunicatethisactiontoapersonintheparticularenvironmentalcontext,M,online5oftheconditions_satisfiedfunction.Thissectiondescribesvariousapproachestothegenerate_help_requestfunctionwhichcarriesoutthisinference.Formally,wedeneafunctionhtoscorecandidatesentences:argmaxh(;a;M)(1)ThespecicfunctionhusedinEquation1willgreatlyaffecttheresults.Wedenethreeincreasinglycomplexapproachesforh,whichleadtomoretargetednaturallanguagerequestsforhelpbymodelingtheabilityofthelistenertounderstandit.Thecontributionofthispaperisadenitionforhusinginversesemantics.Forwardsemanticsistheproblemofmappingfromwordsinlanguagetoaspectsoftheexternalworld;thecanonicalproblemisenablingarobottofollowaperson'snaturallanguagecommands[14,12,22,16].Inversesemanticsisthereverse:mappingfromspecicaspectsoftheexternalworld(inthiscase,anactionthattherobotwouldlikethehumantotake)towordsinlanguage.ToapplythisapproachweusetheG3modelofnaturallanguagesemantics.WebuildontheworkofTellexetal.[22],whousedtheG3frameworktoendowtherobotwiththeabilitytofollownaturallanguagecommandsgivenbypeople.Inthispaper,instead,weinvertthemodel,toendowtherobotwiththeabilitytocreatenaturallanguagerequests,whichwillbeunderstoodbypeople.TheinferenceprocessinEquation1isasearchoverpossiblesentences.Wedeneaspaceofsentencesusingacontext-freegrammar(CFG),showninFigure3.Theinferencepro-cedurecreatesagroundinggraphforeachcandidatesentenceusingtheparsestructurederivedfromtheCFGandthenscoresitaccordingtothefunctionh.Thissearchspaceisquitelarge,andweusegreedysearchtoexpandpromisingnodesrst.A.SpeakingbyReexThesimplestapproachfromtheassemblyexecutive'sper-spectiveistodelegatediagnosisandsolutionoftheproblemtothehumanwiththesimplexedrequest,=Helpme.Thisalgorithmtakesintoaccountneithertheenvironmentorthelistenerwhenchoosingwhattosay.WerefertothisalgorithmasS0. B.SpeakingbyTemplateAsasecondbaseline,weimplementedatemplate-basedalgorithm,followingtraditionalapproachestogeneratinglan-guage[6,17].Thisapproachusesalookuptabletomapsymbolichelpconditionstonaturallanguagerequests.Thesegenericrequeststakethefollowingform:Placepart2whereIcanseeit.Handmepart2.Attachpart2atlocation1onpart5.(i.e.screwinatableleg)Notethattheuseofrstpersonintheseexpressionsreferstotherobot.SinceVICONdoesnotpossessanysemanticqualitiesoftheparts,theyarereferredtogenericallybypartidentiernumbers.Suchtemplatescanbeeffectiveinsimplesituations,wherethehumancaninferthepartfromthecontext,butdonotmodelhowwordsmaptotheenvironment,andthusdonotreectthemappingbetweenwordsandperceptualdata.Inconstrainedinteractionscenarios,theprogrammercouldhard-codeobjectnamesforeachpart,butthisapproachbecomesimpracticalasthescopeofinteractionincreases,especiallyforreferringexpressionssuchasthepartonthetable.C.ModelingWordMeaningsThissectionbrieydescribeshowtheG3frameworkmodelswordmeanings,whichhasbeenpreviouslyusedtounder-standlanguage[22].Whenunderstandinglanguage,theG3frameworkimposesadistributionovergroundingsintheexternalworld, 1::: N,givenanaturallanguagesentence.Groundingsarethespecicphysicalconceptsthatarereferredtobythelanguageandcanbeobjects(e.g.,atablelegorarobot),places(e.g.,aparticularlocationintheworld),paths(e.g.,atrajectorythroughtheenvironment),orevents(e.g.,asequenceofactionstakenbytherobot).Eachgroundingcorrespondstoaparticularconstituenti2,denedbytheCFGparsetreeforthesentence.Forexample,forasentencesuchasPickupthetableleg,thegroundingforthephrasethetablelegcorrespondstoanactualtablelegintheexternalworld,andthegroundingfortheentiresentencecorrespondstotheactionsofapersonastheyfollowtherequest.UnderstandingasentenceintheG3frameworkamountstothefollowinginferenceproblem:argmax 1::: Np( 1::: Nj;M)(2)TheenvironmentmodelMconsistsoftherobot'slocationalongwiththelocationsandgeometriesofobjectsintheexternalworld.Thecomputedenvironmentmodeldenesaspaceofpossiblevaluesforthegroundingvariables, 1::: N.Arobotcomputestheenvironmentmodelusingsensorinput;inthedomainoffurnitureassembly,thesystemcreatestheenvironmentmodelusinginputfromVICON.Tofactorthemodel,weintroduceacorrespondencevector,,asdoTellexetal.[22].Eachentryi2correspondstowhetherlinguisticconstituenti2correspondstothegroundingsassociatedwiththatconstituent.Forexample,the Fig.4.Groundinggraphfortherequest,Pickupthetableleg.Randomvariablesandedgesarecreatedinthegraphicalmodelforeachconstituentintheparsetree.Thevariablescorrespondtolanguage;the variablescorrespondtogroundingsintheexternalworld.Edgesinthegrapharecreatedaccordingtotheparsestructureofthesentence.correspondencevariablewouldbeTrueforthephrasethewhitetablelegandagroundingofawhiteleg,andFalseifthegroundingwasadifferentobject,suchasablacktabletop.Weassumethat 1::: Nareindependentofunlessisknown.Introducingenablesfactorizationaccordingtothestructureoflanguagewithlocalnormalizationateachfactoroveraspaceofjustthetwopossiblevaluesfori.Theoptimizationthenbecomes:argmax 1::: Np( 1::: Nj;;M)(3)AfterfactoringusingBayes'ruleandignoringconstanttermswehave:argmax 1::: Np(j; 1::: N;M)(4)Wefactortheexpressionaccordingtothecompositionalsyntacticstructureofthelanguage,denedbytheparsetree.argmax 1::: NYip(iji; i1::: ik;M)(5)Thisfactorizationcanberepresentedasadirectedgraphicalmodelwhererandomvariablesandedgesinthemodelarecreatedaccordingtothestructureofthelanguage.Werefertooneofthesegraphicalmodelsasagroundinggraph.Figure4showsanexamplegraphicalmodel;thedetailsofthefactorizationaredescribedbyTellexetal.[22].Thesystemfactorizesthedistributionaccordingtothewell-knownhierarchicalparsestructureoflanguage.WhenevaluatingthemodelforspecicsentencesgeneratedbyourCFG,weusetheparsestructuredenedbytheCFGtofactorthedistribution.Eachfactorcorrespondstoanindividuallog-linearmodelforpredictingwhethertheparticularnodeoftheCFGcorrespondstoaparticulargroundingintheexternalworld.Trainingmodelparametersrequiresanalignedparallelcorpusoflanguagepairedwithgroundings;wedescribethetrainingprocedureusedforourfurnitureassemblydomaininSectionIV-F.D.SpeakingbyModelingtheEnvironmentNext,wedescribeamorecomplexmodelforspeaking,thattakesintoaccountamodeloftheenvironment,butnot Helpme(S0)Helpme.TemplatesPleasehandmepart2.G3S1Givemethewhiteleg.G3S2Givemethewhitelegthatisontheblacktable.Hand-writtenTakethetablelegthatisonthetableandplaceitintherobot'shand.Fig.5.Scenefromourdatasetandtherequestsgeneratedbyeachapproach.F.TrainingWetrainedthemodelforunderstandinglanguagefollowingthesameprocedureasTellexetal.[22].Wecollectedanewdatasetofnaturallanguagerequestsgivenbyahumantoanotherhumaninthefurnitureassemblydomain.Wecreatedtwenty-onevideosofapersonexecutingataskinvolvedinassemblingapieceoffurniture.Forexample,onevideoshowsapersonscrewingatablelegintoatable,andanothershowsapersonhandingatablelegtoasecondperson.Eachvideohasanassociatedcontextconsistingofthelocations,geometries,andtrajectoriesofthepeopleandobjects,producedwithVICON.WeaskedannotatorsonAmazonMechanicalTurktoviewthevideosandwriteanaturallanguagerequesttheywouldgivetoaskoneofthepeopletocarryouttheactiondepictedinthevideo.ThenweannotatedrequestsinthevideowithassociatedgroundingsintheVICONdata.Thecorpuscontains326requestswithatotalof3279words.Inadditionwegeneratedadditionalpositiveandnegativeexamplesforthespecicwordsinourcontext-freegrammar.V.EVALUATIONThegoalofourevaluationwastoassesswhetherouralgo-rithmsincreasetheeffectivenessofaperson'shelp,orinotherwords,toenablethemtomorequicklyandaccuratelyprovidehelptotherobot.Toevaluatewhetherouralgorithmsenableahumantoaccuratelyprovidehelpcomparedtobaselines,weuseanonlinecorpus-basedevaluation.Weconductedareal-worlduserstudytoassesswhetherourleadingalgorithmimprovesthespeedandaccuracyofaperson'shelptoateamofautonomousrobotsengagedinareal-worldassemblytask.A.Corpus-BasedEvaluationOuronlineevaluationusedAmazonMechanicalTurk(AMT)tomeasurewhetherpeoplecouldusegeneratedhelprequeststoinfertheactionthattherobotwasaskingthemtoperform.WepresentedaworkeronAMTwithapictureofascene,showingarobot,aperson,andvariouspiecesoffurniture,togetherwiththetextoftherobot'srequestforhelp.Figure5showsanexampleinitialscene,withseveraldifferentTABLEIIFRACTIONOFCORRECTLYFOLLOWEDREQUESTS Metric%Success95%Condence Chance20.0HelpmeBaseline(S0)21.08:0TemplateBaseline47.05:7G3InverseSemanticswithS152.35:7G3InverseSemanticswithS264.35:4Hand-WrittenRequests94.04:7 requestsforhelpgeneratedbydifferentalgorithms,allaskingthehumantocarryoutthesameaction.Next,weshowedtheworkervevideosofahumantakingvariousactionsinthesceneinresponsetotherequests.Weaskedthemtochoosethevideothatbestmatchedtherequestforhelp.Wechoseactionstolmbasedonactionsthatwouldrecoverfromtypicalfailuresthattherobotsmightencounter.Atrialconsistsofaworkerviewinganinitialscenepairedwitharequestforhelpandthenchoosingacorrespondingvideo.Wecreatedadatasetconsistingoftwentytrialsbyconstruct-ingfourdifferentinitialscenesandlminganactortakingvedifferentactionsineachscene.WepresentresultsforthefourautomaticmethodsdescribedinSectionIV,aswellasabaselineconsistingofhand-writtenrequestswhichwecreatedtobeclearandunambiguous.Figure6showsthefourinitialscenespairedwithhandwrittenhelprequests.Forthehelpmeandhand-writtenbaselines,weissuedeachofthetwentygeneratedrequeststovesubjects,foratotalof100trials.WeissuedeachrequestinthetemplateandG3approachestofteenusersforatotalof300trials.WeassumedtherobothadaccurateperceptualaccesstotheobjectsintheenvironmentandtheirlocationsusingtheVICONsystem.ResultsappearinTableII.OurresultsshowthattheHelpmebaselineperformsatchance,whereasthetemplatebaselineandtheG3inversesemanticsmodelbothimprovedperformancesignicantly.TheS1modelmayhaveimprovedperformanceoverthetemplatebaseline,buttheseresultsdonotrisetothelevelofstatisticalsignicance.TheS2model,however,realizesasignicantimprovement,p=0:002byStudent'st-test,duetoitsmorespecicrequests,whichmodeltheuncertaintyofthelistener.Theseresultsdemonstratethatourmodelsuccessfullygenerateshelprequestsformanyconditions.Mostfailuresoccurredduetoambiguityinthelanguage,eveninsentencesgeneratedbytheS2model.Forexample,manypeopleconfusedthewhitelegthatisneartheblacktablewiththewhitelegthatisundertheblacktable.Addingmoreprepositions,suchasnexttowouldaddressthisissuebyenablingthealgorithmtogeneratemorespecicreferringexpressionsthatmoreaccuratelymatchpeople'sexpectations.B.UserStudyInourexperiment,humansandrobotscollaboratedtoas-sembleIKEAfurniture.Thestudysplitparticipantsintotwoconditionsusingabetween-subjectsdesign,with8subjectsineachcondition.Inthebaselinecondition,robotsrequested Takethetablelegthatisonthetableandplaceitintherobot'shand.Takethetablelegthatisunderthetableandplaceitintherobot'shand.Takethetablelegthatisnexttothetableandplaceitintherobot'shand.Pickupthetablelegthatisonthetableandholdit.Takethetablelegthatisonthetableandplaceitontheoorinfrontoftherobot. Screwthewhitetablelegintotheholeinthetabletop.Screwtheblacktablelegintotheholeinthetabletop.Takethewhitetablelegandinsertitinthehole,butdonotscrewitin.Movethewhitetablelegovernearthetabletop.Takethetabletopandplaceitnearthewhitetablelegontheoor. Takethewhitetablelegthatisnexttothetableandputitinfrontoftherobot.Taketheblacktablelegthatisnexttothetableandputitinfrontoftherobot.Taketheblacktablelegthatisfarawayfromthetableandputitinfrontoftherobot.Takethewhitetablelegthatisontopofthetableandplaceitintherobot'shand.Pickupthewhitetablelegnexttothetableandholdit. Takethewhitetable,ipitover,andsetitdowninplace.Taketheblacktable,ipitover,andsetitdowninplace.Takethewhitetableandmoveitneartherobot,keepingitupside-down.Pickupthewhitetableandholdit.Takethewhitetable,ipitover,andputitintherobot'shand.Fig.6.Thefourinitialscenesfromtheevaluationdataset,togetherwiththehand-writtenhelprequestsusedinourevaluation.helpwiththeS0approach,usingonlythewordsPleasehelpme.Inthetestcondition,robotsrequestedhelpusingtheS2inversesemanticsmetric.Therobotsautonomouslyplannedandexecutedtheassemblyontworealrobots,andalldetectedfailureswerereal.Ourgoalwastoassesstheeffectofthechoiceofhelprequest,madetoauserwithlimitedsituationalawareness,withinanend-to-endsystem.WechoseapproachS0asabaselinetoevaluatethemagnitudeofthiseffect.Theaccompanyingvideoisonlineathttp://youtu.be/2Ts0W4SiOfs.Wemeasureeffectivenessbyacombinationofobjectiveandsubjectivemeasures.Wereporttwoobjectivemeasures:efciencytheelapsedtimeperhelprequest,andaccuracythenumberoferror-freeuserinterventions.Takentogether,thesemeasuresshowhoweffectivelythehuman'stimeisbeingusedbytherobots.Wealsoreportthreesubjectivemeasuresderivedfromapost-trialsurvey,aswellastheirownwrittenfeedbackaboutthesystem,togainanunderstandingoftheirviewofthestrengthsandweaknessesofourapproach.1)Procedure:Subjectsineachconditionweregender-balancedandhadnosignicantdifferenceinexperiencewithrobotsorfurnitureassembly.Tofamiliarizeuserswiththerobot'scapabilities,wegavethemalistofactionsthatmighthelptherobots.Duringpreliminarytrials,subjectshadproblemswhenhandingpartstotherobot(calledahand-off),sowedemonstratedthistaskandgaveeachusertheopportunitytopractice.Theentireinstructionperiodlastedlessthanveminutes,includingthedemonstration.Duringtheexperiment,weinstructeduserstofocusonadifferentassemblytaskandonlyhelptherobotswhenrequested.Foreachsubject,therobotteamstartedfromthesameinitialconditions,showninFigure7.Somefailureswereinevitablegiventheinitialconditions(e.g.,atabletopturnedupsidedown;apartonatableoutoftherobots'reach.)Otherfailureshappenednaturally(e.g.,atablelegthatslippedoutofarobot'sgripper.)Whenafailureoccurredduringassembly,thefailingrobotaddressedthepersonbysaying,Excuseme,andgeneratedandspokearequestforhelpthroughanon-boardspeaker,distinguishingitselfbycolor Fig.7.Initialcongurationfortheuserstudy.Theuserisseatedbehindthewhiteboardinthebackground.ifnecessary.Weprojectedalldialogueonalargescreentoremovedependenceonunderstandingsynthesizedspeech.Thehumanthenintervenedinthewaytheyfeltwasappropriate.Aftercommunicatingahelprequest,therobotswaitedupto60secondsfortheusertoprovidehelp.Ifthetheenvironmentchangedinawaythatsatisedtherequest,therobotsaidThankyou,I'lltakeitfromhere,andwecountedtheperson'sinterventionassuccessful.Iftheallottedtimeelapsed,therobotinsteadsaidNevermind,I'lltakeitfromhere,andmovedontoadifferentpartoftheassemblyprocess.Theseinstanceswererecordedasfailedinterventions.Foreachintervention,werecordedthetimeelapsedandnumberofactionsthehumantookinattemptingtosolvetheproblem.Eachtrialranforfteenminutes.Althoughwetriedtolimitexperimenterintervention,therewereseveralproblemswiththeroboticassemblysystemthatrequiredexpertassistance.Experimentersintervenedwheneitheroftwosituationsarose:potentialdamagetothehardware(19times),oraninniteloopinlegacysoftware(15times).Inaddition,softwarerunningontherobotscrashedandneededtoberestarted5times.Inthefuture,weplantoaddresstheseissuesusingmethodsfordirectingrequeststothepersonmostlikelytosatisfythem,ratherthanonlytargetingrequestsatuntrainedusers.2)ResultsandDiscussion:Overthecourseofthestudy,therobotsmade102helprequests,ofwhich76weresatised REFERENCES[1]M.Bollini,S.Tellex,T.Thompson,N.Roy,andD.Rus.Interpretingandexecutingrecipeswithacookingrobot.In13thInternationalSymposiumonExperimentalRobotics,2012.[2]DavidL.ChenandRaymondJ.Mooney.Learningtointerpretnaturallanguagenavigationinstructionsfromobservations.InProc.AAAI,2011.[3]G.Dorais,R.Banasso,D.Kortenkamp,P.Pell,andD.Schreckenghost.Adjustableautonomyforhuman-centeredautonomoussystemsonmars,1998.[4]AncaDraganandSiddharthaSrinivasa.Generatinglegiblemotion.InRobotics:ScienceandSystems,June2013.[5]J.Dzifcak,M.Scheutz,C.Baral,andP.Schermerhorn.Whattodoandhowtodoit:Translatingnaturallanguagedirectivesintotemporalanddynamiclogicrepresentationforgoalmanagementandactionexecution.InProc.IEEEInt'lConf.onRoboticsandAutomation,pages41634168,2009.[6]T.Fong,C.Thorpe,andC.Baur.Robot,askerofquestions.JournalofRoboticsandAutonomousSystems,42:235243,2003.[7]K.GarouandA.Koller.Combiningsymbolicandcorpus-basedapproachesforthegenerationofsuccess-fulreferringexpressions.InProceedingsofthe13thEuropeanWorkshoponNaturalLanguageGeneration,pages121131.AssociationforComputationalLinguis-tics,2011.[8]D.Golland,P.Liang,andD.Klein.Agame-theoreticap-proachtogeneratingspatialdescriptions.InProceedingsofthe2010conferenceonempiricalmethodsinnaturallanguageprocessing,pages410419.AssociationforComputationalLinguistics,2010.[9]NoahDGoodmanandAndreasStuhlm¨uller.Knowledgeandimplicature:Modelinglanguageunderstandingassocialcognition.Topicsincognitivescience,5(1):173184,2013.[10]DanielJurafskyandJamesH.Martin.SpeechandLanguageProcessing.PearsonPrenticeHall,2edition,May2008.ISBN0131873210.[11]R.A.Knepper,T.Layton,J.Romanishin,andD.Rus.IkeaBot:Anautonomousmulti-robotcoordinatedfur-nitureassemblysystem.InProc.IEEEInt'lConf.onRoboticsandAutomation,Karlsruhe,Germany,May2013.[12]ThomasKollar,StefanieTellex,DebRoy,andNicholasRoy.Towardunderstandingnaturallanguagedirections.InProc.ACM/IEEEInt'lConf.onHuman-RobotInter-action,pages259266,2010.[13]EmielKrahmerandKeesVanDeemter.Computationalgenerationofreferringexpressions:Asurvey.Computa-tionalLinguistics,38(1):173218,2012.[14]MattMacMahon,BrianStankiewicz,andBenjaminKuipers.Walkthetalk:Connectinglanguage,knowledge,andactioninrouteinstructions.InProc.Nat'lConf.onArticialIntelligence(AAAI),pages14751482,2006.[15]J.Maitin-Shepard,J.Lei,M.Cusumano-Towner,andP.Abbeel.Clothgrasppointdetectionbasedonmultiple-viewgeometriccueswithapplicationtorobotictowelfolding.InProc.IEEEInt'lConf.onRoboticsandAutomation,Anchorage,Alaska,USA,May2010.[16]C.Matuszek,N.FitzGerald,L.Zettlemoyer,L.Bo,andD.Fox.Ajointmodeloflanguageandpercep-tionforgroundedattributelearning.ArxivpreprintarXiv:1206.6423,2012.[17]EhudReiterandRobertDale.BuildingNaturalLan-guageGenerationSystems.CambridgeUniversityPress,January2000.ISBN9780521620369.[18]StephanieRosenthal,ManuelaVeloso,andAnindK.Dey.Learningaccuracyandavailabilityofhumanswhohelpmobilerobots.InProc.AAAI,2011.[19]D.Roy.Atrainablevisually-groundedspokenlanguagegenerationsystem.InProceedingsoftheInternationalConferenceofSpokenLanguageProcessing,2002.[20]R.Simmons,S.Singh,F.Heger,L.M.Hiatt,S.C.Koterba,N.Melchior,andB.P.Sellner.Human-robotteamsforlarge-scaleassembly.InProceedingsoftheNASAScienceTechnologyConference,May2007.[21]K.Striegnitz,A.Denis,A.Gargett,K.Garou,A.Koller,andM.Theune.Reportonthesecondsecondchallengeongeneratinginstructionsinvirtualenvironments(give-2.5).InProceedingsofthe13thEuropeanWorkshoponNaturalLanguageGeneration,pages270279.Associa-tionforComputationalLinguistics,2011.[22]S.Tellex,T.Kollar,S.Dickerson,M.R.Walter,A.Baner-jee,S.Teller,andN.Roy.Understandingnaturallanguagecommandsforroboticnavigationandmobilemanipulation.InProc.AAAI,2011.[23]AdamVogel,MaxBodoia,ChristopherPotts,andDanJurafsky.Emergenceofgriceanmaximsfrommulti-agentdecisiontheory.InProceedingsofNAACL2013,2013.[24]AdamVogel,ChristopherPotts,andDanJurafsky.Impli-caturesandnestedbeliefsinapproximateDecentralized-POMDPs.InProceedingsofthe51stAnnualMeetingoftheAssociationforComputationalLinguistics,Soa,Bulgaria,August2013.AssociationforComputationalLinguistics.[25]R.H.Wilson.Minimizinguserqueriesininteractiveassemblyplanning.IEEETransactionsonRoboticsandAutomation,11(2),April1995.