1httpsgithubcomhcmlabnovatimeInthisworkweuseNOVAtoannotatetheFrenchpartofNoXidatabasewhichiscomposedof21sessionsThetotaldurationofallthesesessionsis7hoursand25minutesWeuseadiscreteannotatio ID: 850096
Download Pdf The PPT/PDF document "ExpertNoviceInteractionAnnotationandAnal..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.
1 Expert-NoviceInteraction:AnnotationandAn
Expert-NoviceInteraction:AnnotationandAnalysisSoumiaDermouche,CatherinePelachaudCNRSUMR7222,ISIR,SorbonneUniversit´es75005Paris,Francefsoumia.dermouche,catherine.pelachaudg@upmc.frAbstractInthisdemonstration,wepresenttheNoXicorpusofexpert-noviceinteractions,ourannotationsandanalysis.ToanalyzethedataweapplyHCApriori,aTemporalSequenceMiningalgorithmtoextractrelevantbehaviorsequencesforbothexpertandnovice.NoXiprovidesover25hoursofdyadicinteractionsrecordedindifferentlanguages,mainlyEnglish,French,andGerman.Theannotationtool,NOVA,developedby(Bauretal.,2015)allowsannotatingdatausingdiscreteandcontinuousschema.WeuseNOVAtomanuallyannotatenon-verbalbehaviors(discreteannotation)andengagementlevels(continuousannotation).Keywords:Non-verbalbehavior;Engagement;SequenceMining;VirtualAgent1.IntroductionThisworkispartoftheH2020projectARIA-VALUSPA(ArticialRetrievalofInformationAssistants-VirtualAgentswithLinguisticUnderstanding,SocialskillsandPersonalizedAspects).Inthisproject,acorpusofdyadicinteractions,namedNoXi,hasbeencollected(Cafaroetal.,2017).NoXiisavailabletotheresearchcommu-nityfromthewebsite:https://noxi.aria-agent.eu/.Duringtheinteraction,participantsexchangedthroughalargescreenindifferentrooms.Oneparticipantassumestheroleofanexpertonagiventopicandtheothertheroleofanoviceforthistopic.NoXiiscomposedof84sessionsrecordedinthreedifferentcountriesFrance,Ger-manyandUKanddiscussing58topicslikevideogames,sports,cooking,etc.Inthefollowingsections,wedescribeourcodingschemeforNoXiannotationandhowweusetheNOVAtool.Wehavemanuallyannotatedseveralnon-verbalbehaviorsandengagementlevelsofbothexpertandnovice.Theuseofsequenceminingallowedusdiscoveringrelevantpatternsfordifferentengagementlevels.2.AnnotationNOVA1isanopen-sourceannotationtooldevelopedby(Bauretal.,2015)thatweusetoannotatetheNoXicorpus.NOVAovercomesthelimitationsofexistingan-notationtoolsbyexploringricherdatalikefacestreamsorskeletonandbyproposingtwoannotationschemasattime:discreteandcontinuous.Moreover,NOVAisacollabora-tiveplatforminwhichannotatorsfromdifferentsitescancombineandsharetheirannotations.Discreteannotationschemacanbeusedtolabelbehaviorsthatcanbeclassi-edintoasetofcategories(e.g.gazedirection).Discreteannotationcharacterizesthestartingandtheendingtimeofbehaviors.Ontheotherhand,acontinuousscalecouldbemoreappropriatefordescribingcontinuousdimensions,suchas,engagementwhichisexpressedallalongtheinter-action.Figure1showsonesessionofNoXiviewedwithNOVA.Audio-visualaswellasskeletonandfacestreamsofbothexpertandnoviceareopened.UsingNOVA,con-tinuousanddiscreteannotationscanbevisualizedatsame 1https://github.com/hcmlab/novatime.Inthiswork,weuseNOVAtoannotatetheFrenchpartofNoXidatabasewhichiscomposedof21sessions.Theto-taldurationofallthesesessionsis7hoursand25minutes.Weuseadiscreteannotationschematolabelsixnon-verbalbehaviortypes:headdirectionandmovement,smile,eye-browmovement,gestureandhandrestpositions.Contin-uousscaleisadaptedforengagementannotation.Inor-dertoavoidcontentbiasesfromthetheverbalstreamandprosodywhenannotatingengagement,wehaveltereditout,forbothexpertandnovice.Accordingto(Yannakakisetal.,2017)thatsuggestordinalannotationforaffectmod-eling,weannotateengagementovervelevels:stronglydisengaged,partiallydisengaged,neutral,partiallyengagedandstronglyengaged.Oneevaluatorwasaskedtorateandassociatetheengagementlevelofexpertandnoviceovertheselevels.Table1illustratesthemanualannotationsthatwerealizesofar.Theseannotationshavebeenrealizedbythreeevaluators:oneforengagementannotation,oneforgestureannotationandthelastonedealtwiththeremainingannotations.Foreachmodality,weindicatelabelofanno-tatedsignals,thenumberofannotatedsessions,theirdura-tion,andthenumberofannotationsforexpertandnovic
2 e.3.HCAprioriAlgorithmHumanbehaviorsaren
e.3.HCAprioriAlgorithmHumanbehaviorsarenaturallymultimodal.Humanstates,attitude,engagementlevel,etc,canbedisplayedthroughsequencesofbehaviors(BurgoonandDunbar,2006).InordertoextractameaningfulmultimodalsequencesfromNoXi,werelyonHCApriori,atemporalsequencemin-ingalgorithm(DermoucheandPelachaud,2016).Thisalgorithmaimsatndingfrequentpatterns(frequentsub-sequences)hiddeninsetofsequences.HCAprioritakesasinput:thesequencedataset,aminimumthreshold(fmin),i.e.onlypatternsthatholdwithinthisthresholdareconsid-eredasfrequent,dissimilaritymeasurelikeCityBlockanddissimilaritythresholdcalled.HCApriorioperatesintwosteps:(1)hierarchicalclusteringinwhichsignalsaregroupedintothesameclusterifandonlyiftheirtemporaldistanceislessthan.Temporaldis-tancebetweentwosignalsisevaluatedusingadissimilaritymeasuresuchasCityBlock.Attheendofthisstep,the Figure1:AscreenshotofNOVAinterface:videosofexpertandnovice,expertskeletonandnovice'sfacetracking(top).Discreteandcontinuousannotationstracksareshown(bottom).Table1:Numberofmanualannotationsofeachnon-verbalmodalityforexpertandnovice. Modality Labelofannotatedsignals Annotationnumber Expert Novice Headdirectionandmovements Nod,Shake,Forward,Back,Up,Down,Side,Tilt 72 337 Smiles Smile 153 157 Eyebrowmovements FrownandRaised 147 44 Gestures Iconics,Metaphorics,Deictics,Beats,andAdaptors 1223 293 Handrestpositions Armscrossed,Handstogether,Handsinpockets,Handsbehindback,andAkimbo 1317 612 Engagement Stronglydisengaged,Partiallydisengaged,Neutral,Partiallyengaged,Stronglyengaged 1481 1679 clustercentroidrepresentsapatternoflengthone.(2)Tak-ingasinputtheresultsofthepreviousstage,Apriori-likeprocedureisadaptedtogeneratelongertemporalpatterns.ForNoXianalysisusingHCApriori,wecan,forexample,exploretherelationshipsbetweennon-verbalbehaviorandengagementperception.Forthispurpose,wepreparedtheinputdatasetofHCAprioribycollectingallsequencesofnonverbalbehaviorsthatappearduringagivenengage-mentlevel.Table2presentsthenumberofsequencesweobtainedforeachengagementlevelforexpertandfornovice.Then,wehaveappliedHCAprioritoextracttem-poralpatternsofnonverbalsignalsexpressingtheveen-gagementlevels.Ourdemowillconsistofapresentationofthedatacollec-tion,experimentalsetupofNoXi,aswellastheannotationtoolusedforthemanualannotationofvariousbehaviors.Itwillalsoprovide,basedonHCApriori,thedataanaly-sisandtheinvestigationofthesequentialbehaviosofbothexpertandnovice.Table2:Numberofsequencesofeachengagementlevelforbothexpertandnovice. Level1 Level2 Level3 Level4 Level5 Total Expert 48 373 373 561 126 1481 Novice 116 432 509 558 64 1679 4.AcknowledgementsFundedbyEuropeanUnionHorizon2020researchandin-novationprogramme,grantagreementNo645378.5.BibliographicalReferencesBaur,T.,Mehlmann,G.,Damian,I.,Lingenfelser,F.,Wag-ner,J.,Lugrin,B.,Andr´e,E.,andGebhard,P.(2015).Context-AwareAutomatedAnalysisandAnnotationofSocialHumanAgentInteractions.ACMTransactionsonInteractiveIntelligentSystems,5(2):133.Burgoon,J.K.andDunbar,N.E.(2006).Nonverbalex-pressionsofdominanceandpowerinhumanrelation-ships.TheSAGEHandbookofNonverbalCommunica-tion,(September2014):279298.Cafaro,A.,Wagner,J.,Baur,T.,Dermouche,S.,Torres,M.T.,Pelachaud,C.,Andr,E.,andValstar,M.(2017).TheNoXiDatabase:MultimodalRecordingsofMe-diatedNovice-ExpertInteractions.InICMI'17,,pages350359,Glasgow,Scotland.ACM.Dermouche,S.andPelachaud,C.(2016).Sequence-basedmultimodalbehaviormodelingforsocialagents.InProceedingsofthe18thACMInternationalConferenceonMultimodalInteraction-ICMI2016,pages2936,Tokyo,Japan.ACM.Yannakakis,G.N.,Cowie,R.,andBusso,C.(2017).TheOrdinalNatureofEmotions.In2017SeventhInterna-tionalConferenceonAffectiveComputingandIntelligentInteraction(ACII),pages248255