Sound Meaning a1 schwa neutral uuun a2 a reticencestalling aasamaminimalpairaasovsso a3 e agreementsympathyorientationtoemotionalcontent eenehe a4 o receiptofnewinformationHeritage ID: 98727
Download Pdf The PPT/PDF document "Examplesetc." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.
Sound Meaning Examplesetc. a1 schwa neutral uu,un a2 /a/ reticence,stalling aa,sa,ma;minimalpair:aa-sovs.so a3 /e/ agreement,sympathy,orientationtoemo-tionalcontent ee,ne,he a4 /o/ receiptofnewinformation(Heritage1984),orientationtofactualcontent oo,hoh,ooun b1 nasalization agreement un,haan,eeen,hhnh,huun b2 /m/ contemplation um,hm,maa b3 liprounding weightinessofinformationreceived ow b4 /j/ incipientdesiretotakeaturn,desiretodiscouragetheotherfromtalking(Drum-mond&Hopper1993) turn-initialya,ja-nai-yaofself-correction,topicclosingmaa-ii-ya c1 breathinessand/h/ deference,politeness uh,huh,hee,hmm,ah c2 vocalfry boredom,lackofinvolvement d1 pitchheight degreeofinterest(Shinozaki&Abe1997) d2 risingpitch incompleteunderstanding;invitationstocontinue d3 fallingpitch completeunderstanding;closure(Kawamorietal.1995) e1 duration amountofthought e2 numberofsyllables lackofanythingtoadd(Gardner1997) ununununun,sososo e3 loudness self-condence,importanceofutterance e4 abruptend(sharpenergydrop) coldness,formality,haste f1 timinggovernedbyspeaker'scues passivity f2 delayedtiming reticence,thinking f3 frequencyofgrunts interest,attentiveness Table2:SomehypothesizedcorrespondencesbetweenthecomponentsoundsofJapanesegruntsandtheirmeanings.Spellingsofexamplesaretraditionalratherthanphoneticallyaccurate,especiallyinthecasesofu,handw. Sound Meaning Examplesetc. a5 /i/ other-directness(vs.talkingtooneself) minimalpairs:haivs.ha,turn-initialiyavs.ya,daivs.da b5 /s/ independentjudgement(vs.passivelybe-lievingtheother) minimalpairs:saavs.aa,soovs.oo b6 /r/ immediateactionrequired,ororientationtoaextra-linguistic,real-world,happening ara,kora,areofsurprise b7 /t/ willingnesstocontroltheconversation minimalpairs:eetovs.ee,hon-toovs.hon Table3:Hypothesizednon-productivecorrespondencesbetweenthecomponentsoundsofJapanesenear-gruntsandtheirmeanings.meaningsofeachcomponent(Kawamorietal.1995;Takubo&Kinsui1997).Thismeansthatgruntsare`iconic',or,inotherwords,involve`soundsymbolism'.Soundsymbol-ismingruntsappearstobeadistinctfromtheono-matopoeicandmimeticsystemsofsoundsymbolism.HypothesisDThestrengthofanacousticcom-ponentinagruntcorrespondsdirectlytothestrengthofthecorrespondingcomponentofmeaning. commands.Recentlytherehavebeenseveralpropos-alsforsystemsthatlistentohowtheuserrespondsandadjusttheplaybackrateorcontentaccordingly(Iwase1998).Thesesystemsuseprosodicinforma-tionorstandardphoneme-basedwordrecognition.Byextractingthemeaningofback-channelgruntsfromtheuser,usingthecorrespondencesoutlinedabove,itshouldbepossibletobetterunderstandandadaptoutputtotheuser'sneeds.Asnotedabove,gruntsareacousticallydierentfromwords,inshowinglesstemporalvariation.Thismeansthatitmaybepossibletorecognizegruntsevenwhentheyoverlapwithplaybackbythesystem,evenovertelephonechannelswithoutperfectechocan-cellation.Thisalsomeansthatgruntunderstandingshouldprobablybedonewithspectralfeaturescom-putedoverwideranalysiswindowsthanthoseusedforwordrecognition,perhaps50or100milliseconds.Theproblemofgruntunderstandingisdierentformthatofwordrecognitionalsointhatthereisnose-quenceofphonemestorecognize,butrather,thevar-iousstrengthsofmanysuperimposedacousticcompo-nentsmustbecomputed.Wearecurrentlyworkingonthisproblem.Themappingfromsoundtomeaningwillnotbeinvariant,butwilldependoncontext.Forexample,ifthepreviousutterancewassugoi`wonderful',anun`mm'willseemlukewarm,butiftheprevious`utter-ance'wasasilentstare,thesameuncansoundwarmandfriendly.Eveninlimiteddomains,wherevari-ationsofcontextarenotextreme,compensationforinter-speakerdierencesinthefrequencyofuseofthevariouscomponentsofgruntswillneedtobetakenintoaccount.Oncethemeaningofagrunthasbeenunder-stood,thequestionofhowthesystemshouldre-spondtoitremains.Oneproblemmaybethein-tegrationofinteractionatgrunt-based(interpersonalandattitudinal)levelswithinteractionatthemeaninglevel.Itispossibletoimagineimplementingtheseassemi-independentresponsepathways,integratedwithasubsumptionarchitecture(Ward1997).ReferencesDrummond,Kent&RobertHopper(1993).BackChannelsRevisited:AcknowledgmentTokensandSpeakershipIncipiency.ResearchonLan-guageandSocialInteraction,26:157{177.Gardner,Rod(1997).TheConversationObjectMm:Aweakandvariableacknowledgingto-ken.ResearchinLanguageandSocialInteraction,30:131{156.Heritage,John(1984).AChange-of-StateTokenandAspectsofitsSequentialPlacement.InJ.MaxwellAtkinson&JohnHeritage,editors,StructureofSocialActions:StudiesinConversa-tionAnalysis,pp.299{345.CambridgeUniversityPress.Iwase,Tatsuya(1998).YuzaniawasetaTaiwaPeesunoChosetsu(AdjustingthePaceofConversationtoSuittheUser).InProceedingsofthe4thAnnualMeetingofthe(Japanese)AssociationforNaturalLanguageProcessing,pp.472{475.Kawamori,Masahito,TakeshiKawabata,&AkiraShi-mazu(1995).APhonologicalStudyonJapaneseDiscourseMarkers.In9thSpokenLanguagePro-cessingWorkshopNotes(SIG-SLP-9),pp.13{20.InformationProcessingSocietyofJapan.Okada,Misao(1996).HowtheLengthandPitchofAizuti`Back-channelUtterances'andtheNa-tureoftheSpeechActivityDeterminePreferenceStructureinJapanese.InBerkeleyLinguisticsSo-ciety,ProceedingsoftheTwenty-SecondAnnualMeeting,pp.279{289.Shinozaki,Tubasa&MasanobuAbe(1997).KisokuGoseiOnseideYakudokanoJitsugensuruHo-ryakunitsuite(AStrategyforRealizingLiveIn-teractionwithSynthesizedSpeech).In17thSpo-kenLanguageProcessingWorkshopNotes(SIG-SLP-17),pp.81{88.InformationProcessingSoci-etyofJapan.Takubo,Yukinori&SatoshiKinsui(1997).Otoshi,KandoshinoDanwatekiKino(TheConversationFunctionsofResponsesandExclamations).InBunpotoOnsei(SpeechandGrammar),pp.257{279.Kuroshio,Tokyo.Tsukahara,Wataru(1998).PurosodioyobiBunmyakuJohooMochiitaOotonoSentaku/ChosetsunoKokoromi(SelectingandAdaptingConrmationsinResponsetoProsodicIndicationsandContex-tualFactors).InProceedingsofthe4thAnnualMeetingofthe(Japanese)AssociationforNatu-ralLanguageProcessing,pp.468{471.Ward,Nigel(1996).UsingProsodicCluestoDe-cideWhentoProduceBack-channelUtterances.InInternationalConferenceonSpokenLanguageProcessing,pp.1728{1731.Ward,Nigel(1997).ResponsivenessinDialogandPri-oritiesforLanguageResearch.SystemsandCy-bernetics,28(6):521{533.Ward,Nigel&WataruTsukahara(1998).ProsodicFeatureswhichCueBack-ChannelResponsesinEnglishandJapanese.manuscript.White,Sheida(1989).Backchannelsacrosscultures:AstudyofAmericansandJapanese.LanguageinSociety,18:59{76.