/
Embodied Construction Grammar Embodied Construction Grammar

Embodied Construction Grammar - PowerPoint Presentation

debby-jeon
debby-jeon . @debby-jeon
Follow
349 views
Uploaded On 2019-01-20

Embodied Construction Grammar - PPT Presentation

ECG Formalizing Cognitive Linguistics Community Grammar and Core Concepts Deep Grammatical Analysis Computational Implementation Test Grammars Applied Projects Question Answering Map to Connectionist Models Brain ID: 747159

learning omitted theme context omitted learning context theme give language grammar discourse gei3 amp fit eve recipient action situational

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "Embodied Construction Grammar" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

Embodied Construction GrammarECG(Formalizing Cognitive Linguistics)

Community Grammar and Core Concepts

Deep Grammatical Analysis

Computational

Implementation

Test Grammars

Applied Projects – Question Answering

Map to Connectionist Models, Brain

Models of Grammar AcquisitionSlide2

Simulation specification

The analysis process produces a

simulation specification

that includes image-schematic, motor control and conceptual structures provides parameters for a mental simulationSlide3

Summary: ECGLinguistic constructions are tied to a model of simulated action and perception

Embedded in a theory of language processing

Constrains theory to be usable

Basis for models of grammar learningPrecise, computationally usable formalismPractical computational applications, like MT and NLUTesting of functionality, e.g. language learning

A shared theory and formalism for different cognitive mechanisms

Constructions, metaphor, mental spaces, etc.

Reduction to Connectionist and Neural levelsSlide4

physics

lowest energy state

chemistry

molecular fit

biology

fitness, MEU

N

euroeconomicsvision threats, friendslanguage errors, NTL

Constrained Best Fit in Nature

inanimate animate

society, politics

framing, compromiseSlide5

Competition-based analyzer

An analysis is made up of:

A constructional tree

A semantic specificationA set of resolutions

Bill gave Mary the book

Mary

Bill

Ref-ExpRef-ExpRef-ExpGiveA-GIVE-B-Xsubjvobj1obj2book01@Man@WomanGive-Action@Bookgiverrecipient

theme

Johno BryantSlide6

Combined score determines best-fit

Syntactic Fit:

Constituency relations

Combine with preferences on non-local elementsConditioned on syntactic context

Antecedent Fit:

Ability to find referents in the context

Conditioned on syntax match, feature agreement

Semantic Fit:Semantic bindings for frame rolesFrame roles’ fillers are scoredSlide7

0Eve1

walked

2

into3the4

house

5

Constructs

-------------- NPVP[0] (0,5) Eve[3] (0,1) ActiveSelfMotionPath [2] (1,5) WalkedVerb[57] (1,2) SpatialPP[56] (2,5) Into[174] (2,3) DetNoun[173] (3,5) The[204] (3,4) House[205] (4,5)Schema Instances------------------- SelfMotionPathEvent[1] HouseSchema[66] WalkAction[60] Person[4] SPG[58] RD[177] ~ house RD[5]~ Eve Slide8

Unification chains and their fillers

SelfMotionPathEvent[1].mover

SPG[58].trajector

WalkAction[60].walker

RD[5].resolved-ref

RD[5].category

Filler: Person4

  SpatialPP[56].mInto[174].mSelfMotionPathEvent[1].spg Filler: SPG58 SelfMotionPathEvent[1] .landmarkHouse[205].mRD[177].categorySPG[58].landmark Filler:HouseSchema66  WalkedVerb[57].mWalkAction[60].routineWalkAction[60].gaitSelfMotionPathEvent[1] .motion Filler:WalkAction60Slide9

Mother (I) give you this (a toy).

CHILDES Beijing Corpus (Tardiff, 1993; Tardiff, 1996)

ma1+ma

gei3

ni3

zhei4+ge

mother

give2PSthis+CLSYou give auntie [the peach].Oh (go on)! You give [auntie] [that].Productive Argument Omission (Mandarin)Johno Bryant & Eva Mok123ni3 gei3 yi22PSgiveauntie

ao

ni3

gei3

ya

EMP

2PS

give

EMP

4

gei3

give

[I]

give

[you] [some peach]

.Slide10

Arguments are omitted with different probabilitiesAll args omitted: 30.6% No args omitted: 6.1%Slide11

Analyzing ni3 gei3 yi2 (You give auntie)Syntactic Fit:

P(Theme omitted | ditransitive cxn) = 0.65

P(Recipient omitted | ditransitive cxn) = 0.42

Two of the competing analyses:

ni3

gei3

yi2

omitted↓↓↓↓GiverTransferRecipientThemeni3gei3omittedyi2↓↓

Giver

Transfer

Recipient

Theme

(1-0.78)*(1-0.42)*0.65 = 0.08

(1-0.78)*(1-0.65)*0.42 = 0.03Slide12

Using frame and lexical information to restrict type of reference

Lexical Unit

gei3

Giver

(DNI)

Recipient

(DNI)

Theme (DNI)The Transfer FrameGiverRecipientThemeMannerMeansPlacePurposeReasonTimeSlide13

Can the omitted argument be recovered from context?Antecedent Fit:

ni3

gei3

yi2

omitted

↓↓GiverTransferRecipientThemeni3gei3omittedyi2↓↓↓

Giver

Transfer

Recipient

Theme

Discourse & Situational

Context

child mother

peach auntie

table

?Slide14

How good of a theme is a peach? How about an aunt?

The Transfer Frame

Giver (usually animate)

Recipient (usually animate)

Theme (usually inanimate)

ni3

gei3

yi2omitted↓↓↓↓GiverTransferRecipientThemeni3gei3omittedyi2

Giver

Transfer

Recipient

Theme

Semantic Fit:

ni3

gei3

yi2

omitted

Giver

Transfer

Recipient

ThemeSlide15

The argument omission patterns shown earlier can be covered with just ONE constructionEach construction is annotated with probabilities of omission

Language-specific default probability can be set

Subj

Verb

Obj1

Obj2

↓↓↓GiverTransferRecipientTheme0.780.420.65P(omitted|cxn):Slide16

Leverage process to simplify representationThe processing model is complementary to the theory of grammarBy using a competition-based analysis process, we can:

Find the best-fit analysis with respect to constituency structure, context, and semantics

Eliminate the need to enumerate allowable patterns of argument omission in grammar

This is currently being applied in models of language understanding and grammar learning. Slide17

Best-fit example with theme omitted

Subj

Verb

Obj1

Obj2

↓↓GiverTransferRecipientThemeYou give auntie [the peach].2Verb↓Transferlocal? omitted?local? omitted?local? omitted?locallocal

Subj

Giver

omitted

local?

omitted?

local

Obj1

Recipient

Obj2

Theme

ni3

gei3

yi2

2PS

give

auntieSlide18

Lexical Unit

gei3

Giver

Recipient

Theme

How to recover the omitted argument, in this case the peach?

The Transfer Frame

GiverRecipientThemeMannerMeansPlacePurposeReasonTime(DNI)(DNI)(DNI)Discourse & Situational ContextchildmotherauntiepeachtableomittedObj2↓ThemeSlide19

Best-fit example with theme omitted

Oh (go on)! You give

[auntie] [that]

.

3

Verb

Transferlocal? omitted?local? omitted?local? omitted?localomittedSubj↓Giveromittedlocal? omitted? local Obj1↓RecipientObj2↓

Theme

ao

ni3

gei3

ya

EMP

2PS

give

EMPSlide20

Lexical Unit

gei3

Giver

Recipient

Theme

How to recover the omitted argument, in this case the aunt and the peach?

The Transfer Frame

GiverRecipientThemeMannerMeansPlacePurposeReasonTime(DNI)(DNI)(DNI)Discourse & Situational ContextchildmotherauntiepeachtableomittedObj2↓Theme

omitted

Obj1

RecipientSlide21

Modeling context for language understanding and learning

Linguistic structure reflects experiential structure

Discourse participants and entities

Embodied schemas:action, perception, emotion, attention, perspective Semantic and pragmatic relations: spatial, social, ontological, causal

‘Contextual bootstrapping’

for grammar learningSlide22

The context model tracks accessible entities, events, and utterances

Discourse & Situational Context

Discourse01

participants: Eve , Mother

objects: Hands, ...

discourse-history: DS01

situational-history: Wash-Action

Discourse:Slide23

Each of the items in the context model has rich internal structure

Situational History:

Discourse History:

Participants:

Objects:

Discourse:

Wash-Action

washer: Eve washee: HandsDS01 speaker: Mother addressee: Eve attentional-focus: Hands content: {"are they clean yet?"} speech-act: questionEve category: child gender: female name: Eve age: 2Mother category: parent gender: female name: Eve age: 33Hands category: BodyPart part-of: Eve number: plural accessibility: accessibleSlide24

Analysis produces a semantic specification

Linguistic Knowledge

Utterance

Discourse & Situational Context

Semantic Specification

World Knowledge

Analysis

“You washed them”WASH-ACTION washer: Eve washee: HandsSlide25

How Can Children Be So Good At Learning Language?

Gold’s Theorem:

No superfinite class of language is identifiable in the limit from positive data only

Principles & ParametersBabies are born as blank slates but acquire language quickly (with noisy input and little correction) → Language must be innate:

Universal Grammar + parameter setting

But babies aren’t born as blank slates!

And they do not learn language in a vacuum!Slide26

Key ideas for a NT of language acquisitionNancy Chang and Eva Mok

Embodied Construction Grammar

Opulence of the Substrate

Prelinguistic children already have rich sensorimotor representations and sophisticated social knowledgeBasic Scenes

Simple clause constructions are associated directly with

scenes basic to human experience

(Goldberg 1995, Slobin 1985)

Verb Island Hypothesis Children learn their earliest constructions (arguments, syntactic marking) on a verb-specific basis(Verb Island Hypothesis, Tomasello 1992)Slide27

Embodiment and Grammar Learning

Paradigm problem for Nature vs. Nurture

The

poverty of the stimulusThe

opulence

of the

substrate

Intricate interplay of genetic and environmental, including social, factors.Slide28

Two perspectives on grammar learning

Computational models

Grammatical induction

language identificationcontext-free grammars, unification grammarsstatistical NLP (parsing, etc.)Word learning modelssemantic representations

logical forms

discrete representations

continuous representations

statistical modelsDevelopmental evidencePrior knowledgeprimitive conceptsevent-based knowledgesocial cognitionlexical itemsData-driven learningbasic sceneslexically specific patternsusage-based learningSlide29

Key assumptions for language acquisition

Significant prior

conceptual/embodied knowledge

rich sensorimotor/social substrateIncremental learning based on experienceLexically specific constructions are learned first.

Language learning tied to

language use

Acquisition interacts with comprehension, production;

reflects communication and experience in world.Statistical properties of data affect learningSlide30

Context

Eve

washer

Wash-Action

Hands

washee

Discourse Segment

addresseeattentional-focusAnalysis draws on constructions and contextbeforebeforeMeaningFormyouAddresseewasherWash-Actionwashedwashee

ContextElement

themSlide31

Learning updates linguistic knowledge based on input utterances

Learning

Discourse & Situational Context

Linguistic Knowledge

Analysis

Utterance

Partial

SemSpecWorld KnowledgeSlide32

Context

Eve

washer

Wash-Action

Hands

washee

Discourse Segment

addresseeattentional-focusContext aids understanding: Incomplete grammars yield partial SemSpecMeaningFormyouAddresseewasherWash-ActionwashedwasheeContextElement

themSlide33

Context

Eve

washer

Wash-Action

Hands

washee

Discourse Segment

addresseeattentional-focusContext bootstraps learning: new construction maps form to meaningMeaningFormyouAddresseeWash-ActionwashedContextElementthembeforebeforewasher

washeeSlide34

Context bootstraps learning: new construction maps form to meaning

Meaning

Form

you

Addressee

Wash-Action

washed

ContextElementthembeforebeforewasherwasheeYOU-WASHED-THEM constituents: YOU, WASHED, THEM form: YOU before WASHED WASHED before THEM meaning: WASH-ACTION washer: addressee washee: ContextElementSlide35

Grammar learning: suggesting new CxNs and reorganizing existing ones

reinforcement

reorganize

merge

join

split

Linguistic Knowledge

Discourse & Situational ContextAnalysisUtterancePartialSemSpecWorld Knowledgehypothesizemap form to meaninglearn contextual constraintsSlide36

Challenge: How far up to generalizeEat riceEat appleEat watermelon

Want rice

Want apple

Want chair

Inanimate Object

Manipulable

Objects

Unmovable ObjectsFoodFurnitureFruitSavoryChairSofaapplewatermelonriceSlide37

Challenge: Omissible constituentsIn Mandarin, almost anything available in context can be omitted – and often is in child-directed speech.Intuition:

Same context, two expressions that differ by one constituent

 a general construction with the constituent being omissible

May require verbatim memory traces of utterances + “relevant” contextSlide38

When does the learning stop?Most likely grammar given utterances and contextThe grammar prior includes a preference for the “kind” of grammar

In practice, take the log and minimize cost

 Minimum Description Length (MDL)

Bayesian Learning Framework

Schemas +

Constructions

SemSpec

Analysis + ResolutionContext FittingreorganizehypothesizereinforcementSlide39

Intuition for MDLS -> Give me NPNP -> the book

NP -> a book

S -> Give me NP

NP -> DET bookDET -> theDET -> a

39

Suppose that the prior is inversely proportional to the size

of the grammar (e.g. number of rules)

It’s not worthwhile to make this generalizationSlide40

Intuition for MDLS -> Give me NPNP -> the book

NP -> a book

NP -> the pen

NP -> a penNP -> the pencilNP -> a pencilNP -> the markerNP -> a marker

S -> Give me NP

NP -> DET N

DET -> the

DET -> aN -> bookN -> penN -> pencilN -> markerSlide41

Usage-based learning: comprehension and production

reinforcement

(usage)

reinformcent

(correction)

reinforcement

(usage)

hypothesize constructions& reorganizereinforcement(correction)constructiconworld knowledgediscourse & situational contextsimulationanalysisutterance

analyze

&

resolve

utterance

response

comm. intent

generate