/
A Confidence Model for Syntactically-Motivated Entailment Proofs A Confidence Model for Syntactically-Motivated Entailment Proofs

A Confidence Model for Syntactically-Motivated Entailment Proofs - PowerPoint Presentation

shoesxbox
shoesxbox . @shoesxbox
Follow
342 views
Uploaded On 2020-08-06

A Confidence Model for Syntactically-Motivated Entailment Proofs - PPT Presentation

Asher Stern amp Ido Dagan ISCOL June 2011 Israel 1 Recognizing Textual Entailment RTE Given a text T and a hypothesis H Does T entail H 2 T An explosion caused by gas took place at a ID: 800173

taba proof weight vector proof taba vector weight hotel model blast algorithm system tree learning cost rte proofs entailment

Share:

Link:

Embed:

Download Presentation from below link

Download The PPT/PDF document "A Confidence Model for Syntactically-Mot..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

A Confidence Model for Syntactically-Motivated Entailment Proofs

Asher Stern & Ido DaganISCOLJune 2011, Israel

1

Slide2
Recognizing Textual Entailment (RTE)

Given a text,

T, and a hypothesis, HDoes T entail H

2

T

:

An explosion caused by gas took place at a

Taba

hotelH: A blast occurred at a hotel in Taba.

Example

Slide3
Proof Over P

arse Trees

3

T = T

0

T

1

T

2 → ... → Tn = H

Slide4
Bar Ilan

Proof System - Entailment Rules

4

explosion

blast

Generic Syntactic

Lexical Syntactic

Lexical

Slide5

Bar

Ilan Proof System

5

H

: A blast occurred at a hotel in

Taba

.

Lexical

Lexical syntactic

Syntactic

An explosion caused by gas took place at a

Taba

hotel

A

blast

caused by gas took place at a Taba hotel

A blast took place at a Taba hotel

A blast

occurred at

a Taba hotel

A blast occurred at

a hotel in Taba

.

Slide6
Tree-Edit-Distance

6

Insurgents attacked soldiers -> Soldiers were attacked by insurgents

Slide7
Proof over parse trees

Which steps?

Tree-EditsRegular or customEntailment RulesHow to classify?Decide “yes” if and only if a proof was found

Almost always “no”Cannot handle knowledge inaccuraciesEstimate a confidence to the proof correctness

7

Slide8
Proof systems

TED based

Estimate the cost of a proofComplete proofsArbitrary operationsLimited knowledge

Entailment Rules based

Linguistically motivated

Rich knowledge

No estimation of proof correctness

Incomplete proofs

Mixed system with ad-hoc approximate match criteria8

Our System

The benefits of both worlds, and more!

Linguistically

motivated complete proofsConfidence model

Slide9

Our Method

Complete proofsOn the fly operationsCost modelLearning model parameters

9

Slide10
On the fly Operations

“On the fly” operationsInsert node on the fly

Move node / move sub-tree on the flyFlip part of speechEtc.More syntactically motivated than Tree EditsNot justified, but:Their impact on the proof correctness can be estimated by the cost model.10

Slide11
Cost Model

11

The Idea:Represent the proof as a

feature-vectorUse the vector in a

learning algorithm

Slide12
Cost Model

Represent a proof as F

(P) = (F1, F2 … FD)Define weight vector w=(w1,w2,…,w

D)Define proof costClassify

a proof

b is a threshold

Learn

the parameters (

w,b)12

Slide13
Search Algorithm

13

Need to find the “best” proof

“Best Proof” = proof with lowest cost

Assuming a weight vector is

given

Search space is exponential

pruning

Slide14
Parameter Estimation

Goal: find good weight vector and threshold

(w,b)Use a standard machine learning algorithm (logistic regression or linear SVM)But: Training samples are not given as feature vectorsLearning algorithm requires training samplesTraining samples construction requires weight vector

Learning weight vector done by learning algorithm

Iterative learning

14

Slide15
Parameter Estimation

15

Slide16
Parameter Estimation

Start with w0, a reasonable guess for weight vector

i=0Repeat until convergenceFind the best proofs and construct vectors, using wiUse a linear ML algorithm to find a new weight vector, w

i+1i = i+1

16

Slide17
Results

17

System

RTE-1

RTE-2

RTE-3

RTE-5

Logical

Resolution Refutation (

Raina

et al. 2005)

57.0

Probabilistic Calculus of Tree Transformations (

Harmeling

, 2009)

56.39

57.88

Probabilistic Tree Edit model (Wang and Manning, 2010)

63.0

61.10

Deterministic Entailment Proofs (Bar-Haim et al., 2007)

61.12

63.80

Our System

57.13

61.63

67.13

63.50

Operation

Avg.

in positives

Avg.

in negatives

Ratio

Insert Named Entity

0.006

0.016

2.67Insert Content Word0.0380.0942.44DIRT0.0130.0231.73Change “subject” to “object” and vice versa0.0250.0401.60Flip Part-of-speech0.0980.1011.03Lin similarity0.0840.0720.86WordNet0.0640.0520.81

Slide18
Conclusions

Linguistically motivated proofsComplete proofs

Cost modelEstimation of proof correctnessSearch best proofLearning parametersResultsReasonable behavior of learning scheme

18

Slide19

Thank you

Q & A19