/
Lecture 16: Coreference Resolution Lecture 16: Coreference Resolution

Lecture 16: Coreference Resolution - PowerPoint Presentation

jacey
jacey . @jacey
Follow
65 views
Uploaded On 2023-11-04

Lecture 16: Coreference Resolution - PPT Presentation

Determining who is who and what is what Harvard IACS Chris Tanner 2 COREFERENCE RESOLUTIONARY You would rather have a Lexus or justice A dream or some substance A Bimmer a necklace or freedom ID: 1028624

data coreference world resolution coreference data resolution world cheers barge full days set finally succeeded puny machines wrenching mammoth

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "Lecture 16: Coreference Resolution" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

1. Lecture 16: Coreference ResolutionDetermining who is who and what is whatHarvard IACSChris Tanner

2. 2COREFERENCE RESOLUTIONARY“You would rather have a Lexus or justice? A dream or some substance? A Bimmer, a necklace, or freedom?-- Dead Prez

3. 3ANNOUNCEMENTSHW4 is outHW2 and Phase 2 are Quiz 5 have been gradedResearch Project Phase 3 due Oct 28 (Thurs) @ 11:59pm

4. 4Conjoined CNNNeural ClusteringResultsImprovementsLeveraging Data No DataBetter DataAdditional ResearchOutlineCoreference Resolution

5. 5Conjoined CNNNeural ClusteringResultsImprovementsLeveraging Data No DataBetter DataAdditional ResearchCoreference ResolutionOutline

6. Coreference Resolution6Conjoined CNNNeural ClusteringResultsImprovementsLeveraging Data No DataBetter DataAdditional ResearchOutline

7. Discourse7These systems hinge upon understanding what you’re saying (discourse) and the meaning of it (semantics)

8. Discourse8Also necessary for information retrieval, question-answering, document summarization, etc“TL;DR crypto stocks are surging”Question answering based on semantic structures. Narayanan and Harabagiu, 2004Event coreference for information extraction. Humphreys et al., 1997Sub-event based multi-document summarization. Daniel et al., 2003

9. 9

10. 10In the end, a full moon succeeded where puny machines could not, wrenching the mammoth barge out of the Egyptian mud in which it became wedged six days earlier. A spring tide finally set the Ever Given and its enormous stack of 18,300 shipping containers afloat again, drawing cheers from Egyptians on the shore and a virtual world beyond.

11. 11In the end, a full moon succeeded where puny machines could not, wrenching the mammoth barge out of the Egyptian mud in which it became wedged six days earlier. A spring tide finally set the Ever Given and its enormous stack of 18,300 shipping containers afloat again, drawing cheers from Egyptians on the shore and a virtual world beyond.

12. 12In the end, a full moon succeeded where puny machines could not, wrenching the mammoth barge out of the Egyptian mud in which it became wedged six days earlier. A spring tide finally set the Ever Given and its enormous stack of 18,300 shipping containers afloat again, drawing cheers from Egyptians on the shore and a virtual world beyond.

13. 13In the end, a full moon succeeded where puny machines could not, wrenching the mammoth barge out of the Egyptian mud in which it became wedged six days earlier. A spring tide finally set the Ever Given and its enormous stack of 18,300 shipping containers afloat again, drawing cheers from Egyptians on the shore and a virtual world beyond.

14. 14In the end, a full moon succeeded where puny machines could not, wrenching the mammoth barge out of the Egyptian mud in which it became wedged six days earlier. A spring tide finally set the Ever Given and its enormous stack of 18,300 shipping containers afloat again, drawing cheers from Egyptians on the shore and a virtual world beyond.

15. 15In the end, a full moon succeeded where puny machines could not, wrenching the mammoth barge out of the Egyptian mud in which it became wedged six days earlier. A spring tide finally set the Ever Given and its enormous stack of 18,300 shipping containers afloat again, drawing cheers from Egyptians on the shore and a virtual world beyond.

16. 16In the end, a full moon succeeded where puny machines could not, wrenching the mammoth barge out of the Egyptian mud in which it became wedged six days earlier. A spring tide finally set the Ever Given and its enormous stack of 18,300 shipping containers afloat again, drawing cheers from Egyptians on the shore and a virtual world beyond.

17. 17In the end, a full moon succeeded where puny machines could not, wrenching the mammoth barge out of the Egyptian mud in which it became wedged six days earlier. A spring tide finally set the Ever Given and its enormous stack of 18,300 shipping containers afloat again, drawing cheers from Egyptians on the shore and a virtual world beyond.

18. 18In the end, a full moon succeeded where puny machines could not, wrenching the mammoth barge out of the Egyptian mud in which it became wedged six days earlier. A spring tide finally set the Ever Given and its enormous stack of 18,300 shipping containers afloat again, drawing cheers from Egyptians on the shore and a virtual world beyond.

19. 19In the end, a full moon succeeded where puny machines could not, wrenching the mammoth barge out of the Egyptian mud in which it became wedged six days earlier. A spring tide finally set the Ever Given and its enormous stack of 18,300 shipping containers afloat again, drawing cheers from Egyptians on the shore and a virtual world beyond.

20. 20In the end, a full moon succeeded where puny machines could not, wrenching the mammoth barge out of the Egyptian mud in which it became wedged six days earlier. A spring tide finally set the Ever Given and its enormous stack of 18,300 shipping containers afloat again, drawing cheers from Egyptians on the shore and a virtual world beyond.

21. 21In the end, a full moon succeeded where puny machines could not, wrenching the mammoth barge out of the Egyptian mud in which it became wedged six days earlier. A spring tide finally set the Ever Given and its enormous stack of 18,300 shipping containers afloat again, drawing cheers from Egyptians on the shore and a virtual world beyond.

22. 22In the end, a full moon succeeded where puny machines could not, wrenching the mammoth barge out of the Egyptian mud in which it became wedged six days earlier. A spring tide finally set the Ever Given and its enormous stack of 18,300 shipping containers afloat again, drawing cheers from Egyptians on the shore and a virtual world beyond.

23. 23In the end, a full moon succeeded where puny machines could not, wrenching the mammoth barge out of the Egyptian mud in which it became wedged six days earlier. A spring tide finally set the Ever Given and its enormous stack of 18,300 shipping containers afloat again, drawing cheers from Egyptians on the shore and a virtual world beyond.Coreference ResolutionThe task of determining which words all refer to the same underlying real-world thing

24. 24In the end, a full moon succeeded where puny machines could not, wrenching the mammoth barge out of the Egyptian mud in which it became wedged six days earlier. A spring tide finally set the Ever Given and its enormous stack of 18,300 shipping containers afloat again, drawing cheers from Egyptians on the shore and a virtual world beyond.Coreference ResolutionThe task of determining which words all refer to the same underlying real-world thingEASY FOR HUMANS

25. 25In the end, a full moon succeeded where puny machines could not, wrenching the mammoth barge out of the Egyptian mud in which it became wedged six days earlier. A spring tide finally set the Ever Given and its enormous stack of 18,300 shipping containers afloat again, drawing cheers from Egyptians on the shore and a virtual world beyond.State-of-the-art neural model? End-to-end Neural Coreference Resolution. Lee et al. 2017

26. 26In the end, a full moon succeeded where puny machines could not, wrenching the mammoth barge out of the Egyptian mud in which it became wedged six days earlier. A spring tide finally set the Ever Given and its enormous stack of 18,300 shipping containers afloat again, drawing cheers from Egyptians on the shore and a virtual world beyond.State-of-the-art neural model? End-to-end Neural Coreference Resolution. Lee et al. 2017

27. 27In the end, a full moon succeeded where puny machines could not, wrenching the mammoth barge out of the Egyptian mud in which it became wedged six days earlier. A spring tide finally set the Ever Given and its enormous stack of 18,300 shipping containers afloat again, drawing cheers from Egyptians on the shore and a virtual world beyond.HARD FOR COMPUTERS

28. 28Types of referring expressionsIndefinite noun phrases:I saw an incredible oak tree todayDefinite noun phrases:I read about it in the New York TimesPronominal mentions:Emily aced the quiz, as she expectedNominal mentions and names:The amazing marathoner, Des Linden, is a true inspirationDemonstratives:These pretzels are making me thirsty.This, that, these, those

29. Good models should be able to perform coreference resolution across multiple documents29

30. 30In the end, a full moon succeeded where puny machines could not, wrenching the mammoth barge out of the Egyptian mud in which it became wedged six days earlier. A spring tide finally set the Ever Given and its enormous stack of 18,300 shipping containers afloat again, drawing cheers from Egyptians on the shore and a virtual world beyond.SUEZ, Egypt (AP) — Experts boarded the massive container ship Tuesday that had blocked Egypt’s vital Suez Canal and disrupted global trade for nearly a week, seeking answers to a single question that could have billions of dollars in legal repercussions: What went wrong?

31. And handle events31

32. 32In the end, a full moon succeeded where puny machines could not, wrenching the mammoth barge out of the Egyptian mud in which it became wedged six days earlier. A spring tide finally set the Ever Given and its enormous stack of 18,300 shipping containers afloat again, drawing cheers from Egyptians on the shore and a virtual world beyond.SUEZ, Egypt (AP) — Experts boarded the massive container ship Tuesday that had blocked Egypt’s vital Suez Canal and disrupted global trade for nearly a week, seeking answers to a single question that could have billions of dollars in legal repercussions: What went wrong?

33. 33In the end, a full moon succeeded where puny machines could not, wrenching the mammoth barge out of the Egyptian mud in which it became wedged six days earlier. A spring tide finally set the Ever Given and its enormous stack of 18,300 shipping containers afloat again, drawing cheers from Egyptians on the shore and a virtual world beyond.SUEZ, Egypt (AP) — Experts boarded the massive container ship Tuesday that had blocked Egypt’s vital Suez Canal and disrupted global trade for nearly a week, seeking answers to a single question that could have billions of dollars in legal repercussions: What went wrong?

34. 34mammoth barge it itsEver Given container ship ExpertsSuez CanalEgyptian mudEgyptianswedgedblockedset afloatdrawing cheers disruptedsucceededboardedseekingwrenchingcluster1cluster2cluster3cluster13cluster9cluster8cluster5cluster4cluster7cluster11cluster10cluster12cluster6

35. 35mammoth barge it itsEver Given container ship ExpertsSuez CanalEgyptian mudEgyptianswedgedblockedset afloatdrawing cheers disruptedsucceededboardedseekingwrenchingcluster1cluster2cluster3cluster13cluster9cluster8cluster5cluster4cluster7cluster11cluster10cluster12cluster6Takeaway #1Coreference resolution determines which mentions all refer to the same underlying entity or event, and is ultimately a clustering task.

36. 36Entity Coreference (2010 – present)Early research demonstrated highly-effective rule-based entity coref systems Stanford’s Multi-Pass Sieve Coreference Resolution System. Lee et al. CoNLL 2011A Multi-Pass Sieve for Coreference Resolution. Raghunathan et al. EMNLP 2010CoNLL F1: 58.3

37. 37Entity Coreference (2010 – present)Rule 1: cluster together all entity mentions that are identicalThe Ever Given cargo ship has been stuck for the past six days. While reports of Ever Given started to …Stanford’s Multi-Pass Sieve Coreference Resolution System. Lee et al. CoNLL 2011A Multi-Pass Sieve for Coreference Resolution. Raghunathan et al. EMNLP 2010

38. 38Entity Coreference (2010 – present)Rule 10: cluster together all entity mentions that are aliases according to WikipediaDonald Glover, better known as Childish Gambino, has written and produced an incredible TV series titled Atlanta.Stanford’s Multi-Pass Sieve Coreference Resolution System. Lee et al. CoNLL 2011A Multi-Pass Sieve for Coreference Resolution. Raghunathan et al. EMNLP 2010

39. 39Entity Coreference (2011 – present)Then, many systems threw tons of manually-defined features into their modelsImproving Coreference Resolution by Learning Entity-Level Distributed Representations. Clark and Manning. ACL 2016Narrowing the Modeling Gap: A Cluster-Ranking Approach to Coreference Resolution. Rahman and Ng. JAIR 2011CoNLL F1: 65.3

40. Entity Coreference (2011 – present)Then, many systems threw tons of manually-defined features into their modelsImproving Coreference Resolution by Learning Entity-Level Distributed Representations. Clark and Manning. ACL 2016Narrowing the Modeling Gap: A Cluster-Ranking Approach to Coreference Resolution. Rahman and Ng. JAIR 2011Takeaway #2Research has largely relied on ML models w/ many manually-defined features.Strong results but clear limitations.40

41. Actress Lindsay Lohan finally checked into court-mandated rehab at the Betty Ford Center late Thursday.41Event Coreference (2014 - present)Lindsay Lohan checked into the Betty Ford Clinic in Rancho Mirage, California on Thursday night, for what is to be a three-month stay, her rep confirms to People.ECB+ corpus has 982 short documents

42. 42SameLemma: if two mentions have the same lemma (base form), classify them as being coref!runningOriginal wordLemmatizationrunranrunThis shouldn’t work so well, but it does.Event Coreference (2014 - present)

43. 43Conjoined CNNNeural ClusteringResultsImprovementsLeveraging Data No DataBetter DataAdditional ResearchCoreference ResolutionOutline

44. Coreference Resolution44ImprovementsLeveraging Data No DataBetter DataAdditional ResearchOutlineConjoined CNNNeural ClusteringResults

45. Coreference resolution45corpusExpertscluster3mammothbarge it itsEver Given container ship cluster1?

46. Coreference resolution46corpusExpertscluster3mammothbarge it itsEver Given container ship cluster1MentionDetectionMentionPair ModelClustering

47. Coreference resolution47corpusExpertscluster3mammothbarge it itsEver Given container ship cluster1MentionDetectionMentionPair ModelClustering

48. 4.6 Magnitude Quake Recorded in Sonoma CountyAn earthquake with a preliminary magnitude of 4.6 was recorded in the North Bay this morning, according to the U.S. Geological Survey. The quake occurred at 2:09 a.m. about 14 miles north-northeast of Healdsburg and had a depth of 1.2 miles. It was followed by a 2.9 aftershock at 2:12 a.m. and a 2.2 at 2:15 a.m… there are no reports of injuries or major damage.48Mention DetectionDetermines which spans of words constitute a mentionDoc 14.6 Magnitude Quake Rattles Sonoma County Early ThursdayAn earthquake measuring 4.6 rattled Sonoma and Lake counties early Thursday, according to the U.S. Geological Survey.  The quake occurred at 2:09 a.m., about 14 miles northeast of Healdsburg, on the Maacama Fault with a depth of 12 miles.  A Sonoma County Sheriff’s dispatcher said around 7 a.m. that there had been no reports of damage or injuries.Doc 2

49. Coreference resolution49corpusExpertscluster3mammothbarge it itsEver Given container ship cluster1MentionDetectionMentionPair ModelClustering

50. Coreference resolution50corpusExpertscluster3mammothbarge it itsEver Given container ship cluster1MentionDetectionMentionPair ModelClustering

51. 51Mention Pair ModelCalculates a coref probability for all pairs of mentionsaccording torattlesrattlesrecordedshookshook0.90.20.730.150.120.230.820.20.080.520.71

52. 52Conjoined CNNWord embeddingsLemma embeddingsDependency Parse embeddingsCharacter embeddingsPart-of-speech embeddings# dimensionsFeature300300400100300

53. 53Conjoined CNNsimilarity scoreCross-Document Coreference Resolution for Entities and Events. Tanner. Brown University Dissertation. 2019

54. 54Conjoined CNNTwo identical networks with tied weightsDistance Score: L2 normLoss Function: Contrastive LossCross-Document Coreference Resolution for Entities and Events. Tanner. Brown University Dissertation. 2019

55. 55Conjoined CNNWe predict these should NOT coref as pairsWe predict these should coref as pairs0.5 threshold

56. 56Conjoined CNNLibSVM and FFNN received same features as CCNN,plus relational features (e.g., cosine sim., dot-product, WordNet)Development Set ResultsCross-Document Coreference Resolution for Entities and Events. Tanner. Brown University Dissertation. 2019

57. Coreference resolution57corpusExpertscluster3mammothbarge it itsEver Given container ship cluster1MentionDetectionMentionPair ModelClustering

58. Coreference resolution58corpusExpertscluster3mammothbarge it itsEver Given container ship cluster1MentionDetectionMentionPair ModelClustering

59. Coreference Resolution59ImprovementsLeveraging Data No DataBetter DataAdditional ResearchOutlineConjoined CNNNeural ClusteringResults

60. Conjoined CNNNeural ClusteringResultsCoreference Resolution60ImprovementsLeveraging Data No DataBetter DataAdditional ResearchOutline

61. 61ClusteringGoal is to return clusters from the fully-connected graphaccording torattlesrattlesrecordedshookshook0.90.20.730.150.120.230.820.20.080.520.71

62. 62according torattlesrattlesrecordedshookshook0.90.730.82Goal is to return clusters from the fully-connected graphClustering

63. 63Nearly 100% of past systems simply performed agglomerative clustering.ClusteringWe want:More holistic, cluster-to-cluster predictionsLess sensitivity to non-uniformity across topicsNo additional stopping parameterPrevention against an all-subsuming cluster

64. 64ClusteringCross-Document Coreference Resolution for Entities and Events. Tanner. Brown University Dissertation. 2019

65. Coreference resolution65corpusExpertscluster3mammothbarge it itsEver Given container ship cluster1MentionDetectionMentionPair ModelClustering

66. Conjoined CNNNeural ClusteringResultsCoreference Resolution66ImprovementsLeveraging Data No DataBetter DataAdditional ResearchOutline

67. Conjoined CNNNeural ClusteringResultsCoreference Resolution67ImprovementsLeveraging Data No DataBetter DataAdditional ResearchOutline

68. … as Peter Capaldi stepped into Matt Smith’s soon to be vacant …68Mention Detection… as Peter Capaldi stepped into Matt Smith’s soon to be vacant …Gold Test MentionsPredicted Test Mentions

69. 69Feature Ablation (full system)Lemma + Character Embeddings yields the best performance

70. 70Cross-Document Coreference Resolution for Entities and Events. Tanner. Brown University Dissertation. 2019Using Predicted Mentions

71. 71Cross-Document Coreference Resolution for Entities and Events. Tanner. Brown University Dissertation. 2019Using Gold Mentions

72. 72CCNN + ClusteringFINDINGSState-of-the-art for event corefContextualized representationsMore holistic clusteringChar + Lemma Embeddings were the only two necessary features

73. 73Errorssemantics — 82%context-dependent (30%)similar meanings (38%)wide-reading (14%)unclear — 13%syntax — 3%too difficult for me — 2%semantics — 42%unclear — 20%slang — 16%longer names — 14%pronouns — 8%False PositivesFalse NegativesTotal # of Mention-Pairs to test: 8,669# False Positives: 86# False Negatives: 569

74. Friday, Obama announced …74Sony announced today …False PositiveFalse NegativesFalse NegativeThe casting of Smith …Smith stepped into the role …Smith was handed the keys to play …The UN Refugee Agency on Friday strongly condemned the aerial bombing of …Two of the bombs fell within the Yida Camp, including …CCNN + Clustering

75. CCNN + ClusteringFriday, Obama announced …75Sony announced today …False PositiveFalse NegativesFalse NegativeThe casting of Smith …Smith stepped into the role …Smith was handed the keys to play …The UN Refugee Agency on Friday strongly condemned the aerial bombing of …Two of the bombs fell within the Yida Camp, including …Takeaway #3The community needs a better corpus.Takeaway #4Event coref is especially hard, but using deep learning w/ contextualized representations works well.

76. Conjoined CNNNeural ClusteringResultsCoreference Resolution76ImprovementsLeveraging Data No DataBetter DataAdditional ResearchOutline

77. Coreference Resolution77ImprovementsLeveraging Data No DataBetter DataAdditional ResearchOutlineConjoined CNNNeural ClusteringResults

78. 78Entity CoreferenceMentionDetectionMentionPair ModelClusteringEnd-to-end Neural Coreference Resolution. Lee et al. EMNLP 2017.End-to-end neural systems

79. 79End-to-end Neural Coreference Resolution. Lee et al. EMNLP 2017.Entity CoreferenceEnd-to-end

80. 80Uses several important featuresEnd-to-end Neural Coreference Resolution. Lee et al. EMNLP 2017.Entity Coreference

81. 81TheEverGivenhasx1x2x3x4r2Encoder #1r3r4r1Encoder #2Encoder #3BERT(encodes rich information very well)y1y2y3y4

82. 82End-to-EndBERT for Coreference Resolution: Baselines and Analysis. Joshi et al. EMNLP 2019.BERTTransformer (x)CoNLL F1: 79.6Entity Coreference

83. 83End-to-EndBERT for Coreference Resolution: Baselines and Analysis. Joshi et al. EMNLP 2019.CoNLL F1: 79.6Entity Coreference

84. 84REMAINING ISSUESPronouns (especially in conversation)Conflating relatedness with equality(e.g., “Flight attendants” with ”pilots”)World-knowledgeMention paraphrasing(e.g., “Royals” with “Prince Charles and his wife Camilla”)Entity Coreference

85. 85Coref is still far from solved

86. However, coref is still far from solvedTakeaway #5Pre-trained LLM (e.g., BERT) capture rich information but miss nuanced cases86

87. However, coref is still far from solvedTakeaway #587Takeaway #6Until we have better data, we don’t fully understand the capabilities of our existing systems, nor do we know what is possible.Pre-trained LLM (e.g., BERT) capture rich information but miss nuanced cases

88. 88Takeaway #1Coreference resolution determines which mentions all refer to the same underlying entity or event, and is ultimately a clustering task.Takeaway #2Research has largely relied on ML models w/ many manually-defined features. Strong results but clear limitations.Takeaway #3The community needs a better corpus.

89. 89Takeaway #5Neural pre-trained text encoders (e.g., BERT) capture rich information but miss nuanced casesTakeaway #4Event coref is especially hard, but using deep learning w/ contextualized representations works well.Takeaway #6Until we have better data, we don’t fully understand the capabilities of our existing systems, or know what’s possible.

90. 90INSIGHTSInstead of hammering away on a problem and throwing complex models at it, pay close attention to:What you’re trying to model (i.e., your data)How you’re framing the problem(e.g., a clustering task via pairwise predictions)Performance is reaching an asymptote.

91. Coreference Resolution91ImprovementsLeveraging Data No DataBetter DataAdditional ResearchOutlineConjoined CNNNeural ClusteringResults

92. ImprovementsCoreference Resolution92Leveraging Data No DataBetter DataAdditional ResearchOutlineConjoined CNNNeural ClusteringResults

93. 93How can we equip our coreference resolution models with common sense knowledge?Ning HuaHarvardBioinformatics MSVered ShwartzUBCAssistant ProfessorSahithya RaviUBCPhD Student

94. 94SOTA coref model uses RoBERTa as a base.Before coref training, we fine-tune the RoBERTa base on ConceptNet

95. 95Also, using graph embeddings and graph alignments to influence coref modelling

96. 96Can Bayesian clustering improve joint entity and event coreference? Joint Entity and Event CoreferenceXin ZengIACS MS Thesis

97. ImprovementsCoreference Resolution97Leveraging Data No DataBetter DataAdditional ResearchOutlineConjoined CNNNeural ClusteringResults

98. ImprovementsCoreference Resolution98Additional ResearchOutlineConjoined CNNNeural ClusteringResultsLeveraging Data No DataBetter Data

99. 99Since labelled data is a scarce commodity, can we build a powerful unsupervised model?Alessandro StolfoETH-ZurichPhD StudentVikram GuptaETH-ZurichResearch AffiliateMrinmaya SachanETH-ZurichAssistant Professor

100. 100We combine the old school, manual rule-based systemwith the SOTA BERT-based end-to-end modelSpanBERTTransformer (x)A Simple and Effective Approach for Unsupervised Coreference using Pretrained Representations and Weak Supervision from Linguistic Rules. Alessandro Stolfo et al. In preparation.

101. 101We combine the old school, manual rule-based systemwith the SOTA BERT-based end-to-end modelSpanBERTTransformer (x)Supervised(needs training data)Unsupervised(doesn’t need training data)A Simple and Effective Approach for Unsupervised Coreference using Pretrained Representations and Weak Supervision from Linguistic Rules. Alessandro Stolfo et al. In preparation.

102. 102We combine the old school, manual rule-based systemwith the SOTA BERT-based end-to-end modelBERTTransformer (x)Supervised(needs training data)Unsupervised(doesn’t need training data)Let’s use this as synthetic “gold” labels for BERTA Simple and Effective Approach for Unsupervised Coreference using Pretrained Representations and Weak Supervision from Linguistic Rules. Alessandro Stolfo et al. In preparation.

103. 103Training with noisy (imperfect) rule-based labels would limit our BERT model to perform no better than the rule-based systemCONCERNA Simple and Effective Approach for Unsupervised Coreference using Pretrained Representations and Weak Supervision from Linguistic Rules. Alessandro Stolfo et al. In preparation.

104. 104Training with noisy (imperfect) rule-based labels would limit our BERT model to perform no better than the rule-based systemCONCERNFINDINGSOur combined BERT model successfully uses distant-supervision to outperform the rule-based systemA Simple and Effective Approach for Unsupervised Coreference using Pretrained Representations and Weak Supervision from Linguistic Rules. Alessandro Stolfo et al. In preparation.

105. 105A Simple and Effective Approach for Unsupervised Coreference using Pretrained Representations and Weak Supervision from Linguistic Rules. Alessandro Stolfo et al. In preparation.

106. ImprovementsCoreference Resolution106Additional ResearchOutlineConjoined CNNNeural ClusteringResultsLeveraging Data No DataBetter Data

107. ImprovementsCoreference Resolution107Additional ResearchOutlineConjoined CNNNeural ClusteringResultsLeveraging Data No DataBetter Data

108. ImprovementsCoreference Resolution108Additional ResearchOutlineConjoined CNNNeural ClusteringResultsLeveraging Data No DataBetter Data

109. 109Coreference Resolution has had many exciting advances in the last 10 years, but it’s far from solved and remains one of the most challenging and exciting NLP tasks.Conclusions