Files in this item



application/pdfHU-THESIS-2020.pdf (513kB)Restricted Access
(no description provided)PDF


Title:Pairwise embedding for event coreference resolution
Author(s):Hu, Yanda
Advisor(s):Ji, Heng
Department / Program:Electrical & Computer Eng
Discipline:Electrical & Computer Engr
Degree Granting Institution:University of Illinois at Urbana-Champaign
Subject(s):event coreference
Abstract:Event coreference resolution is an important part in information extraction research and natural language understanding areas. Recently, the pre-trained language models emerging in modern Natural Language Processing (NLP) community provide a new perspective of solving classical NLP tasks. This thesis presents a novel, extensible, and error-tolerant model for within-document event coreference resolution. The model takes events from the same document pair wisely and extracts the features from both events. These features include but are not limited to sentence embeddings, trigger embeddings, and argument type representations. The extracted features are then used to fine-tune a pre-trained language model, BERT (Bidirectional Encoder Representations from Transformers), to perform event coreference resolution. The coreference results are evaluated by different metrics including B^3, MUC, and CEAF. Experimental results show that our pro-posed model outperforms baseline models with improvements of 0.009 on B^3 and 0.035 on MUC over the state-of-the-art models.
Issue Date:2020-05-12
Rights Information:Copyright 2020 Yanda Hu
Date Available in IDEALS:2020-08-27
Date Deposited:2020-05

This item appears in the following Collection(s)

Item Statistics