Withdraw
Loading…
Association knowledge in natural language learning
Yu, Pengfei
Loading…
Permalink
https://hdl.handle.net/2142/127164
Description
- Title
- Association knowledge in natural language learning
- Author(s)
- Yu, Pengfei
- Issue Date
- 2024-10-28
- Director of Research (if dissertation) or Advisor (if thesis)
- Ji, Heng
- Doctoral Committee Chair(s)
- Ji, Heng
- Committee Member(s)
- Han, Jiawei
- Hoiem, Derek
- Neubig, Graham
- Yih, Scott
- Department of Study
- Siebel School Comp & Data Sci
- Discipline
- Computer Science
- Degree Granting Institution
- University of Illinois at Urbana-Champaign
- Degree Name
- Ph.D.
- Degree Level
- Dissertation
- Keyword(s)
- Natural Language Processing
- Association Modeling
- Information Extraction
- Knowledge Editing
- Abstract
- Association is an important feature of natural language originates from the human's cognitive ability to associate concepts. It is the key to unveiling the computational mechanism of natural language, which is closely related to the research of natural language processing (NLP). However, the current dominant learning paradigm, which is based on neural models that combine distributed representations with probabilistic modeling, demonstrates insufficient capabilities in modeling associations in natural language. To compensate for the deficiency, we mathematically formulate the concept of Association Knowledge as the joint distribution over the probabilities of instances and establish a general methodology to incorporate association knowledge into the training architecture of neural models. We delve into Association Knowledge through a series of case studies across various dimensions, including associations among types of knowledge, languages, instances and unstructured information. These case studies span both smaller neural models and large language models. Through our investigations, we demonstrate that explicitly integrating Association Knowledge into neural architectures markedly improves model performance and efficiency. This enhanced capability is evident in diverse scenarios, from improving event detection in lifelong learning settings and facilitating robust cross-lingual translations, to enhancing the detection of long-tail mentions and refining the updates in large language models. Collectively, our findings underscore the pivotal role of Association Knowledge in advancing the state of NLP by fostering more robust and knowledge-aware neural models.
- Graduation Semester
- 2024-12
- Type of Resource
- Thesis
- Handle URL
- https://hdl.handle.net/2142/127164
- Copyright and License Information
- Copyright 2024 Pengfei Yu
Owning Collections
Graduate Dissertations and Theses at Illinois PRIMARY
Graduate Theses and Dissertations at IllinoisManage Files
Loading…
Edit Collection Membership
Loading…
Edit Metadata
Loading…
Edit Properties
Loading…
Embargoes
Loading…