Withdraw
Loading…
UMLS-based approach for developing VoiS: Voice-Activated Conversational Agent for self-management of multiple chronic conditions
Park, Min Sook; Oh, Hyunkyoung; Luo, Jake; Ahamed, Sheikh Iqbal; Upama, Paramita Basak; Anik, Adib Ahmed; Tian, Shiyu; Rabbani, Masud
Content Files

Loading…
Download Files
Loading…
Download Counts (All Files)
Loading…
Edit File
Loading…
Permalink
https://hdl.handle.net/2142/123162
Description
- Title
- UMLS-based approach for developing VoiS: Voice-Activated Conversational Agent for self-management of multiple chronic conditions
- Author(s)
- Park, Min Sook
- Oh, Hyunkyoung
- Luo, Jake
- Ahamed, Sheikh Iqbal
- Upama, Paramita Basak
- Anik, Adib Ahmed
- Tian, Shiyu
- Rabbani, Masud
- Issue Date
- 2024-01
- Keyword(s)
- Medical Ontologies
- Health Informatics
- Conversational Agents
- mHealth
- System Design
- Mobile Systems
- Ontologies
- Abstract
- This abstract proposes a system design for an ontology-based conversational agent (CA) for the self-management of chronic conditions. The proposed system plans to integrate the largest medical ontology, the Unified Medical Language System (UMLS) (Bodenreider, 2004), aiming to narrow the vocabulary gaps between health professionals and patients and to make the agent more responsive to the users. Recently, conversational agents (CAs) like ChatGPT, or computer dialog systems that simulate human-to-human communication in natural language, have risen and gained popularity in various health contexts (Bin Sawad et al., 2022; Montenegro et al., 2019) including self-management of chronic conditions (Griffin et al., 2020). Despite their potential, the currently available CAs are often criticized for lacking the capability to understand natural language inputs (Montenegro et al., 2019). This limitation can be highlighted in medical areas due to the known vocabulary gaps between health professionals and health consumers. The knowledge-grounded dialog flow for CAs presents the potential to lift this limitation, making CAs naturally converse with their users. In the proposed voice-activated self-monitoring support (VoiS) application, the research team plans to integrate the UMLS to make the agent better understand lay terms from patients and properly map those terms to medical concepts. This automated process is expected to improve the user experience in two folds: a) promote the quality of communication between the patients and health providers and b) make the VoiS app more responsive to user inputs, overcoming accepting only constrained user inputs (e.g., multiple choice of utterance).
- Series/Report Name or Number
- Proceedings of the ALISE Annual Conference, 2023
- Type of Resource
- text
- Genre of Resource
- Conference Poster
- Language
- eng
- Handle URL
- https://hdl.handle.net/2142/123162
- DOI
- https://doi.org/10.21900/j.alise.2023.1251
- Copyright and License Information
- Copyright 2023 Min Sook Park, Hyunkyoung Oh, Jake Luo, Sheikh Iqbal Ahamed, Paramita Basak Upama, Adib Ahmed Anik, Shiyu Tian, Masud Rabbani
- This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License (https://creativecommons.org/licenses/by-sa/4.0/).
Owning Collections
Proceedings of the ALISE Annual Conference: ALISE 2023 PRIMARY
Bridge the Gap: Teaching, Learning, Practice, and CompetenciesManage Files
Loading…
Edit Collection Membership
Loading…
Edit Metadata
Loading…
Edit Properties
Loading…
Embargoes
Loading…