Files in this item



application/pdf3362794.pdf (3MB)Restricted to U of Illinois
(no description provided)PDF


Title:Neural Correlates of Auditory -Visual Speech Perception in Noise
Author(s):Gilbert, Jaimie
Doctoral Committee Chair(s):Charissa Lansing
Department / Program:Speech and Hearing Science
Discipline:Speech and Hearing Science
Degree Granting Institution:University of Illinois at Urbana-Champaign
Subject(s):Health Sciences, Speech Pathology
Abstract:Speech perception in noise may be facilitated by presenting the concurrent optic stimulus of observable speech gestures. Objective measures such as event-related potentials (ERPs) are crucial to understanding the processes underlying a facilitation of auditory-visual speech perception. Previous research has demonstrated that in quiet acoustic conditions auditory-visual speech perception occurs faster (decreased latency) and with less neural activity (decreased amplitude) than auditory-only speech perception. These empirical observations provide support for the construct of auditory-visual neural facilitation. Auditory-visual facilitation was quantified with response time and accuracy measures and the N1/P2 ERP waveform response as a function of changes in audibility (manipulation of the acoustic environment by testing a range of signal-to-noise ratios) and content of optic cue (manipulation of the types of cues available, e.g., speech, nonspeech-static, or non-speech-dynamic cues). Experiment 1 (Response Time Measures) evaluated participant responses in a speeded-response task investigating effects of both audibility and type of optic cue. Results revealed better accuracy and response times with visible speech gestures compared to those for any non-speech cue. Experiment 2 (Audibility) investigated the influence of audibility on auditory-visual facilitation in response time measures and the N1/P2 response. ERP measures showed effects of reduced audibility (slower latency, decreased amplitude) for both types of facial motion, i.e., speech and non-speech dynamic facial optic cues, compared to measures in quiet conditions. Experiment 3 (Optic Cues) evaluated the influence of the type of optic cue on auditory-visual facilitation with response time measures and the N1/P2 response. N1 latency was faster with both types of facial motion tested in this experiment, but N1 amplitude was decreased only with concurrent presentation of auditory and visual speech. The N1 ERP results of these experiments reveal that the effect of audibility alone does not explain auditory-visual facilitation in noise. The decreased N1 amplitude associated with the visible speech gesture and the concurrent auditory speech suggests that processing of the visible speech gesture either stimulates N1 generators or interacts with processing in N1 generators. A likely generator of the N1 response is the auditory cortex, which matures differently without auditory stimulation during a critical period. The impact of auditory-visual integration deprivation on neural development and ability to make use of optic cues must also be investigated. Further scientific understanding of any maturational differences or differences in processing due to auditory-visual integration deprivation is needed to promote utilization of auditory-visual facilitation of speech perception for individuals with auditory impairment. Research and (re)habilitation therapies for speech perception in noise must continue to emphasize the benefit of associating and integrating auditory and visual speech cues.
Issue Date:2009
Description:173 p.
Thesis (Ph.D.)--University of Illinois at Urbana-Champaign, 2009.
Other Identifier(s):(MiAaPQ)AAI3362794
Date Available in IDEALS:2015-09-25
Date Deposited:2009

This item appears in the following Collection(s)

Item Statistics