Files in this item

FilesDescriptionFormat

application/pdf

application/pdfOSUAGWU-DISSERTATION-2015.pdf (13MB)Restricted Access
(no description provided)PDF

Description

Title:Brain-machine interface coupled cognitive sensory fusion with a Kohonen and reservoir computing scheme
Author(s):Osuagwu, Onyeama E.
Director of Research:Levinson, Stephen; Goddard, Lynford L
Doctoral Committee Chair(s):Levinson, Stephen; Goddard, Lynford L
Doctoral Committee Member(s):Carney, Paul S; Hubler, Alfred; Raginsky, Maxim
Department / Program:Electrical & Computer Engineering
Discipline:Electrical & Computer Engineering
Degree Granting Institution:University of Illinois at Urbana-Champaign
Degree:Ph.D.
Genre:Dissertation
Subject(s):Artificial Intelligence
Robotics
Neural Networks
Natural Language Acquisition
Brain-Machine Interface
Abstract:Artificial Intelligence (AI) has been a source of great intrigue and has spawned many questions regarding the human condition and the core of what it means to be a sentient entity. The field has bifurcated into so-called “weak” and “strong” artificial intelligence. In weak artificial intelligence reside the forms of automation and data mining that we interact with on a daily basis. Strong artificial intelligence can be best defined as a “synthetic” being with cognitive abilities and the capacity for presence of mind that we would normally associate with humankind. We feel that this distinction is misguided. First, we begin with the statement that intelligence lies on a spectrum, even in artificial systems. The fact that our systems currently can be considered weak artificial intelligence does not preclude our ability to develop an understanding that can lead us to more complex behavior. In this research, we utilized neural feedback via electroencephalogram (EEG) data to develop an emotional landscape for linguistic interaction via the android's sensory fields which we consider to be part and parcel of embodied cognition. We have also given the iCub child android the instinct to babble the words it has learned. This is a skill that we leveraged for low-level linguistic acquisition in the latter part of this research, the slightly stronger artificial intelligence goal. This research is motivated by two main questions regarding intelligence: Is intelligence an emergent phenomenon? And, if so, can multi-modal sensory information and a term coined called “co-intelligence” which is a shared sensory experience via coupling EEG input, assist in the development of representations in the mind that we colloquially refer to as language? Given that it is not reasonable to program all of the activities needed to foster intelligence in artificial systems, our hope is that these types of forays will set the stage for further development of stronger artificial intelligence constructs. We have incorporated self-organizing processes - i.e. Kohonen maps, hidden Markov models for the speech, language development and emotional information via neural data - to help lay the substrate for emergence. Next, homage is given to the central and unique role played in intellectual study by language. We have also developed rudimentary associative memory for the iCub that is derived from the aforementioned sensory input that was collected. We formalized this process only as needed, but that is based on the assumption that mind, brain and language can be represented using the mathematics and logic of the day without contradiction. We have some reservations regarding this statement, but unfortunately a proof is a task beyond the scope of this Ph.D. Finally, this data from the coupling of the EEG and the other sensory modes of embodied cognition is used to interact with a reservoir computing recurrent neural network in an attempt to produce simple language interaction, e.g. babbling, from the child android.
Issue Date:2015-12-04
Type:Thesis
URI:http://hdl.handle.net/2142/89230
Rights Information:Copyright 2015 Onyeama E.Osuagwu
Date Available in IDEALS:2016-03-02
Date Deposited:2015-12


This item appears in the following Collection(s)

Item Statistics