Files in this item

FilesDescriptionFormat

application/pdf

application/pdfHANSON-THESIS-2020.pdf (419kB)
(no description provided)PDF

Description

Title:Universal approximation of input-output maps and dynamical systems by neural network architectures
Author(s):Hanson, Joshua McKinley
Advisor(s):Raginsky, Maxim
Department / Program:Electrical & Computer Eng
Discipline:Electrical & Computer Engr
Degree Granting Institution:University of Illinois at Urbana-Champaign
Degree:M.S.
Genre:Thesis
Subject(s):Input-output maps
convolutional neural nets
dynamical systems
recurrent neural nets
deep neural networks
continuous time
discrete time
universal approximation
simulation
feedback
stability
fading memory
approximately finite memory
Abstract:It is well known that feedforward neural networks can approximate any continuous function supported on a finite-dimensional compact set to arbitrary accuracy. However, many engineering applications require modeling infinite-dimensional functions, such as sequence-to-sequence transformations or input-output characteristics of systems of differential equations. For discrete-time input-output maps having limited long-term memory, we prove universal approximation guarantees for temporal convolutional nets constructed using only a finite number of computation units which hold on an infinite-time horizon. We also provide quantitative estimates for the width and depth of the network sufficient to achieve any fixed error tolerance. Furthemore, we show that discrete-time input-output maps given by state-space realizations satisfying certain stability criteria admit such convolutional net approximations which are accurate on an infinite-time scale. For continuous-time input-output maps induced by dynamical systems that are stable in a similar sense, we prove that continuous-time recurrent neural nets are capable of reproducing the original trajectories to within arbitrarily small error tolerance over an infinite-time horizon. For a subset of these stable systems, we provide quantitative estimates on the number of neurons sufficient to guarantee the desired error bound.
Issue Date:2020-07-15
Type:Thesis
URI:http://hdl.handle.net/2142/108486
Rights Information:Copyright 2020 Joshua Hanson
Date Available in IDEALS:2020-10-07
Date Deposited:2020-08


This item appears in the following Collection(s)

Item Statistics