Files in this item

FilesDescriptionFormat

application/pdf

application/pdfGAO-DISSERTATION-2019.pdf (4MB)
(no description provided)PDF

Description

Title:Information theory meets big data: Theory, algorithms and applications to deep learning
Author(s):Gao, Weihao
Director of Research:Viswanath, Pramod
Doctoral Committee Chair(s):Viswanath, Pramod
Doctoral Committee Member(s):Raginsky, Maxim; Oh, Sewoong; Kannan, Sreeram
Department / Program:Electrical & Computer Eng
Discipline:Electrical & Computer Engr
Degree Granting Institution:University of Illinois at Urbana-Champaign
Degree:Ph.D.
Genre:Dissertation
Subject(s):Information Theory
Property Estimation
Deep Learning
Abstract:As the era of big data arises, people get access to numerous amounts of multi-view data. Measuring, discovering and understanding the underlying relationship among different aspects of data is the core problem in information theory. However, traditional information theory research focuses on solving this problem in an abstract population-level way. In order to apply information-theoretic tools to real-world problems, it is necessary to revisit information theory from sample-level. One important bridge between traditional information theory and real-world problems is the information-theoretic quantity estimators. These estimators enable computing of traditional information-theoretic quantities from big data and understanding hidden relationships in data. Information-theoretic tools can also be utilized to improve modern machine learning techniques. In this dissertation, several problems of information-theoretic quantity estimators and their applications are investigated. This dissertation consists of the following topics: (1) theoretical study of the fundamental limit of information-theoretic quantity estimators, especially k-nearest neighbor estimators of differential entropy and mutual information; (2) designing novel algorithms of differential entropy and mutual information estimators for some special and challenging practical scenarios, as well as new information-theoretic measures to discover complex relationships among data which cannot be found by traditional measures; (3) applying information-theoretic tools to improve training algorithms and model compression algorithms in deep learning.
Issue Date:2019-08-19
Type:Text
URI:http://hdl.handle.net/2142/106145
Rights Information:Copyright 2019 Weihao Gao
Date Available in IDEALS:2020-03-02
Date Deposited:2019-12


This item appears in the following Collection(s)

Item Statistics