The role of explicit regularization in overparameterized neural networks
Liang, Shiyu
Loading…
Permalink
https://hdl.handle.net/2142/113893
Description
Title
The role of explicit regularization in overparameterized neural networks
Author(s)
Liang, Shiyu
Issue Date
2021-12-01
Director of Research (if dissertation) or Advisor (if thesis)
Srikant, Rayadurgam
Doctoral Committee Chair(s)
Srikant, Rayadurgam
Committee Member(s)
Viswanath, Pramod
Raginsky, Maxim
Sun, Ruoyu
Lee, Jason D.
Department of Study
Electrical & Computer Eng
Discipline
Electrical & Computer Engr
Degree Granting Institution
University of Illinois at Urbana-Champaign
Degree Name
Ph.D.
Degree Level
Dissertation
Date of Ingest
2022-04-29T21:34:46Z
Keyword(s)
Engineering
Language
eng
Abstract
Recent theoretical works on over-parameterized neural nets have focused on two aspects: optimization and generalization. Many existing works that study optimization and generalization together are based on the neural tangent kernel and require a very large width. In this dissertation, we are interested in the following two questions: for a binary classification problem with two-layer mildly over-parameterized ReLU network, (1) does every local minimum memorize and generalize well? and (2) can we find a set of parameters that result in small test error in polynomial time?
We first show that the landscape of loss functions with explicit regularization has the following property: all local minima, and certain other points which are only stationary in certain directions, achieve small test error.
We then prove that, for convolutional neural nets, there is an algorithm which finds one of these points in polynomial time (in the input dimension and the number of data points). In addition, we prove that for a fully connected neural net, with an additional assumption on the data distribution, there is a polynomial-time algorithm to find one of these points.
Use this login method if you
don't
have an
@illinois.edu
email address.
(Oops, I do have one)
IDEALS migrated to a new platform on June 23, 2022. If you created
your account prior to this date, you will have to reset your password
using the forgot-password link below.