Director of Research (if dissertation) or Advisor (if thesis)
Koyejo, Oluwasanmi
Department of Study
Computer Science
Discipline
Computer Science
Degree Granting Institution
University of Illinois at Urbana-Champaign
Degree Name
M.S.
Degree Level
Thesis
Keyword(s)
Machine Learning
Optimization
Algorithms
Mirror Descent
Sparse Optimization
Language
eng
Abstract
Parsimony is a general guiding principle in science and philosophy which suggests that if one has multiple theories fitting the data equally well, one should choose the ``simplest" theory. In the field of machine learning and artificial intelligence, the sparsity of a model is used as a measure of parsimony. Algorithms which produce an optimal set of sparse parameters for a given model have been notoriously difficult to construct due to the non-convex and combinatorial nature of sparsity constraints. In this thesis we begin by giving an overview of popular algorithms for sparse and convex optimization. We then show how they can be combined with classical tools from the theory of approximation algorithms to compute approximate projections onto the sparsity constraints, which ultimately leads to a novel algorithm for sparse optimization.
Use this login method if you
don't
have an
@illinois.edu
email address.
(Oops, I do have one)
IDEALS migrated to a new platform on June 23, 2022. If you created
your account prior to this date, you will have to reset your password
using the forgot-password link below.