Files in this item

FilesDescriptionFormat

application/pdf

application/pdfYANG-DISSERTATION-2020.pdf (3MB)Restricted Access
(no description provided)PDF

Description

Title:Efficient learning of temporal dynamics with first-order methods
Author(s):Yang, Yingxiang
Director of Research:Kiyavash, Negar; He, Niao
Doctoral Committee Chair(s):He, Niao
Doctoral Committee Member(s):Srikant, Rayadurgam; Raginsky, Maxim; Liu, Han
Department / Program:Electrical & Computer Eng
Discipline:Electrical & Computer Engr
Degree Granting Institution:University of Illinois at Urbana-Champaign
Degree:Ph.D.
Genre:Dissertation
Subject(s):machine learning
temporal dynamics
first-order optimization
point processes
multivariate Hawkes processes
Poisson processes
positive functions
approximate Bayesian computation
macroscopic learning
Abstract:Temporal dynamical systems are pervasively used in data science to model high-dimensional data generating processes. For instance, event data are often modeled with point processes, while time series data are often captured by autoregressive models or differential equations. In this dissertation, we design algorithms for such models that enable efficient learning on large datasets. We address several key challenges that rise from real-world applications on learning structured temporal dynamics in the following aspects: • how to enable efficient nonparametric learning for large datasets? • how to learn positive-valued intensity functions for point processes? • how to learn from complex systems with implicit likelihood? • how to learn from aggregated observations of temporal dynamics? Our main focus in this dissertation will be on designing algorithms that are statistically and computationally efficient, by harnessing the power of first-order optimization methods. In particular, we provide solutions to the above challenges, by developing the following distinct, yet closely related algorithms: • First, we introduce an online learning framework for nonparametric maximum likelihood estimation of multivariate Hawkes processes, significantly outperforming the existing nonparametric learning algorithms in run time and at the same time, achieving better prediction accuracy compared to existing parametric algorithms. • Second, we develop a general framework, named “pseudo mirror descent”, to address the challenge in efficient handling of the positivity constraint when learning the intensity function of point processes. This framework greatly alleviates the burden of expensive projections required by existing nonparametric approaches without compromising the convergence guarantees. • Third, we develop a saddle point optimization approach for efficient posterior estimation in large datasets when the likelihood is not available in closed form. The proposed framework outperforms existing benchmarks significantly in terms of learning accuracy and allows us to efficiently learn sophisticated dynamics such as the evolution of a population in an ecological system. • Lastly, to learn with aggregated observations, we propose a novel learning framework based on conditional stochastic optimization, as well as a provably convergent algorithm based on gradient descent and random search for finding the optimal solution. Compared to the plug-in supervised learning setting which uses only aggregated or pre-aggregated observations, our proposed framework achieves superior performances in various applications, including the prediction of Medicare data and COVID-19 infection data. For each of the proposed algorithms and frameworks, we provide both theoretical guarantees, as well as extensive numerical comparisons with the state-of-the-art benchmarks. Our experimental results demonstrate clear advantage of our proposed algorithms, both in terms of computational efficiency and statistical accuracy when compared to the state-of-the-art.
Issue Date:2020-07-17
Type:Thesis
URI:http://hdl.handle.net/2142/108701
Rights Information:Copyright 2020 Yingxiang Yang
Date Available in IDEALS:2020-10-07
Date Deposited:2020-08


This item appears in the following Collection(s)

Item Statistics