System identification for the bar model: Algorithms, consistency and sample complexity
Xie, Xiaotian
This item is only available for download by members of the University of Illinois community. Students, faculty, and staff at the U of I may log in with your NetID and password to view the item. If you are trying to access an Illinois-restricted dissertation or thesis, you can request a copy through your library's Inter-Library Loan office or purchase a copy directly from ProQuest.
Permalink
https://hdl.handle.net/2142/115520
Description
Title
System identification for the bar model: Algorithms, consistency and sample complexity
Author(s)
Xie, Xiaotian
Issue Date
2022-04-20
Director of Research (if dissertation) or Advisor (if thesis)
Beck, Carolyn L.
Srikant, Rayadurgam
Doctoral Committee Chair(s)
Beck, Carolyn L.
Srikant, Rayadurgam
Committee Member(s)
Sowers, Richard B.
Varshney, Lav R.
Katselis, Dimitrios
Department of Study
Industrial&Enterprise Sys Eng
Discipline
Industrial Engineering
Degree Granting Institution
University of Illinois at Urbana-Champaign
Degree Name
Ph.D.
Degree Level
Dissertation
Keyword(s)
System identification, Network inference, Sample complexity, Concentration inequalities
Abstract
System identification has been extensively studied in the context of linear state-space models with continuous state variables. In this thesis, we focus on system identification problems in dynamical systems where the state takes values in a discrete set. If the state vector has p components and each component of the vector can take on two values, then the number of possible states is 2^p, thus leading to a combinatorial explosion in the number of parameters to be identified. To overcome this difficulty, we consider a recently introduced structured dynamical system model, called the Bernoulli Autoregressive (BAR) Model, which consists of (p^2+p) parameters. For this model, we first consider two estimators of the system parameters: a maximum likelihood (ML) estimator and a variant of the ML estimator which leads to closed-form expressions for the estimated parameters. We prove that both of these estimators are consistent in the sense that the estimates converge to the true parameter values when the number of observations goes to infinity. Then, we consider the sample complexity of the closed-form estimator. Using concentration results for random matrices and Lipschitz functions, we derive a bound on the probability that the estimates deviate from the true parameter values by a certain amount, and use the bound to show that the sample complexity is polynomial in p. Finally, building upon the tools used to study the closed-form estimator, we derive new concentration inequalities for vector-valued Lipschitz functions of Markov processes and use them to improve sample complexity results for other estimation problems beyond the BAR model.
Use this login method if you
don't
have an
@illinois.edu
email address.
(Oops, I do have one)
IDEALS migrated to a new platform on June 23, 2022. If you created
your account prior to this date, you will have to reset your password
using the forgot-password link below.