Files in this item

FilesDescriptionFormat

application/pdf

application/pdfOUYANG-DISSERTATION-2018.pdf (906kB)Restricted Access
(no description provided)PDF

Description

Title:Scalable sparsity structure learning using Bayesian methods
Author(s):Ouyang, Yunbo
Director of Research:Liang, Feng
Doctoral Committee Chair(s):Liang, Feng
Doctoral Committee Member(s):Qu, Annie; Narisetty, Naveen N; Zhu, Ruoqing
Department / Program:Statistics
Discipline:Statistics
Degree Granting Institution:University of Illinois at Urbana-Champaign
Degree:Ph.D.
Genre:Dissertation
Subject(s):Bayesian statistics
high-dimensional data analysis
variable selection
Abstract:Learning sparsity pattern in high dimension is a great challenge in both implementation and theory. In this thesis we develop scalable Bayesian algorithms based on EM algorithm and variational inference to learn sparsity structure in various models. Estimation consistency and selection consistency of our methods are established. First, a nonparametric Bayes estimator is proposed for the problem of estimating a sparse sequence based on Gaussian random variables. We adopt the popular two-group prior with one component being a point mass at zero, and the other component being a mixture of Gaussian distributions. Although the Gaussian family has been shown to be suboptimal for this problem, we find that Gaussian mixtures, with a proper choice on the means and mixing weights, have the desired asymptotic behavior, e.g., the corresponding posterior concentrates on balls with the desired minimax rate. Second, the above estimator could be directly applied to the high dimensional linear classification. In theory, we not only build a bridge to connect the estimation error of the mean difference and the classification error in different scenarios, also provide sufficient conditions of sub-optimal classifiers and optimal classifiers. Third, we study adaptive ridge regression for linear models. Adaptive ridge regression is closely related with Bayesian variable selection problem with Gaussian mixture spike-and-slab prior because it resembles EM algorithm developed in Wang et al. (2016) for the above problem. The output of adaptive ridge regression can be used to construct a distribution estimator to approximate posterior. We show the approximate posterior has the desired concentration property and adaptive ridge regression estimator has desired predictive error. Last, we propose a Bayesian approach to sparse principal components analysis (PCA). We show that our algorithm, which is based on variational approximation, achieves Bayesian selection consistency. Empirical studies have demonstrated the competitive performance of the proposed algorithm.
Issue Date:2018-03-28
Type:Thesis
URI:http://hdl.handle.net/2142/101264
Rights Information:Copyright 2018 Yunbo Ouyang
Date Available in IDEALS:2018-09-04
Date Deposited:2018-05


This item appears in the following Collection(s)

Item Statistics