Files in this item



application/pdfURIBEMENESES-DISSERTATION-2018.pdf (5MB)
(no description provided)PDF


Title:Efficient algorithms for distributed learning, optimization and belief systems over networks
Author(s):Uribe Meneses, Cesar Augusto
Director of Research:Basar, Tamer; Nedich, Angelia; Olshevsky, Alex
Doctoral Committee Chair(s):Basar, Tamer
Doctoral Committee Member(s):Srikant, Rayadurgam; Raginsky, Maxim
Department / Program:Electrical & Computer Eng
Discipline:Electrical & Computer Engr
Degree Granting Institution:University of Illinois at Urbana-Champaign
Subject(s):Distributed Optimization
Distributed Learning
Optimal Algorithms
Belief Systems
Opinion Dynamics
Algorithm Analysis
Network Science
Optimization over Graphs
Convergence Rates
Non-asymptotic Analysis
Wasserstein Barycenters
Accelerated Algorithms
Abstract:A distributed system is composed of independent agents, machines, processing units, etc., where interactions between them are usually constrained by a network structure. In contrast to centralized approaches where all information and computation resources are available at a single location, agents on a distributed system can only use locally available information. The particular flexibilities induced by a distributed structure make it suitable for large-scale problems involving large quantities of data. Specifically, the increasing amount of data generated by inherently distributed systems such as social media, sensor networks, and cloud-based databases has brought considerable attention to distributed data processing techniques on several fronts of applied and theoretical machine learning, robotics, resource allocation, among many others. As a result, much effort has been put into the design of efficient distributed algorithms that take into account the communication constraints and make coordinated decisions in a fully distributed manner. In this dissertation, we focus on the principled design and analysis of distributed algorithms for optimization, learning and belief systems over networks. Particularly, we are interested in the non-asymptotic analysis of various distributed algorithms and the explicit influence of the topology of the network they ought to be solved over. Initially, we analyze a recently proposed model for opinion dynamics in belief systems with logic constraints. Opinion dynamics are a natural model for a distributed system and serve as an introductory topic for the further study of learning and optimization over networks. We assume there is an underlying structure of social relations, represented by a social network, and people in this social group interact by exchanging opinions about a number of truth statements. We analyze, from a graph-theoretic point of view, this belief system when a set of logic constraints relate the opinions on the several topics being discussed. We provide novel graph-theoretic conditions for convergence, explicit estimates of the convergence rate and the limiting value of the opinions for all agents in the network in terms of the topology of the social structure of the agents and the topology induced by the set of logic constraints. We derive explicit dependencies for a number of well-known graph topologies. We then shift our attention to the distributed learning problem of cooperative inference where a group of agents interact over a network and seek to estimate a joint parameter that best explains a set of network-wide observations using the local information only. Again, we assume there is an underlying network that defines the communication constraints between the agents and derive explicit, non-asymptotic, and geometric convergence rates for the concentration of beliefs on the optimal parameter. For the case of having a finite number of hypotheses, we propose distributed learning algorithms for time-varying undirected graphs, time-varying directed graphs and a new acceleration scheme for fixed undirected graphs. For each of the network structures, we present explicit dependencies for the worst case network topology. Furthermore, we extend these belief concentration results to hypotheses sets being a compact subset of the real numbers, for a simplified static undirected network assumption. Moreover, we present a generic distributed parameter estimation algorithm for observational models belonging to the exponential family of distributions. We further extend the distributed mean estimation from Gaussian observations to time-varying directed networks. The graph-theoretical analysis of belief systems with logic constraints and the distributed learning for cooperative inference are specific instances of convex optimization problems where the objective function is decomposable as the sum of convex functions. Particularly, these problems assume each of the summands is held by a node on a graph and agents are oblivious to the network topology. As a final object of interest, we study the optimality of first-order distributed optimization algorithms for general convex optimization problems. We focus on understanding the fundamental limits induced by the distributed networked structure of the problem and how it compares with the hypothetical case of having centralized computations available. We show that for large classes of convex optimization problems, we can design optimal algorithms that can be executed over a network in a distributed manner while matching lower complexity bounds of their centralized counterparts with an additional iteration cost that depends on the network structure. We design optimal distributed algorithms for various convexity and smoothness properties that can be executed over arbitrary fixed, connected and undirected graphs. Furthermore, we explore the application of these distributed algorithms to the problem of distributed computation of Wasserstein barycenters of finite distributions. Finally, we discuss some future directions of research for the design and analysis of distributed algorithms, both from theoretical and applied perspectives.
Issue Date:2018-07-09
Rights Information:Copyright 2018 Cesar A Uribe Meneses
Date Available in IDEALS:2018-09-27
Date Deposited:2018-08

This item appears in the following Collection(s)

Item Statistics