Files in this item



application/pdfMARTINI-THESIS-2020.pdf (3MB)Restricted Access
(no description provided)PDF


Title:Massively parallel message passing on a GPU for graphical model inference
Author(s):Martini, Amr Mamoun
Advisor(s):Schwing, Alex G
Department / Program:Electrical & Computer Eng
Discipline:Electrical & Computer Engr
Degree Granting Institution:University of Illinois at Urbana-Champaign
Subject(s):graphical models
statistical inference
computational inference
coordinate descent
Abstract:Graphical model inference is fundamental to many problems across disciplines. However, its combinatorial nature makes it computationally challenging. For more effective inference, message passing algorithms that expose significant parallelism have been implemented to exploit graphics processing units (GPUs), albeit often tackling specific graphical model structures such as directed acyclic graphs (DAGs), grids, uniform state spaces, and pairwise models. All those implementations emphasize the importance of load balancing irregular graphs in order to fully utilize GPU parallelism. However, they do not formalize the problems and instead give ad hoc solutions. In contrast, we formalize load balancing of message passing for general, irregular graphs as a minimax problem and develop an algorithm to solve it efficiently. We show that our implementation permits scaling of message passing to meet the demands of current problems of interest in machine learning and computer vision, achieving significant speedups over state of the art.
Issue Date:2020-07-22
Rights Information:Copyright 2020 Amr Martini
Date Available in IDEALS:2020-10-07
Date Deposited:2020-08

This item appears in the following Collection(s)

Item Statistics