Files in this item



application/pdfHUYNH-THESIS-2021.pdf (461kB)
(no description provided)PDF


Title:Inference in Ising models by graph neural networks with structural features
Author(s):Huynh, Hieu Tri
Advisor(s):Lu, Yi
Department / Program:Electrical & Computer Eng
Discipline:Electrical & Computer Engr
Degree Granting Institution:University of Illinois at Urbana-Champaign
Subject(s):Inference, Probabilistic graphical models, Graph neural networks
Abstract:Probabilistic graphical models (PGMs) are powerful frameworks for modeling interactions between random variables. The two major inference tasks on PGMs are marginal probability inference and maximum-a-posteriori (MAP) inference. Exact inference on PGMs is intractable, hence approximation algorithms, such as belief propagation, are proposed for practical applications. Recently Graphical Neural Networks (GNNs) are shown to outperform BP on small-scale loopy graphs. GNN computes a more general function on each node using neural networks, and learns the exact distribution of small loop-free and loopy graphs. As BP is exact on loop-free graphs and graphs with exactly one loop, GNN performs worse than BP on these graphs, but outperforms BP on graphs with more loops as BP’s performance degrades. We propose a simplified GNN architecture, GNN-Mimic-BP, which outperforms GNN by orders of magnitude on loop-free graphs. In fact, with the simplification, GNN-Mimic-BP enables the architecture to mimic BP exactly on loop-free graphs. We then combine the simplified architecture with enhanced information of short loops in the graph. The resulting architecture outperforms the original GNN on both classic graphs ranging from loop-free to complete, as well as random graphs with a wide range of edge density.
Issue Date:2021-07-22
Rights Information:Copyright 2021 Hieu Huynh
Date Available in IDEALS:2022-01-12
Date Deposited:2021-08

This item appears in the following Collection(s)

Item Statistics