Files in this item



application/pdfCHEN-THESIS-2020.pdf (5MB)Restricted to U of Illinois
(no description provided)PDF


Title:Synthesizing a complete tomographic study with nodules from multiple radiograph views via deep generative models
Author(s):Chen, Andrew
Advisor(s):Koyejo, Oluwasanmi
Department / Program:Electrical & Computer Eng
Discipline:Electrical & Computer Engr
Degree Granting Institution:University of Illinois at Urbana-Champaign
machine learning
generative modeling
Abstract:Chest radiography encodes a 3D anatomy into a complex 2D representation. This projection creates distinctive challenges even for the most experienced radiologists as many critical findings are superimposed, often resulting in error or further imaging. In particular, it is difficult to visualize the volume and density of lung nodules in chest radiographs. A deep generative model, commonly used to synthesize realistic images, can be used to perform 2D to 3D translations. In this thesis, we propose a generative model, optimized using pixel-wise error, that can synthesize a complete tomographic study containing nodules from frontal and lateral chest x-ray radiographs. Additionally, the generated studies maintain the proper chest cavity structure.
Issue Date:2020-07-09
Rights Information:Copyright 2020 Andrew Chen
Date Available in IDEALS:2020-10-07
Date Deposited:2020-08

This item appears in the following Collection(s)

Item Statistics