Files in this item



application/pdfWU-THESIS-2020.pdf (2MB)Restricted to U of Illinois
(no description provided)PDF


Title:An adaptive pruning algorithm for DNN compression
Author(s):Wu, Jiaying
Advisor(s):Kindratenko, Volodymyr
Department / Program:Electrical & Computer Eng
Discipline:Electrical & Computer Engr
Degree Granting Institution:University of Illinois at Urbana-Champaign
Abstract:In recent years, deep neural networks have achieved remarkable results in various artificial intelligence tasks such as image recognition and machine language translation. Although deep neural networks achieve state-of-the-art accuracy for various tasks, the high accuracy comes with high computational complexity and large network size which leads to over-parameterization. The computational complexity and large size of deep neural networks limit their applications for low-power and mobile platforms. Techniques such as pruning, quantization and binarization have been developed to reduce network parameter size. Pruning is the one of well-studied methods to solve network over-parameterization and reduce computational complexity. A typical pruning process includes three stages: train the network, remove redundant weights according to certain criteria, and fine-tune the reduced network. However, the three-stage method is time-consuming and frequently cannot fully compensate for the pruned neurons which are actually important. In this thesis, we propose a new algorithm which merges the removing stage and fine-tuning stage into the training stage. The new algorithm reduces time complexity of the pruning process and makes incorrectly pruned neurons recoverable without significant loss of accuracy. We also explore the effects of different pruning criteria calculation methods, weight updating methods and pruning rate changing methods on pruning algorithm performance.
Issue Date:2020-05-12
Rights Information:Copyright 2020 Jiaying Wu
Date Available in IDEALS:2020-08-26
Date Deposited:2020-05

This item appears in the following Collection(s)

Item Statistics