Files in this item

FilesDescriptionFormat

application/pdf

application/pdfSP20-ECE499-Thesis-Harisrikanth, Keshav.pdf (1MB)Restricted to U of Illinois
(no description provided)PDF

Description

Title:LSMT for T-DLA+: Efficient computation of quantized LSMT networks
Author(s):Harisrikanth, Keshav
Contributor(s):Chen, Deming
Subject(s):FPGA
Embedded
Accelerator architectures
Reconfigurable architectures
LSTM
Quantization
Abstract:Neural networks represent a complex computation which can be extremely resource intensive. This can limit their usability in contexts where very small amounts of hardware are deployed on low power budgets. One key way in which the computational cost of neural networks can be significantly reduced is quantization, in which the values throughout the network are represented in fewer bits. A ternarized network is specifically a network in which every weight has been quantized to three values, +1,-1 and 0. Past works have shown that, despite their simple weight systems, ternarized neural networks can achieve much closer accuracy to full floating point networks than might be expected. In order to further extract computational efficiency from these networks, we have designed and analyzed DNN acceleration on embedded FPGAs through the creation of a ternarized deep neural network coprocessor with custom designed ISA. We have prior built a basic ternarized neural network accelerator capable of basic CNN computation. A key point of improvement for this design over past works is specifically supporting LSTM operations, and the efficient hardware specifically for LSTM computation. This continuing work faces the significant challenge of making ISAs that are both general enough for any task, but specific enough to be well executed in hardware, and reduce code density. This thesis in particular will primarily focus on the LSTM hardware computation units in itself.
Issue Date:2020-05
Genre:Other
Type:Text
Language:English
URI:http://hdl.handle.net/2142/107274
Date Available in IDEALS:2020-06-12


This item appears in the following Collection(s)

Item Statistics