Files in this item

FilesDescriptionFormat

application/pdf

application/pdfKesler_David.pdf (753kB)
(no description provided)PDF

Description

Title:A hardware acceleration technique for gradient descent and conjugate gradient
Author(s):Kesler, David R.
Advisor(s):Kumar, Rakesh
Department / Program:Electrical & Computer Eng
Discipline:Electrical & Computer Engr
Degree Granting Institution:University of Illinois at Urbana-Champaign
Degree:M.S.
Genre:Thesis
Subject(s):Gradient Descent
Conjugate Gradient
Hardware Acceleration
Matrix Multiplication
Abstract:Gradient descent, conjugate gradient, and other iterative algorithms are a powerful class of algorithms; however, they can take a long time for conver- gence. Baseline accelerator designs feature insu cient coverage of operations and do not work well on the problems we target. In this thesis we present a novel hardware architecture for accelerating gradient descent and other similar algorithms. To support this architecture, we also present a sparse matrix-vector storage format, and software support for utilizing the format, so that it can be e ciently mapped onto hardware which is also well suited for dense operations. We show that the accelerator design outperforms similar designs which target only the most dominant operation of a given algorithm, providing substantial energy and performance bene ts. We further show that the accelerator can be reasonably implemented on a general purpose CPU with small area overhead.
Issue Date:2011-05-25
URI:http://hdl.handle.net/2142/24241
Rights Information:
Copyright 2011 David R. Kesler
Date Available in IDEALS:2011-05-25
Date Deposited:2011-05


This item appears in the following Collection(s)

Item Statistics