Files in this item

FilesDescriptionFormat

application/pdf

application/pdfLIM-THESIS-2019.pdf (3MB)
(no description provided)PDF

Description

Title:A hierarchical adaptively boosted in-memory classifier in 6T SRAM
Author(s):Lim, Sungmin
Advisor(s):Shanbhag, Naresh R.
Department / Program:Electrical & Computer Eng
Discipline:Electrical & Computer Engr
Degree Granting Institution:University of Illinois at Urbana-Champaign
Degree:M.S.
Genre:Thesis
Subject(s):machine learning, mixed-signal accelerator, adaptive boosting, in-memory computing, energy-efficient.
Abstract:Recent emerging machine learning applications such as Internet-of-Things and medical devices require to be operated in a battery-powered platform. As the machine learning algorithms involve heavy data-intensive computations, interest in energy-efficient and low-delay machine learning accelerators is growing. Because there is a trade-off between energy and accuracy in machine learning applications, it is a reasonable direction to provide scalable architecture which has diverse operating points. This thesis presents a high-accuracy in-memory realization of the AdaBoost machine learning classifier. The proposed classifier employs a deep in-memory architecture (DIMA), and employs foreground calibration to compensate for PVT variations and improve task-level accuracy. The proposed architecture switches between a high accuracy/high power (HA) mode and a low power/low accuracy (LP) mode via soft decision thresholding to provide an elegant energy-accuracy trade-off. The proposed realization achieves an EDP reduction of 43X over a digital architecture at an iso-accuracy of 95% for the MNIST dataset, which is an improvement of 5% over a previous in-memory implementation of AdaBoost.
Issue Date:2019-01-25
Type:Text
URI:http://hdl.handle.net/2142/104741
Rights Information:Copyright 2019 Sungmin Lim
Date Available in IDEALS:2019-08-23
Date Deposited:2019-05


This item appears in the following Collection(s)

Item Statistics