Files in this item

FilesDescriptionFormat

application/pdf

application/pdfDiMO_research_Statement.pdf (117kB)
(no description provided)PDF

Description

Title:DiMO: Differentiable Model Optimization
Author(s):Brando Miranda
Subject(s):meta-learning
learning to learn
NAS
automl
machine learning
Abstract:I am interested in automated methods for model selection - architecture selection, hyperparameter selection and optimizer selection. To this end, I am developing DiMO (Differentiable Model Optimization): a novel technique for 1) model optimization (i.e. model selection for a fixed dataset by improving the selected architecture, hyperparameters and custom optimizers) and 2) meta-learning (i.e. automatic selection of model architectures for new tasks not observed during training). The main idea of DiMO is to develop a framework (flexible for both meta-learning and model optimization) that models the entire machine learning pipeline and minimize that objective directly with minimal hacks and tricks. The hope is that by optimizing the objective directly - with minimal modifications - performance of constructed models can be optimal and the search efficient, due to it's differentiable formulation.
Issue Date:2020-03-01
Genre:Proposal
Type:Text
Language:English
URI:http://hdl.handle.net/2142/109086
Date Available in IDEALS:2020-12-14


This item appears in the following Collection(s)

Item Statistics