Withdraw
Loading…
DiMO: Differentiable Model Optimization
Brando Miranda
Loading…
Permalink
https://hdl.handle.net/2142/109086
Description
- Title
- DiMO: Differentiable Model Optimization
- Author(s)
- Brando Miranda
- Issue Date
- 2020-03-01
- Keyword(s)
- meta-learning
- learning to learn
- NAS
- automl
- machine learning
- Abstract
- I am interested in automated methods for model selection - architecture selection, hyperparameter selection and optimizer selection. To this end, I am developing DiMO (Differentiable Model Optimization): a novel technique for 1) model optimization (i.e. model selection for a fixed dataset by improving the selected architecture, hyperparameters and custom optimizers) and 2) meta-learning (i.e. automatic selection of model architectures for new tasks not observed during training). The main idea of DiMO is to develop a framework (flexible for both meta-learning and model optimization) that models the entire machine learning pipeline and minimize that objective directly with minimal hacks and tricks. The hope is that by optimizing the objective directly - with minimal modifications - performance of constructed models can be optimal and the search efficient, due to it's differentiable formulation.
- Type of Resource
- text
- Language
- en
- Permalink
- http://hdl.handle.net/2142/109086
Owning Collections
Manage Files
Loading…
Edit Collection Membership
Loading…
Edit Metadata
Loading…
Edit Properties
Loading…
Embargoes
Loading…