MACON: memory-augmented continual learning for open-world classification
Lyu, Weijie
Loading…
Permalink
https://hdl.handle.net/2142/120293
Description
Title
MACON: memory-augmented continual learning for open-world classification
Author(s)
Lyu, Weijie
Issue Date
2023-04-20
Director of Research (if dissertation) or Advisor (if thesis)
Hoiem, Derek
Department of Study
Computer Science
Discipline
Computer Science
Degree Granting Institution
University of Illinois at Urbana-Champaign
Degree Name
M.S.
Degree Level
Thesis
Keyword(s)
Continual Learning
Open-world Recognition
Open-vocabulary Classification
Memory-augmented Neural Network
Language
eng
Abstract
Emerging concepts and rapidly changing environments necessitate AI models that can quickly adapt to new scenarios without losing their inherited capabilities. Large foundation models like CLIP offer a strong zero-shot learning baseline under an open-vocabulary classification scenario. However, their massive training data makes re-training or fine-tuning impractical without sacrificing zero-shot performance. We introduce Memory-Augmented CONtinual learning (MACON), a novel framework for addressing open-world continual learning challenges. The core idea is to augment foundation models, such as CLIP, with memory to provide context and flexibility for better decision-making and prevention of catastrophic forgetting. We propose various memory retrieval methods tailored to different continual learning scenarios. Our results demonstrate that MACON exhibits fast adaptation capabilities, minimal forgetting issues, and robust generalization abilities, making it suitable for a wide range of open-world applications.
Use this login method if you
don't
have an
@illinois.edu
email address.
(Oops, I do have one)
IDEALS migrated to a new platform on June 23, 2022. If you created
your account prior to this date, you will have to reset your password
using the forgot-password link below.