A Theoretical Basis for Model Collapse in Recursive Training
Borkar, Vivek S.
Loading…
Permalink
https://hdl.handle.net/2142/130253
Description
Title
A Theoretical Basis for Model Collapse in Recursive Training
Author(s)
Borkar, Vivek S.
Issue Date
2025-09-17
Keyword(s)
Generative models
Recursive training
Model collapse
Convergence of probability measures
Martingale convergence theorem
Abstract
It is known that recursive training from generative models can lead to the so called ‘collapse’ of the simulated probability distribution. This note shows that one in fact gets two different asymptotic behaviours depending on whether an external source, howsoever minor, is also contributing samples.
Publisher
Allerton Conference on Communication, Control, and Computing
Series/Report Name or Number
2025 61st Allerton Conference on Communication, Control, and Computing Proceedings
Use this login method if you
don't
have an
@illinois.edu
email address.
(Oops, I do have one)
IDEALS migrated to a new platform on June 23, 2022. If you created
your account prior to this date, you will have to reset your password
using the forgot-password link below.