Withdraw
Loading…
ISR: INVARIANT SUBSPACE RECOVERY
Si, Haozhi
Loading…
Permalink
https://hdl.handle.net/2142/124978
Description
- Title
- ISR: INVARIANT SUBSPACE RECOVERY
- Author(s)
- Si, Haozhi
- Issue Date
- 2021-12-01
- Keyword(s)
- Domain generalization, Invariant Risk Minimization (IRM)
- Abstract
- Domain generalization asks for models trained on a set of training environments to perform well on unseen test environments. Recently, a series of algorithms such as Invariant Risk Minimization (IRM) and its follow-up works have been proposed for domain generalization. These algorithms are empirically successful, however, under certain assumptions. The Risk of Invariant Risk Minimization (Rosenfeld et al. 2021) shows that IRMandits alternatives cannot generalize to unseen environments with opdsq training environments, where ds is the dimension of the spurious feature space. Meanwhile, these algorithms are computationally costly given their complex optimization objectives. In this paper, we propose a novel algorithm, Invariant Subspace Recovery (ISR), that can achieve a provable domain generalization under Gaussian data models with Opdsq training environments. Notably, unlike IRM and its alternatives, our algorithm has a global convergence guarantee without any non-convexity issue. By making assumptions on the second-order moment of data distribution, we further proposed an algorithm that can work with Op1q training environments. Our experiment results on both synthesized and real-world image data show that applying ISR on features as a post-processing method can increase the accuracy of neural models in unseen test domains.
- Type of Resource
- text
- Language
- eng
Owning Collections
Senior Theses - Electrical and Computer Engineering PRIMARY
The best of ECE undergraduate researchManage Files
Loading…
Edit Collection Membership
Loading…
Edit Metadata
Loading…
Edit Properties
Loading…
Embargoes
Loading…