Sub-JEPA: Subspace Gaussian Regularization for Stable End-to-End World Models
Official implementation of the paper: Sub-JEPA: Subspace Gaussian Regularization for Stable End-to-End World Models.
Overview
Joint-Embedding Predictive Architectures (JEPAs) provide a simple framework for learning world models by predicting future latent states. However, JEPA training can be subject to collapse without sufficient structural constraints. Sub-JEPA relaxes global constraints used in previous methods (like LeWM) by applying Gaussian regularization across multiple random subspaces rather than the original high-dimensional embedding space. This leads to a better balance between training stability and representation quality in continuous-control environments.
Resources
- GitHub: intcomp/Sub-JEPA
- Paper: arXiv:2605.09241
Installation
To set up the environment, clone the repository and apply the Sub-JEPA patch to the underlying LeWM codebase:
git clone --recursive https://github.com/intcomp/Sub-JEPA.git
cd Sub-JEPA
# Apply the Sub-JEPA patch to LeWM
git -C le-wm apply ../lewm_subjepa.patch
Please refer to the official repository for additional environment and data setup instructions.
Usage
Training
Training is configured with Hydra. To train on the tworoom environment:
PYTHONPATH=. python le-wm/train.py data=tworoom
Evaluation
Evaluation configurations are located under le-wm/config/eval/:
python le-wm/eval.py --config-name=tworoom.yaml policy=tworoom/subjepa
Citation
@misc{zhao2026subjepa,
title = {Sub-JEPA: Subspace Gaussian Regularization for Stable End-to-End World Models},
author = {Zhao, Kai and Nie, Dongliang and Lin, Yuchen and Luo, Zhehan and Gu, Yixiao and Fan, Deng-Ping and Zeng, Dan},
year = {2026},
eprint = {2605.09241},
archivePrefix = {arXiv},
primaryClass = {cs.LG},
url = {https://arxiv.org/abs/2605.09241}
}