In this project, we consider both a nonparametric, nonstationary setting and a parametric setting for the latent processes and propose two provable conditions under which temporally causal latent processes can be identified from their nonlinear mixtures. We propose LEAP, a theoretically-grounded architecture that extends Variational Autoencoders (VAEs) by enforcing our conditions through proper constraints in causal process prior. Experimental result on various data sets demonstrate that temporally causal latent processes are reliably identified from observed variables under different dependency structures and that our approach considerably outperforms baselines that do not leverage history or nonstationarity information. This is one of the first works that successfully recover time-delayed latent processes from nonlinear mixtures without using sparsity or minimality assumptions.
Learning Temporally Causal Latent Processes from General Temporal Data
Weiran Yao*,
Yuewen Sun*,
Alex Ho,
Changyin Sun,and
Kun Zhang
(*: indicates equal contribution.)
International Conference on Learning Representations (ICLR) 2022
[Paper]
[Project Page]
@article{yao2021learning,
title={Learning Temporally Causal Latent Processes from General Temporal Data},
author={Yao, Weiran and Sun, Yuewen and Ho, Alex and Sun, Changyin and Zhang, Kun},
journal={arXiv preprint arXiv:2110.05428},
year={2021}
}
Our Approach: we leverage nonstationarity in process noise or functional and distributional forms of temporal statistics to identify temporally causal latent processes from observation.
Framework: LEAP consists of (A) encoders and decoders for specific data types; (B) a recurrent inference network that approximates the posteriors of latent causal variables, and (C) a causal process prior network that models nonstationary latent causal processes with independent noise (IN) condition constraints.
Experiment results are showcased in Jupyter Notebooks in /tests
folder. Each notebook contains the scripts for analysis and visualization for one specific experiment.
Run the scripts in /leap/scripts
to generate results for experiment.
Further details are documented within the code.
To install it, create a conda environment with Python>=3.7
and follow the instructions below. Note, that the current implementation of LEAP requires a GPU.
conda create -n leap python=3.7
cd leap
pip install -e .
- Synthetic data:
python leap/datasets/gen_dataset.py
- KittiMask: https://github.com/bethgelab/slow_disentanglement
- Mass-Spring system: https://yunzhuli.github.io/V-CDN/
- CMU Mocap databse: http://mocap.cs.cmu.edu/