This is official code repository for
Learning with Mixture of Prototypes for Out-of-Distribution Detection
Haodong Lu, Dong Gong, Shuo Wang, Jason Xue, Lina Yao, Kristen Moore
The Twelfth International Conference on Learning Representations (ICLR) 2024
All experiments were conducted using the following libraries on a single RTX3090 GPU.
- Python 3.10.11
- Pytorch 2.0.1
- tqdm
The default root directory for ID and OOD datasets is data/
.
ID datasets Datasets like CIFAR-10 & CIFAR-100 will be automatically downloaded.
OOD datasets We use SVHN, Textures (dtd), Places365, LSUN-C (LSUN) and iSUN as our primary OOD datasets in our experiments.
OOD datasets can be downloaded via the following links :
- SVHN: download it and place it in the folder of
data/svhn
. Then runpython util/loaders/select_svhn_data.py
to generate test subset. - Places365: download it and place it in the folder of
data/places365/test_subset
. We randomly sample 10,000 images from the original test dataset. - LSUN: download it and place it in the folder of
data/LSUN
. - iSUN: download it and place it in the folder of
data/iSUN
. - Textures: download it and place it in the folder of
data/dtd
.
The training and evaluation scripts are presented in runner.sh
file.
Please check out our pretrained weights here, and configure the save_path
argument in runner.sh
for evaluation.
If you find our work useful, please consider citing our paper:
@inproceedings{
PALM2024,
title={Learning with Mixture of Prototypes for Out-of-Distribution Detection},
author={Haodong Lu, Dong Gong, Shuo Wang, Jason Xue, Lina Yao, Kristen Moore},
booktitle={The Twelfth International Conference on Learning Representations},
year={2024},
url={https://openreview.net/forum?id=uNkKaD3MCs}
}