Skip to content

[ACM MM 2022] Patch-based Knowledge Distillation for Lifelong Person Re-Identification

License

Notifications You must be signed in to change notification settings

feifeiobama/PatchKD

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PatchKD

Code for ACM MM 2022 paper Patch-based Knowledge Distillation for Lifelong Person Re-Identification.

Framework

Installation

git clone https://github.com/feifeiobama/PatchKD
cd PatchKD
pip install -r requirements.txt
python setup.py develop

Please follow Torchreid_Datasets_Doc to download datasets and unzip them to your data path (we refer to 'machine_dataset_path' in train_test.py). Alternatively, you could download some datasets from light-reid and DualNorm.

Quick Start

Training + evaluation. Make sure the visdom server is listening.

python train_test.py

Evaluation from checkpoint:

python train_test.py --mode test --resume_test_model /path/to/pretrained/model

Visualization from checkpoint:

python train_test.py --mode visualize --resume_visualize_model /path/to/pretrained/model

Experiment under a different training order:

python train_test.py --train_dataset duke msmt17 market subcuhksysu cuhk03

Results

The following results were obtained with single NVIDIA 2080 Ti GPU:

Results

Citation

If you find this code useful for your research, please cite our paper.

@inproceedings{sun2022patch,
    author = {Sun, Zhicheng and Mu, Yadong},
    title = {Patch-based Knowledge Distillation for Lifelong Person Re-Identification},
    booktitle = {Proceedings of the 30th ACM International Conference on Multimedia},
    pages = {696--707},
    year = {2022}
}

Acknowledgement

Our code is based on the PyTorch implementation of LifelongReID.

About

[ACM MM 2022] Patch-based Knowledge Distillation for Lifelong Person Re-Identification

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published