Skip to content

[TCSVT 2024] Official implementation of the paper: Benchmarking Micro-action Recognition: Dataset, Methods, and Applications

Notifications You must be signed in to change notification settings

VUT-HFUT/Micro-Action

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

24 Commits
 
 
 
 
 
 

Repository files navigation

Micro-Action Benchmark

Benchmarking Micro-action Recognition: Dataset, Methods, and Applications

[Paper]

PWC

News

  • [2024/12/15] 🎉 🎉 🎉 Our paper PCAN about Micro-Action Recognition is accepted by AAAI 2025.
  • [2024/7/9] 🎉 🎉 🎉 We released the MMA-52 dataset for Multi-label Micro-Action Detection task. Report
  • [2024/4/26] We launched the Micro-Action Analysis Grand Challenge (MAC 2024) associated with ACM Multimedia 2024.
  • [2024/4/16] We released the source code of MANet.

Introduction

Micro-action is an imperceptible non-verbal behaviour characterised by low-intensity movement. It offers insights into the feelings and intentions of individuals and is important for human-oriented applications such as emotion recognition and psychological assessment. However, the identification, differentiation, and understanding of micro-actions pose challenges due to the imperceptible and inaccessible nature of these subtle human behaviors in everyday life. In this study, we innovatively collect a new micro-action dataset designated as Micro-action-52 (MA-52), and propose a benchmark named micro-action network (MANet) for micro-action recognition (MAR) task. Uniquely, MA-52 provides the whole-body perspective including gestures, upper- and lower-limb movements, attempting to reveal comprehensive micro-action cues. In detail, MA-52 contains 52 micro-action categories along with seven body part labels, and encompasses a full array of realistic and natural micro-actions, accounting for 205 participants and 22,422 video instances collated from the psychological interviews. Based on the proposed dataset, we assess MANet and other nine prevalent action recognition methods. MANet incorporates squeeze-and-excitation (SE) and temporal shift module (TSM) into the ResNet architecture for modeling the spatiotemporal characteristics of micro-actions. Then a joint-embedding loss is designed for semantic matching between video and action labels; the loss is used to better distinguish between visually similar yet distinct micro-action categories. The extended application in emotion recognition has demonstrated one of the important values of our proposed dataset and method. In the future, further exploration of human behaviour, emotion, and psychological assessment will be conducted in depth.


Data

Download

The datasets are only to be used for non-commercial scientific purposes. You may request access to the dataset by completing the Google Form provided and corresponding LA files. We will respond promptly upon receipt of your application. If you have difficulty in filling out the form, we can also accept the application by [email].

MA-52 Statistics

Micro-Action examples

For more micro-action samples, please refer to MA-52 Dataset Samples.zip in huggingface.

A1: shaking body

A2: turning around

A3: sitting straightly

B1 nodding

B2 shaking head

B3 turning head

Citation

Please consider citing the related paper in your publications if it helps your research.

@article{guo2024benchmarking,
  title={Benchmarking Micro-action Recognition: Dataset, Methods, and Applications},
  author={Guo, Dan and Li, Kun and Hu, Bin and Zhang, Yan and Wang, Meng},
  journal={IEEE Transactions on Circuits and Systems for Video Technology},
  year={2024},
  volume={34},
  number={7},
  pages={6238-6252},
  publisher={IEEE},
  doi={10.1109/TCSVT.2024.3358415}
}

@article{li2024mmad,
  title={MMAD: Multi-label Micro-Action Detection in Videos},
  author={Li, Kun and Guo, Dan and Liu, Pengyu and Chen, Guoliang and Wang, Meng},
  journal={arXiv preprint arXiv:2407.05311},
  year={2024}
}

@misc{MicroAction2024,
  author       = {Guo, Dan and Li, Kun and Hu, Bin and Zhang, Yan and Wang, Meng},
  title        = {Micro-Action Benchmark},
  year         = {2024},
  howpublished = {\url{https://github.com/VUT-HFUT/Micro-Action}},
  note         = {Accessed: 2024-08-21}
}

About

[TCSVT 2024] Official implementation of the paper: Benchmarking Micro-action Recognition: Dataset, Methods, and Applications

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published