UAV3D is a public large-scale benchmark designed for 3D perception tasks from Unmanned Aerial Vehicle (UAV) platforms. This benchmark comprises the synthetic data and 3D perception algorithms, aiming to facilitate research in both single UAV and collaborative UAVs 3D perception tasks.
- (2024/9) The paper got accepted at NeurIPS 2024.
- (2024/9) UAV3D V1.0-mini (Google Drive or Baidu Netdisk) is released.
- (2024/6) Source code and pre-trained models are released.
- (2024/6) UAV3D V1.0 (Google Drive or Baidu Netdisk) is released.
Model | Backbone | Size | mAP↑ | NDS↑ | mATE↓ | mASE↓ | mAOE↓ | Checkpoint | Log |
---|---|---|---|---|---|---|---|---|---|
PETR | Res-50 | 704×256 | 0.512 | 0.571 | 0.741 | 0.173 | 0.072 | link | link |
BEVFusion | Res-50 | 704×256 | 0.487 | 0.458 | 0.615 | 0.152 | 1.000 | link | link |
DETR3D | Res-50 | 704×256 | 0.430 | 0.509 | 0.791 | 0.187 | 0.100 | link | link |
PETR | Res-50 | 800×450 | 0.581 | 0.632 | 0.625 | 0.160 | 0.064 | link | link |
BEVFusion | Res-101 | 800×450 | 0.536 | 0.582 | 0.521 | 0.154 | 0.343 | link | link |
DETR3D | Res-101 | 800×450 | 0.618 | 0.671 | 0.494 | 0.158 | 0.070 | link | link |
Model | Backbone | Size | AMOTA↑ | AMOTP↓ | MOTA↑ | MOTP↓ | TID↓ | LGD↓ | det_result | Log |
---|---|---|---|---|---|---|---|---|---|---|
PETR | Res-50 | 704×256 | 0.199 | 1.294 | 0.195 | 0.794 | 1.280 | 2.970 | link | link |
BEVFusion | Res-50 | 704×256 | 0.566 | 1.137 | 0.501 | 0.695 | 0.790 | 1.600 | link | link |
DETR3D | Res-50 | 704×256 | 0.089 | 1.382 | 0.121 | 0.800 | 1.540 | 3.530 | link | link |
PETR | Res-50 | 800×450 | 0.291 | 1.156 | 0.256 | 0.677 | 1.090 | 2.550 | link | link |
BEVFusion | Res-101 | 800×450 | 0.606 | 1.006 | 0.540 | 0.627 | 0.700 | 1.390 | link | link |
DETR3D | Res-101 | 800×450 | 0.262 | 1.123 | 0.238 | 0.561 | 1.140 | 2.720 | link | link |
Model | mAP↑ | NDS↑ | mATE↓ | mASE↓ | mAOE↓ | AP@IoU=0.5↑ | AP@IoU=0.7↑ | Checkpoint | Log |
---|---|---|---|---|---|---|---|---|---|
Lower-bound | 0.544 | 0.556 | 0.540 | 0.147 | 0.578 | 0.457 | 0.140 | link | link |
When2com | 0.550 | 0.507 | 0.534 | 0.156 | 0.679 | 0.461 | 0.166 | link | link |
Who2com | 0.546 | 0.597 | 0.541 | 0.150 | 0.263 | 0.453 | 0.141 | link | link |
V2VNet | 0.647 | 0.628 | 0.508 | 0.167 | 0.533 | 0.545 | 0.141 | link | link |
DiscoNet | 0.700 | 0.689 | 0.423 | 0.143 | 0.422 | 0.649 | 0.247 | link | link |
Upper-bound | 0.720 | 0.748 | 0.391 | 0.106 | 0.117 | 0.673 | 0.316 | link | link |
Model | AMOTA↑ | AMOTP↓ | MOTA↑ | MOTP↓ | TID↓ | LGD↓ | det_result | Log |
---|---|---|---|---|---|---|---|---|
Lower-bound | 0.644 | 1.018 | 0.593 | 0.611 | 0.620 | 1.280 | link | link |
When2com | 0.646 | 1.012 | 0.595 | 0.618 | 0.590 | 1.200 | link | link |
Who2com | 0.648 | 1.012 | 0.602 | 0.623 | 0.580 | 1.200 | link | link |
V2VNet | 0.782 | 0.803 | 0.735 | 0.587 | 0.360 | 0.710 | link | link |
DiscoNet | 0.809 | 0.703 | 0.766 | 0.516 | 0.300 | 0.590 | link | link |
Upper-bound | 0.812 | 0.672 | 0.781 | 0.476 | 0.300 | 0.570 | link | link |
If you find this repository useful, please consider giving a star ⭐ and citation 📘:
@inproceedings{uav3d2024,
title={UAV3D: A Large-scale 3D Perception Benchmark for Unmanned Aerial Vehicles},
author={Hui Ye and Raj Sunderraman and Shihao Ji},
booktitle={The 38th Conference on Neural Information Processing Systems (NeurIPS)},
year={2024}
}
In collecting UAV3D, we received valuable help and suggestions from the authors of CoPerception-UAV and Where2comm.
For 3D object detection task, our implementation is based on PETR, BEVFusion, and DETR3D.
For Collaborative 3D object detection task, our implementation is based on BEVFusion and CoPerception.
For object tracking task, our implementation is based on CenterPoint.
The software and data were created by Georgia State University Research Foundation under Army Research Laboratory (ARL) Award Numbers W911NF-22-2-0025 and W911NF-23-2-0224. ARL, as the Federal awarding agency, reserves a royalty-free, nonexclusive and irrevocable right to reproduce, publish, or otherwise use this software for Federal purposes, and to authorize others to do so in accordance with 2 CFR 200.315(b).