Skip to content

Commit

Permalink
0.0.7 (#27)
Browse files Browse the repository at this point in the history
  • Loading branch information
jamjamjon authored Jul 31, 2024
1 parent 0901ab3 commit 1d74085
Show file tree
Hide file tree
Showing 28 changed files with 1,222 additions and 432 deletions.
47 changes: 0 additions & 47 deletions CHANGELOG.md

This file was deleted.

17 changes: 14 additions & 3 deletions Cargo.toml
Original file line number Diff line number Diff line change
@@ -1,13 +1,13 @@
[package]
name = "usls"
version = "0.0.6"
version = "0.0.7"
edition = "2021"
description = "A Rust library integrated with ONNXRuntime, providing a collection of ML models."
repository = "https://github.com/jamjamjon/usls"
authors = ["Jamjamjon <[email protected]>"]
license = "MIT"
readme = "README.md"
exclude = ["assets/*", "examples/*"]
exclude = ["assets/*", "examples/*", "scripts/*", "runs/*"]

[dependencies]
clap = { version = "4.2.4", features = ["derive"] }
Expand Down Expand Up @@ -44,4 +44,15 @@ ab_glyph = "0.2.23"
geo = "0.28.0"
prost = "0.12.4"
human_bytes = "0.4.3"
fast_image_resize = { version = "4.0.0", git = "https://github.com/jamjamjon/fast_image_resize", branch = "dev" , features = ["image"]}
fast_image_resize = { version = "4.2.1", features = ["image"]}


[dev-dependencies]
criterion = "0.5.1"

[[bench]]
name = "yolo"
harness = false

[lib]
bench = false
121 changes: 23 additions & 98 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,21 @@
# usls

[![Static Badge](https://img.shields.io/crates/v/usls.svg?style=for-the-badge&logo=rust)](https://crates.io/crates/usls) ![Static Badge](https://img.shields.io/crates/d/usls?style=for-the-badge) [![Static Badge](https://img.shields.io/badge/Documents-usls-blue?style=for-the-badge&logo=docs.rs)](https://docs.rs/usls) [![Static Badge](https://img.shields.io/badge/GitHub-black?style=for-the-badge&logo=github)](https://github.com/jamjamjon/usls)
[![Static Badge](https://img.shields.io/crates/v/usls.svg?style=for-the-badge&logo=rust)](https://crates.io/crates/usls) [![Static Badge](https://img.shields.io/badge/ONNXRuntime-v1.17.x-yellow?style=for-the-badge&logo=docs.rs)](https://github.com/microsoft/onnxruntime/releases) [![Static Badge](https://img.shields.io/badge/CUDA-11.x-green?style=for-the-badge&logo=docs.rs)](https://developer.nvidia.com/cuda-toolkit-archive) [![Static Badge](https://img.shields.io/badge/TRT-8.6.x.x-blue?style=for-the-badge&logo=docs.rs)](https://developer.nvidia.com/tensorrt)
[![Static Badge](https://img.shields.io/badge/Documents-usls-blue?style=for-the-badge&logo=docs.rs)](https://docs.rs/usls) ![Static Badge](https://img.shields.io/crates/d/usls?style=for-the-badge)



A Rust library integrated with **ONNXRuntime**, providing a collection of **Computer Vison** and **Vision-Language** models including [YOLOv5](https://github.com/ultralytics/yolov5), [YOLOv6](https://github.com/meituan/YOLOv6), [YOLOv7](https://github.com/WongKinYiu/yolov7), [YOLOv8](https://github.com/ultralytics/ultralytics), [YOLOv9](https://github.com/WongKinYiu/yolov9), [YOLOv10](https://github.com/THU-MIG/yolov10), [RTDETR](https://arxiv.org/abs/2304.08069), [SAM](https://github.com/facebookresearch/segment-anything), [MobileSAM](https://github.com/ChaoningZhang/MobileSAM), [EdgeSAM](https://github.com/chongzhou96/EdgeSAM), [SAM-HQ](https://github.com/SysCV/sam-hq), [CLIP](https://github.com/openai/CLIP), [DINOv2](https://github.com/facebookresearch/dinov2), [FastSAM](https://github.com/CASIA-IVA-Lab/FastSAM), [YOLO-World](https://github.com/AILab-CVC/YOLO-World), [BLIP](https://arxiv.org/abs/2201.12086), [PaddleOCR](https://github.com/PaddlePaddle/PaddleOCR), [Depth-Anything](https://github.com/LiheYoung/Depth-Anything), [MODNet](https://github.com/ZHKKKe/MODNet) and others.


| Segment Anything |
| :------------------------------------------------------: |
| <img src='examples/sam/demo2.png' width="800px"> |

| YOLO + SAM |
| :------------------------------------------------------: |
| <img src='examples/yolo-sam/demo.png' width="800px"> |

A Rust library integrated with **ONNXRuntime**, providing a collection of **Computer Vison** and **Vision-Language** models including [YOLOv5](https://github.com/ultralytics/yolov5), [YOLOv8](https://github.com/ultralytics/ultralytics), [YOLOv9](https://github.com/WongKinYiu/yolov9), [YOLOv10](https://github.com/THU-MIG/yolov10), [RTDETR](https://arxiv.org/abs/2304.08069), [CLIP](https://github.com/openai/CLIP), [DINOv2](https://github.com/facebookresearch/dinov2), [FastSAM](https://github.com/CASIA-IVA-Lab/FastSAM), [YOLO-World](https://github.com/AILab-CVC/YOLO-World), [BLIP](https://arxiv.org/abs/2201.12086), [PaddleOCR](https://github.com/PaddlePaddle/PaddleOCR), [Depth-Anything](https://github.com/LiheYoung/Depth-Anything), [MODNet](https://github.com/ZHKKKe/MODNet) and others.

| Monocular Depth Estimation |
| :--------------------------------------------------------------: |
Expand All @@ -13,9 +26,7 @@ A Rust library integrated with **ONNXRuntime**, providing a collection of **Comp
| :----------------------------------------------------: | :------------------------------------------------: |
| <img src='examples/yolop/demo.png' width="385px"> | <img src='examples/db/demo.png' width="385x"> |

| Portrait Matting |
| :------------------------------------------------------: |
| <img src='examples/modnet/demo.png' width="800px"> |



## Supported Models
Expand All @@ -30,6 +41,10 @@ A Rust library integrated with **ONNXRuntime**, providing a collection of **Comp
| [YOLOv10](https://github.com/THU-MIG/yolov10) | Object Detection | [demo](examples/yolo) |||||
| [RTDETR](https://arxiv.org/abs/2304.08069) | Object Detection | [demo](examples/yolo) |||||
| [FastSAM](https://github.com/CASIA-IVA-Lab/FastSAM) | Instance Segmentation | [demo](examples/yolo) |||||
| [SAM](https://github.com/facebookresearch/segment-anything) | Segmente Anything | [demo](examples/sam) ||| | |
| [MobileSAM](https://github.com/ChaoningZhang/MobileSAM) | Segmente Anything | [demo](examples/sam) ||| | |
| [EdgeSAM](https://github.com/chongzhou96/EdgeSAM) | Segmente Anything | [demo](examples/sam) ||| | |
| [SAM-HQ](https://github.com/SysCV/sam-hq) | Segmente Anything | [demo](examples/sam) ||| | |
| [YOLO-World](https://github.com/AILab-CVC/YOLO-World) | Object Detection | [demo](examples/yolo) |||||
| [DINOv2](https://github.com/facebookresearch/dinov2) | Vision-Self-Supervised | [demo](examples/dinov2) |||||
| [CLIP](https://github.com/openai/CLIP) | Vision-Language | [demo](examples/clip) ||| ✅ visual<br />❌ textual | ✅ visual<br />❌ textual |
Expand Down Expand Up @@ -64,103 +79,13 @@ cargo run -r --example yolo # blip, clip, yolop, svtr, db, ...

## Integrate into your own project

### 1. Add `usls` as a dependency to your project's `Cargo.toml`


```Shell
# Add `usls` as a dependency to your project's `Cargo.toml`
cargo add usls
```

Or you can use specific commit

```Shell
# Or you can use specific commit
usls = { git = "https://github.com/jamjamjon/usls", rev = "???sha???"}
```
### 2. Build model
```Rust
let options = Options::default()
.with_yolo_version(YOLOVersion::V5) // YOLOVersion: V5, V6, V7, V8, V9, V10, RTDETR
.with_yolo_task(YOLOTask::Classify) // YOLOTask: Classify, Detect, Pose, Segment, Obb
.with_model("xxxx.onnx")?;
let mut model = YOLO::new(options)?;
```

- If you want to run your model with TensorRT or CoreML
```Rust
let options = Options::default()
.with_trt(0) // using cuda by default
// .with_coreml(0)
```
- If your model has dynamic shapes
```Rust
let options = Options::default()
.with_i00((1, 2, 4).into()) // dynamic batch
.with_i02((416, 640, 800).into()) // dynamic height
.with_i03((416, 640, 800).into()) // dynamic width
```
- If you want to set a confidence for each category
```Rust
let options = Options::default()
.with_confs(&[0.4, 0.15]) // class_0: 0.4, others: 0.15
```
- Go check [Options](src/core/options.rs) for more model options.
#### 3. Load images
- Build `DataLoader` to load images
```Rust
let dl = DataLoader::default()
.with_batch(model.batch.opt as usize)
.load("./assets/")?;

for (xs, _paths) in dl {
let _y = model.run(&xs)?;
}
```
- Or simply read one image
```Rust
let x = vec![DataLoader::try_read("./assets/bus.jpg")?];
let y = model.run(&x)?;
```
#### 4. Annotate and save
```Rust
let annotator = Annotator::default().with_saveout("YOLO");
annotator.annotate(&x, &y);
```
#### 5. Get results
The inference outputs of provided models will be saved to `Vec<Y>`.
- You can get detection bboxes with `y.bboxes()`:
```Rust
let ys = model.run(&xs)?;
for y in ys {
// bboxes
if let Some(bboxes) = y.bboxes() {
for bbox in bboxes {
println!(
"Bbox: {}, {}, {}, {}, {}, {}",
bbox.xmin(),
bbox.ymin(),
bbox.xmax(),
bbox.ymax(),
bbox.confidence(),
bbox.id(),
)
}
}
}
```
- Other: [Docs](https://docs.rs/usls/latest/usls/struct.Y.html)
Binary file added assets/dog.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added assets/truck.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
96 changes: 96 additions & 0 deletions benches/yolo.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,96 @@
use anyhow::Result;
use criterion::{black_box, criterion_group, criterion_main, Criterion};

use usls::{coco, models::YOLO, DataLoader, Options, Vision, YOLOTask, YOLOVersion};

enum Stage {
Pre,
Run,
Post,
Pipeline,
}

fn yolo_stage_bench(
model: &mut YOLO,
x: &[image::DynamicImage],
stage: Stage,
n: u64,
) -> std::time::Duration {
let mut t_pre = std::time::Duration::new(0, 0);
let mut t_run = std::time::Duration::new(0, 0);
let mut t_post = std::time::Duration::new(0, 0);
let mut t_pipeline = std::time::Duration::new(0, 0);
for _ in 0..n {
let t0 = std::time::Instant::now();
let xs = model.preprocess(x).unwrap();
t_pre += t0.elapsed();

let t = std::time::Instant::now();
let xs = model.inference(xs).unwrap();
t_run += t.elapsed();

let t = std::time::Instant::now();
let _ys = black_box(model.postprocess(xs, x).unwrap());
t_post += t.elapsed();
t_pipeline += t0.elapsed();
}
match stage {
Stage::Pre => t_pre,
Stage::Run => t_run,
Stage::Post => t_post,
Stage::Pipeline => t_pipeline,
}
}

pub fn benchmark_cuda(c: &mut Criterion, h: isize, w: isize) -> Result<()> {
let mut group = c.benchmark_group(format!("YOLO ({}-{})", w, h));
group
.significance_level(0.05)
.sample_size(80)
.measurement_time(std::time::Duration::new(20, 0));

let options = Options::default()
.with_yolo_version(YOLOVersion::V8) // YOLOVersion: V5, V6, V7, V8, V9, V10, RTDETR
.with_yolo_task(YOLOTask::Detect) // YOLOTask: Classify, Detect, Pose, Segment, Obb
.with_model("yolov8m-dyn.onnx")?
.with_cuda(0)
// .with_cpu()
.with_dry_run(0)
.with_i00((1, 1, 4).into())
.with_i02((320, h, 1280).into())
.with_i03((320, w, 1280).into())
.with_confs(&[0.2, 0.15]) // class_0: 0.4, others: 0.15
.with_names2(&coco::KEYPOINTS_NAMES_17);
let mut model = YOLO::new(options)?;

let xs = vec![DataLoader::try_read("./assets/bus.jpg")?];

group.bench_function("pre-process", |b| {
b.iter_custom(|n| yolo_stage_bench(&mut model, &xs, Stage::Pre, n))
});

group.bench_function("run", |b| {
b.iter_custom(|n| yolo_stage_bench(&mut model, &xs, Stage::Run, n))
});

group.bench_function("post-process", |b| {
b.iter_custom(|n| yolo_stage_bench(&mut model, &xs, Stage::Post, n))
});

group.bench_function("pipeline", |b| {
b.iter_custom(|n| yolo_stage_bench(&mut model, &xs, Stage::Pipeline, n))
});

group.finish();
Ok(())
}

pub fn criterion_benchmark(c: &mut Criterion) {
// benchmark_cuda(c, 416, 416).unwrap();
benchmark_cuda(c, 640, 640).unwrap();
benchmark_cuda(c, 448, 768).unwrap();
// benchmark_cuda(c, 800, 800).unwrap();
}

criterion_group!(benches, criterion_benchmark);
criterion_main!(benches);
5 changes: 5 additions & 0 deletions build.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
fn main() {
// Need this for CoreML. See: https://ort.pyke.io/perf/execution-providers#coreml
#[cfg(target_os = "macos")]
println!("cargo:rustc-link-arg=-fapple-link-rtlib");
}
21 changes: 21 additions & 0 deletions examples/sam/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
## Quick Start

```Shell

# SAM
cargo run -r --example sam

# MobileSAM
cargo run -r --example sam -- --kind mobile-sam

# EdgeSAM
cargo run -r --example sam -- --kind edge-sam

# SAM-HQ
cargo run -r --example sam -- --kind sam-hq
```


## Results

![](./demo.png)
Binary file added examples/sam/demo.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading

0 comments on commit 1d74085

Please sign in to comment.