Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update doc about codebase deployment #1051

Merged
merged 39 commits into from
Sep 24, 2022
Merged
Show file tree
Hide file tree
Changes from 25 commits
Commits
Show all changes
39 commits
Select commit Hold shift + click to select a range
a3f1f14
update doc about mmclassfication deployment
lvhan028 Sep 15, 2022
ea6feee
update install mmdeploy part
lvhan028 Sep 15, 2022
5f609d0
correct mmcls deployment commands
lvhan028 Sep 16, 2022
1101991
reformat supported models
lvhan028 Sep 16, 2022
84942a1
add deployed model specification
lvhan028 Sep 16, 2022
5322e01
update doc about mmdetection deployment
lvhan028 Sep 16, 2022
fc15588
fix according to reviewer comments
lvhan028 Sep 16, 2022
5cfef89
fix according to reviewer comments
lvhan028 Sep 16, 2022
9e0e2d8
fix according to reviewer comments
lvhan028 Sep 16, 2022
82f54da
fix according to reviewer comments
lvhan028 Sep 18, 2022
bf1a715
fix according to reviewer comments
lvhan028 Sep 18, 2022
7e8939b
fix according to reviewer comments
lvhan028 Sep 18, 2022
f5a1c26
update doc about mmsegmentation deployment
lvhan028 Sep 18, 2022
d039eb1
update doc about mmocr deployment
lvhan028 Sep 19, 2022
ced7a06
check in cityscapes.png as an input image for converting mmsegmentati…
lvhan028 Sep 19, 2022
645144d
update mmocr deployment
lvhan028 Sep 19, 2022
a199ab8
update mmseg.md
lvhan028 Sep 19, 2022
43d1a67
update mmseg.md
lvhan028 Sep 19, 2022
7d1cf0d
update mmocr.md
lvhan028 Sep 19, 2022
485e7ed
Merge branch 'dev-1.x' into codebase-deployment
lvhan028 Sep 19, 2022
f88835f
update sdk model inference for mmocr deployment
lvhan028 Sep 19, 2022
daaf209
update according to reviewer comments
lvhan028 Sep 19, 2022
234db34
update
lvhan028 Sep 19, 2022
63c2c25
update
lvhan028 Sep 19, 2022
54ff626
update mmedit
lvhan028 Sep 19, 2022
fb8c919
Merge branch 'dev-1.x' into codebase-deployment
lvhan028 Sep 20, 2022
cd5cffe
update mmpose deployment
lvhan028 Sep 20, 2022
6a9e78f
Merge branch 'dev-1.x' into codebase-deployment
lvhan028 Sep 21, 2022
877197a
check in face.png for mmedit deployment
lvhan028 Sep 21, 2022
5584fd2
Merge branch 'dev-1.x' into codebase-deployment
lvhan028 Sep 21, 2022
f5efd5b
update
lvhan028 Sep 21, 2022
aa8c6c2
fix according to reviewer comments
lvhan028 Sep 21, 2022
96a3b11
remove duplicate doc
lvhan028 Sep 22, 2022
7b5a0bb
update docs in english
lvhan028 Sep 22, 2022
6516d1d
update codebase documents in chinese
lvhan028 Sep 22, 2022
cfe7879
fix according to reviewerss comments
lvhan028 Sep 22, 2022
6fbf17f
update according reviewer comments
lvhan028 Sep 24, 2022
92a8888
ObjectDetection -> Object Detection
lvhan028 Sep 24, 2022
74570a2
InstanceSegmentation -> Instance Segmentation
lvhan028 Sep 24, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Binary file added demo/resources/cityscapes.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added demo/resources/det.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added demo/resources/text_det.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added demo/resources/text_recog.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
173 changes: 158 additions & 15 deletions docs/en/04-supported-codebases/mmcls.md
Original file line number Diff line number Diff line change
@@ -1,20 +1,163 @@
# MMClassification Support
# MMClassification Deployment
lvhan028 marked this conversation as resolved.
Show resolved Hide resolved

[MMClassification](https://github.com/open-mmlab/mmclassification) is an open-source image classification toolbox based on PyTorch. It is a part of the [OpenMMLab](https://openmmlab.com) project.
- [Installation](#installation)
- [Install mmcls](#install-mmcls)
- [Install mmdeploy](#install-mmdeploy)
- [Convert model](#convert-model)
- [Model Specification](#model-specification)
- [Model inference](#model-inference)
- [Backend model inference](#backend-model-inference)
- [SDK model inference](#sdk-model-inference)
- [Supported models](#supported-models)

## MMClassification installation tutorial
______________________________________________________________________

Please refer to [install.md](https://github.com/open-mmlab/mmclassification/blob/master/docs/en/install.md) for installation.
[MMClassification](https://github.com/open-mmlab/mmclassification) aka `mmcls` is an open-source image classification toolbox based on PyTorch. It is a part of the [OpenMMLab](https://openmmlab.com) project.

## List of MMClassification models supported by MMDeploy
## Installation

| Model | TorchScript | ONNX Runtime | TensorRT | ncnn | PPLNN | OpenVINO | Model config |
| :---------------- | :---------: | :----------: | :------: | :--: | :---: | :------: | :---------------------------------------------------------------------------------------------: |
| ResNet | Y | Y | Y | Y | Y | Y | [config](https://github.com/open-mmlab/mmclassification/tree/master/configs/resnet) |
| ResNeXt | Y | Y | Y | Y | Y | Y | [config](https://github.com/open-mmlab/mmclassification/tree/master/configs/resnext) |
| SE-ResNet | Y | Y | Y | Y | Y | Y | [config](https://github.com/open-mmlab/mmclassification/tree/master/configs/seresnet) |
| MobileNetV2 | Y | Y | Y | Y | Y | Y | [config](https://github.com/open-mmlab/mmclassification/tree/master/configs/mobilenet_v2) |
| ShuffleNetV1 | Y | Y | Y | Y | Y | Y | [config](https://github.com/open-mmlab/mmclassification/tree/master/configs/shufflenet_v1) |
| ShuffleNetV2 | Y | Y | Y | Y | Y | Y | [config](https://github.com/open-mmlab/mmclassification/tree/master/configs/shufflenet_v2) |
| VisionTransformer | Y | Y | Y | Y | ? | Y | [config](https://github.com/open-mmlab/mmclassification/tree/master/configs/vision_transformer) |
| SwinTransformer | Y | Y | Y | N | ? | N | [config](https://github.com/open-mmlab/mmclassification/tree/master/configs/swin_transformer) |
### Install mmcls

Please follow this [quick guide](https://github.com/open-mmlab/mmclassification/tree/1.x#installation) to install mmcls. If you have already done that, move on to [the next section](#install-mmdeploy).

### Install mmdeploy

There are several methods to install mmdeploy, among which you can choose an appropriate one according to your target platform and device.

**Method I:** Install precompiled package

> **TODO**. MMDeploy hasn't released based on dev-1.x branch.

**Method II:** Build using scripts

If your target platform is **Ubuntu 18.04 or later version**, we encourage you to run
[scripts](../01-how-to-build/build_from_script.md). For example, the following commands install mmdeploy as well as inference engine - `ONNX Runtime`.

```shell
git clone --recursive -b dev-1.x https://github.com/open-mmlab/mmdeploy.git
cd mmdeploy
python3 tools/scripts/build_ubuntu_x64_ort.py $(nproc)
export PYTHONPATH=$(pwd)/build/lib:$PYTHONPATH
export LD_LIBRARY_PATH=$(pwd)/../mmdeploy-dep/onnxruntime-linux-x64-1.8.1/lib/:$LD_LIBRARY_PATH
```

**Method III:** Build from source

If neither **I** nor **II** meets your requirements, [building mmdeploy from source](../01-how-to-build/build_from_source.md) is the last option.

## Convert model

You can use [tools/deploy.py](https://github.com/open-mmlab/mmdeploy/blob/dev-1.x/tools/deploy.py) to convert mmcls models to the specified backend models. Its detailed usage can be learned from [here](https://github.com/open-mmlab/mmdeploy/blob/master/docs/en/02-how-to-run/convert_model.md#usage).

The command below shows an example about converting `resnet18` model to onnx model that can be inferred by ONNX Runtime.

```shell
cd mmdeploy

# download resnet18 model from mmcls model zoo
mim download mmcls --config resnet18_8xb32_in1k --dest .

# convert mmcls model to onnxruntime model with dynamic shape
python tools/deploy.py \
configs/mmcls/classification_onnxruntime_dynamic.py \
resnet18_8xb32_in1k.py \
resnet18_8xb32_in1k_20210831-fbbb1da6.pth \
tests/data/tiger.jpeg \
--work-dir mmdeploy_models/mmcls/ort \
--device cpu \
--show \
--dump-info
```

It is crucial to specify the correct deployment config during model conversion. We've already provided builtin deployment config [files](https://github.com/open-mmlab/mmdeploy/tree/dev-1.x/configs/mmcls) of all supported backends for mmclassification. The config filename pattern is:

```
classification_{backend}-{precision}_{static | dynamic}_{shape}.py
```

- **{backend}:** inference backend, such as onnxruntime, tensorrt, pplnn, ncnn, openvino, coreml and etc.
- **{precision}:** fp16, int8. When it's empty, it means fp32
- **{static | dynamic}:** static shape or dynamic shape
- **{shape}:** input shape or shape range of a model

Therefore, in the above example, you can also convert `resnet18` to other backend models by changing the deployment config file `classification_onnxruntime_dynamic.py` to [others](https://github.com/open-mmlab/mmdeploy/tree/dev-1.x/configs/mmcls), e.g., converting to tensorrt-fp16 model by `classification_tensorrt-fp16_dynamic-224x224-224x224.py`.

```{tip}
When converting mmcls models to tensorrt models, --device should be set to "cuda"
```

## Model Specification

Before moving on to model inference chapter, let's know more about the converted model structure which is very important for model inference.

The converted model locates in the working directory like `mmdeploy_models/mmcls/ort` in the previous example. It includes:

```
mmdeploy_models/mmcls/ort
├── deploy.json
├── detail.json
├── end2end.onnx
└── pipeline.json
```

in which,

- **end2end.onnx**: backend model which can be inferred by ONNX Runtime
- **deploy.json**: meta information about backend model
- **pipeline.json**: inference pipeline of mmdeploy SDK
- **detail.json**: conversion parameters

The whole package **mmdeploy_models/mmcls/ort** is defined as **mmdeploy SDK model**, i.e., **mmdeploy SDK model** includes both backend model and inference meta information.

## Model inference

### Backend model inference

MMDeploy provides a unified API named as `inference_model` to inference model, making all inference backends API transparent to users.

Take the previous converted `end2end.onnx` model as an example, you can use the following code to inference the model.

```shell
from mmdeploy.apis import inference_model
result = inference_model(
lvhan028 marked this conversation as resolved.
Show resolved Hide resolved
model_cfg='./resnet18_8xb32_in1k.py',
deploy_cfg='configs/mmcls/classification_onnxruntime_dynamic.py',
backend_files=['mmdeploy_models/mmcls/ort/end2end.onnx'],
img='tests/data/tiger.jpeg',
device='cpu')
print(result)
```

### SDK model inference

You can also perform SDK model inference like following,

```python
from mmdeploy_python import Classifier
import cv2

img = cv2.imread('tests/data/tiger.jpeg')

# create a classifier
classifier = Classifier(model_path='./mmdeploy_models/mmcls/ort', device_name='cpu', device_id=0)
# perform inference
result = classifier(img)
# show inference result
for label_id, score in result:
print(label_id, score)
```

Besides python API, mmdeploy SDK also provides other FFI (Foreign Function Interface), such as C, C++, C#, Java and so on. You can learn their usage from [demos](https://github.com/open-mmlab/mmdeploy/tree/dev-1.x/demo).

## Supported models
lvhan028 marked this conversation as resolved.
Show resolved Hide resolved

| Model | TorchScript | ONNX Runtime | TensorRT | ncnn | PPLNN | OpenVINO |
| :--------------------------------------------------------------------------------------------------------- | :---------: | :----------: | :------: | :--: | :---: | :------: |
| [ResNet](https://github.com/open-mmlab/mmclassification/tree/master/configs/resnet) | Y | Y | Y | Y | Y | Y |
| [ResNeXt](https://github.com/open-mmlab/mmclassification/tree/master/configs/resnext) | Y | Y | Y | Y | Y | Y |
| [SE-ResNet](https://github.com/open-mmlab/mmclassification/tree/master/configs/seresnet) | Y | Y | Y | Y | Y | Y |
| [MobileNetV2](https://github.com/open-mmlab/mmclassification/tree/master/configs/mobilenet_v2) | Y | Y | Y | Y | Y | Y |
| [ShuffleNetV1](https://github.com/open-mmlab/mmclassification/tree/master/configs/shufflenet_v1) | Y | Y | Y | Y | Y | Y |
| [ShuffleNetV2](https://github.com/open-mmlab/mmclassification/tree/master/configs/shufflenet_v2) | Y | Y | Y | Y | Y | Y |
| [VisionTransformer](https://github.com/open-mmlab/mmclassification/tree/master/configs/vision_transformer) | Y | Y | Y | Y | ? | Y |
| [SwinTransformer](https://github.com/open-mmlab/mmclassification/tree/master/configs/swin_transformer) | Y | Y | Y | N | ? | N |
Loading