MMDeploy provides cross compile for android platform.
Model converter is executed on linux platform, and SDK is executed on android platform.
Here are two steps for android build.
-
Build model converter on linux, please refer to How to build linux
-
Build SDK using android toolchain on linux.
This doc is only for how to build SDK using android toolchain on linux.
-
cmake
Make sure cmake version >= 3.14.0. If not, you can follow instructions below to install cmake 3.20.0. For more versions of cmake, please refer to cmake website.
wget https://github.com/Kitware/CMake/releases/download/v3.20.0/cmake-3.20.0-linux-x86_64.tar.gz tar -xzvf cmake-3.20.0-linux-x86_64.tar.gz sudo ln -sf $(pwd)/cmake-3.20.0-linux-x86_64/bin/* /usr/bin/
-
ANDROID NDK 19+
Make sure android ndk version >= 19.0. If not, you can follow instructions below to install android ndk r23b. For more versions of android ndk, please refer to android ndk website.
wget https://dl.google.com/android/repository/android-ndk-r23b-linux.zip unzip android-ndk-r23b-linux.zip cd android-ndk-r23b export NDK_PATH=${PWD}
You can skip this chapter if only interested in model converter.
NAME | INSTALLATION |
---|---|
OpenCV (>=3.0) |
|
ncnn | A high-performance neural network inference computing framework supporting for android. Now, MMDeploy supports v20220216 and has to use git clone to download it.
|
NAME | VALUE | DEFAULT | REMARK |
---|---|---|---|
MMDEPLOY_BUILD_SDK | {ON, OFF} | OFF | Switch to build MMDeploy SDK |
MMDEPLOY_BUILD_SDK_PYTHON_API | {ON, OFF} | OFF | switch to build MMDeploy SDK python package |
MMDEPLOY_BUILD_SDK_JAVA_API | {ON, OFF} | OFF | switch to build MMDeploy SDK Java API |
MMDEPLOY_BUILD_TEST | {ON, OFF} | OFF | Switch to build MMDeploy SDK unittest cases |
MMDEPLOY_TARGET_DEVICES | {"cpu"} | cpu | Enable target device. If you want use ncnn vulkan accelerate, you still fill {"cpu"} here. Because, vulkan accelerate is only for ncnn net. The other part of inference is still using cpu. |
MMDEPLOY_TARGET_BACKENDS | {"ncnn"} | N/A | Enabling inference engine. By default, no target inference engine is set, since it highly depends on the use case. Only ncnn backend is supported for android platform now. After specifying the inference engine, it's package path has to be passed to cmake as follows, 1. ncnn: ncnn. ncnn_DIR is needed.
|
MMDEPLOY_CODEBASES | {"mmcls", "mmdet", "mmseg", "mmedit", "mmocr", "all"} | N/A | Enable codebase's postprocess modules. It MUST be set by a semicolon separated list of codebase names. The currently supported codebases are 'mmcls', 'mmdet', 'mmedit', 'mmseg', 'mmocr'. Instead of listing them one by one, you can also pass all to enable them all, i.e., -DMMDEPLOY_CODEBASES=all |
MMDEPLOY_SHARED_LIBS | {ON, OFF} | ON | Switch to build shared library or static library of MMDeploy SDK. Now you should build static library for android. Bug will be fixed soon. |
MMDeploy provides a recipe as shown below for building SDK with ncnn as inference engine for android.
- cpu + ncnn
cd ${MMDEPLOY_DIR} mkdir -p build && cd build cmake .. \ -DMMDEPLOY_BUILD_SDK=ON \ -DMMDEPLOY_BUILD_SDK_JAVA_API=ON \ -DOpenCV_DIR=${OPENCV_ANDROID_SDK_DIR}/sdk/native/jni/abi-arm64-v8a \ -Dncnn_DIR=${NCNN_DIR}/build/install/lib/cmake/ncnn \ -DMMDEPLOY_TARGET_BACKENDS=ncnn \ -DMMDEPLOY_CODEBASES=all \ -DMMDEPLOY_SHARED_LIBS=OFF \ -DCMAKE_TOOLCHAIN_FILE=${NDK_PATH}/build/cmake/android.toolchain.cmake \ -DANDROID_ABI=arm64-v8a \ -DANDROID_PLATFORM=android-30 \ -DANDROID_CPP_FEATURES="rtti exceptions" make -j$(nproc) && make install
cd ${MMDEPLOY_DIR}/build/install/example
mkdir -p build && cd build
cmake .. \
-DOpenCV_DIR=${OPENCV_ANDROID_SDK_DIR}/sdk/native/jni/abi-arm64-v8a \
-Dncnn_DIR=${NCNN_DIR}/build/install/lib/cmake/ncnn \
-DMMDeploy_DIR=${MMDEPLOY_DIR}/build/install/lib/cmake/MMDeploy \
-DCMAKE_TOOLCHAIN_FILE=${NDK_PATH}/build/cmake/android.toolchain.cmake \
-DANDROID_ABI=arm64-v8a \
-DANDROID_PLATFORM=android-30
make -j$(nproc)