OpenVINO Model Server 2020.3
OpenVINO™ Model Server 2020.3 introduces support for Inference Engine in version 2020.3.
Refer to OpenVINO-Release Notes to learn more about enhancements. The most important for the model server scenarios are:
- Introducing Long-Term Support (LTS), a new release type that provides longer-term maintenance and support with a focus on stability and compatibility
- Added support for new FP32 and INT8 models to enable more vision and text use cases: 3D U-Net, MobileFace, EAST, OpenPose, RetinaNet, and FaceNet
- Improved the support of AVX2 and AVX512 instruction sets in the CPU preprocessing module
- Added support for new model operations
- Introduced support for bfloat16 (BF16) data type for inferencing
- Included security, functionality bug fixes, and minor capability changes
OpenVINO Model Server 2020.3 release has the following changes and enhancements:
- Documentation for Multi-Device Plugin usage to enable load balancing across multiple devices for a single model.
- Added a Quick start guide
- Documentation improvements
Bug fixes:
- Fixed unnecessary model reload that occurred for multiple versions of the model
- Fixed race condition for simultaneous loading and unloading of the same version
- Fixed bug in face detection example
You can use an OVMS public Docker image based on clearlinux via the following command:
docker pull intelaipg/openvino-model-server:2020.3