Skip to content

OpenVINO Model Server 2020.1

Compare
Choose a tag to compare
@dtrawins dtrawins released this 23 Mar 20:55
· 240 commits to master since this release
1ae76c6

OpenVINO™ Model Server 2020.1 introduces support for Inference Engine in version 2020.1.
Refer to OpenVINO-Release Notes to learn more about enhancements. The most relevant for the model server use case are:

  • Inference Engine integrated with ngraph
  • Low-precision runtime for INT8
  • Added support for multiple new layers and operations
  • Numerous improvements in the plugin implementation

OpenVINO Model Server 2020.1 release has the following new features and changes:

  • Speeded up inference output serialization – up to 40x faster – models with big outputs will have noticeably shorter latency
  • Added exemplary client sending inference requests from multiple cameras in parallel
  • Added support for tensorflow 2.0 and python3.8 with backward compatibility
  • Updated functional tests to use IR models from OpenVINO Model Zoo
  • Updated functional tests to use mino for S3 compatible model storage

Bug fixes:

  • Fixed model files detection and import for certain name patterns
  • Corrected kubernetes demo in GCP

Note: In version 2020.1 CPU extensions library was removed. Extensions are include into the CPU plugin.
Extension library is now optional to include custom layers only.

You can use an OVMS public Docker image based on OpenVINO runtime image via the following command:

docker pull intelaipg/openvino-model-server:2020.1