-
Hi all, I am working on executing ML within WASM as an embedded function, and Tract looks like a great fit for that. Thanks in advance! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Well, guess what. Use tract-tensorflow for tensorflow models, tract-onnx for ONNX models, tract-nnef for NNEF models and tract-tflite for TensorFlow Lite :) Using tract-nnef or tract-tflite makes smaller binaries that tract-onnx or tract-tensorflow. tract command line can load the four formats, and can dump models in NNEF or tflite format, so it can be used as a converter to NNEF or tflite. To 1/ keep your binaries small and 2/ avoid using alpha or deprecated code, the best way to go for embedding is to convert your model to NNEF offline, using tract command line, then write the embedded code including tract-nnef. It will pull its dependencies (data,linalg,core): 1/ Convert your model to ONNX (using tract or some tensorflow python code). If tract happens to work out of the box on your Tensorflow model, you can skip the ONNX conversion... or you could even ship your model with code working on top of tract-tensorflow (bigger binaries). |
Beta Was this translation helpful? Give feedback.
Well, guess what. Use tract-tensorflow for tensorflow models, tract-onnx for ONNX models, tract-nnef for NNEF models and tract-tflite for TensorFlow Lite :)
Using tract-nnef or tract-tflite makes smaller binaries that tract-onnx or tract-tensorflow.
NNEF and ONNX are the two more mature formats in tract. Support for TensorFlow is limited to 1.x and is on its way to deprecation. TensorflowLite support is brand new, still limited and should be considered alpha-level.
tract command line can load the four formats, and can dump models in NNEF or tflite format, so it can be used as a converter to NNEF or tflite.
To 1/ keep your binaries small and 2/ avoid using alpha or deprecated code, the bes…