-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[TENSORFLOW] Tensorflow 2.x support #7498
Conversation
0f08d28
to
e50f56c
Compare
@@ -3682,6 +3729,30 @@ def from_tensorflow(graph, layout="NHWC", shape=None, outputs=None): | |||
params : dict of str to tvm.nd.NDArray | |||
Dict of converted parameters stored in tvm.nd.NDArray format | |||
""" | |||
from tensorflow.python.eager.function import ConcreteFunction | |||
|
|||
if isinstance(graph, ConcreteFunction): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
would it be better if we create a wrapper over from_tensorflow() instead of overloading the parameter "graph" to be either GraphDef or ConcreteFunction?
Another question: Is ConcreteFunction the right parameter for the TF front end parser? How about start from the saved_model format? When TVM is used to compile a model for inference, it starts from a saved_model. This way perhaps there is no need to use _build_signature_def in order to optimize the graph. Maybe _build_signature_def can be part of some utility code for convenience.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wrapper will change frontend API. We could seek community opinion on this.
I thought of saved_model, but we already have TFParser (python/tvm/relay/frontend/tensorflow_parser.py) helper that handles saved_model formats. I think we could make from_tensorflow
being single entry for GraphDef / ConcreteFunction / Saved Model Dir. TFParser can handle all transforms until it results GraphDef.
TFHub URL to GraphDef can also be added to TFParser, but hub.KerasLayer has bunch or args to handle some times. We could hold it for now.
I will amend this PR with these changes and also TF1.x / TF2.x compatibility we discussed.
"debug_stripper", | ||
"arithmetic", | ||
"dependency", | ||
"arithmetic", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is line 3752 a duplicate of line 3750?
It appears that "arithmetic" and "dependency" optimizations may be ON by default, while "debug_stripper" is OFF by default: https://www.tensorflow.org/guide/graph_optimization. I have not verified this. Are these required?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"arithmetic" is not a duplicate, this transform happen again.
The transforms order was with a ref. from TensorflowJS project. This can be revised later as needed.
37a91dc
to
413a1d0
Compare
@masahi @tqchen @siju-samuel @zhiics @anijain2305 Can you help review this PR. This brings in TF2.x support with existing parser and re-arrangements to accommodate TF2.x brand new parser. @tqchen We need to refresh the docker image to install tensorflow-hub. The e2e models now come from TFHub. |
@srkreddy1238 can you try to install via https://github.com/apache/tvm/blob/main/tests/scripts/task_ci_setup.sh |
bd6e0b9
to
5096688
Compare
Thanks for contributing to TVM! Please refer to guideline https://tvm.apache.org/docs/contribute/ for useful information and tips. After the pull request is submitted, please request code reviews from Reviewers by @ them in the pull request thread.