-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Relay] Support for Tensorflow 2.0 #4102
Comments
@ic that was good start to upgrade TF. We could use this issue to track TF upgrade. |
I'll be working on #4104, usually on Fridays. |
How about the backward compatibility to 1.12 or 1.14 while upgrading to 2.0 ? One easy solution is to use TF upgrade script and convert to 2.0 and use it, but it may not be more native to 2.0 features.
Any ideas @tqchen @tmoreau89 @yongwww @soiferj ? |
IMHO, both 1.x and 2.0 should be maintained for a while, most of the working tf models were built on top of 1.x. Another option is to maintain the same from_tensorflow for both 1.x and 2.0, it is possible to maintain it for both since savedmodel is the universal format, not sure how much work is needed. For TVM CI, maybe we can build different container for this kind of backward compatability testing. @tqchen |
Google seems to push for using TF2.x, perhaps to "catch up" on popularity against PyTorch and the like on developer happiness? Still I support @yongwww's point: Existing work outside Google is on TF 1.x (and 0.x :-)), and they will linger for some time. So this PR, and the sub-PR about fixing the tutorial target full backward compatibility, using the
|
Thanks for the comments @yongwww & @ic As @yongwww said, front end implementation may not get affected much as the savedmodel is universal except the operator specific API changes in 2.0. TF 2.0 has major modification in terms of the way we build the graph and execute (session, eager execution, tf.function ..etc.). Test cases will demand more effort here. I would suggest dual compatibility by
|
It seems that we have reached consensus for dual compatibility for now. In terms of implementation details, TF2 diverges too much, we can also create a separate code path and do switch at the highest level |
@ic @srkreddy1238 if you guys are working on this support, would you mind sharing the status? |
Hi @ic @srkreddy1238 Would you mind sharing the latest status? |
Sorry for the delay. The intention is to support TF1 and TF2 separately, trying to see if partial migrations can help. No progress since last update, although I hope to have some time soon to get back on it. |
Maybe we could boost the progress. I am told by many folks they are interested in Mobilenet V3 model. However, |
NOTE: the Mainline CI now uses tensorflow 2.0 |
Good to see CI moving to 2.0. On this context.
Most of this work is completed (except few operators failing with 2.0 API). |
Hi @srkreddy1238 do you have any update? it would be super cool to see a pr. |
We have looked into the models like image classification, object detection, segmentation, etc. Seems we need to enable the support for TensorList, control-flow, function, and some missing operations. We would like to work on this project from scratch. As we discussed above, we might create a new tf frontend for tf2.x (and I believe the parser for tf1.x will be deprecated in the future as tf community stops the support for 1.x). Please comment here or ping me/us if anyone is interested in working with us on this. I synced with siva @srkreddy1238 in the past few weeks, we will keep him and the community updated if we work together on this. |
TF 2.x differs at the front facing API level but the end graph doesn't change a lot here. My understanding and advice:
TF2.x brings in standardization without the need of additional arguments like inputs, outputs, shapes(few cases) ..etc to be given by the user manually like we earlier asked for. Ref. https://github.com/srkreddy1238/tvm/tree/tf2.0 has the implementation.
|
@srkreddy1238 yeah, I agree that GraphDef protocol buffer definition just changed a little, creating a new parser will end up with some redundant work (copy from the existing parser). I tried to run the test cases in your shared branch, I found the existing parser works for most of the TF 2.x ops (previously I assume we have to modify most of the mapping in the parser). we can just update the existing parser for TF2.x, and remove unnecessary part in the future once we stop supporting 1.x support. Please rebase your work, it would be good to have it work on top of TF 2.4(latest version is 2.4.0rc4). We can work on variable and tensorlist related ops at the same time. |
@srkreddy1238: I am working on a new TF parser from scratch, to be introduced as ~/python/tvm/relay/frontend/tensorflow2.py. I will share this work soon. The new parser will support control flow v2 op(While/If), instead of control flow v1 op (Enter/Exist/Merge). A lot of the existing parser code which deal with control flow v1 are not needed in the new TF parser. The new parser will support TensorList* ops instead of TensorArray* ops. TensorArray* ops are no longer needed in TF2.x models. The new parser will also handle stateful LSTM models (and similar stateful TF models). My goal this year include leveraging the TF XLA infrastructure to simplify the parser. TF2.x has introduced many significant changes and deprecated some old ways of doing things. So I think a fresh parser is a better way. We can still maintain the current parser for a while for TF1.x models. @yongwww : fyi. |
@wx3000 thanks for the inputs. I see TF2.x as an amendment rather a replacement to 1.x parser. There are new operator support needed in 2.x which can go as an amend over existing operator/parser support. The control flow ops and List ops can be extended on existing parser infra. Ref. #7498
Couldn't find a concrete motivation for a brand new parser which overlaps existing parser mostly. @yongwww any thoughts ? |
@srkreddy1238 : I took a brief look at PR # 7498. This is useful for simple TF2 models where most of the operators did not change much from TF1 to TF2. I am still not convinced that extending the current TF parser will be good in the long term, considering the complexity of supporting TF control flow V1 and V2 in the same code base. A clean start may provide a better foundation to build future work on. |
@wx3000 understand the complexity around control flow ops. But, I am worried about two copies of independent ops across parsers. Understanding the complexity of existing parser and to avoid redundant ops across parsers here are my thoughts.
|
@srkreddy1238 : I agree we should avoid duplication in ops implementation. This will speed things up for TF2.x model support and keep the code clean. Great idea! I think we can have a series of small PR roughly in this order:
We will soon submit item 1-3 from the above list. Rest I don't know who will be working on at this point but hopefully someone will pick it up. |
I agree with the main ideas you guys proposed above. @srkreddy1238 @wx3000 Having a common .py for op mapping for both v1 and v2, exclusive ops go to individual parsers (v1, v2). |
Hi! What is the status of Tensorflow2 support or schedule for it? We use Tensorflow 2.x. Should TVM users still convert models back to TF1.12 or can we go with TF2 models? |
Please open a new issue if TF 2 support is desired. I'm not aware of anyone working actively on this now. |
TensorFlow 2.x is supported in TVM, no need to convert models back to TF 1.x. |
Hi Team, What is the status of supporting TF 2.x in TVM . I don't see any example for parsing TF 2.x model in TVM |
TF2 has been released this week, and becomes the stable version. Some APIs are breaking or not supported in Relay.
[Note: I may have been wrong on the resize op thing. Same problem with TF1.14]
For example, the
half_pixel_centers
attribute in the resize op has been introduced in TF2 (separate issue #4101), and deprecated namespaces liketf.gfile
,tf.graph_util
,tf.GraphDef
, etc.Walking through the TF tutorial, here are some elements to support TF2. Definitely insufficient, so just for getting started on the support.
The text was updated successfully, but these errors were encountered: