-
Notifications
You must be signed in to change notification settings - Fork 58
[UX] Changes in new Relax Parser #211
Comments
Thanks @yongwww for summarizing the parser requirements in one place. Can we also add #197 to the list as well. It is more of a sugar support than a bug in current parser, but an important one IMO. You can use the very short example below: Support R.emit_te sugar
|
My 2 cents on C3Most imperative languages rely on user to provide function signature than deducing it. In Relax we would rely on shape/type deduction to deduce the function signature. This is an odd choice given that we cannot in a single pass deduce any function signature (because of function calls). For example how many passes would it take to deduce the return type of foo as
If we assume that the user provides all function signatures, we can break this long chain and ensure that all AST node shape/types are known in two passes: (1) add all function signatures to list (2) deduce types for all values within a function. Not sure if we support recursion, but if we do, then that would also require user provided function signatures. |
Would we not be able to use a unification-style type inference algorithm (like Hindley-Milner Algorithm W, which was what we based Relay's type inference on) to handle cases like that, including with recursion? OCaml includes imperative features and is still able to infer types in all cases. We would have to be careful with our type system, but I think inference would be possible. (This is the sort of thing a full language spec might help us to reason about haha.) The way this would work would be to assign types with type variables where they're not known (so a function with two arguments initially has type |
On C1, regarding whether to use square brackets or parentheses for notation, I would support in principle following Python's conventions wherever possible to avoid surprises. Square brackets look a little odd, but I think deviations from Python should be carefully considered (e.g., we do deviate from Python with regard to lexical scoping, and I think there are good reasons for that) |
On C7, I wonder if we should have some kind of conversion (implicit or explicit) between tensors that are scalars (zero-rank tensors) and IntImms/FloatImms etc. PackedFuncs can use either. Can TIR PrimFuncs take IntImms/FloatImms as arguments? What about doing computations with shapes? |
Also, I would really be in favor of not requiring functions to have a return! We could have an implicit |
Thanks for the pointer to Relay's type inference system. I'll take your word for it that it is doable :) and someone familiar with the algorithm could work towards a prototype for Relax type inference system.
It might be my bias from LLVM family of IRs, but I believe annotating function signatures is not much of a burden (we ask users to annotate parameters already, so it is just the return type). It seems pretty straightforward to me that with function signatures annotated, we can infer type of all sub expressions and relieve the Relax type system of type inferencing function calls.
To be clear, you are advocating for skipping the return statement only when there is nothing to return? |
I also do not think that annotating types is that much of a burden, but those who like Python may not want to do it. (You make the good point that we already expect arg types to be annotated, so requiring more expectations is not a large additional burden.) Just tossing ideas out there, something like C++'s
Yes, I think we should treat the lack of a return statement as implicitly returning |
Another request: We should be able to parse Python if expressions: |
Here is another case that is not handled that might be reasonable to parse too: @R.function
def branch(x: Tensor((), "int32")):
if x:
return x
else:
return x The parser currently complains that the body needs to end with a returned expression. It might be hard to formulate a general rule here; it may be worth spec'ing out in more detail. Do we require both branches of a conditional to have a return if either one does? Do we create an implicit else branch if there is no else branch provided? Etc. |
The current Here is a unit test that should pass but would fail with the current parser:
|
Closing for now and let us open new issues for the new parser |
More and more UX issues/feature requests were created in the past few weeks, some of them are related to Relax Parser. The new relax parser is under active development (thanks to @junrushao1994 @Hzfengsy @cyx-6 @yelite @YuchenJin), more details about the new parser/printer please take a look at TVMScript Unified Printer and metaprogramming. We would like to summarize the changes we are going to make in the new parser, any comments or suggestions are very welcome!
Callable[["float32"], "float32]
) to make it more pythonic.Tensor((2, 3), "float32")
,Tensor([2, 3], "float32")
). Currently the return type of relax Function could be something likeTensor(None, "float32", ndim=2)
even the return shape is able to be deduced, we should use the deduced shape (e.g.Tensor((m, n), "float32")
) for this case.ExternFunc
with side effects), empty tuple will be used as the type of the var. [UX][TVMScript] Feature requests #202 any other suggestions for this scenario?Tensor((32, 32), "float32")
).The text was updated successfully, but these errors were encountered: