Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

why torch-to-iree pass generate func named main$async automatically? #17776

Closed
bailuan opened this issue Jul 1, 2024 · 5 comments
Closed

why torch-to-iree pass generate func named main$async automatically? #17776

bailuan opened this issue Jul 1, 2024 · 5 comments
Labels
integrations/pytorch PyTorch integration work support Request support or ask a question

Comments

@bailuan
Copy link

bailuan commented Jul 1, 2024

What happened?

When I try to use iree-opt --torch-to-iree to transform ir from torch to linalg. I found my function named main becomes main$async automatically. What should I do if I just want a function named main but not a async function named main$async?

Steps to reproduce your issue

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

What component(s) does this issue relate to?

MLIR

Version information

No response

Additional context

No response

@bailuan bailuan added the bug 🐞 Something isn't working label Jul 1, 2024
@ScottTodd ScottTodd added support Request support or ask a question integrations/pytorch PyTorch integration work and removed bug 🐞 Something isn't working labels Jul 1, 2024
@ScottTodd
Copy link
Member

If you just want to convert from torch to linalg, use torch-mlir-opt from https://github.com/llvm/torch-mlir. IREE's use of torch-mlir is in service of IREE compilation and may introduce other dialects as needed for that purpose.

The --torch-to-iree pass should be creating both a function named main and a function named main$async. That's documented here: https://github.com/iree-org/iree/blob/main/compiler/plugins/input/Torch/InputConversion/FuncConversion.cpp and tested here: https://github.com/iree-org/iree/blob/main/compiler/plugins/input/Torch/InputConversion/test/func_conversion.mlir.

@bailuan
Copy link
Author

bailuan commented Jul 2, 2024

THX, another question please, what should I do if I just want use python script to transform IR but not compile from source to use torch-mlir-opt?

@bailuan
Copy link
Author

bailuan commented Jul 2, 2024

I have tried to conpile from source by using command line suggested: cmake -GNinja -Bbuild \ -DCMAKE_BUILD_TYPE=Release \ -DPython3_FIND_VIRTUALENV=ONLY \ -DLLVM_ENABLE_PROJECTS=mlir \ -DLLVM_EXTERNAL_PROJECTS="torch-mlir" \ -DLLVM_EXTERNAL_TORCH_MLIR_SOURCE_DIR="$PWD" \ -DMLIR_ENABLE_BINDINGS_PYTHON=ON \ -DLLVM_TARGETS_TO_BUILD=host \ externals/llvm-project/llvm, but not find the torch-mlir-opt in build/bin/.

@ScottTodd
Copy link
Member

THX, another question please, what should I do if I just want use python script to transform IR but not compile from source to use torch-mlir-opt?

The IREE Python bindings expose some of the MLIR Python bindings (https://mlir.llvm.org/docs/Bindings/Python/) via from iree.compiler import ir. See https://iree.dev/reference/bindings/python/#installing-iree-packages for installation instructions. Upstream MLIR and torch-mlir also build some subset of the python bindings into packages.

I have tried to conpile from source by using command line suggested: cmake -GNinja -Bbuild \ -DCMAKE_BUILD_TYPE=Release \ -DPython3_FIND_VIRTUALENV=ONLY \ -DLLVM_ENABLE_PROJECTS=mlir \ -DLLVM_EXTERNAL_PROJECTS="torch-mlir" \ -DLLVM_EXTERNAL_TORCH_MLIR_SOURCE_DIR="$PWD" \ -DMLIR_ENABLE_BINDINGS_PYTHON=ON \ -DLLVM_TARGETS_TO_BUILD=host \ externals/llvm-project/llvm, but not find the torch-mlir-opt in build/bin/.

Please direct torch-mlir support questions to torch-mlir. Did you build after configuring? See https://github.com/llvm/torch-mlir/blob/main/docs/development.md#build-commands

@bailuan
Copy link
Author

bailuan commented Jul 2, 2024

ok, I will try this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
integrations/pytorch PyTorch integration work support Request support or ask a question
Projects
None yet
Development

No branches or pull requests

2 participants