Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[AutoTVM][Autoscheduler] Default build funcs inherit PassContext #11632

Merged

Conversation

AndrewZhaoLuo
Copy link
Contributor

See #11618 for old discussion

These funcs used building during the tuning process now inherit PassContext so we can inject things like pass instrumentation into the build process during tuning.

Note, due to the fact subprocesses are used for building by default, we can't naively do something like

with PassContext(...):
   tune()

since new processes will not inherit the parents context.

But what does allow us to do is wrap the existing build functions with new PassContexts and call those instead making it a little more extensible.

@areusch
Copy link
Contributor

areusch commented Jun 9, 2022

@AndrewZhaoLuo ping me when ready for another look

@AndrewZhaoLuo
Copy link
Contributor Author

Synced offline with @areusch, and created #11656 to track this unintuitive behavior.

@areusch
Copy link
Contributor

areusch commented Jun 9, 2022

i'm good with this, but can you add a unit test to verify check_gpu is applied?

@AndrewZhaoLuo
Copy link
Contributor Author

AndrewZhaoLuo commented Jun 10, 2022

Added a test, PTAL @areusch

It's a bit of a doozy though since I had to patch a lot of things

callbacks=(lambda _tuner, _inputs, rs: results.extend(rs),),
)

assert len(results) == 1
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

want to check the pass also succeeded? i think if one of those asserts fail we just get measure error here

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

assertions will fail the test so I think it's safe -- we just want to make sure proper passes are run

We don't want to check for success in tuning since the tuning process on GPU is actually flaky (see test_tuning_gpu()) above.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ah ok--maybe the runner is where exceptions don't fail the test. good enough, if you've proven locally an exception causes the test to fail.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants