We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
To test the device handling of our torch related code, we should add the possibility to manually run tests with cuda support.
Suggestion:
Add --with-cuda option:
--with-cuda
parser.addoption( "--with-cuda", action="store_true", default=False, help="Use CUDA for testing if available", )
Add a device fixture, which can be used in tests:
@pytest.fixture(scope="session") def device(request): import torch use_cuda = request.config.getoption("--with-cuda") if use_cuda and torch.cuda.is_available(): return torch.device("cuda") else: return torch.device("cpu")
manually run pytest --with-cuda
pytest --with-cuda
The text was updated successfully, but these errors were encountered:
schroedk
Successfully merging a pull request may close this issue.
To test the device handling of our torch related code, we should add the possibility to manually run tests with cuda
support.
Suggestion:
Add
--with-cuda
option:Add a device fixture, which can be used in tests:
manually run
pytest --with-cuda
The text was updated successfully, but these errors were encountered: