Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Python test improvements #388

Merged
merged 5 commits into from
Jan 22, 2025
Merged

Python test improvements #388

merged 5 commits into from
Jan 22, 2025

Conversation

amdecker
Copy link
Collaborator

@amdecker amdecker commented Jan 22, 2025

Addresses

Changes

  • Python test does not fail on expected failures
  • Python test console log level is now "INFO" by default, but it can be changed from the console [EDIT by GLT]
python tests/test_against_cache.py -l warning
  • Assertion now gives shot and column of failure e.g. AssertionError: Comparison failed on shot 1150805012, column sxr

@amdecker amdecker requested a review from gtrevisan January 22, 2025 11:07
if "PYTEST_CURRENT_TEST" in os.environ or not expect_failure:
assert (
not data_difference.failed
), f"Comparison failed on shot {data_difference.shot_id}, column {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

your last commits introduces this newline, but it breaks stuff for me with:

Traceback (most recent call last):
  File "disruption-py/tests/test_against_cache.py", line 18, in <module>
    from tests.utils.eval_against_sql import (
  File "disruption-py/tests/utils/eval_against_sql.py", line 188
    ), f"Comparison failed on shot {data_difference.shot_id}, column {
       ^
SyntaxError: unterminated string literal (detected at line 188)

did the newline solve stuff for you? which python version are you using?

@gtrevisan gtrevisan force-pushed the decker/python-test-improvements branch from 2c67b89 to cdb0dfc Compare January 22, 2025 14:32
@gtrevisan
Copy link
Member

for C-MOD, I have 35 passed, 11 xfailed for pytest, but a lot of Expected failure and succeeded for python -- is that expected?

on a separate note, now that you added a quick shortcut to switch the log-level from the cli, I'll put INFO back as default.

@amdecker
Copy link
Collaborator Author

for C-MOD, I have 35 passed, 11 xfailed for pytest, but a lot of Expected failure and succeeded for python -- is that expected?

I took a look and this happens because we expect failure by column rather than shot and column. For a given column, some shots pass the test and some shots fail. If we succeeded for all shots, pytest would mark as xsuccess, but as soon as one shot fails it marks xfail.

@gtrevisan gtrevisan merged commit b16e3dc into dev Jan 22, 2025
10 checks passed
@gtrevisan gtrevisan deleted the decker/python-test-improvements branch January 22, 2025 18:58
This was referenced Jan 22, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants