-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix flaky parity whisper test #1447
Conversation
8bbfe13
to
bae3fd4
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd be inclined to decorate the flaking tests with xfail:
@pytest.mark.xfail(strict=False, raises=ValueError)
This avoids the changes that touch more mentioned in the ssh test comments ⛳️
Ah, I like that better. It does feel cleaner. Thanks! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
a couple of lines to remove, then it's all good
140cf6d
to
0bbb46f
Compare
0bbb46f
to
1cdae8b
Compare
1cdae8b
to
18cf178
Compare
Got a verbal approval from @pipermerriam |
What was wrong?
Our parity integration tests were flaking out on the
shh.post
method sometimes.Related to Issue #1408
How was it fixed?
Since we don't care what comes back from parity, just that we're interfacing with it correctly, added a check for an error response, and make sure it's the correct error response. Otherwise, continue with the test as usual.
Todo:
Cute Animal Picture