Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inconsistency between dataset title validation result versus validation tab #7094

Closed
esxkang opened this issue Jan 20, 2023 · 2 comments
Closed
Labels
bug Bug report

Comments

@esxkang
Copy link

esxkang commented Jan 20, 2023

Describe the bug
Datasets are indicating results with failed assertions (ex. Dataset is failing 3 / 464 assertions) but validation tab only returns a random 100 results and shows “All assertions have passed” based on 100 out of 100 assertions. There is no way to see the failed assertions.

To Reproduce
Steps to reproduce the behavior:

  1. Go to dataset which has more than 100 assertions with some failed assertions.
  2. Click on validation tab.
  3. Observe X mark with "failed assertions" description next to dataset title and compare against validation results below dataset title.
  4. Compare to see error.

Expected behavior
I expect to be able to easily see/identify the failed assertions on the Validation tab. Perhaps there should be an option to show all assertions and sort by or filter to passed and failed assertions. I also expect a consistency in the counts, so if there are 464 assertions, it shouldn't just show 100 on the validation tab.

Screenshots
image

Desktop (please complete the following information):

  • OS: Windows
  • Browser: Chrome
  • Version: 109.0.5414.75
@esxkang esxkang added the bug Bug report label Jan 20, 2023
@atulsaurav
Copy link
Contributor

it looks like its hardcoded in GraphQL to only pull 100 assertions. Here is the code. A quick fix may be to increase this count to 1000 maybe.. a better fix may be to leave actively load next 100 rows as the use is scrolling the list. In any case, the assertion count needs to be consistent with count displayed with the X mark near the dataset title.

@chriscollins3456
Copy link
Collaborator

hey @esxkang and @atulsaurav ! thanks so much for this issue and pointing this out. I think there are two options here:

  1. a quick fix to increase that hardcoded number from 100 -> 1000
  2. keep the request number 100, but paginate our assertions while also fixing the header on the validations tab to be correct.

I think 2 is the better solution but will take a bit more effort. So what I did was implement 1 to fix this situation in the short term and then I filed a ticket for our team to pick up number 2 in our quality improvement work.

Here's the PR for 1: #7215

I'm going to close this issue for now but if you find this fix doesn't work for you, please re-open!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Bug report
Projects
None yet
Development

No branches or pull requests

3 participants