-
-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Set up CircleCI test summary [npmcollaborators] #2526
Conversation
|
https://circleci.com/docs/api/#test-metadata I remember looking for this API a while back and found nothing so I think this may be relatively new. Seems like it may now finally be possible to have a Test Results badge for Circle (akin to Appveyor, Jenkins, Azure DevOps, etc.)! |
@@ -157,6 +157,8 @@ | |||
"minimist": "^1.2.0", | |||
"mkdirp": "^0.5.1", | |||
"mocha": "^5.0.0", | |||
"mocha-env-reporter": "^3.0.0", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What about using one of the multi reporters, like mocha-multi-reporters or mocha-multi? That way we can still keep the terminal (spec) reporter as well for the benefit of local development and also get the results in the xml file that Circle needs.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
mocha-env-reporter
should use the default reporter outside of CI, though if there’s some other benefit to showing both I could change it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice! I remember looking at that a while back but I see now where I may have gotten confused by the list of CI servers in the readme
@@ -157,6 +157,8 @@ | |||
"minimist": "^1.2.0", | |||
"mkdirp": "^0.5.1", | |||
"mocha": "^5.0.0", | |||
"mocha-env-reporter": "^3.0.0", | |||
"mocha-junit-reporter": "^1.18.0", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also FWIW, Circle can parse the "xunit" format that is emitted by mocha's native/bundled xunit reporter which could save having to add another dev dependency
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh, nice! I’ll give that a shot.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure how well the xunit reporter will play with mocha-env-reporter though. I always pass the file path for the xunit file via the reporter options --reporter-options ./foo/bar.xml
but no clue if that can be specified via an env variable or some other mechanism
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It looks like with xunit
there's not a way to configure the output file through the environment, whereas mocha-junit-reporter
supports this. I opened mochajs/mocha#3621 with this ask.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Makes sense, not worth fighting xunit here IMO :) My guess is that they will point you to one of the community reporters (like xunit-file) that provide support for env var defined output file, at which point we're right back to what the junit reporter already gives us
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Heh, possibly. I have a soft spot for Mocha, especially after some recent unpleasant experiences with Jest doing unexpected things with setup. However I do find the way Mocha handles reporters and configuration to be frustrating, fiddly, and to require more effort than it should. I was happy with some upcoming improvements to the config which they mentioned in mochajs/mocha#3612. 🤞
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm a big fan of Mocha too. I've only worked with Jest in one project, but I do hear a lot of good things about it. What sort of setup issues have you seen with Jest?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I hit these issues:
- Tests are run even when beforeAll throws error jestjs/jest#2713 (it won't be fixed in Jasmine so it's waiting for Jest Circus)
- Stop testcase execution on failure in beforeEach jasmine/jasmine#577
- Fail with error & stack trace in asynchronous code jasmine/jasmine#529
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I know my approval doesn't count for much, but I can't wait til this gets merged! My browser crashes ~75% of the time on Circle when trying to scroll down to the test failures, especially on the daily service tests job, so this will be awesome 😄
Not at all. Thanks for reviewing and feel free to add comments on other open PRs where you are able to provide feedback :) Code review is one of the areas where we have a bit of a bottleneck. |
5e13553
I appreciate the reviews, from both of you! |
This reverts commit 5e13553.
See https://circleci.com/docs/2.0/collect-test-data/
This provides a list of failed tests in the Circle UI and a summary of which tests have failed. It should make interpreting test results easier
It also provides information about how frequently individual tests fail:
These seem particularly helpful in the daily service tests, which will also need a couple tweaks made: https://circleci.com/gh/badges/daily-tests/33
I'm not sure if it's possible to access these totals through the API, though if it is, it would be cool to show how many service tests have failed. That would make it a little more fun to keep on top of them. 😉
Example failures:
https://circleci.com/gh/badges/shields/29091
https://circleci.com/gh/badges/shields/29100