-
Notifications
You must be signed in to change notification settings - Fork 92
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add Stryker action to update dashboard #77
Conversation
When I tried adding the env with the STRYKER_DASHBOARD_API_KEY secret, it looks like the Stryker task was run and the following error was simply ignored: 22:27:10 (2681) ERROR DashboardReporter Could not upload report. StrykerError: Error HTTP PUT https://dashboard.stryker-mutator.io/api/reports/github.com/xmldom/xmldom/stryker-action. Unauthorized. Did you provide the correct api key in the "STRYKER_DASHBOARD_API_KEY" environment variable? Unfortunately I've been having trouble configuring the dashboard for this project: stryker-mutator/stryker-dashboard#156 |
I got it working with the dashboard that is based on my @brodybits fork, check this badge which also links to the mutation report: Future updates to master would be reflected on that dashboard, since I have configured the @brodybits dashboard configuration in The purpose of this workaround solution is to make the dynamic mutation report functionality available for contributors until I get a chance to further investigate the issue in stryker-mutator/stryker-dashboard#156. Unfortunately I am not super familiar with all of the stack components of the stryker-dashboard, so this could take a while. I would also like to add the mutation badge, with link to the mutation report, once #78 is reviewed and merged. Once we resolve the issue in stryker-mutator/stryker-dashboard#156, we should be able to drop the special dashboard configuration item in I would appreciate some feedback if this kind of workaround solution is acceptable or not. /cc @karfau |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Personally I don't have any issues with this approach, see detailed question below.
I don't know anything about the speed of reaction on their site, so if you think the issue will persist for a while, it makes sense to not wait on a resolution.
Another question I have is: Do we want to be able to let that check happen in PRs and comment in the PR when the score is lower then on master? I have no idea how difficult this would be, if it would rather confuse contributors, etc. so landing this as a first step is already a good step.
PS: You don't need to /cc
me, I'm watching the repo.
using @brodybits dashboard location as a temporary workaround solution as configured by the following lines in stryker.conf.json: "dashboard": { "project": "github.com/brodybits/xmldom", "version": "master" },
bb8174d
to
f375c10
Compare
A major drawback I forgot to mention before is that Stryker generally takes a while to run, seems to be about 35-40 minutes or so on this project. It may be a bit faster if we would switch to something like Jest and configure Stryker to use Jest test runner rather than the generic command test runner. I think they are working on optimizing Stryker, hoping for a major speedup. I did update the trigger branches (see my comment), rebase, and add the Stryker badge. I updated the first commit message to make the existence of the temporary workaround solution now hopefully extra-clear in the commit history. Unfortunately I do not expect the issue to be fixed on the dashboard side for a while. I did raise a few other issues and they suggested I give a proposal. They do seem to be pretty responsive in general, seem to be extremely busy with a new optimization release now. Assuming I would find and fix the underlying problem, it could then take some additional time to go through the review & rework cycle. I do think the workaround is sadly a bit counterintuitive for people not super familiar with how the Stryker dashboard works with GitHub, hope the commit message makes it a little more clear.
Yup. I changed the default branch of my fork, and removed the master branch from my fork, so I think it should be safe now. I did accidentally push an update of this proposal to my fork, and noticed it did not trigger the action. Actions seem to be not triggered on a fork unless explicitly enabled in the Actions tab.
Shouldn't be, especially now that I switched the default branch and removed master from my fork. In general, the Stryker dashboard pretty much publishes as directed, if validated by the secret key.
I think it would be nice to check for reduced mutation score, and check for changes with surviving mutants in general. I think it would be ideal for an action or bot to run on each PR, each change to a PR, and send a message with the updated mutation report. I think this would need some support from the Stryker dashboard, probably best to hold off for now.
Got it:) |
Thanks @karfau for the timely review. I did follow your suggestion to restrict the trigger to master, and took the liberty to include the Stryker report badge. I do wish the Stryker badge had the word "report" in is label, as I suggested in stryker-mutator/stryker-dashboard#158. I did raise followup issue #80 to fix the dashboard location at some point. I would like to try merging a recent PR, such as #67 to check that the dashboard report is updated properly. |
WOW 😱 first time I ran it it only took 10 minutes. We could change the command that stryker executes to run tests to the content of the Maybe you can try locally if that makes a difference? I really hope so! |
We could try that but I doubt it would make much of a difference, since I guess that vows would just run all scripts within the same process. As I tried to explain before, Stryker is much slower when it uses the "command" test runner, which spawns a new process every time, rather than the Jest runner which should be much more streamlined. You may want to take a look at stryker-mutator/stryker-js#2269 where they are working on a major optimization. |
OK, I'll check that myself.
vs
executing this thousands of times of course has an impact. |
Makes sense now. I actually did not completely understand the existence of the pretest script or the use of the XMLDOM_ASSERT_STRICT setting when merging PR #59. I do think it would be good for someone to investigate the effect on the Stryker execution time, ideally on GitHub which is where we encounter the super-long execution time. Maybe just a draft PR or 2 with an extra "npm run stryker" step in I am now thinking of an idea reorganize the package scripts something like this (not tested) "scripts": {
"start": "nodemon --watch package.json --watch lib --watch test --exec 'npm --silent run test'",
"stryker": "stryker run",
"test": "npm-run-all test:assert test:unit",
"test:assert": "XMLDOM_ASSERT_STRICT=1 npm run test:vows:assert",
"test:unit": "npm run test:vows",
"test:vows": "vows 'test/**/*.vows.js'",
"test:vows:assert": "vows test/assert.vows.js",
"// end": "exit 1"
}, and configure Stryker to use the "npm run test:unit" command (I think it should be possible, forgot how to do it). |
Makes total sense. I can take care of these two changes, one by one. |
💖 Thanks for using the Stryker Dashboard @brodybits did you ever find out why it wasn't working on the core xmldom repository? |
Thanks for asking. From the Chrome dev tools, it looks to me like the server sends a list of some of the orgs that I am an owner of but not all of the orgs. I was able to change a list entry to select the xmldom org, but then the server sent an empty list (zero repositories) that it sees on the xmldom org. As an owner and public member of the xmldom org, I am now completely baffled by this behavior. I am now thinking I would have to either install a copy of the server, according to the stryker-dashboard documentation, unless there is some kind of test server that I can be trusted to play around with. A lot of the technology on the stryker-dashboard stack is pretty new for me. Unfortunately I am pretty overloaded by other priorities and have no idea when I would really find the spare time to investigate. This is why I have the workaround solution in place. Zo vervelend (I lived in Amsterdam area for over 12 years, back to the States now.) Stryker is sure a nice tool for understanding the quality of the test coverage. It would be nice if we could see the change in test coverage in a PR under progress, and if code changes are properly tested. I have done this kind of thing by hand before, which is definitely labor intensive. I don't know what it would take to get this kind of enhancement working with GitHub. |
Tests for `test/assert.js` need to run before the vows testsuite, not as part of them. - Rename `test/assert.vows.js` to `test/assert.vows.checks.js` - Script `test:assert` only runs those tests - Script `test:assert:strict` runs them with strict assertions - Script `test:unit` and `test:vows` are currently the same (until we add another testing framework to the mix) In the end `npm [run] test` executes the same amount of tests on CI but it's more easy to run a subset. Added devDependency [`npm-run-all`](https://www.npmjs.com/package/npm-run-all) xmldom#82 xmldom#77 (comment)
Tests for `test/assert.js` need to run before the vows test suite, not as part of them. - Rename `test/assert.vows.js` to `test/assert.vows.checks.js` - Script `test:assert` only runs those tests - Script `test:assert:strict` runs them with strict assertions - Script `test:unit` and `test:vows` are currently the same (until we add another testing framework to the mix) - Update script `test` to run strict assertions only on `test/assert.vows.checks.js` In the end `npm [run] test` executes the same amount of tests on CI but it's more easy to run a subset. Added devDependency [`npm-run-all`](https://www.npmjs.com/package/npm-run-all) #82 #77 (comment) Co-authored-by: Christian Bewernitz <[email protected]> Co-authored-by: Chris Brody <[email protected]>
as a Node.js action - updated
stryker-*
from the list of trigger branches before merging (maybe)resolves #60
/cc @karfau