-
Notifications
You must be signed in to change notification settings - Fork 19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ISSUE-20] Developer tools milestone #24
Conversation
- Finishes adding to definition of done - Adds Internationalization as related milestone that won't be in place
documentation/milestones/individual_milestones/developer_tools.md
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looking great! A few comments about metrics etc.
- Code quality checks metric: Total runtime of all code quality checks | ||
- Security checks metrics | ||
- Total runtime of all security checks | ||
- Number of security vulnerabilities not caught until production |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These three ("Number of...") are good metrics, but I'm thinking about whether there would be a way to automate their calculation and reporting. It seems a little tricky to do so?
We don't want to be proposing metrics that are hard or too time-consuming to track.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good call! I simplified and cleaned up all of metrics to keep them a bit more focused in this commit
- Total runtime of all CI/CD checks
- Number of bug tickets opened in the last 30 days
- Number of days it takes to close a bug ticket
- Percentage of code covered by unit tests
- Percentage of code covered by integration tests
- Number of days it takes to resolve a security vulnerability reported in the grants API project | ||
- Number of days it takes to update an upstream dependency once a version which resolves a CVE has been released | ||
- Dependency management metrics | ||
- Number of days it takes to update an upstream dependency once a new minor or major version has been released |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm. Some dev teams prefer to leave dependencies untouched until you need new features (or there's a security vulnerability), given that updates occasionally break things. So it may not always be needed for us to update when new versions have been released.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@lucasmbrown-usds This is worth discussing further!
@daphnegold was proposing the use of Renovate in the FE code quality tools, which automates the creation of PRs to adopt dependency updates, but you make a good point that a fair amount of time can be sunk into simply making updates when they're not needed.
On past projects I've found regularly updating dependencies to be helpful because otherwise teams can wind up scrambling to make pretty significant updates once a CVE is identified, or they delay updates for years because they become so large and painful to make -- e.g. in my previous project the team was still using Airflow 1.x and had tried and failed to upgrade to Airflow 2.x a few times because they were so far behind on updates.
That being said I like the idea of striking this as a core metric and took it out of the list!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Got it, I appreciate your point about trying to make increments incrementally instead of waiting for a "big bang" of a massive update. I defer to you and the dev team on your preference for whether we try to keep everything updated or keep everything frozen!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If you and the dev team prefer to keep everything updated, then I'm fine adding this metric back in!
documentation/milestones/individual_milestones/developer_tools.md
Outdated
Show resolved
Hide resolved
|
||
*Are there multiple services that are being connected for the first time in PROD?* | ||
|
||
1. **Task Runner & Secrets Management:** In addition to deploying these services separately, this milestone should also support a strategy for injecting secrets into the task runner during the CI/CD pipeline for running integration tests. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nice, this is a helpful flag!
- [ ] Developers can change the value of this config variable in their local environment without changing it for other developers | ||
- [ ] An integration test has been created to validate that this variable is injected at runtime | ||
|
||
### Proposed metrics for measuring goals/value/definition of done |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are there any code quality scoring tools (not just linters) we can implement to automate the calculation of metrics related to these things?
Things like https://codeclimate.com/quality or https://www.sonarsource.com/products/sonarqube/
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That would be great to use a tool like code climate.
|
||
### Proposed metrics for measuring goals/value/definition of done | ||
|
||
- Test framework metrics |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think developer tools have a couple main goals: reducing bugs, and making the daily experience of software developers more joyful and effective.
With the former, maybe we can measure and track the number of bug tickets created, perhaps as a percent of all tickets. (I'm guessing we'll probably label bug tickets somehow, e.g., with the label bug
). Idk, I'm not attached to that metric, I'm just brainstorming.
Maybe we can measure the latter goal somehow too -- like a pulse survey that runs once a month/quarter asking internal and external developers about the enjoyability of working with the codebase.
I've definitely worked on a few projects where it started out really fun and clean to work in the codebase, and gradually it became annoying as tech debt and weird patterns piled up.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like this! I simplified the metrics to the following:
- Total runtime of all CI/CD checks
- Number of bug tickets opened in the last 30 days
- Number of days it takes to close a bug ticket
- Percentage of code covered by unit tests
- Percentage of code covered by integration tests
But it could be good to revisit the developer satisfaction idea once we have a better measurement strategy set up. Perhaps we could even do a slack poll or incorporate something like this in our retros?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good ideas!
|
||
Evaluate and adopt a set of tools to enforce the following code quality checks: | ||
|
||
- Code Linting |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For future reference (not for this milestone but for the implementation / ADRs, so feel free to track this wherever), I like things like pre-commit hooks that do things like end-of-file fixing, sorting imports / removing unused imports, etc. Prettier for YAML / Markdown, immutability enforcement for at least JavaScript, enforcing typestrings for Python, docstring enforcement, tools like darglint
that check whether docstring matches actual function, etc.
I'm out of touch with what's trendy these days, so these are just ideas for consideration!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I updated the Code Linting bullet with some of the specific suggestions you made. Many of the tools that we've selected (i.e. ruff
, black
) do incorporate some of these capabilities (i.e. unused import cleanup, sorting imports, EOF fixing, etc.)
I hadn't heard of darglint
but it sounds cool, unfortunately it looks like it's no longer supported as a project. It looks like there might be some other projects that have sprung up since it was sunset though that we can investigate further.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looking great! One final nit: based on the discussion with me and @acouch in the comments, could we add implementing a tool like https://codeclimate.com/quality or equivalent, and then using the metrics from that as a key metric we track?
We could leave it TBD for now, but something like, "1 quantitative code quality metric from our code quality scoring tool".
@lucasmbrown-usds Good call, I just added a couple of points about Code Quality scoring in this commit and this commit |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
Summary
Proposes the first draft of the individual milestone definition for Developer Tools.
Fixes: #20
Changes Proposed
docs/milestones/individual_milestones/developer_tools.md
docs/milestones/milestone_short_descriptions.md
under the Developer Tools sectionInstructions for Review
Open Questions