Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ISSUE-20] Developer tools milestone #24

Merged
merged 17 commits into from
Jul 17, 2023
Merged

[ISSUE-20] Developer tools milestone #24

merged 17 commits into from
Jul 17, 2023

Conversation

widal001
Copy link
Collaborator

@widal001 widal001 commented Jun 20, 2023

Summary

Proposes the first draft of the individual milestone definition for Developer Tools.

Fixes: #20

Changes Proposed

  • Creates docs/milestones/individual_milestones/developer_tools.md
  • Links this milestone to docs/milestones/milestone_short_descriptions.md under the Developer Tools section

Instructions for Review

  1. Schedule a meeting to review this milestone definition
  2. Document and implement feedback from the meeting

Open Questions

  • Can we remove Logging from the list of deliverables for this milestone and complete this in Infrastructure-as-Code instead?

@widal001 widal001 added the draft Not yet ready for review label Jun 20, 2023
widal001 added 3 commits June 20, 2023 08:56
- Finishes adding to definition of done
- Adds Internationalization as related milestone that won't be in place
@widal001 widal001 added the docs: deliverable Deliverable specification ticket label Jul 5, 2023
@widal001 widal001 marked this pull request as ready for review July 6, 2023 17:51
@widal001 widal001 removed the draft Not yet ready for review label Jul 6, 2023
Copy link
Collaborator

@lucasmbrown-usds lucasmbrown-usds left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking great! A few comments about metrics etc.

- Code quality checks metric: Total runtime of all code quality checks
- Security checks metrics
- Total runtime of all security checks
- Number of security vulnerabilities not caught until production
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These three ("Number of...") are good metrics, but I'm thinking about whether there would be a way to automate their calculation and reporting. It seems a little tricky to do so?

We don't want to be proposing metrics that are hard or too time-consuming to track.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good call! I simplified and cleaned up all of metrics to keep them a bit more focused in this commit

- Total runtime of all CI/CD checks
- Number of bug tickets opened in the last 30 days
- Number of days it takes to close a bug ticket
- Percentage of code covered by unit tests
- Percentage of code covered by integration tests

- Number of days it takes to resolve a security vulnerability reported in the grants API project
- Number of days it takes to update an upstream dependency once a version which resolves a CVE has been released
- Dependency management metrics
- Number of days it takes to update an upstream dependency once a new minor or major version has been released
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm. Some dev teams prefer to leave dependencies untouched until you need new features (or there's a security vulnerability), given that updates occasionally break things. So it may not always be needed for us to update when new versions have been released.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@lucasmbrown-usds This is worth discussing further!

@daphnegold was proposing the use of Renovate in the FE code quality tools, which automates the creation of PRs to adopt dependency updates, but you make a good point that a fair amount of time can be sunk into simply making updates when they're not needed.

On past projects I've found regularly updating dependencies to be helpful because otherwise teams can wind up scrambling to make pretty significant updates once a CVE is identified, or they delay updates for years because they become so large and painful to make -- e.g. in my previous project the team was still using Airflow 1.x and had tried and failed to upgrade to Airflow 2.x a few times because they were so far behind on updates.

That being said I like the idea of striking this as a core metric and took it out of the list!

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Got it, I appreciate your point about trying to make increments incrementally instead of waiting for a "big bang" of a massive update. I defer to you and the dev team on your preference for whether we try to keep everything updated or keep everything frozen!

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you and the dev team prefer to keep everything updated, then I'm fine adding this metric back in!


*Are there multiple services that are being connected for the first time in PROD?*

1. **Task Runner & Secrets Management:** In addition to deploying these services separately, this milestone should also support a strategy for injecting secrets into the task runner during the CI/CD pipeline for running integration tests.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nice, this is a helpful flag!

- [ ] Developers can change the value of this config variable in their local environment without changing it for other developers
- [ ] An integration test has been created to validate that this variable is injected at runtime

### Proposed metrics for measuring goals/value/definition of done
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are there any code quality scoring tools (not just linters) we can implement to automate the calculation of metrics related to these things?

Things like https://codeclimate.com/quality or https://www.sonarsource.com/products/sonarqube/

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That would be great to use a tool like code climate.


### Proposed metrics for measuring goals/value/definition of done

- Test framework metrics
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think developer tools have a couple main goals: reducing bugs, and making the daily experience of software developers more joyful and effective.

With the former, maybe we can measure and track the number of bug tickets created, perhaps as a percent of all tickets. (I'm guessing we'll probably label bug tickets somehow, e.g., with the label bug). Idk, I'm not attached to that metric, I'm just brainstorming.

Maybe we can measure the latter goal somehow too -- like a pulse survey that runs once a month/quarter asking internal and external developers about the enjoyability of working with the codebase.

I've definitely worked on a few projects where it started out really fun and clean to work in the codebase, and gradually it became annoying as tech debt and weird patterns piled up.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like this! I simplified the metrics to the following:

- Total runtime of all CI/CD checks
- Number of bug tickets opened in the last 30 days
- Number of days it takes to close a bug ticket
- Percentage of code covered by unit tests
- Percentage of code covered by integration tests

But it could be good to revisit the developer satisfaction idea once we have a better measurement strategy set up. Perhaps we could even do a slack poll or incorporate something like this in our retros?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good ideas!


Evaluate and adopt a set of tools to enforce the following code quality checks:

- Code Linting
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For future reference (not for this milestone but for the implementation / ADRs, so feel free to track this wherever), I like things like pre-commit hooks that do things like end-of-file fixing, sorting imports / removing unused imports, etc. Prettier for YAML / Markdown, immutability enforcement for at least JavaScript, enforcing typestrings for Python, docstring enforcement, tools like darglint that check whether docstring matches actual function, etc.

I'm out of touch with what's trendy these days, so these are just ideas for consideration!

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I updated the Code Linting bullet with some of the specific suggestions you made. Many of the tools that we've selected (i.e. ruff, black) do incorporate some of these capabilities (i.e. unused import cleanup, sorting imports, EOF fixing, etc.)

I hadn't heard of darglint but it sounds cool, unfortunately it looks like it's no longer supported as a project. It looks like there might be some other projects that have sprung up since it was sunset though that we can investigate further.

Copy link
Collaborator

@lucasmbrown-usds lucasmbrown-usds left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking great! One final nit: based on the discussion with me and @acouch in the comments, could we add implementing a tool like https://codeclimate.com/quality or equivalent, and then using the metrics from that as a key metric we track?

We could leave it TBD for now, but something like, "1 quantitative code quality metric from our code quality scoring tool".

@widal001
Copy link
Collaborator Author

widal001 commented Jul 17, 2023

@lucasmbrown-usds Good call, I just added a couple of points about Code Quality scoring in this commit and this commit

Copy link
Collaborator

@lucasmbrown-usds lucasmbrown-usds left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@widal001 widal001 merged commit 41c6ddc into main Jul 17, 2023
@widal001 widal001 deleted the issue-20-dev-tools branch July 17, 2023 20:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
docs: deliverable Deliverable specification ticket
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Milestone Doc]: Developer Tools
3 participants