Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ISSUE-20] Developer tools milestone #24

Merged
merged 17 commits into from
Jul 17, 2023
Merged
Show file tree
Hide file tree
Changes from 4 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
196 changes: 196 additions & 0 deletions documentation/milestones/individual_milestones/developer_tools.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,196 @@
# Developer Tools

| Field | Value |
| --------------- | -------------- |
| Document Status | Draft |
| Epic Link | TODO: Add Link |
| Epic Dashboard | TODO: Add Link |
| Target Release | TODO: Add Date |
| Product Owner | Lucas Brown |
| Document Owner | Billy Daly |
| Lead Developer | TODO: Add Name |
widal001 marked this conversation as resolved.
Show resolved Hide resolved
| Lead Designer | TODO: Add Name |

## Short description

Select and implement a set of developer tools that automate key code quality and security checks within the backend codebase.

## Goals

### Business description & value

We must incorporate an effective set of developer tools into our backend codebase to ensure that code contributions from maintainers (HHS staff members and contractors) and open source contributors meet key standards and do not introduce bugs or security vulnerabilities.

While enforcing compliance with these standards may increase the time and energy required for individual contributions, adopting an effective set of tools can increase the speed of delivery over time by reducing the overhead associated with reviewing new code and identifying potential bugs before they are deployed to production.

### User stories

- As a **full-time HHS staff member**, I want to:
- ensure that the codebase meets certain quality standards, so that it will be easier to onboard future maintainers and developers to the project
- have a mechanism for catching potential code issues during development or code review, so that we are not introducing bugs or security vulnerabilities in production
widal001 marked this conversation as resolved.
Show resolved Hide resolved
- As an **open source contributor**, I want to:
- be able to reference documentation explaining how to use the developer tools, so that I don't have to learn how to use these tools on my own in order to contribute to the project
- have full test coverage for the codebase, so that I know when I've introduce code that changes or breaks existing behavior
- have code formatting and standards enforced with automated tooling, so that I don't have to learn and check that my code adheres to those standards manually
- be able to report security vulnerabilities to project maintainers directly, so that they can quickly create and deploy a fix before the vulnerability is made public
- As a **maintainer of the project**, I want to:
- have code quality checks run automatically on each push, so that formatting, linting, or security issues are caught before being deployed to production
- ensure that new contributions meet certain thresholds for test coverage, so that code contributions from internal and external developers also include tests which validate the code's behavior
- be automatically notified when updates are available for project dependencies, so that easily evaluate and adopt these new updates and ensure our dependencies don't become stale
- be notified when a security vulnerability is detected in our project or in an upstream dependency, so we can work to quickly address the vulnerability and deploy a fix
- securely manage and rotate keys and secrets related to the project, so that we minimize the risk of secrets being exposed or compromised

## Technical description

### Automated test framework

- Unit tests
- Integration tests
- Test coverage

### Code quality checks

- Code Linting
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For future reference (not for this milestone but for the implementation / ADRs, so feel free to track this wherever), I like things like pre-commit hooks that do things like end-of-file fixing, sorting imports / removing unused imports, etc. Prettier for YAML / Markdown, immutability enforcement for at least JavaScript, enforcing typestrings for Python, docstring enforcement, tools like darglint that check whether docstring matches actual function, etc.

I'm out of touch with what's trendy these days, so these are just ideas for consideration!

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I updated the Code Linting bullet with some of the specific suggestions you made. Many of the tools that we've selected (i.e. ruff, black) do incorporate some of these capabilities (i.e. unused import cleanup, sorting imports, EOF fixing, etc.)

I hadn't heard of darglint but it sounds cool, unfortunately it looks like it's no longer supported as a project. It looks like there might be some other projects that have sprung up since it was sunset though that we can investigate further.

- Auto-formatting
- Type checking
- Commit conventions

### Security checks

- Secrets scanning
- Upstream CVE monitoring
- Vulnerability reporting mechanism

### Dependency management

- Dependency conflicts
- Dependency updates
- Dependency funding

### Config & secrets management

- Loading config variables
- Secrets storage & sharing
- Runtime injection

### Definition of done

- [ ] For all of the tools:
- [ ] Code required to configure and run the tools is deployed to `main` & PROD
- [ ] Instructions for how to adopt and use these tools is clearly documented in a public space
- [ ] At least 5 internal developers/maintainers have adopted and run these tools on a cloned version of the repo
- [ ] At least 3 open source contributors have adopted and run these tools on a forked version of the repo
- [ ] ADRs documenting the selection of these tools and standards have been created and approved
- [ ] **Automated testing framework** is live and meets the following conditions:
- [ ] At least 1 unit test has been added to the codebase
- [ ] At least 1 integration test has been added to the codebase
- [ ] Unit tests are run on every push to the remote GitHub repository
- [ ] Integration tests are run at least once before merging new code into `main`
- [ ] Code which fails any of the unit or integration tests will be blocked from merging into `main`
- [ ] A report on the percentage of code covered by tests is available after every test run
- [ ] Code whose test coverage falls below a certain threshold will be blocked from merging into `main`
- [ ] **Code quality checks** are live and meet the following conditions:
- [ ] All checks are run on every push to the remote GitHub repository
- [ ] The most important checks are run on every local commit
- [ ] Code which fails any of these checks will be blocked from merging into `main`
- [ ] **Security checks** are live and meet the following conditions:
- [ ] At least 1 (test) security vulnerability report has been submitted
- [ ] Maintainers are notified within 1 hour of a vulnerability being reported within the grants API codebase
- [ ] Maintainers are notified within 72 hours of a vulnerability being reported on an upstream dependency
- [ ] Security checks are running on every push to the remote GitHub repository
- [ ] Code which fails any of these security checks is blocked from merging into `main`
- [ ] **Dependency management** is live and meets the following conditions:
- [ ] At least 2 upstream dependencies have been added to the project
- [ ] Compatible versions of upstream dependencies are automatically detected when a new dependency is added
- [ ] Maintainers are notified when a new minor or major version of an upstream dependency is available
- [ ] **Config & secrets management** is live and meets the following conditions:
- [ ] At least 1 configuration variable has been added to the project
- [ ] This config variable has different values when it is running in different environments (e.g. dev, staging, and prod)
- [ ] At least 2 internal developers can access a shared value of this variable locally
- [ ] Developers can change the value of this config variable in their local environment without changing it for other developers
- [ ] An integration test has been created to validate that this variable is injected at runtime

### Proposed metrics for measuring goals/value/definition of done
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are there any code quality scoring tools (not just linters) we can implement to automate the calculation of metrics related to these things?

Things like https://codeclimate.com/quality or https://www.sonarsource.com/products/sonarqube/

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That would be great to use a tool like code climate.


- Test framework metrics
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think developer tools have a couple main goals: reducing bugs, and making the daily experience of software developers more joyful and effective.

With the former, maybe we can measure and track the number of bug tickets created, perhaps as a percent of all tickets. (I'm guessing we'll probably label bug tickets somehow, e.g., with the label bug). Idk, I'm not attached to that metric, I'm just brainstorming.

Maybe we can measure the latter goal somehow too -- like a pulse survey that runs once a month/quarter asking internal and external developers about the enjoyability of working with the codebase.

I've definitely worked on a few projects where it started out really fun and clean to work in the codebase, and gradually it became annoying as tech debt and weird patterns piled up.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like this! I simplified the metrics to the following:

- Total runtime of all CI/CD checks
- Number of bug tickets opened in the last 30 days
- Number of days it takes to close a bug ticket
- Percentage of code covered by unit tests
- Percentage of code covered by integration tests

But it could be good to revisit the developer satisfaction idea once we have a better measurement strategy set up. Perhaps we could even do a slack poll or incorporate something like this in our retros?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good ideas!

- Percentage of code covered by unit tests
- Percentage of code covered by integration tests
- Total runtime of all unit tests
- Total runtime of all integration tests
- Code quality checks metric: Total runtime of all code quality checks
- Security checks metrics
- Total runtime of all security checks
- Number of security vulnerabilities not caught until production
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These three ("Number of...") are good metrics, but I'm thinking about whether there would be a way to automate their calculation and reporting. It seems a little tricky to do so?

We don't want to be proposing metrics that are hard or too time-consuming to track.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good call! I simplified and cleaned up all of metrics to keep them a bit more focused in this commit

- Total runtime of all CI/CD checks
- Number of bug tickets opened in the last 30 days
- Number of days it takes to close a bug ticket
- Percentage of code covered by unit tests
- Percentage of code covered by integration tests

- Number of days it takes to resolve a security vulnerability reported in the grants API project
- Number of days it takes to update an upstream dependency once a version which resolves a CVE has been released
- Dependency management metrics
- Number of days it takes to update an upstream dependency once a new minor or major version has been released
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm. Some dev teams prefer to leave dependencies untouched until you need new features (or there's a security vulnerability), given that updates occasionally break things. So it may not always be needed for us to update when new versions have been released.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@lucasmbrown-usds This is worth discussing further!

@daphnegold was proposing the use of Renovate in the FE code quality tools, which automates the creation of PRs to adopt dependency updates, but you make a good point that a fair amount of time can be sunk into simply making updates when they're not needed.

On past projects I've found regularly updating dependencies to be helpful because otherwise teams can wind up scrambling to make pretty significant updates once a CVE is identified, or they delay updates for years because they become so large and painful to make -- e.g. in my previous project the team was still using Airflow 1.x and had tried and failed to upgrade to Airflow 2.x a few times because they were so far behind on updates.

That being said I like the idea of striking this as a core metric and took it out of the list!

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Got it, I appreciate your point about trying to make increments incrementally instead of waiting for a "big bang" of a massive update. I defer to you and the dev team on your preference for whether we try to keep everything updated or keep everything frozen!

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you and the dev team prefer to keep everything updated, then I'm fine adding this metric back in!

- Number of dependencies that received funding through this project
- Config & secrets management metric: Average age of across all tokens/secrets

### Destination for live updating metrics

Not yet known

## Planning

### Assumptions & dependencies

What capabilities / milestones do we expect to be in place at the beginning of work
on this milestone?

- [ ] **DB & API Plan:** Choosing a set of developer tools will be depend heavily on the language selected for the API.
- [ ] **Onboard Dev Team:** The dev team should be involved in the selection and implementation of these tools.

Are there any notable capabilities / milestones do NOT we expect to be in place at the
beginning of work on this milestone?

- **CI/CD:** While the checks should run automatically each time code is pushed to GitHub, these checks will be incorporated more formally into a CI/CD pipeline in a separate milestone
- **Internationalization:** While this milestone will involve content that needs to be translated, we are not likely to have the mechanism for supporting translation in place by the time work starts on this milestone.

### Open questions

- [ ] [to be added]

### Not doing

The following work will *not* be completed as part of this milestone:

1. [to be added]

## Integrations

### Translations

Does this milestone involve delivering any content that needs translation?

- Instructions for adopting and using developer tools
- Instructions for reporting security vulnerabilities

If so, when will English-language content be locked? Then when will translation be
started and completed?

- Languages to support TBD
- Translation timeline TBD

### Services going into PROD for the first time

This can include services going into PROD behind a feature flag that is not turned on.

1. None

### Services being integrated in PROD for the first time

Are there multiple services that are being connected for the first time in PROD?

1. None

### Data being shared publicly for the first time

Are there any fields being shared publicly that have never been shared in PROD before?

1. None

### Security considerations


2 changes: 2 additions & 0 deletions documentation/milestones/milestone_short_descriptions.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,8 @@ Diagram short name: `Dev-Tools`

Dependencies: `None`

Milestone definition: [Developer tools milestone](./individual_milestones/developer_tools.md)

Install developer tools for backend, including:

- Automated test framework
Expand Down