Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Rollout Infra] Start reading the GCS feature flags and expose them in a global Var #4869

Merged
merged 9 commits into from
Jan 23, 2025

Conversation

gmechali
Copy link
Contributor

@gmechali gmechali commented Jan 21, 2025

Uploaded feature_flags.json in datcom-website-[ENV]-resources GCS buckets in format as:
[{ "name": "autocomplete", "enabled": true }, { "name": "dev_place_experiment", "enabled": false }, { "name": "dev_place_ga", "enabled": false }]
On server startup, it ingests this file, and exposes the feature status through api/features.
Also adds an api/features/{feature_name} endpoint to expose whether it's enabled.

Starts reading the feature flags for place autocomplete, place page experiment rollout and place page GA rollout.

The feature flags were uploaded as:

  • autocomplete: Enabled for all environments
  • dev place experiment: Enabled for autopush
  • dev place GA rollout: Enabled for autopush

This implementation is as outlined in https://docs.google.com/document/d/19CqgKt6aIbFs5v2zhm3TagDfPJF-K_SpH-9VPP9FeRw/edit?resourcekey=0-qM2XvR7dYp4j0jXshva04g&tab=t.0

In a followup, I will add the fallback flag values, ingest them in the load_feature_flags method, and write the script to auto update the GCS buckets. See PR - #4871

…al var declared in base.py. Creates a helper to interact with them from both the TS and py code. Reads the flags for autocomplete, place page experiment and place page GA.
@gmechali gmechali marked this pull request as ready for review January 21, 2025 22:20
@gmechali gmechali requested a review from beets January 21, 2025 22:20
@gmechali gmechali changed the title [Rollout Infra] Start reading the GCS feature flags and expose them in the new Features API. [Rollout Infra] Start reading the GCS feature flags and expose them in a global Var Jan 21, 2025
@gmechali gmechali requested a review from dwnoble January 22, 2025 16:36
env_for_bucket = os.environ.get('FLASK_ENV')
if env_for_bucket in ['local', 'integration_test', 'test', 'webdriver']:
env_for_bucket = 'autopush'
elif env_for_bucket == 'production':
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should also think about what to do for the 'staging' environment and various legacy custom data commons (e.g., 'feedingamerica', and others defined in server/app_env/*.py)

For staging, i think we should read the 'prod' bucket.
For the legacy custom dcs, we can either read the prod bucket (but that assumes the right read permissions are in place), or maybe we can just read a local feature flag file. Another option could be we keep the prod feature flag file checked into this repo, and for all other environments beyond what you have listed here, we just read that locally checked in version? wdyt?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would argue that staging should still read its own bucket, but that we should aim to keep the staging flags and prod flags in sync. They should only differ if we need to test something in staging before launching it to prod. So temporary diffs.
Do you agree?

For custom dc, I agree with your take that it should read the local file. If custom DC customers want to support launches on the fly, they can make their own bucket, store their flag and update the target. I ll leave that conversation for the following PR where I actually add the local files!

server/lib/util.py Show resolved Hide resolved
logging.error("Bucket not found: " + bucket_name)
return {}

blob = bucket.get_blob("feature_flags.json")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this might need a try/except block too

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

so actually if you look at the method implementation, it already nests the read call within a try/catch and returns None if there's a failure.
So this should never raise any exception.

logging.warning("Feature flag file not found in the bucket.")

# Create the dictionary using a dictionary comprehension
feature_flag_dict = {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not just store the feature flag configuration as a object/dictionary in json?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So I did load it up in json but doing this just plays it a little safer since we now know that if we have a feature flag, it must have a name and enabled.
so in case the file that was read in had bad data this would ensure it doesn't get added to the flags.

server/routes/place/html.py Outdated Show resolved Hide resolved
server/lib/shared.py Outdated Show resolved Hide resolved
@gmechali gmechali requested a review from dwnoble January 23, 2025 18:49
Copy link
Contributor

@dwnoble dwnoble left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks Gabe!

Last thing (can be in a follow up)- can you add some instructions, either to the readme or inline in one of the feature_flags files on how to enable feature flags and how to update the feature flag files (including what gcp account they're in)

@gmechali
Copy link
Contributor Author

Thanks Gabe!

Last thing (can be in a follow up)- can you add some instructions, either to the readme or inline in one of the feature_flags files on how to enable feature flags and how to update the feature flag files (including what gcp account they're in)

Yeah absolutely, i ll do that after im done implementing. And will clean up the doc too!

@gmechali gmechali merged commit 9e36d52 into datacommonsorg:master Jan 23, 2025
9 checks passed
@gmechali gmechali deleted the rollout branch January 23, 2025 19:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants