-
Notifications
You must be signed in to change notification settings - Fork 44
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
gateway: add an authenticating reverse proxy
Before this change, we relied on an authenticating reverse proxy called gateway. It's been in use since the inception of the project, as part of a package of services we got from SNCF Réseau. The front-end was responsible for authenticating with keycloak, and providing a Bearer token for the reverse proxy to check. This change introduces an authenticating reverse proxy, which replaces the previous gateway. The front-end now entirely delegates authentication to the backend. On startup, gateway/auth/login is called, which either returns an username if the client is authenticated, or a redirection URL if the user needs to log in. There are three authentication backends supported by the new gateway: - A default mock backend, which return all users to be authentificated with the same username - An OpenID Connect backend, which is used to authenticate interactive users. The application does not support session refresh, and does not forward logout events to the identity provider. - A static Bearer token backend, which is not meant to be used by interactive users Co-Authored-By: Valentin Chanas <[email protected]> Co-Authored-By =?UTF-8?q?=C3=89lys=C3=A6?= <[email protected]>
- Loading branch information
Showing
107 changed files
with
7,246 additions
and
504 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,5 @@ | ||
# shell scripts must not be converted to CRLF, otherwise the CR could be interpreted | ||
# as part of the program name if the script ever were to be interpreted under linux. | ||
# it's especially an issue for scripts copied to a docker container. | ||
# https://unix.stackexchange.com/questions/721844/linux-bash-shell-script-error-cannot-execute-required-file-not-found | ||
*.sh text eol=lf |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,53 @@ | ||
#!/usr/bin/env bash | ||
|
||
# args contains the argument to the bake metadata generation script | ||
args=() | ||
|
||
# the image pattern is used to deduce image name from container name | ||
image_pattern='ghcr.io/osrd-project/unstable/osrd-{container}' | ||
args+=(--image-pattern "$image_pattern") | ||
|
||
# get the list of containers | ||
containers=$(sed -En 's/^target\s+"base-([a-z-]*)"\s*\{\s*\}$/\1/p' < docker/docker-bake.hcl) | ||
for container in $containers; do | ||
args+=("--container" "$container") | ||
done | ||
|
||
is_protected_branch() { | ||
case "$target_ref" in | ||
dev|staging|prod) true;; | ||
*) false;; | ||
esac | ||
} | ||
|
||
cache_from() { | ||
printf "type=registry,mode=max,ref=%s" "$1" | ||
} | ||
|
||
cache_to() { | ||
cache_from "$1" | ||
printf ",compression=zstd" | ||
} | ||
|
||
# if we're on a pull request | ||
if [ -n "$GITHUB_BASE_REF" ]; then | ||
target_ref="$GITHUB_BASE_REF" | ||
pr_id="${GITHUB_REF%/merge}" | ||
pr_id="${pr_id#refs/pull/}" | ||
pr_cache_pat="${image_pattern}:pr-${pr_id}-cache" | ||
args+=(--cache-to-pattern "$(cache_to "${pr_cache_pat}")") | ||
args+=(--cache-from-pattern "$(cache_from "${pr_cache_pat}")") | ||
|
||
target_ref="$GITHUB_BASE_REF" | ||
if is_protected_branch "$target_ref"; then | ||
target_cache_pat="${image_pattern}:${target_ref}-cache" | ||
args+=(--cache-from-pattern "$(cache_from "${target_cache_pat}")") | ||
fi | ||
# we're on dev, staging or prod | ||
elif [ "$GITHUB_REF_PROTECTED" = 'true' ]; then | ||
ref_cache_pat="${image_pattern}:${GITHUB_REF_NAME}-cache" | ||
args+=(--cache-to-pattern "$(cache_to "${ref_cache_pat}")") | ||
args+=(--cache-from-pattern "$(cache_from "${ref_cache_pat}")") | ||
fi | ||
|
||
exec "$(dirname "$0")"/bake-metadata.py "${args[@]}" 'base-{container}' "$@" |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,102 @@ | ||
#!/usr/bin/env python3 | ||
""" | ||
Generates a bake for multiple containers, given tags and labels | ||
as produced by the docker/metadata-action action. | ||
""" | ||
|
||
import sys | ||
import json | ||
|
||
from argparse import ArgumentParser, FileType | ||
|
||
|
||
def _parser(): | ||
parser = ArgumentParser(description=__doc__) | ||
parser.add_argument( | ||
"--container", dest="containers", action="append", help="input container names" | ||
) | ||
|
||
pat_doc = ", where {container} is the base container name" | ||
parser.add_argument( | ||
"--image-pattern", | ||
dest="image_patterns", | ||
action="append", | ||
help="output image patterns" + pat_doc, | ||
) | ||
parser.add_argument( | ||
"--cache-to-pattern", | ||
dest="cache_to_patterns", | ||
action="append", | ||
help="cache-to target patterns" + pat_doc, | ||
) | ||
parser.add_argument( | ||
"--cache-from-pattern", | ||
dest="cache_from_patterns", | ||
action="append", | ||
help="cache-from target patterns" + pat_doc, | ||
) | ||
parser.add_argument("target_pattern", help="output target name pattern" + pat_doc) | ||
parser.add_argument( | ||
"input_metadata", | ||
type=FileType("r"), | ||
help="tags and labels Json file input, as produced by docker/metadata-action", | ||
) | ||
return parser | ||
|
||
|
||
def container_tags(container, image_patterns, input_tags): | ||
"""Make a list of tags to apply to a container""" | ||
container_images = (pat.format(container=container) for pat in image_patterns) | ||
return [f"{image}:{tag}" for image in container_images for tag in input_tags] | ||
|
||
|
||
def generate_bake_file( | ||
input_metadata, | ||
containers, | ||
target_pattern, | ||
image_patterns, | ||
cache_to_patterns, | ||
cache_from_patterns, | ||
): | ||
input_labels = input_metadata["labels"] | ||
input_tags = [tag.split(":")[1] for tag in input_metadata["tags"]] | ||
|
||
bake_targets = {} | ||
for container in containers: | ||
target_manifest = { | ||
"tags": container_tags(container, image_patterns, input_tags), | ||
"labels": input_labels, | ||
} | ||
|
||
if cache_to_patterns: | ||
target_manifest["cache-to"] = [ | ||
pat.format(container=container) for pat in cache_to_patterns | ||
] | ||
|
||
if cache_from_patterns: | ||
target_manifest["cache-from"] = [ | ||
pat.format(container=container) for pat in cache_from_patterns | ||
] | ||
|
||
target = target_pattern.format(container=container) | ||
bake_targets[target] = target_manifest | ||
|
||
return {"target": bake_targets} | ||
|
||
|
||
def main(args=None): | ||
args = _parser().parse_args(args) | ||
input_metadata = json.load(args.input_metadata) | ||
bake_file = generate_bake_file( | ||
input_metadata, | ||
args.containers, | ||
args.target_pattern, | ||
args.image_patterns, | ||
args.cache_to_patterns, | ||
args.cache_from_patterns, | ||
) | ||
json.dump(bake_file, sys.stdout) | ||
|
||
|
||
if __name__ == "__main__": | ||
main() |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,68 @@ | ||
#!/usr/bin/env bash | ||
|
||
# gh cli is buggy as of the writing of this script, and wants both | ||
# of these environment variables to understand it's a script | ||
export CLICOLOR=0 CLICOLOR_FORCE=0 | ||
|
||
: ${ORG:=osrd-project} | ||
: ${RETENTION_PERIOD_DAYS:=30} | ||
: ${PACKAGE_FILTER_REGEX:='.*'} # this regex matches everything | ||
: ${EXCLUDE_TAGS_REGEX:='^\b$'} # this regex never matches anything | ||
|
||
# Define the retention period in milliseconds (30 days in this case) | ||
retention_period_seconds=$((RETENTION_PERIOD_DAYS * 24 * 60 * 60)) | ||
|
||
# The threshold time for package version deletion | ||
retention_threshold=$(($(date +%s) - retention_period_seconds)) | ||
|
||
select_packages() { | ||
jq 'map(select(.name | test($regex_filter)))' --arg regex_filter "$1" | ||
} | ||
|
||
# packages ORG [PACKAGE-TYPE] | ||
packages() { | ||
gh api "orgs/$1/packages?package_type=${2:-container}" | ||
} | ||
|
||
# takes a json list of packages, and outputs newline separated url-encoded package names | ||
packages_names() { | ||
jq -r '.[].name | @uri' | ||
} | ||
|
||
# package_versions ORG PACKAGE [PACKAGE-TYPE] | ||
# the org and package names must be url-encoded | ||
package_versions() { | ||
gh api "orgs/$1/packages/${3:-container}/$2/versions" | ||
} | ||
|
||
# takes a json array stream of package versions, and selects the ones last updated before | ||
# the given unix timestamp | ||
select_expired_versions() { | ||
jq 'map(select(.updated_at | fromdate | . <= $exptime))' --argjson exptime "$1" | ||
} | ||
|
||
# takes a json list of package versions, and filter out those with an excluded tag | ||
exclude_tagged_versions() { | ||
jq 'map(select(.metadata.container.tags | any(test($exclude_tags_regex)) | not))' --arg exclude_tags_regex "$1" | ||
} | ||
|
||
# delete_package_version ORG PACKAGE VERSION [PACKAGE-TYPE] | ||
delete_package_version() { | ||
echo "Deleting package $2 version $3" | ||
gh api --method DELETE "orgs/$1/packages/${4:-container}/$2/versions/$3" | ||
} | ||
|
||
delete_package_versions() { | ||
jq '.[].id' | while IFS= read -r version; do | ||
delete_package_version "$ORG" "$package" "$version" | ||
done | ||
} | ||
|
||
# for all packages in the organization that match the regex | ||
packages "$ORG" | select_packages "${PACKAGE_FILTER_REGEX}" | packages_names | while IFS= read -r package; do | ||
echo "Processing package $package..." | ||
package_versions "$ORG" "$package" | \ | ||
select_expired_versions "$retention_threshold" | \ | ||
exclude_tagged_versions "${EXCLUDE_TAGS_REGEX}" | \ | ||
delete_package_versions | ||
done |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.