-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update v9.5.2 #74
Update v9.5.2 #74
Conversation
* Rename package to satokengen to make it clearer that it is for service account tokens
add selector for items in the megamenu
* Give more detail and context on http_addr in grafana.ini This particular configuration item is misleadingly named (it's about networking addresses rather than HTTP) and there are a bunch of ways users can misread, misconfigure, and create a situation where grafana won't start due to a bind error. I'm trying to outline a couple of extra things about what this covers and provide good default advice, leaving room for specialists who know what they're doing to still use this, with more clarity. * Update _index.md * Apply suggestions from code review Co-authored-by: Christopher Moyer <[email protected]> * Update _index.md removed recommendation note about http_addr * makes prettier --------- Co-authored-by: Christopher Moyer <[email protected]> Co-authored-by: Chris Moyer <[email protected]>
…ackend combinations (grafana#65497) * define 3 feature toggles for rollout phases * Pass feature toggles along * Implement first feature toggle * Try a different strategy with fall-throughs to specific configurations * Apply toggle overrides once outside of backend composition * Emit log messages when we coerce backends * Run code generator for feature toggle files * Improve wording in flag descs * Re-run generator * Use code-generated constants instead of plain strings * Use converted enum values rather than strings for pre-parsing
* Docs: Fix markup for URL * updates absolute URL to a relref --------- Co-authored-by: Chris Moyer <[email protected]>
* Add click tracking to data links * Use constants, mock window.open call * Add comment about possibly missing reporting data * Remove superfluous if
…ana#63714) * Tracing: Pass OTLP address and propagation format to plugins * Fix unit tests * Fix indentation * Fix plugin manager integration tests * Goimports * Pass plugin version to plugins * Do not add GF_PLUGIN_VERSION if plugin version is not set, add tests * Allow disabling plugins distributed tracing on a per-plugin basis * Moved disabled plugins to tracing.opentelemetry config section * Pre-allocate DisabledPlugins map to the correct size * Moved disable tracing setting flags in plugin settings * Renamed plugin env vars for tracing endpoint and propagation * Fix plugin initializer tests * Refactoring: Moved OpentelemetryCfg from pkg/infra to pkg/plugins * Changed GetSection to Section in parseSettingsOpentelemetry * Add tests for NewOpentelemetryCfg * Fix test case names in TestNewOpentelemetryCfg * OpenTelemetry: Remove redundant error checks
* Move checkbox outside of SeachItem * Add li element
* Update scenes to latest * Replace FormatRegistryID with VariableFormatID * Remove scene demos that were moved to scenes repo, fix the remaining demos * Fix grafana Monitoring app * DashboardsLoader migration * Fix test
… tests (grafana#65625) * add selector to close dashboard settings and use it in e2e tests * check visibility of add panel button * check visibility of add new panel button before clicking
update Alerts & incidents to Alerts & IRM
…) for request timeouts (grafana#65434) * Logs Volume: identify timeouts and provide remediation action * Supplementary result error: refactor updated component API * Create helper to identify timeout errors * Update timeout identifying function * Add unit test * Update panel unit test * Update public/app/features/explore/utils/logsVolumeResponse.ts Co-authored-by: Giordano Ricci <[email protected]> * Use some instead of reduce * Change alert type to info * Add comment * Remove unnecessary optional chaining * Remove unnecessary condition * Remove unnecessary wrapping arrow function --------- Co-authored-by: Giordano Ricci <[email protected]>
* TraceQL - configurable static fields for new UI * TraceQL - filter out static fields from Tags section. Added tooltip to static fields * Add more units to duration validation. Improve duration field tooltip with accepted units * Better control of delete button on SearchField * Move new config behind feature toggle * Special title for intrinsic "name" * Fix tests * Move static fields not in the datasource to the Tags section * Start using the useAsync hook in the Tempo TraceQL configuration page to retrieve the tags and datasource * Fix tests * Fix test. Use useAsync to retrieve options in SearchField * Remove ability to set a default value in filter configuration. Removed type from filter, dynamic filters are now any filters not present in the datasource config * Updated the static filters tooltip * Replace useState + useEffect with useMemo for scopedTag
grafana#65354) * Elasticsearch: Add benchmarks for processing of requests and responses * Update, use package variable to elimitnate possible optimization * Update, require no error
* BrowseDashboards: Fix move to General folder not working * simplify isOpen
update codeowners per new team name
* wip for move folders * hello * Polish up move dashboard results messages * tests * fix other test * tweak messages when things can't be moved * tweak messages when things can't be moved * fix tests * remove comment * restore failOnConsole * . * Fix move modal not opening due to dodgy rebase
add delete confirmation to delete modal
* Table: Fix column filter popup shadow * fix border
* baldm0mma/update_stat_docs/ update stat docs * baldm0mma/update_stat_docs/ update content * baldm0mma/update_stat_docs/ update show text * baldm0mma/update_stat_docs/ update color mode text * Update docs/sources/panels-visualizations/visualizations/stat/index.md Update to fully plural Co-authored-by: Isabel <[email protected]> * Update docs/sources/panels-visualizations/visualizations/stat/index.md Update to singular Co-authored-by: Isabel <[email protected]> --------- Co-authored-by: Isabel <[email protected]>
…a#67653) Docs: Fix broken link for no data and error handling (grafana#67617) Fix broken link for no data and error handling (cherry picked from commit 1013a3e) Co-authored-by: Alfredo <[email protected]>
Chore: Upgrade Go to 1.20.4
release v9.5.2
🤖 OpenAI High-level summaryThe changes involve updates to documentation, configuration files, Docker blocks, and some code refactoring. The majority of the changes do not affect the code logic or functionality. However, some modifications may impact the behavior of the code and require a thorough review.
Chat with 🤖 OpenAI Bot (
|
"typecheck", | ||
"asciicheck", | ||
"errorlint", | ||
"sqlclosecheck", | ||
"revive", | ||
] | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🤖 OpenAI
The removal of "sqlclosecheck" linter might lead to potential issues with unclosed SQL connections. Please consider adding it back.
"typecheck", | |
"asciicheck", | |
"errorlint", | |
"sqlclosecheck", | |
"revive", | |
] | |
"typecheck", | |
"asciicheck", | |
"errorlint", | |
"sqlclosecheck", | |
"revive", | |
] |
"console": "integratedTerminal", | ||
"internalConsoleOptions": "neverOpen", | ||
"port": 9229 | ||
}, | ||
{ | ||
"name": "Debug Go test", | ||
"type": "go", | ||
"request": "launch", | ||
"mode": "test", | ||
"program": "${workspaceFolder}/${relativeFileDirname}", | ||
"showLog": true | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🤖 OpenAI
The new "Debug Go test" configuration has been added, but it's missing the env
and cwd
properties that were present in the previous configurations. This might cause issues when running tests that depend on environment variables or the current working directory.
"console": "integratedTerminal", | |
"internalConsoleOptions": "neverOpen", | |
"port": 9229 | |
}, | |
{ | |
"name": "Debug Go test", | |
"type": "go", | |
"request": "launch", | |
"mode": "test", | |
"program": "${workspaceFolder}/${relativeFileDirname}", | |
"showLog": true | |
} | |
"name": "Debug Go test", | |
"type": "go", | |
"request": "launch", | |
"mode": "test", | |
"program": "${workspaceFolder}/${relativeFileDirname}", | |
"env": {}, | |
"cwd": "${workspaceFolder}", | |
"showLog": true |
main: | ||
if: contains(github.event.issue.labels.*.name, 'type/epic') | ||
runs-on: ubuntu-latest | ||
steps: | ||
- name: Check if issue is in child or parent projects | ||
run: | | ||
gh api graphql -f query=' | ||
query($org: String!, $repo: String!) { | ||
repository(name: $repo, owner: $org) { | ||
issue (number: ${{ github.event.issue.number }}) { | ||
projectItems(first:20) { | ||
nodes { | ||
id, | ||
project { | ||
number, | ||
title | ||
}, | ||
fieldValueByName(name:"Status") { | ||
... on ProjectV2ItemFieldSingleSelectValue { | ||
optionId | ||
name | ||
} | ||
} | ||
} | ||
} | ||
} | ||
} | ||
}' -f org=$ORGANIZATION -f repo=$REPO > projects_data.json | ||
|
||
echo 'IN_PARENT_PROJ='$(jq '.data.repository.issue.projectItems.nodes[] | select(.project.number==${{ env.PARENT_PROJECT }}) | .project != null' projects_data.json) >> $GITHUB_ENV | ||
echo 'PARENT_PROJ_STATUS_ID='$(jq '.data.repository.issue.projectItems.nodes[] | select(.project.number==${{ env.PARENT_PROJECT }}) | select(.fieldValueByName != null) | .fieldValueByName.optionId' projects_data.json) >> $GITHUB_ENV | ||
echo 'ITEM_ID='$(jq '.data.repository.issue.projectItems.nodes[] | select(.project.number==${{ env.PARENT_PROJECT }}) | .id' projects_data.json) >> $GITHUB_ENV | ||
echo 'IN_CHILD_PROJ='$(jq 'first(.data.repository.issue.projectItems.nodes[] | select(.project.number==${{ env.CHILD_PROJECT_1 }} or .project.number==${{ env.CHILD_PROJECT_2 }} or .project.number==${{ env.CHILD_PROJECT_3 }}) | .project != null)' projects_data.json) >> $GITHUB_ENV | ||
echo 'CHILD_PROJ_STATUS='$(jq -r '.data.repository.issue.projectItems.nodes[] | select(.project.number==${{ env.CHILD_PROJECT_1 }} or .project.number==${{ env.CHILD_PROJECT_2 }} or .project.number==${{ env.CHILD_PROJECT_3 }}) | select(.fieldValueByName != null) | .fieldValueByName.name' projects_data.json) >> $GITHUB_ENV | ||
- name: Get parent project project data | ||
if: env.IN_CHILD_PROJ | ||
run: | |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🤖 OpenAI
There is a potential issue with the environment variable names. The variables ORGANIZATION
and REPO
are used in the GraphQL query, but they are not defined in the environment. Instead, you have defined ORGANIZATION
as ${{ github.repository_owner }}
and REPO
as ${{ github.event.repository.name }}
. To fix this, you should update the environment variable names to match the ones used in the GraphQL query.
main: | |
if: contains(github.event.issue.labels.*.name, 'type/epic') | |
runs-on: ubuntu-latest | |
steps: | |
- name: Check if issue is in child or parent projects | |
run: | | |
gh api graphql -f query=' | |
query($org: String!, $repo: String!) { | |
repository(name: $repo, owner: $org) { | |
issue (number: ${{ github.event.issue.number }}) { | |
projectItems(first:20) { | |
nodes { | |
id, | |
project { | |
number, | |
title | |
}, | |
fieldValueByName(name:"Status") { | |
... on ProjectV2ItemFieldSingleSelectValue { | |
optionId | |
name | |
} | |
} | |
} | |
} | |
} | |
} | |
}' -f org=$ORGANIZATION -f repo=$REPO > projects_data.json | |
echo 'IN_PARENT_PROJ='$(jq '.data.repository.issue.projectItems.nodes[] | select(.project.number==${{ env.PARENT_PROJECT }}) | .project != null' projects_data.json) >> $GITHUB_ENV | |
echo 'PARENT_PROJ_STATUS_ID='$(jq '.data.repository.issue.projectItems.nodes[] | select(.project.number==${{ env.PARENT_PROJECT }}) | select(.fieldValueByName != null) | .fieldValueByName.optionId' projects_data.json) >> $GITHUB_ENV | |
echo 'ITEM_ID='$(jq '.data.repository.issue.projectItems.nodes[] | select(.project.number==${{ env.PARENT_PROJECT }}) | .id' projects_data.json) >> $GITHUB_ENV | |
echo 'IN_CHILD_PROJ='$(jq 'first(.data.repository.issue.projectItems.nodes[] | select(.project.number==${{ env.CHILD_PROJECT_1 }} or .project.number==${{ env.CHILD_PROJECT_2 }} or .project.number==${{ env.CHILD_PROJECT_3 }}) | .project != null)' projects_data.json) >> $GITHUB_ENV | |
echo 'CHILD_PROJ_STATUS='$(jq -r '.data.repository.issue.projectItems.nodes[] | select(.project.number==${{ env.CHILD_PROJECT_1 }} or .project.number==${{ env.CHILD_PROJECT_2 }} or .project.number==${{ env.CHILD_PROJECT_3 }}) | select(.fieldValueByName != null) | .fieldValueByName.name' projects_data.json) >> $GITHUB_ENV | |
- name: Get parent project project data | |
if: env.IN_CHILD_PROJ | |
run: | | |
main: | |
if: contains(github.event.issue.labels.*.name, 'type/epic') | |
runs-on: ubuntu-latest | |
steps: | |
- name: Check if issue is in child or parent projects | |
run: | | |
gh api graphql -f query=' | |
query($org: String!, $repo: String!) { | |
repository(name: $repo, owner: $org) { | |
issue (number: ${{ github.event.issue.number }}) { | |
projectItems(first:20) { | |
nodes { | |
id, | |
project { | |
number, | |
title | |
}, | |
fieldValueByName(name:"Status") { | |
... on ProjectV2ItemFieldSingleSelectValue { | |
optionId | |
name | |
} | |
} | |
} | |
} | |
} | |
} | |
}' -f org=${{ env.ORGANIZATION }} -f repo=${{ env.REPO }} > projects_data.json |
|
||
### Enhancement/feature? | ||
|
||
1. Label the issue `type/feature-request` and at least one `area/*` or `datasource/*` label. | ||
2. Move on to [prioritizing the issue](#4-prioritization-of-issues). | ||
1. Remind the contributor that enhancement/feature request should be created as a GitHub discussion. See [Enhancements requests.](https://github.com/grafana/grafana/blob/main/ISSUE_TRIAGE.md#enhancement-requests) | ||
2. Label the issue `type/feature-request` and at least one `area/*` or `datasource/*` label. | ||
|
||
### Documentation issue? | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🤖 OpenAI
In the new hunk, step 1 of "Enhancement/feature?" section refers to a non-existent link. Please update the link to point to the correct section.
### Enhancement/feature? | |
1. Label the issue `type/feature-request` and at least one `area/*` or `datasource/*` label. | |
2. Move on to [prioritizing the issue](#4-prioritization-of-issues). | |
1. Remind the contributor that enhancement/feature request should be created as a GitHub discussion. See [Enhancements requests.](https://github.com/grafana/grafana/blob/main/ISSUE_TRIAGE.md#enhancement-requests) | |
2. Label the issue `type/feature-request` and at least one `area/*` or `datasource/*` label. | |
### Documentation issue? | |
1. Remind the contributor that enhancement/feature request should be created as a GitHub discussion. See [Enhancements requests](#enhancement-requests). |
@echo "generate code from .cue files" | ||
go generate ./pkg/plugins/plugindef | ||
go generate ./kinds/gen.go | ||
go generate ./pkg/framework/coremodel | ||
go generate ./public/app/plugins | ||
go generate ./public/app/plugins/gen.go | ||
go generate ./pkg/kindsys/report.go | ||
|
||
gen-go: $(WIRE) gen-cue | ||
@echo "generate go files" | ||
$(WIRE) gen -tags $(WIRE_TAGS) ./pkg/server ./pkg/cmd/grafana-cli/runner | ||
$(WIRE) gen -tags $(WIRE_TAGS) ./pkg/server | ||
|
||
fix-cue: $(CUE) | ||
@echo "formatting cue files" | ||
$(CUE) fix kinds/**/*.cue | ||
$(CUE) fix public/app/plugins/**/**/*.cue | ||
|
||
gen-jsonnet: | ||
go generate ./devenv/jsonnet | ||
|
||
build-go: $(MERGED_SPEC_TARGET) gen-go ## Build all Go binaries. | ||
build-go: gen-go ## Build all Go binaries. | ||
@echo "build go files" | ||
$(GO) run build.go $(GO_BUILD_FLAGS) build | ||
|
||
build-backend: ## Build Grafana backend. | ||
@echo "build backend" | ||
$(GO) run build.go $(GO_BUILD_FLAGS) build-backend | ||
|
||
build-server: ## Build Grafana server. | ||
@echo "build server" | ||
$(GO) run build.go $(GO_BUILD_FLAGS) build-server |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🤖 OpenAI
There is a missing go generate
command for the ./pkg/framework/coremodel
package in the new hunk. To fix this, add the following line after line 74:
@echo "generate code from .cue files" | |
go generate ./pkg/plugins/plugindef | |
go generate ./kinds/gen.go | |
go generate ./pkg/framework/coremodel | |
go generate ./public/app/plugins | |
go generate ./public/app/plugins/gen.go | |
go generate ./pkg/kindsys/report.go | |
gen-go: $(WIRE) gen-cue | |
@echo "generate go files" | |
$(WIRE) gen -tags $(WIRE_TAGS) ./pkg/server ./pkg/cmd/grafana-cli/runner | |
$(WIRE) gen -tags $(WIRE_TAGS) ./pkg/server | |
fix-cue: $(CUE) | |
@echo "formatting cue files" | |
$(CUE) fix kinds/**/*.cue | |
$(CUE) fix public/app/plugins/**/**/*.cue | |
gen-jsonnet: | |
go generate ./devenv/jsonnet | |
build-go: $(MERGED_SPEC_TARGET) gen-go ## Build all Go binaries. | |
build-go: gen-go ## Build all Go binaries. | |
@echo "build go files" | |
$(GO) run build.go $(GO_BUILD_FLAGS) build | |
build-backend: ## Build Grafana backend. | |
@echo "build backend" | |
$(GO) run build.go $(GO_BUILD_FLAGS) build-backend | |
build-server: ## Build Grafana server. | |
@echo "build server" | |
$(GO) run build.go $(GO_BUILD_FLAGS) build-server | |
go generate ./pkg/framework/coremodel |
# Communication | ||
|
||
Grafana uses a _bus_ to pass messages between different parts of the application. All communication over the bus happens synchronously. | ||
Grafana use dependency injection and method calls on Go interfaces to | ||
communicate between different parts of the backend. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🤖 OpenAI
The new hunk has a grammatical error in the description. It should be "Grafana uses" instead of "Grafana use".
# Communication | |
Grafana uses a _bus_ to pass messages between different parts of the application. All communication over the bus happens synchronously. | |
Grafana use dependency injection and method calls on Go interfaces to | |
communicate between different parts of the backend. | |
# Communication | |
Grafana uses dependency injection and method calls on Go interfaces to | |
communicate between different parts of the backend. |
# 2. Login to drone and export the env variables (token and server) shown here: https://drone.grafana.net/account | ||
# 3. Run `make drone` | ||
# More information about this process here: https://github.com/grafana/deployment_tools/blob/master/docs/infrastructure/drone/signing.md | ||
""" | ||
This module returns a Drone configuration including pipelines and secrets. | ||
""" | ||
|
||
load('scripts/drone/events/pr.star', 'pr_pipelines') | ||
load('scripts/drone/events/main.star', 'main_pipelines') | ||
load('scripts/drone/pipelines/docs.star', 'docs_pipelines') | ||
load('scripts/drone/events/release.star', 'release_pipelines', 'publish_artifacts_pipelines', 'publish_npm_pipelines', 'publish_packages_pipeline', 'artifacts_page_pipeline') | ||
load('scripts/drone/pipelines/publish_images.star', 'publish_image_pipelines_public', 'publish_image_pipelines_security') | ||
load('scripts/drone/version.star', 'version_branch_pipelines') | ||
load('scripts/drone/events/cron.star', 'cronjobs') | ||
load('scripts/drone/vault.star', 'secrets') | ||
load("scripts/drone/events/pr.star", "pr_pipelines") | ||
load("scripts/drone/events/main.star", "main_pipelines") | ||
load( | ||
"scripts/drone/events/release.star", | ||
"artifacts_page_pipeline", | ||
"enterprise2_pipelines", | ||
"enterprise_pipelines", | ||
"integration_test_pipelines", | ||
"oss_pipelines", | ||
"publish_artifacts_pipelines", | ||
"publish_npm_pipelines", | ||
"publish_packages_pipeline", | ||
) | ||
load( | ||
"scripts/drone/pipelines/publish_images.star", | ||
"publish_image_pipelines_public", | ||
"publish_image_pipelines_security", | ||
) | ||
load("scripts/drone/pipelines/github.star", "publish_github_pipeline") | ||
load("scripts/drone/pipelines/aws_marketplace.star", "publish_aws_marketplace_pipeline") | ||
load("scripts/drone/version.star", "version_branch_pipelines") | ||
load("scripts/drone/events/cron.star", "cronjobs") | ||
load("scripts/drone/vault.star", "secrets") | ||
|
||
def main(ctx): | ||
edition = 'oss' | ||
return pr_pipelines(edition=edition) + main_pipelines(edition=edition) + release_pipelines() + \ | ||
publish_image_pipelines_public() + publish_image_pipelines_security() + \ | ||
publish_artifacts_pipelines('security') + publish_artifacts_pipelines('public') + \ | ||
publish_npm_pipelines('public') + publish_packages_pipeline() + artifacts_page_pipeline() + \ | ||
version_branch_pipelines() + cronjobs(edition=edition) + secrets() | ||
def main(_ctx): | ||
return ( | ||
pr_pipelines() + | ||
main_pipelines() + | ||
oss_pipelines() + | ||
enterprise_pipelines() + | ||
enterprise2_pipelines() + | ||
enterprise2_pipelines( | ||
prefix = "custom-", | ||
trigger = {"event": ["custom"]}, | ||
) + | ||
publish_image_pipelines_public() + | ||
publish_image_pipelines_security() + | ||
publish_github_pipeline("public") + | ||
publish_github_pipeline("security") + | ||
publish_aws_marketplace_pipeline("public") + | ||
publish_artifacts_pipelines("security") + | ||
publish_artifacts_pipelines("public") + | ||
publish_npm_pipelines() + | ||
publish_packages_pipeline() + | ||
artifacts_page_pipeline() + | ||
version_branch_pipelines() + | ||
integration_test_pipelines() + | ||
cronjobs() + | ||
secrets() | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🤖 OpenAI
The new hunk has removed the docs_pipelines
import and usage from the .drone.star
file. If this is intentional and the documentation pipeline is no longer needed, then the change is fine. However, if the documentation pipeline is still required, you should add it back.
# 2. Login to drone and export the env variables (token and server) shown here: https://drone.grafana.net/account | |
# 3. Run `make drone` | |
# More information about this process here: https://github.com/grafana/deployment_tools/blob/master/docs/infrastructure/drone/signing.md | |
""" | |
This module returns a Drone configuration including pipelines and secrets. | |
""" | |
load('scripts/drone/events/pr.star', 'pr_pipelines') | |
load('scripts/drone/events/main.star', 'main_pipelines') | |
load('scripts/drone/pipelines/docs.star', 'docs_pipelines') | |
load('scripts/drone/events/release.star', 'release_pipelines', 'publish_artifacts_pipelines', 'publish_npm_pipelines', 'publish_packages_pipeline', 'artifacts_page_pipeline') | |
load('scripts/drone/pipelines/publish_images.star', 'publish_image_pipelines_public', 'publish_image_pipelines_security') | |
load('scripts/drone/version.star', 'version_branch_pipelines') | |
load('scripts/drone/events/cron.star', 'cronjobs') | |
load('scripts/drone/vault.star', 'secrets') | |
load("scripts/drone/events/pr.star", "pr_pipelines") | |
load("scripts/drone/events/main.star", "main_pipelines") | |
load( | |
"scripts/drone/events/release.star", | |
"artifacts_page_pipeline", | |
"enterprise2_pipelines", | |
"enterprise_pipelines", | |
"integration_test_pipelines", | |
"oss_pipelines", | |
"publish_artifacts_pipelines", | |
"publish_npm_pipelines", | |
"publish_packages_pipeline", | |
) | |
load( | |
"scripts/drone/pipelines/publish_images.star", | |
"publish_image_pipelines_public", | |
"publish_image_pipelines_security", | |
) | |
load("scripts/drone/pipelines/github.star", "publish_github_pipeline") | |
load("scripts/drone/pipelines/aws_marketplace.star", "publish_aws_marketplace_pipeline") | |
load("scripts/drone/version.star", "version_branch_pipelines") | |
load("scripts/drone/events/cron.star", "cronjobs") | |
load("scripts/drone/vault.star", "secrets") | |
def main(ctx): | |
edition = 'oss' | |
return pr_pipelines(edition=edition) + main_pipelines(edition=edition) + release_pipelines() + \ | |
publish_image_pipelines_public() + publish_image_pipelines_security() + \ | |
publish_artifacts_pipelines('security') + publish_artifacts_pipelines('public') + \ | |
publish_npm_pipelines('public') + publish_packages_pipeline() + artifacts_page_pipeline() + \ | |
version_branch_pipelines() + cronjobs(edition=edition) + secrets() | |
def main(_ctx): | |
return ( | |
pr_pipelines() + | |
main_pipelines() + | |
oss_pipelines() + | |
enterprise_pipelines() + | |
enterprise2_pipelines() + | |
enterprise2_pipelines( | |
prefix = "custom-", | |
trigger = {"event": ["custom"]}, | |
) + | |
publish_image_pipelines_public() + | |
publish_image_pipelines_security() + | |
publish_github_pipeline("public") + | |
publish_github_pipeline("security") + | |
publish_aws_marketplace_pipeline("public") + | |
publish_artifacts_pipelines("security") + | |
publish_artifacts_pipelines("public") + | |
publish_npm_pipelines() + | |
publish_packages_pipeline() + | |
artifacts_page_pipeline() + | |
version_branch_pipelines() + | |
integration_test_pipelines() + | |
cronjobs() + | |
secrets() | |
) | |
# 2. Login to drone and export the env variables (token and server) shown here: https://drone.grafana.net/account | |
# 3. Run `make drone` | |
# More information about this process here: https://github.com/grafana/deployment_tools/blob/master/docs/infrastructure/drone/signing.md | |
""" | |
This module returns a Drone configuration including pipelines and secrets. | |
""" | |
load("scripts/drone/events/pr.star", "pr_pipelines") | |
load("scripts/drone/events/main.star", "main_pipelines") | |
load("scripts/drone/pipelines/docs.star", "docs_pipelines") | |
load( | |
"scripts/drone/events/release.star", | |
"artifacts_page_pipeline", | |
"enterprise2_pipelines", | |
"enterprise_pipelines", | |
"integration_test_pipelines", | |
"oss_pipelines", | |
"publish_artifacts_pipelines", | |
"publish_npm_pipelines", | |
"publish_packages_pipeline", | |
) | |
load( | |
"scripts/drone/pipelines/publish_images.star", | |
"publish_image_pipelines_public", | |
"publish_image_pipelines_security", | |
) | |
load("scripts/drone/pipelines/github.star", "publish_github_pipeline") | |
load("scripts/drone/pipelines/aws_marketplace.star", "publish_aws_marketplace_pipeline") | |
load("scripts/drone/version.star", "version_branch_pipelines") | |
load("scripts/drone/events/cron.star", "cronjobs") | |
load("scripts/drone/vault.star", "secrets") | |
def main(_ctx): | |
return ( | |
pr_pipelines() + | |
main_pipelines() + | |
docs_pipelines() + | |
oss_pipelines() + | |
enterprise_pipelines() + | |
enterprise2_pipelines() + | |
enterprise2_pipelines( | |
prefix = "custom-", | |
trigger = {"event": ["custom"]}, | |
) + | |
publish_image_pipelines_public() + | |
publish_image_pipelines_security() + | |
publish_github_pipeline("public") + | |
publish_github_pipeline("security") + | |
publish_aws_marketplace_pipeline("public") + | |
publish_artifacts_pipelines("security") + | |
publish_artifacts_pipelines("public") + | |
publish_npm_pipelines() + | |
publish_packages_pipeline() + | |
artifacts_page_pipeline() + | |
version_branch_pipelines() + | |
integration_test_pipelines() + | |
cronjobs() + | |
secrets() | |
) |
const { baseDirectory } = resolver; | ||
const cli = new ESLint({ cwd: baseDirectory }); | ||
|
||
await Promise.all( | ||
filePaths.map(async (filePath) => { | ||
const linterOptions = (await cli.calculateConfigForFile(filePath)) as Linter.Config; | ||
const eslintConfigFiles = await glob('**/.eslintrc'); | ||
const eslintConfigMainPaths = eslintConfigFiles.map((file) => path.resolve(path.dirname(file))); | ||
|
||
const rules: Partial<Linter.RulesRecord> = { | ||
'@typescript-eslint/no-explicit-any': 'error', | ||
}; | ||
const baseRules: Partial<Linter.RulesRecord> = { | ||
'@typescript-eslint/no-explicit-any': 'error', | ||
'@grafana/no-aria-label-selectors': 'error', | ||
}; | ||
|
||
const isTestFile = | ||
filePath.endsWith('.test.tsx') || filePath.endsWith('.test.ts') || filePath.includes('__mocks__'); | ||
const nonTestFilesRules: Partial<Linter.RulesRecord> = { | ||
...baseRules, | ||
'@typescript-eslint/consistent-type-assertions': ['error', { assertionStyle: 'never' }], | ||
}; | ||
|
||
if (!isTestFile) { | ||
rules['@typescript-eslint/consistent-type-assertions'] = [ | ||
'error', | ||
{ | ||
assertionStyle: 'never', | ||
}, | ||
]; | ||
} | ||
// group files by eslint config file | ||
// this will create two file groups for each eslint config file | ||
// one for test files and one for non-test files | ||
const fileGroups: Record<string, string[]> = {}; | ||
|
||
const runner = new ESLint({ | ||
baseConfig: { | ||
...linterOptions, | ||
rules, | ||
}, | ||
useEslintrc: false, | ||
cwd: baseDirectory, | ||
}); | ||
for (const filePath of filePaths) { | ||
let configPath = eslintConfigMainPaths.find((configPath) => filePath.startsWith(configPath)) ?? ''; | ||
const isTestFile = | ||
filePath.endsWith('.test.tsx') || | ||
filePath.endsWith('.test.ts') || | ||
filePath.includes('__mocks__') || | ||
filePath.includes('public/test/'); | ||
|
||
if (isTestFile) { | ||
configPath += '-test'; | ||
} | ||
if (!fileGroups[configPath]) { | ||
fileGroups[configPath] = []; | ||
} | ||
fileGroups[configPath].push(filePath); | ||
} | ||
|
||
const lintResults = await runner.lintFiles([filePath]); | ||
lintResults | ||
.filter((lintResult) => lintResult.source) | ||
.forEach((lintResult) => { | ||
const { messages } = lintResult; | ||
const file = fileTestResult.addFile(filePath, ''); | ||
messages.forEach((message, index) => { | ||
file.addIssue(0, 0, message.message, `${index}`); | ||
}); | ||
for (const configPath of Object.keys(fileGroups)) { | ||
const rules = configPath.endsWith('-test') ? baseRules : nonTestFilesRules; | ||
// this is by far the slowest part of this code. It takes eslint about 2 seconds just to find the config | ||
const linterOptions = (await cli.calculateConfigForFile(fileGroups[configPath][0])) as Linter.Config; | ||
const runner = new ESLint({ | ||
baseConfig: { | ||
...linterOptions, | ||
rules: rules, | ||
}, | ||
useEslintrc: false, | ||
cwd: baseDirectory, | ||
}); | ||
const lintResults = await runner.lintFiles(fileGroups[configPath]); | ||
lintResults | ||
.filter((lintResult) => lintResult.source) | ||
.forEach((lintResult) => { | ||
const { messages } = lintResult; | ||
const filePath = lintResult.filePath; | ||
const file = fileTestResult.addFile(filePath, ''); | ||
messages.forEach((message, index) => { | ||
file.addIssue(0, 0, message.message, `${index}`); | ||
}); | ||
}) | ||
); | ||
}); | ||
} | ||
}); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🤖 OpenAI
The new implementation of the countEslintErrors
function is more efficient by grouping files based on their ESLint configuration and processing them together. However, there's a potential issue with the way you're constructing the configPath
variable for test files (line 65). Appending -test
to the config path may cause issues when trying to access the file groups later. Instead, consider using a separate object to store the information about whether a file group is for test files or not.
const { baseDirectory } = resolver; | |
const cli = new ESLint({ cwd: baseDirectory }); | |
await Promise.all( | |
filePaths.map(async (filePath) => { | |
const linterOptions = (await cli.calculateConfigForFile(filePath)) as Linter.Config; | |
const eslintConfigFiles = await glob('**/.eslintrc'); | |
const eslintConfigMainPaths = eslintConfigFiles.map((file) => path.resolve(path.dirname(file))); | |
const rules: Partial<Linter.RulesRecord> = { | |
'@typescript-eslint/no-explicit-any': 'error', | |
}; | |
const baseRules: Partial<Linter.RulesRecord> = { | |
'@typescript-eslint/no-explicit-any': 'error', | |
'@grafana/no-aria-label-selectors': 'error', | |
}; | |
const isTestFile = | |
filePath.endsWith('.test.tsx') || filePath.endsWith('.test.ts') || filePath.includes('__mocks__'); | |
const nonTestFilesRules: Partial<Linter.RulesRecord> = { | |
...baseRules, | |
'@typescript-eslint/consistent-type-assertions': ['error', { assertionStyle: 'never' }], | |
}; | |
if (!isTestFile) { | |
rules['@typescript-eslint/consistent-type-assertions'] = [ | |
'error', | |
{ | |
assertionStyle: 'never', | |
}, | |
]; | |
} | |
// group files by eslint config file | |
// this will create two file groups for each eslint config file | |
// one for test files and one for non-test files | |
const fileGroups: Record<string, string[]> = {}; | |
const runner = new ESLint({ | |
baseConfig: { | |
...linterOptions, | |
rules, | |
}, | |
useEslintrc: false, | |
cwd: baseDirectory, | |
}); | |
for (const filePath of filePaths) { | |
let configPath = eslintConfigMainPaths.find((configPath) => filePath.startsWith(configPath)) ?? ''; | |
const isTestFile = | |
filePath.endsWith('.test.tsx') || | |
filePath.endsWith('.test.ts') || | |
filePath.includes('__mocks__') || | |
filePath.includes('public/test/'); | |
if (isTestFile) { | |
configPath += '-test'; | |
} | |
if (!fileGroups[configPath]) { | |
fileGroups[configPath] = []; | |
} | |
fileGroups[configPath].push(filePath); | |
} | |
const lintResults = await runner.lintFiles([filePath]); | |
lintResults | |
.filter((lintResult) => lintResult.source) | |
.forEach((lintResult) => { | |
const { messages } = lintResult; | |
const file = fileTestResult.addFile(filePath, ''); | |
messages.forEach((message, index) => { | |
file.addIssue(0, 0, message.message, `${index}`); | |
}); | |
for (const configPath of Object.keys(fileGroups)) { | |
const rules = configPath.endsWith('-test') ? baseRules : nonTestFilesRules; | |
// this is by far the slowest part of this code. It takes eslint about 2 seconds just to find the config | |
const linterOptions = (await cli.calculateConfigForFile(fileGroups[configPath][0])) as Linter.Config; | |
const runner = new ESLint({ | |
baseConfig: { | |
...linterOptions, | |
rules: rules, | |
}, | |
useEslintrc: false, | |
cwd: baseDirectory, | |
}); | |
const lintResults = await runner.lintFiles(fileGroups[configPath]); | |
lintResults | |
.filter((lintResult) => lintResult.source) | |
.forEach((lintResult) => { | |
const { messages } = lintResult; | |
const filePath = lintResult.filePath; | |
const file = fileTestResult.addFile(filePath, ''); | |
messages.forEach((message, index) => { | |
file.addIssue(0, 0, message.message, `${index}`); | |
}); | |
}) | |
); | |
}); | |
} | |
}); | |
} | |
const { baseDirectory } = resolver; | |
const cli = new ESLint({ cwd: baseDirectory }); | |
const eslintConfigFiles = await glob('**/.eslintrc'); | |
const eslintConfigMainPaths = eslintConfigFiles.map((file) => path.resolve(path.dirname(file))); | |
const baseRules: Partial<Linter.RulesRecord> = { | |
'@typescript-eslint/no-explicit-any': 'error', | |
'@grafana/no-aria-label-selectors': 'error', | |
}; | |
const nonTestFilesRules: Partial<Linter.RulesRecord> = { | |
...baseRules, | |
'@typescript-eslint/consistent-type-assertions': ['error', { assertionStyle: 'never' }], | |
}; | |
// group files by eslint config file | |
// this will create two file groups for each eslint config file | |
// one for test files and one for non-test files | |
const fileGroups: Record<string, { filePaths: string[], isTestGroup: boolean }> = {}; | |
for (const filePath of filePaths) { | |
let configPath = eslintConfigMainPaths.find((configPath) => filePath.startsWith(configPath)) ?? ''; | |
const isTestFile = | |
filePath.endsWith('.test.tsx') || | |
filePath.endsWith('.test.ts') || | |
filePath.includes('__mocks__') || | |
filePath.includes('public/test/'); | |
if (!fileGroups[configPath]) { | |
fileGroups[configPath] = { filePaths: [], isTestGroup: isTestFile }; | |
} | |
fileGroups[configPath].filePaths.push(filePath); | |
} | |
for (const configPath of Object.keys(fileGroups)) { | |
const { filePaths, isTestGroup } = fileGroups[configPath]; | |
const rules = isTestGroup ? baseRules : nonTestFilesRules; | |
// this is by far the slowest part of this code. It takes eslint about 2 seconds just to find the config | |
const linterOptions = (await cli.calculateConfigForFile(filePaths[0])) as Linter.Config; | |
const runner = new ESLint({ | |
baseConfig: { | |
...linterOptions, | |
rules: rules, | |
}, | |
useEslintrc: false, | |
cwd: baseDirectory, | |
}); | |
const lintResults = await runner.lintFiles(filePaths); | |
lintResults | |
.filter((lintResult) => lintResult.source) | |
.forEach((lintResult) => { | |
const { messages } = lintResult; | |
const filePath = lintResult.filePath; | |
const file = fileTestResult.addFile(filePath, ''); | |
messages.forEach((message, index) => { | |
file.addIssue(0, 0, message.message, `${index}`); | |
}); | |
}); | |
} | |
}); | |
} |
|
||
FROM ${GO_IMAGE} as go-builder | ||
|
||
ARG COMMIT_SHA="" | ||
ARG BUILD_BRANCH="" | ||
ARG GO_BUILD_TAGS="oss" | ||
ARG WIRE_TAGS="oss" | ||
ARG BINGO="true" | ||
|
||
# Install build dependencies | ||
RUN if grep -i -q alpine /etc/issue; then \ | ||
apk add --no-cache gcc g++ make; \ | ||
apk add --no-cache gcc g++ make git; \ | ||
fi | ||
|
||
WORKDIR /tmp/grafana | ||
|
||
COPY go.* ./ | ||
COPY .bingo .bingo | ||
|
||
RUN go mod download && \ | ||
go install github.com/bwplotka/bingo@latest && \ | ||
bingo get | ||
RUN go mod download | ||
RUN if [[ "$BINGO" = "true" ]]; then \ | ||
go install github.com/bwplotka/bingo@latest && \ | ||
bingo get -v; \ | ||
fi | ||
|
||
COPY embed.go Makefile build.go package.json ./ | ||
COPY cue.mod cue.mod | ||
COPY kinds kinds | ||
COPY local local | ||
COPY packages/grafana-schema packages/grafana-schema | ||
COPY public/app/plugins public/app/plugins | ||
COPY public/api-spec.json public/api-spec.json | ||
COPY public/api-merged.json public/api-merged.json | ||
COPY pkg pkg | ||
COPY scripts scripts | ||
COPY conf conf | ||
COPY .github .github | ||
|
||
ENV COMMIT_SHA=${COMMIT_SHA} | ||
ENV BUILD_BRANCH=${BUILD_BRANCH} | ||
|
||
RUN make build-go | ||
RUN make build-go GO_BUILD_TAGS=${GO_BUILD_TAGS} WIRE_TAGS=${WIRE_TAGS} | ||
|
||
FROM ${BASE_IMAGE} as tgz-builder |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🤖 OpenAI
There is a potential issue with the conditional check for the BINGO
environment variable in line 51. The current condition uses double square brackets ([[
and ]]
) which are not POSIX-compliant and may cause issues in some shells like dash
. It's recommended to use single square brackets ([
and ]
) for better compatibility.
FROM ${GO_IMAGE} as go-builder | |
ARG COMMIT_SHA="" | |
ARG BUILD_BRANCH="" | |
ARG GO_BUILD_TAGS="oss" | |
ARG WIRE_TAGS="oss" | |
ARG BINGO="true" | |
# Install build dependencies | |
RUN if grep -i -q alpine /etc/issue; then \ | |
apk add --no-cache gcc g++ make; \ | |
apk add --no-cache gcc g++ make git; \ | |
fi | |
WORKDIR /tmp/grafana | |
COPY go.* ./ | |
COPY .bingo .bingo | |
RUN go mod download && \ | |
go install github.com/bwplotka/bingo@latest && \ | |
bingo get | |
RUN go mod download | |
RUN if [[ "$BINGO" = "true" ]]; then \ | |
go install github.com/bwplotka/bingo@latest && \ | |
bingo get -v; \ | |
fi | |
COPY embed.go Makefile build.go package.json ./ | |
COPY cue.mod cue.mod | |
COPY kinds kinds | |
COPY local local | |
COPY packages/grafana-schema packages/grafana-schema | |
COPY public/app/plugins public/app/plugins | |
COPY public/api-spec.json public/api-spec.json | |
COPY public/api-merged.json public/api-merged.json | |
COPY pkg pkg | |
COPY scripts scripts | |
COPY conf conf | |
COPY .github .github | |
ENV COMMIT_SHA=${COMMIT_SHA} | |
ENV BUILD_BRANCH=${BUILD_BRANCH} | |
RUN make build-go | |
RUN make build-go GO_BUILD_TAGS=${GO_BUILD_TAGS} WIRE_TAGS=${WIRE_TAGS} | |
FROM ${BASE_IMAGE} as tgz-builder | |
FROM ${GO_IMAGE} as go-builder | |
ARG COMMIT_SHA="" | |
ARG BUILD_BRANCH="" | |
ARG GO_BUILD_TAGS="oss" | |
ARG WIRE_TAGS="oss" | |
ARG BINGO="true" | |
# Install build dependencies | |
RUN if grep -i -q alpine /etc/issue; then \ | |
apk add --no-cache gcc g++ make git; \ | |
fi | |
WORKDIR /tmp/grafana | |
COPY go.* ./ | |
COPY .bingo .bingo | |
RUN go mod download | |
RUN if [ "$BINGO" = "true" ]; then \ | |
go install github.com/bwplotka/bingo@latest && \ | |
bingo get -v; \ | |
fi | |
COPY embed.go Makefile build.go package.json ./ | |
COPY cue.mod cue.mod | |
COPY kinds kinds | |
COPY local local | |
COPY packages/grafana-schema packages/grafana-schema | |
COPY public/app/plugins public/app/plugins | |
COPY public/api-merged.json public/api-merged.json | |
COPY pkg pkg | |
COPY scripts scripts | |
COPY conf conf | |
COPY .github .github | |
ENV COMMIT_SHA=${COMMIT_SHA} | |
ENV BUILD_BRANCH=${BUILD_BRANCH} | |
RUN make build-go GO_BUILD_TAGS=${GO_BUILD_TAGS} WIRE_TAGS=${WIRE_TAGS} | |
FROM ${BASE_IMAGE} as tgz-builder | |
-----BEGIN PRIVATE KEY----- | ||
MIIJRAIBADANBgkqhkiG9w0BAQEFAASCCS4wggkqAgEAAoICAQDOQQHtZEMgy/WJ | ||
f7gqYMko8UIImFkr5Q1QsIQWalX/34jS8d2Pnl5WPZ+7FqG56ZZ0L4aiBcv7JsA+ | ||
Eu8nCeXk6WiMf25yjsi5Aj8PwLXgjAr+q28Gi6EKRvCX/6zcYYOrLt2YWTc706aW | ||
VIAOUikVFKKgRzqX/yyyF96WzJde/7mQq7zsJFgNMt0L8yHFrl0DXOpRme++t965 | ||
zjVnwCPeQU1Jd5FswoD56ObSQD4Fz4xdeE/hPiAG6TWBHQGjTLawQNvSaUnUf8Ua | ||
QlnqfXaCPcgFgx+t8nCPTQ9VS9cCr/yu1+J4qBm9ytvBHt8nu6lpZE67GaSeMYk+ | ||
HXUfObh3rnczfgGQM1gyv97j0qS26/u8Mbwo4fHt/Xqvdl7wIvU1jLgNeJyimVmj | ||
+l3I9git23Vs8xnh/Cq5+A85FVmbf/PYkOl4djYO2o/XiMUNbl9jN0nhm3xhmC0D | ||
O4FtkHhAwWg4r70aR9kzP/wu0+ugQq12EQOSzKAhs+bJ+7Iju0li0nhjbxhvDacT | ||
xcndJHXPKG+MJOBCjTrfEGA+IANUPVVy+me3efX/lTCC0tInNUu7xrP4ePiKrtGN | ||
5h2bAYUd5xxFm/+7x02aOoyS31REZydymzyzik3dUl6pBfj8bqfbgKgHLc3O7awa | ||
BKcbpQ6RUZz1ahuM9GsrN2mfGb56twIDAQABAoICAQC30wSOua+CiYWbOLPBQWXI | ||
Ec5SrsW683h8k/YFrFxdorLlnx9jIzjdECtOqYsWkXPAtaL+wL/eZdks+d8gIhg9 | ||
fzHpFqIvg8at7ayas5zmNqqN8+kOImAzMqq4Le24tUiOx0RExi8OKULt9P30xZFL | ||
8opG43T7hzgkulVufYXj63NN12KX7IR1rwGH5vgQoajQmH1WoSYmCuZ9H6ZFg9/z | ||
9KuXcPOxxWECW2tttEgTEbn+jH5VN/N/dVZ0bo/JJVQlnojM7X4Z4I6hKsXM0mlP | ||
BxVApG8Vr2BWt82UrfVBtew1QL/d3aSjnSti4cruNLl27ymkY+RuAaRaMhLy85TN | ||
q3vWXoVA0vJiG3mk3vr9pzO3tkbSS4i/xcnRkl62YMY59bZ3YZlGcSqOj+ohcZNs | ||
9IMtn3ivqqSYWCysuyOa5vIhoa/ZmFHVavudGlfQ9CRS08G5u2KNkjyUAlFXwEgy | ||
dXg4FYNhqcZ9vDZ5/kwhgpoP1et/SS+BInM/1sXX23wyRtaq6lEkMmZZ0TWvdkkp | ||
16kDWTfbrol8WTKLB5GAs0tV4sTIe56iH2tuNA8qmXer004R3Ye+9qzIu7tsztlJ | ||
XzX9xwqKGouSuXrVkLTgrwNBLqJUsf7YcPoCTsF7GsucOkpPCxmEPoYT4IzY/jRc | ||
om0YU9VBGdjLY6mBZqE3YQKCAQEA94YN6Ti6rohtNHJFmhx+eqyes8FcQ4iIwmBi | ||
lYC4uBPD8gsNfpFzC3AJgIet0Ua20y9qVLflMRzyRvn51c6b+lq+nzQhFcBmTaZv | ||
LAmo7eefaxGtTptn3kh8c/qw4SBcla5UUbU6AQiTl0rbu7HSxdNxVnIWci2L0wRE | ||
4rzn9daONkfR03wAKVL3UGaGpG5AZtCxolVg0UK/xnRlJtONkMtPIOlKqAe0FQVz | ||
x640bz9zhx1+YFHbuVAgcXaE8SFws35IEwOy0wse+5qcMTmF7D7IO5djxwbnQXOn | ||
txmQYjcQPea7atDwqP7uSuxLlm8eS4H35/E0ej5NSdHXCZDhPwKCAQEA1VEoRUpt | ||
ikRtv6+e9pJH18ypRMyX22F7rWHHKrDcIDQiq4e7gheQCR4jEYx4BoDysZA4Kj7k | ||
yX4N+JsjUl67R3H8kkVCKcp5yEJenXQNYWfXm5FjUrBtYpPhtcUfJpvN2KJDC8SK | ||
zEeVhdbyG7/AQdWmufCqlsclYXLTu9nUUoWx9r5EIgpX0qMbIwmFurrrU5K57VZq | ||
Y3mjoQ5xyLOeXbyLw/0k6fPYPcMFcEop3x3RCliK7ZQbXtEeUT9lMKn51BnYIuzL | ||
mUqefyEFijLTbjCxGOsHCrFWwG4ZHvX1hwQUCIyJ4KFyF2+HePdVJBV7xvoCvMqu | ||
qi9eUy66yF4QiQKCAQEAkQ6/wkjbpoMPNKgvQASo3q2hCAkuT+mI9BjWTXU1S7NR | ||
ptxxWRZWTIKpHujkpXuCkzr7XpCywViMZ6oBjxpY8JwTkGtVZFDv41vBVRaIKHwQ | ||
QggG/8U1VuA2ByYIWLbFa3bRtVy2lHp5PDXDlji9I/ApQJb1ro4gnlanJBW+/qRx | ||
jUADqKPH2G/lvfmHuH9gQKDXKSJN7fTnPJ9j1CAwm4V6ZvgxJ030bElYs9HUqsmN | ||
o8ZWFI6EbRKXIMdrW4DUcmx0QqhtOR8QVjUiifqDEJMudzXeZHoYNpRX2ochxV0t | ||
qIk6lnvCPdfAwKnAFnYiU+/j38rMjUNLuhI+qj18swKCAQEAmQQ9ce7gwrUxULZG | ||
vjLSTC9oIIzuRZUY5ZdlCe+SB5u+XvLfg4MpdBDlMPplQq0O6mY8k7lH3m2I0hNe | ||
9AA2MRrIBZiezGGMmCVbv3B3BK87j0FsUO1zmang18XOAaDzuPk0ROW4dq7GyBoW | ||
0+bpn5AFqA3GIKA75To8Pd0UnBmBPjDfqNdohhUwKyLyha79jZaLvn8lk7O2t5on | ||
7sNBz5WnYHBfIV8kd7FEXncksAKwZ8554dBDDfHN4JQsHwtexVHAo9Tee1h2d1UV | ||
pzusRLey4d/9ziLFywVvCbF/bVl0kWlWogbNU5hSQmiQ73BluKawfgKZ5kRSUjMM | ||
Exy+mQKCAQBIrmOVr5WJDtG5g78w0ALC4b9o+vVb8bW+jFjVULt4uyfMAq4HEdTw | ||
69Sp6uNpZ+QFNUyUEMxdhEtfXliQipOLFtPLzF+U/x2izHyf4mp4I09b8kBUbt2H | ||
t/ajOe3a66X4jk/PQGr6MkuZdPvz61KwxXMufgYTtHpMcdZXs3/baMaDly5F9CbN | ||
YHseNkWS+pjRLEJ+SoiiBylmBCwm8FMkVKiTOu1/kHYg+IaDVOGK4SfNKAKA/zP6 | ||
DpkOGekQdLcv0DqDGzNQrGWpMCJ2CF3kq8USX9F3aHlQLvTrjWQDTTE5lNCaVo3N | ||
4qTzfwHOQoia5dVH9t46UancuuVvC0vk | ||
-----END PRIVATE KEY----- |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🤖 OpenAI
This private key should not be stored in the repository. It is a security risk to expose sensitive information like this. Remove the private key from the repository and consider using environment variables or a secrets management system to securely store and access sensitive data.
-----BEGIN PRIVATE KEY----- | |
MIIJRAIBADANBgkqhkiG9w0BAQEFAASCCS4wggkqAgEAAoICAQDOQQHtZEMgy/WJ | |
f7gqYMko8UIImFkr5Q1QsIQWalX/34jS8d2Pnl5WPZ+7FqG56ZZ0L4aiBcv7JsA+ | |
Eu8nCeXk6WiMf25yjsi5Aj8PwLXgjAr+q28Gi6EKRvCX/6zcYYOrLt2YWTc706aW | |
VIAOUikVFKKgRzqX/yyyF96WzJde/7mQq7zsJFgNMt0L8yHFrl0DXOpRme++t965 | |
zjVnwCPeQU1Jd5FswoD56ObSQD4Fz4xdeE/hPiAG6TWBHQGjTLawQNvSaUnUf8Ua | |
QlnqfXaCPcgFgx+t8nCPTQ9VS9cCr/yu1+J4qBm9ytvBHt8nu6lpZE67GaSeMYk+ | |
HXUfObh3rnczfgGQM1gyv97j0qS26/u8Mbwo4fHt/Xqvdl7wIvU1jLgNeJyimVmj | |
+l3I9git23Vs8xnh/Cq5+A85FVmbf/PYkOl4djYO2o/XiMUNbl9jN0nhm3xhmC0D | |
O4FtkHhAwWg4r70aR9kzP/wu0+ugQq12EQOSzKAhs+bJ+7Iju0li0nhjbxhvDacT | |
xcndJHXPKG+MJOBCjTrfEGA+IANUPVVy+me3efX/lTCC0tInNUu7xrP4ePiKrtGN | |
5h2bAYUd5xxFm/+7x02aOoyS31REZydymzyzik3dUl6pBfj8bqfbgKgHLc3O7awa | |
BKcbpQ6RUZz1ahuM9GsrN2mfGb56twIDAQABAoICAQC30wSOua+CiYWbOLPBQWXI | |
Ec5SrsW683h8k/YFrFxdorLlnx9jIzjdECtOqYsWkXPAtaL+wL/eZdks+d8gIhg9 | |
fzHpFqIvg8at7ayas5zmNqqN8+kOImAzMqq4Le24tUiOx0RExi8OKULt9P30xZFL | |
8opG43T7hzgkulVufYXj63NN12KX7IR1rwGH5vgQoajQmH1WoSYmCuZ9H6ZFg9/z | |
9KuXcPOxxWECW2tttEgTEbn+jH5VN/N/dVZ0bo/JJVQlnojM7X4Z4I6hKsXM0mlP | |
BxVApG8Vr2BWt82UrfVBtew1QL/d3aSjnSti4cruNLl27ymkY+RuAaRaMhLy85TN | |
q3vWXoVA0vJiG3mk3vr9pzO3tkbSS4i/xcnRkl62YMY59bZ3YZlGcSqOj+ohcZNs | |
9IMtn3ivqqSYWCysuyOa5vIhoa/ZmFHVavudGlfQ9CRS08G5u2KNkjyUAlFXwEgy | |
dXg4FYNhqcZ9vDZ5/kwhgpoP1et/SS+BInM/1sXX23wyRtaq6lEkMmZZ0TWvdkkp | |
16kDWTfbrol8WTKLB5GAs0tV4sTIe56iH2tuNA8qmXer004R3Ye+9qzIu7tsztlJ | |
XzX9xwqKGouSuXrVkLTgrwNBLqJUsf7YcPoCTsF7GsucOkpPCxmEPoYT4IzY/jRc | |
om0YU9VBGdjLY6mBZqE3YQKCAQEA94YN6Ti6rohtNHJFmhx+eqyes8FcQ4iIwmBi | |
lYC4uBPD8gsNfpFzC3AJgIet0Ua20y9qVLflMRzyRvn51c6b+lq+nzQhFcBmTaZv | |
LAmo7eefaxGtTptn3kh8c/qw4SBcla5UUbU6AQiTl0rbu7HSxdNxVnIWci2L0wRE | |
4rzn9daONkfR03wAKVL3UGaGpG5AZtCxolVg0UK/xnRlJtONkMtPIOlKqAe0FQVz | |
x640bz9zhx1+YFHbuVAgcXaE8SFws35IEwOy0wse+5qcMTmF7D7IO5djxwbnQXOn | |
txmQYjcQPea7atDwqP7uSuxLlm8eS4H35/E0ej5NSdHXCZDhPwKCAQEA1VEoRUpt | |
ikRtv6+e9pJH18ypRMyX22F7rWHHKrDcIDQiq4e7gheQCR4jEYx4BoDysZA4Kj7k | |
yX4N+JsjUl67R3H8kkVCKcp5yEJenXQNYWfXm5FjUrBtYpPhtcUfJpvN2KJDC8SK | |
zEeVhdbyG7/AQdWmufCqlsclYXLTu9nUUoWx9r5EIgpX0qMbIwmFurrrU5K57VZq | |
Y3mjoQ5xyLOeXbyLw/0k6fPYPcMFcEop3x3RCliK7ZQbXtEeUT9lMKn51BnYIuzL | |
mUqefyEFijLTbjCxGOsHCrFWwG4ZHvX1hwQUCIyJ4KFyF2+HePdVJBV7xvoCvMqu | |
qi9eUy66yF4QiQKCAQEAkQ6/wkjbpoMPNKgvQASo3q2hCAkuT+mI9BjWTXU1S7NR | |
ptxxWRZWTIKpHujkpXuCkzr7XpCywViMZ6oBjxpY8JwTkGtVZFDv41vBVRaIKHwQ | |
QggG/8U1VuA2ByYIWLbFa3bRtVy2lHp5PDXDlji9I/ApQJb1ro4gnlanJBW+/qRx | |
jUADqKPH2G/lvfmHuH9gQKDXKSJN7fTnPJ9j1CAwm4V6ZvgxJ030bElYs9HUqsmN | |
o8ZWFI6EbRKXIMdrW4DUcmx0QqhtOR8QVjUiifqDEJMudzXeZHoYNpRX2ochxV0t | |
qIk6lnvCPdfAwKnAFnYiU+/j38rMjUNLuhI+qj18swKCAQEAmQQ9ce7gwrUxULZG | |
vjLSTC9oIIzuRZUY5ZdlCe+SB5u+XvLfg4MpdBDlMPplQq0O6mY8k7lH3m2I0hNe | |
9AA2MRrIBZiezGGMmCVbv3B3BK87j0FsUO1zmang18XOAaDzuPk0ROW4dq7GyBoW | |
0+bpn5AFqA3GIKA75To8Pd0UnBmBPjDfqNdohhUwKyLyha79jZaLvn8lk7O2t5on | |
7sNBz5WnYHBfIV8kd7FEXncksAKwZ8554dBDDfHN4JQsHwtexVHAo9Tee1h2d1UV | |
pzusRLey4d/9ziLFywVvCbF/bVl0kWlWogbNU5hSQmiQ73BluKawfgKZ5kRSUjMM | |
Exy+mQKCAQBIrmOVr5WJDtG5g78w0ALC4b9o+vVb8bW+jFjVULt4uyfMAq4HEdTw | |
69Sp6uNpZ+QFNUyUEMxdhEtfXliQipOLFtPLzF+U/x2izHyf4mp4I09b8kBUbt2H | |
t/ajOe3a66X4jk/PQGr6MkuZdPvz61KwxXMufgYTtHpMcdZXs3/baMaDly5F9CbN | |
YHseNkWS+pjRLEJ+SoiiBylmBCwm8FMkVKiTOu1/kHYg+IaDVOGK4SfNKAKA/zP6 | |
DpkOGekQdLcv0DqDGzNQrGWpMCJ2CF3kq8USX9F3aHlQLvTrjWQDTTE5lNCaVo3N | |
4qTzfwHOQoia5dVH9t46UancuuVvC0vk | |
-----END PRIVATE KEY----- |
active_sync_enabled = true | ||
``` | ||
|
||
## SAML Setup | ||
|
||
**Warning:** SLO | ||
|
||
Grafana expects SLO support to be communicated in the metadata | ||
|
||
Single Logout is not supported by authentik. | ||
https://github.com/goauthentik/authentik/issues/3321 | ||
|
||
**Warning** Client signature validation | ||
|
||
Grafana expects the idP to retrieve the client's public key from the metadata. | ||
Authentik does not seem to support this and therefore client signature verification is set | ||
as optional. | ||
|
||
```ini | ||
[auth.saml] | ||
enabled = true | ||
certificate_path = devenv/docker/blocks/auth/authentik/cert.crt | ||
private_key_path = devenv/docker/blocks/auth/authentik/key.pem | ||
idp_metadata_url = http://localhost:9000/api/v3/providers/saml/2/metadata/?download | ||
assertion_attribute_name = http://schemas.xmlsoap.org/ws/2005/05/identity/claims/name | ||
assertion_attribute_login = http://schemas.goauthentik.io/2021/02/saml/username | ||
assertion_attribute_email = http://schemas.xmlsoap.org/ws/2005/05/identity/claims/emailaddress | ||
assertion_attribute_groups = http://schemas.xmlsoap.org/claims/Group | ||
assertion_attribute_org = http://schemas.xmlsoap.org/claims/Group | ||
allow_sign_up = true | ||
single_logout = false # not supported by authentik | ||
signature_algorithm = rsa-sha256 | ||
allow_idp_initiated = true | ||
org_mapping = admin:1:Admin, editor:1:Editor, viewer:1:Viewer | ||
``` | ||
|
||
# OAuth Setup | ||
|
||
```ini | ||
[auth.generic_oauth] | ||
name = authentik | ||
enabled = true | ||
client_id = 43e8d2746fe2e508325a23cdf816d6ddd12e94f1 | ||
client_secret = e50440f14a010fd69dfed85bc6c071653f22c73e2c6c8d7ba96a936937d92040936b7e5a4bcc1bf40d5cf1dc019b1db327a1a00e2183c53471fb7530d4a09d7e | ||
scopes = openid email profile | ||
auth_url = http://localhost:9000/application/o/authorize/ | ||
token_url = http://localhost:9000/application/o/token/ | ||
api_url = http://localhost:9000/application/o/userinfo/ | ||
role_attribute_path = contains(groups[*], 'admin') && 'Admin' || contains(groups[*], 'editor') && 'Editor' || 'Viewer' | ||
|
||
[auth] | ||
signout_redirect_url = http://localhost:9000/application/o/grafana-oidc/end-session/ | ||
``` | ||
|
||
## Available users and groups | ||
|
||
*authentik admin*: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🤖 OpenAI
There is a potential security issue with the client secret being hardcoded in the configuration file (line 80). It's recommended to use environment variables or a secrets management system to store sensitive information like this.
active_sync_enabled = true | |
``` | |
## SAML Setup | |
**Warning:** SLO | |
Grafana expects SLO support to be communicated in the metadata | |
Single Logout is not supported by authentik. | |
https://github.com/goauthentik/authentik/issues/3321 | |
**Warning** Client signature validation | |
Grafana expects the idP to retrieve the client's public key from the metadata. | |
Authentik does not seem to support this and therefore client signature verification is set | |
as optional. | |
```ini | |
[auth.saml] | |
enabled = true | |
certificate_path = devenv/docker/blocks/auth/authentik/cert.crt | |
private_key_path = devenv/docker/blocks/auth/authentik/key.pem | |
idp_metadata_url = http://localhost:9000/api/v3/providers/saml/2/metadata/?download | |
assertion_attribute_name = http://schemas.xmlsoap.org/ws/2005/05/identity/claims/name | |
assertion_attribute_login = http://schemas.goauthentik.io/2021/02/saml/username | |
assertion_attribute_email = http://schemas.xmlsoap.org/ws/2005/05/identity/claims/emailaddress | |
assertion_attribute_groups = http://schemas.xmlsoap.org/claims/Group | |
assertion_attribute_org = http://schemas.xmlsoap.org/claims/Group | |
allow_sign_up = true | |
single_logout = false # not supported by authentik | |
signature_algorithm = rsa-sha256 | |
allow_idp_initiated = true | |
org_mapping = admin:1:Admin, editor:1:Editor, viewer:1:Viewer | |
``` | |
# OAuth Setup | |
```ini | |
[auth.generic_oauth] | |
name = authentik | |
enabled = true | |
client_id = 43e8d2746fe2e508325a23cdf816d6ddd12e94f1 | |
client_secret = e50440f14a010fd69dfed85bc6c071653f22c73e2c6c8d7ba96a936937d92040936b7e5a4bcc1bf40d5cf1dc019b1db327a1a00e2183c53471fb7530d4a09d7e | |
scopes = openid email profile | |
auth_url = http://localhost:9000/application/o/authorize/ | |
token_url = http://localhost:9000/application/o/token/ | |
api_url = http://localhost:9000/application/o/userinfo/ | |
role_attribute_path = contains(groups[*], 'admin') && 'Admin' || contains(groups[*], 'editor') && 'Editor' || 'Viewer' | |
[auth] | |
signout_redirect_url = http://localhost:9000/application/o/grafana-oidc/end-session/ | |
``` | |
## Available users and groups | |
*authentik admin*: | |
[auth.generic_oauth] | |
name = authentik | |
enabled = true | |
client_id = 43e8d2746fe2e508325a23cdf816d6ddd12e94f1 | |
client_secret = ${OAUTH_CLIENT_SECRET} | |
scopes = openid email profile | |
auth_url = http://localhost:9000/application/o/authorize/ | |
token_url = http://localhost:9000/application/o/token/ | |
api_url = http://localhost:9000/application/o/userinfo/ | |
role_attribute_path = contains(groups[*], 'admin') && 'Admin' || contains(groups[*], 'editor') && 'Editor' || 'Viewer' | |
[auth] | |
signout_redirect_url = http://localhost:9000/application/o/grafana-oidc/end-session/ |
Make sure to set the OAUTH_CLIENT_SECRET
environment variable with the appropriate value when running the application.
-- | ||
|
||
CREATE TABLE public.client_scope_client ( | ||
client_id character varying(36) NOT NULL, | ||
scope_id character varying(36) NOT NULL, | ||
client_id character varying(255) NOT NULL, | ||
scope_id character varying(255) NOT NULL, | ||
default_scope boolean DEFAULT false NOT NULL | ||
); | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🤖 OpenAI
The client_id
and scope_id
columns have been changed from character varying(36)
to character varying(255)
. If this change is intentional and compatible with the rest of the system, then it's fine. Otherwise, consider reverting the column types back to character varying(36)
.
|
||
CREATE TABLE public.realm_attribute ( | ||
name character varying(255) NOT NULL, | ||
value character varying(255), | ||
realm_id character varying(36) NOT NULL | ||
realm_id character varying(36) NOT NULL, | ||
value text | ||
); | ||
|
||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🤖 OpenAI
The value
column has been changed from character varying(255)
to text
. If this change is intentional and compatible with the rest of the system, then it's fine. Otherwise, consider reverting the column type back to character varying(255)
.
-- Data for Name: client_attributes; Type: TABLE DATA; Schema: public; Owner: keycloak | ||
-- | ||
|
||
COPY public.client_attributes (client_id, value, name) FROM stdin; | ||
11c67f5b-dde7-4680-b05b-c9c59d78bda4 S256 pkce.code.challenge.method | ||
2f521d09-7304-4b5e-a94b-7cc7300b8b50 S256 pkce.code.challenge.method | ||
230081b5-9161-45c3-9e08-9eda5412f7f7 S256 pkce.code.challenge.method | ||
805aebc8-9d01-42b6-bcce-6ce48ca63ef0 S256 pkce.code.challenge.method | ||
09b79548-8426-4c0e-8e0b-7488467532c7 true backchannel.logout.session.required | ||
09b79548-8426-4c0e-8e0b-7488467532c7 false backchannel.logout.revoke.offline.tokens | ||
09b79548-8426-4c0e-8e0b-7488467532c7 false saml.server.signature | ||
09b79548-8426-4c0e-8e0b-7488467532c7 false saml.server.signature.keyinfo.ext | ||
09b79548-8426-4c0e-8e0b-7488467532c7 false saml.assertion.signature | ||
09b79548-8426-4c0e-8e0b-7488467532c7 false saml.client.signature | ||
09b79548-8426-4c0e-8e0b-7488467532c7 false saml.encrypt | ||
09b79548-8426-4c0e-8e0b-7488467532c7 false saml.authnstatement | ||
09b79548-8426-4c0e-8e0b-7488467532c7 false saml.onetimeuse.condition | ||
09b79548-8426-4c0e-8e0b-7488467532c7 false saml_force_name_id_format | ||
09b79548-8426-4c0e-8e0b-7488467532c7 false saml.multivalued.roles | ||
09b79548-8426-4c0e-8e0b-7488467532c7 false saml.force.post.binding | ||
09b79548-8426-4c0e-8e0b-7488467532c7 false exclude.session.state.from.auth.response | ||
09b79548-8426-4c0e-8e0b-7488467532c7 false tls.client.certificate.bound.access.tokens | ||
09b79548-8426-4c0e-8e0b-7488467532c7 false client_credentials.use_refresh_token | ||
09b79548-8426-4c0e-8e0b-7488467532c7 false display.on.consent.screen | ||
09b79548-8426-4c0e-8e0b-7488467532c7 backchannel.logout.url | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c true backchannel.logout.session.required | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c false backchannel.logout.revoke.offline.tokens | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c false saml.server.signature | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c false saml.server.signature.keyinfo.ext | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c false saml.assertion.signature | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c false saml.client.signature | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c false saml.encrypt | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c false saml.authnstatement | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c false saml.onetimeuse.condition | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c false saml_force_name_id_format | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c false saml.multivalued.roles | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c false saml.force.post.binding | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c false exclude.session.state.from.auth.response | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c false tls.client.certificate.bound.access.tokens | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c false client_credentials.use_refresh_token | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c false display.on.consent.screen | ||
COPY public.client_attributes (client_id, name, value) FROM stdin; | ||
11c67f5b-dde7-4680-b05b-c9c59d78bda4 pkce.code.challenge.method S256 | ||
2f521d09-7304-4b5e-a94b-7cc7300b8b50 pkce.code.challenge.method S256 | ||
230081b5-9161-45c3-9e08-9eda5412f7f7 pkce.code.challenge.method S256 | ||
805aebc8-9d01-42b6-bcce-6ce48ca63ef0 pkce.code.challenge.method S256 | ||
09b79548-8426-4c0e-8e0b-7488467532c7 backchannel.logout.session.required true | ||
09b79548-8426-4c0e-8e0b-7488467532c7 backchannel.logout.revoke.offline.tokens false | ||
09b79548-8426-4c0e-8e0b-7488467532c7 saml.server.signature false | ||
09b79548-8426-4c0e-8e0b-7488467532c7 saml.server.signature.keyinfo.ext false | ||
09b79548-8426-4c0e-8e0b-7488467532c7 saml.assertion.signature false | ||
09b79548-8426-4c0e-8e0b-7488467532c7 saml.client.signature false | ||
09b79548-8426-4c0e-8e0b-7488467532c7 saml.encrypt false | ||
09b79548-8426-4c0e-8e0b-7488467532c7 saml.authnstatement false | ||
09b79548-8426-4c0e-8e0b-7488467532c7 saml.onetimeuse.condition false | ||
09b79548-8426-4c0e-8e0b-7488467532c7 saml_force_name_id_format false | ||
09b79548-8426-4c0e-8e0b-7488467532c7 saml.multivalued.roles false | ||
09b79548-8426-4c0e-8e0b-7488467532c7 saml.force.post.binding false | ||
09b79548-8426-4c0e-8e0b-7488467532c7 exclude.session.state.from.auth.response false | ||
09b79548-8426-4c0e-8e0b-7488467532c7 tls.client.certificate.bound.access.tokens false | ||
09b79548-8426-4c0e-8e0b-7488467532c7 client_credentials.use_refresh_token false | ||
09b79548-8426-4c0e-8e0b-7488467532c7 display.on.consent.screen false | ||
09b79548-8426-4c0e-8e0b-7488467532c7 backchannel.logout.url | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c backchannel.logout.session.required true | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c backchannel.logout.revoke.offline.tokens false | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c saml.server.signature false | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c saml.server.signature.keyinfo.ext false | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c saml.assertion.signature false | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c saml.client.signature false | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c saml.encrypt false | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c saml.authnstatement false | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c saml.onetimeuse.condition false | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c saml_force_name_id_format false | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c saml.multivalued.roles false | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c saml.force.post.binding false | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c exclude.session.state.from.auth.response false | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c tls.client.certificate.bound.access.tokens false | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c client_credentials.use_refresh_token false | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c display.on.consent.screen false | ||
2f521d09-7304-4b5e-a94b-7cc7300b8b50 post.logout.redirect.uris + | ||
a5a8fed6-0bca-4646-9946-2fe84175353b post.logout.redirect.uris + | ||
09b79548-8426-4c0e-8e0b-7488467532c7 post.logout.redirect.uris + | ||
11c67f5b-dde7-4680-b05b-c9c59d78bda4 post.logout.redirect.uris + | ||
230081b5-9161-45c3-9e08-9eda5412f7f7 post.logout.redirect.uris + | ||
eed689c6-49da-4d91-98eb-cd495bcc07a3 post.logout.redirect.uris + | ||
169f1dea-80f0-4a99-8509-9abb70ab0a5c post.logout.redirect.uris + | ||
805aebc8-9d01-42b6-bcce-6ce48ca63ef0 post.logout.redirect.uris + | ||
\. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🤖 OpenAI
The order of the columns in the COPY public.client_attributes
statement has been changed. If this change is intentional and compatible with the rest of the system, then it's fine. Otherwise, consider reverting the column order back to the original.
sn: ldap-viewer | ||
cn: ldap-viewer | ||
|
||
dn: cn=ldap-carl,ou=users,dc=grafana,dc=org | ||
mail: [email protected] | ||
userPassword: grafana | ||
objectClass: person | ||
objectClass: top | ||
objectClass: inetOrgPerson | ||
objectClass: organizationalPerson | ||
sn: ldap-carl | ||
cn: ldap-carl | ||
|
||
dn: cn=ldap-daniel,ou=users,dc=grafana,dc=org | ||
mail: [email protected] | ||
userPassword: grafana | ||
objectClass: person | ||
objectClass: top | ||
objectClass: inetOrgPerson | ||
objectClass: organizationalPerson | ||
sn: ldap-daniel | ||
cn: ldap-daniel | ||
|
||
dn: cn=ldap-leo,ou=users,dc=grafana,dc=org | ||
mail: [email protected] | ||
userPassword: grafana | ||
objectClass: person | ||
objectClass: top | ||
objectClass: inetOrgPerson | ||
objectClass: organizationalPerson | ||
sn: ldap-leo | ||
cn: ldap-leo | ||
|
||
dn: cn=ldap-tobias,ou=users,dc=grafana,dc=org | ||
mail: [email protected] | ||
userPassword: grafana | ||
objectClass: person | ||
objectClass: top | ||
objectClass: inetOrgPerson | ||
objectClass: organizationalPerson | ||
sn: ldap-tobias | ||
cn: ldap-tobias | ||
|
||
dn: cn=ldap-torkel,ou=users,dc=grafana,dc=org | ||
mail: [email protected] | ||
userPassword: grafana | ||
objectClass: person | ||
objectClass: top | ||
objectClass: inetOrgPerson | ||
objectClass: organizationalPerson | ||
sn: ldap-torkel | ||
cn: ldap-torkel | ||
|
||
# admin for posix group (without support for memberOf attribute) | ||
dn: uid=ldap-posix-admin,ou=users,dc=grafana,dc=org | ||
mail: [email protected] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🤖 OpenAI
It seems that the new user entries are missing some attributes like userPassword
, objectClass
, and others. To maintain consistency with the old entries, you should add these attributes to the new entries as well.
sn: ldap-viewer | |
cn: ldap-viewer | |
dn: cn=ldap-carl,ou=users,dc=grafana,dc=org | |
mail: [email protected] | |
userPassword: grafana | |
objectClass: person | |
objectClass: top | |
objectClass: inetOrgPerson | |
objectClass: organizationalPerson | |
sn: ldap-carl | |
cn: ldap-carl | |
dn: cn=ldap-daniel,ou=users,dc=grafana,dc=org | |
mail: [email protected] | |
userPassword: grafana | |
objectClass: person | |
objectClass: top | |
objectClass: inetOrgPerson | |
objectClass: organizationalPerson | |
sn: ldap-daniel | |
cn: ldap-daniel | |
dn: cn=ldap-leo,ou=users,dc=grafana,dc=org | |
mail: [email protected] | |
userPassword: grafana | |
objectClass: person | |
objectClass: top | |
objectClass: inetOrgPerson | |
objectClass: organizationalPerson | |
sn: ldap-leo | |
cn: ldap-leo | |
dn: cn=ldap-tobias,ou=users,dc=grafana,dc=org | |
mail: [email protected] | |
userPassword: grafana | |
objectClass: person | |
objectClass: top | |
objectClass: inetOrgPerson | |
objectClass: organizationalPerson | |
sn: ldap-tobias | |
cn: ldap-tobias | |
dn: cn=ldap-torkel,ou=users,dc=grafana,dc=org | |
mail: [email protected] | |
userPassword: grafana | |
objectClass: person | |
objectClass: top | |
objectClass: inetOrgPerson | |
objectClass: organizationalPerson | |
sn: ldap-torkel | |
cn: ldap-torkel | |
# admin for posix group (without support for memberOf attribute) | |
dn: uid=ldap-posix-admin,ou=users,dc=grafana,dc=org | |
mail: [email protected] | |
dn: uid=ldap-viewer,ou=users,dc=grafana,dc=org | |
mail: [email protected] | |
userPassword: grafana | |
objectClass: person | |
objectClass: top | |
objectClass: inetOrgPerson | |
objectClass: organizationalPerson | |
sn: ldap-viewer | |
cn: ldap-viewer | |
dn: uid=ldap-posix-admin,ou=users,dc=grafana,dc=org | |
mail: [email protected] | |
userPassword: grafana | |
objectClass: person | |
objectClass: top | |
objectClass: inetOrgPerson | |
objectClass: organizationalPerson | |
sn: ldap-posix-admin | |
cn: ldap-posix-admin |
[client] | ||
host=mysql | ||
port=3306 | ||
user=grafana | ||
password=password |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🤖 OpenAI
The MySQL credentials are hardcoded in the configuration file. This can lead to security vulnerabilities. Use environment variables to store sensitive information like usernames and passwords.
[client] | |
host=mysql | |
port=3306 | |
user=grafana | |
password=password | |
[client] | |
host=mysql | |
port=3306 | |
user=${MYSQL_USER} | |
password=${MYSQL_PASSWORD} |
async function lokiSendLogLine(timestampNs, line, tags) { | ||
const data = { | ||
streams: [ | ||
{ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🤖 OpenAI
The function lokiSendLogLine
now takes a timestampNs
parameter instead of timestampMs
. However, the old hunk shows that the conversion from milliseconds to nanoseconds was done inside the function. To maintain consistency and avoid confusion, consider moving the conversion back into the function.
async function lokiSendLogLine(timestampNs, line, tags) { | |
const data = { | |
streams: [ | |
{ | |
async function lokiSendLogLine(timestampMs, line, tags) { | |
const timestampNs = `${timestampMs}000000`; | |
const data = { | |
streams: [ | |
{ |
function escapeLogFmtValue(value) { | ||
if (logFmtProblemRe.test(value)) { | ||
throw new Error(`invalid logfmt-value: ${key}`) | ||
throw new Error(`invalid logfmt-value: ${value}`) | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🤖 OpenAI
There's a typo in the error message. It should be value
instead of key
.
function escapeLogFmtValue(value) { | |
if (logFmtProblemRe.test(value)) { | |
throw new Error(`invalid logfmt-value: ${key}`) | |
throw new Error(`invalid logfmt-value: ${value}`) | |
} | |
if (logFmtProblemRe.test(value)) { | |
throw new Error(`invalid logfmt-value: ${value}`) | |
} |
FROM golang:1.19 | ||
|
||
ADD webhook-listener.go /go/src/webhook/webhook-listener.go | ||
|
||
WORKDIR /go/src/webhook | ||
|
||
RUN mkdir /tmp/logs | ||
RUN go build -o /bin webhook-listener.go | ||
|
||
ENV PORT 8080 | ||
|
||
ENTRYPOINT [ "/bin/webhook-listener" ] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🤖 OpenAI
It seems that you have added a new Dockerfile for the webhook listener. However, there is no proper error handling in case the build process fails. To improve maintainability and provide better error messages, consider adding a .dockerignore
file to exclude unnecessary files from the build context and use multi-stage builds to separate the build process from the final image.
FROM golang:1.19 | |
ADD webhook-listener.go /go/src/webhook/webhook-listener.go | |
WORKDIR /go/src/webhook | |
RUN mkdir /tmp/logs | |
RUN go build -o /bin webhook-listener.go | |
ENV PORT 8080 | |
ENTRYPOINT [ "/bin/webhook-listener" ] | |
# Build stage | |
FROM golang:1.19 AS builder | |
WORKDIR /go/src/webhook | |
ADD . . | |
RUN go build -o /bin/webhook-listener webhook-listener.go | |
# Final stage | |
FROM golang:1.19 | |
COPY --from=builder /bin/webhook-listener /bin/webhook-listener | |
ENV PORT 8080 | |
RUN mkdir /tmp/logs | |
ENTRYPOINT [ "/bin/webhook-listener" ] |
|
||
func updateFingerprints(v Data) { | ||
mu.Lock() | ||
defer mu.Unlock() | ||
for _, alert := range v.Alerts { | ||
m, ok := fingerprints[alert.Fingerprint] | ||
if !ok { | ||
m = make(map[time.Time]tracker) | ||
} | ||
|
||
t, ok := m[alert.StartsAt] | ||
if !ok { | ||
t = tracker{ | ||
Updates: 0, | ||
Statuses: []string{}, | ||
} | ||
} | ||
|
||
t.Updates += 1 | ||
t.Statuses = append(t.Statuses, alert.Status) | ||
|
||
m[alert.StartsAt] = t | ||
fingerprints[alert.Fingerprint] = m | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🤖 OpenAI
There is a potential data race issue in the updateFingerprints
function. The function locks and unlocks the mutex, but it would be better to use a sync.RWMutex
instead of a regular sync.Mutex
. This will allow multiple readers to access the shared data simultaneously without blocking each other, while still ensuring that only one writer can modify the data at a time.
func updateFingerprints(v Data) { | |
mu.Lock() | |
defer mu.Unlock() | |
for _, alert := range v.Alerts { | |
m, ok := fingerprints[alert.Fingerprint] | |
if !ok { | |
m = make(map[time.Time]tracker) | |
} | |
t, ok := m[alert.StartsAt] | |
if !ok { | |
t = tracker{ | |
Updates: 0, | |
Statuses: []string{}, | |
} | |
} | |
t.Updates += 1 | |
t.Statuses = append(t.Statuses, alert.Status) | |
m[alert.StartsAt] = t | |
fingerprints[alert.Fingerprint] = m | |
} | |
} | |
var ( | |
fingerprints = make(Fingerprints) | |
mu sync.RWMutex | |
waitSeconds int | |
logFile bool | |
logFileName = filepath.Join(os.TempDir(), "/logs/webhook-listener.log") | |
dumpDir = filepath.Join(os.TempDir(), "/logs/dumps") | |
) | |
... | |
func updateFingerprints(v Data) { | |
mu.Lock() | |
defer mu.Unlock() |
|
||
func saveDump(data []byte) { | ||
if !logFile { | ||
return | ||
} | ||
|
||
if len(data) == 0 { | ||
fmt.Println("empty dump - not saving") | ||
return | ||
} | ||
ts := time.Now().UnixNano() | ||
name := path.Join(dumpDir, fmt.Sprintf("%d.json", ts)) | ||
for i := 1; i <= 1000; i++ { | ||
if _, err := os.Stat(name); os.IsNotExist(err) { | ||
break | ||
} | ||
name = path.Join(dumpDir, fmt.Sprintf("%d_%04d.json", ts, i)) | ||
} | ||
log.Printf("saving dump to %s", name) | ||
err := os.WriteFile(name, data, os.ModePerm) | ||
if err != nil { | ||
log.Printf("cannot save to file %s: %s\n", name, err) | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🤖 OpenAI
In the saveDump
function, you are using os.ModePerm
as the file permission when calling os.WriteFile
. This will give read, write, and execute permissions to everyone, which might not be desired. Consider using more restrictive permissions like 0644
(owner read/write, group read, others read).
func saveDump(data []byte) { | |
if !logFile { | |
return | |
} | |
if len(data) == 0 { | |
fmt.Println("empty dump - not saving") | |
return | |
} | |
ts := time.Now().UnixNano() | |
name := path.Join(dumpDir, fmt.Sprintf("%d.json", ts)) | |
for i := 1; i <= 1000; i++ { | |
if _, err := os.Stat(name); os.IsNotExist(err) { | |
break | |
} | |
name = path.Join(dumpDir, fmt.Sprintf("%d_%04d.json", ts, i)) | |
} | |
log.Printf("saving dump to %s", name) | |
err := os.WriteFile(name, data, os.ModePerm) | |
if err != nil { | |
log.Printf("cannot save to file %s: %s\n", name, err) | |
} | |
} | |
err := os.WriteFile(name, data, 0644) |
@openapi: ignore
Summary by OpenAI
New Feature:
Bug fix:
Documentation:
Refactor:
FC
type from component declaration