Skip to content

Commit

Permalink
Merge branch 'main' into add-http-code-elb
Browse files Browse the repository at this point in the history
  • Loading branch information
paulhcsun authored Jul 24, 2024
2 parents 7149cc9 + 64afa72 commit 5c0c9ed
Show file tree
Hide file tree
Showing 4 changed files with 50 additions and 23 deletions.
21 changes: 0 additions & 21 deletions .github/workflows/closed-issue-message.yml

This file was deleted.

47 changes: 47 additions & 0 deletions .github/workflows/lock-issue-pr-with-message.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
name: Lock Closed Issues and PRs with message

on:
pull_request_target:
types: [closed]
issues:
types: [closed]

jobs:
auto_comment:
permissions:
pull-requests: write
issues: write
runs-on: ubuntu-latest
steps:
- uses: aws-actions/closed-issue-message@v1
with:
repo-token: "${{ secrets.GITHUB_TOKEN }}"
message: |
Comments on closed issues and PRs are hard for our team to see.
If you need help, please open a new issue that references this one.
If you wish to keep having a conversation with other community members under this issue feel free to do so.
lock:
permissions:
pull-requests: write
issues: write
runs-on: ubuntu-latest
needs: auto_comment
steps:
- name: Lock closed issue or PR
run: |
if [ "${{ github.event_name }}" == "issues" ]; then
ISSUE_NUMBER=${{ github.event.issue.number }}
ISSUE_URL=https://api.github.com/repos/${{ github.repository }}/issues/${ISSUE_NUMBER}/lock
else
ISSUE_NUMBER=${{ github.event.pull_request.number }}
ISSUE_URL=https://api.github.com/repos/${{ github.repository }}/issues/${ISSUE_NUMBER}/lock
fi
curl -s -X PUT -H "Authorization: token ${{ secrets.GITHUB_TOKEN }}" \
-H "Accept: application/vnd.github.v3+json" \
${ISSUE_URL} \
-d @- <<EOF
{
"lock_reason": "resolved"
}
EOF
1 change: 1 addition & 0 deletions packages/aws-cdk-lib/aws-rds/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -938,6 +938,7 @@ Data in S3 buckets can be imported to and exported from certain database engines
functionality, set the `s3ImportBuckets` and `s3ExportBuckets` properties for import and export respectively. When
configured, the CDK automatically creates and configures IAM roles as required.
Additionally, the `s3ImportRole` and `s3ExportRole` properties can be used to set this role directly.
Note: To use `s3ImportRole` and `s3ExportRole` with Aurora PostgreSQL, you must also enable the S3 import and export features when you create the DatabaseClusterEngine.

You can read more about loading data to (or from) S3 here:

Expand Down
4 changes: 2 additions & 2 deletions packages/aws-cdk-lib/aws-rds/lib/cluster.ts
Original file line number Diff line number Diff line change
Expand Up @@ -253,7 +253,7 @@ interface DatabaseClusterBaseProps {
* This feature is only supported by the Aurora database engine.
*
* This property must not be used if `s3ImportBuckets` is used.
*
* To use this property with Aurora PostgreSQL, it must be configured with the S3 import feature enabled when creating the DatabaseClusterEngine
* For MySQL:
* @see https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/AuroraMySQL.Integrating.LoadFromS3.html
*
Expand Down Expand Up @@ -284,7 +284,7 @@ interface DatabaseClusterBaseProps {
* This feature is only supported by the Aurora database engine.
*
* This property must not be used if `s3ExportBuckets` is used.
*
* To use this property with Aurora PostgreSQL, it must be configured with the S3 export feature enabled when creating the DatabaseClusterEngine
* For MySQL:
* @see https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/AuroraMySQL.Integrating.SaveIntoS3.html
*
Expand Down

0 comments on commit 5c0c9ed

Please sign in to comment.