Replies: 1 comment
-
To test this version, here is a quick how-to: Initial setupClone the branch where the code for SDLFv2 currently sits:
Print the deployment script’s help as it may be useful to understand the commands from the rest of this post:
Deploy crossaccount IAM roles in child accounts, necessary for DevOps CICD pipelines:
Seven roles starting with Continue deploying SDLF - this time, deploy the DevOps account resources:
The following CodeCommit repositories should have been created:
Two CloudFormation stacks should have been created: Child Accounts and TeamsIn CodeCommit, an empty repository called
and create a file named
Push the file to the repository In SDLFv1, this was done editing In the same repository, deploy a new team. Create a file name
Push to the repository Pipelines and DatasetsA new CodeCommit repository called
Create a file named
Then another file named
Push both files to the repository There are several changes here compared to SDLFv1. Each stage are declared in a block and attached to a pipeline using In the same repository, deploy a new dataset. Create a file name
Push the file to the repository The main different here compared to SDLFv1 is that all details a pipeline may require are passed as a JSON object to the Feel free to deploy https://github.com/awslabs/aws-serverless-data-lake-framework/tree/main/sdlf-utils/pipeline-examples/legislators and confirm everything works as intended. Thanks for testing! |
Beta Was this translation helpful? Give feedback.
-
Work is ongoing on a new major version of the Serverless Data Lake Framework. This is a pre-release, not ready for production workloads.
What’s New
deploy.sh
takes care of deploying the CICD infrastructure used to build these modules, and register them in the private CloudFormation registry of each account. Modules are updated whenever there is a change to their source repository.pDomain
(which defaults todatalake
) can be provided when deploying foundations.dev
,test
,prod
).sdlf-main
.foundations-{domain}-{env}.yaml
and teams inteams-{domain}-{env}.yaml
.master
,test
anddev
branches are expected.parameters-{env}.json
.sdlf-{domain}-{team name}-main
.pipelines-{env}.yaml
and datasets indatasets-{env}.yaml
.master
,test
anddev
branches are expected.parameters-{env}.json
.sdlf-datalakeLibrary
. They are no longer needed and have been removed.pPipelineDetails
parameter when defining a dataset insdlf-dataset
. This parameter goes even further and can be used to store more information that stages can use. These details are stored in the Datasets DynamoDB table (as was already the case in SDLFv1).pEventPattern
in the example), and then process these events on a schedule (pSchedule
)sdlf-monitoring
, with CloudTrail, ELK and SNS.sdlf-monitoring
is not deployed.sdlf-stage-dataquality
sdlf-stage-dataquality
can now be used as an example on how to add a third stage to the default stageA and stageB pipeline.deploy.sh
, there is no more shell scripts.Full Changelog: 1.5.2...2.0.0-beta.0
This discussion was created from the release Serverless Data Lake Framework 2.0.0-beta.0.
Beta Was this translation helpful? Give feedback.
All reactions