How to create AWS resources to run a CDK application locally using SAM?
I have an application called Example4Be, I have a stack that creates the AWS non-Lambda resources (just DynamoDB in this case, but could be SQS, SNS, etc):
import * as cdk from "aws-cdk-lib"
import {Construct} from "constructs"
import * as dynamodb from "aws-cdk-lib/aws-dynamodb"
export class Example4BeResourcesStack extends cdk.Stack {
public readonly table: dynamodb.Table
constructor(scope: Construct, id: string, props?: cdk.StageProps) {
super(scope, id, props)
this.table = new dynamodb.Table(this, "Hits", {
partitionKey: {
name: "path",
type: dynamodb.AttributeType.STRING
}
})
}
}
And then there's a stack that uses it:
import * as cdk from "aws-cdk-lib"
import * as lambda from "aws-cdk-lib/aws-lambda"
import * as apigw from "aws-cdk-lib/aws-apigateway"
import {Construct} from "constructs"
import {Example4BeResourcesStack} from "./example4-be-resources-stack";
export class Example4BeStack extends cdk.Stack {
public readonly hcEndpoint: cdk.CfnOutput
constructor(scope: Construct, id: string, props?: cdk.StageProps) {
super(scope, id, props)
const resources = new Example4BeResourcesStack(scope, `${id}Resources`)
const hello = new lambda.Function(this, "HelloHandler", {
runtime: lambda.Runtime.NODEJS_14_X,
code: lambda.Code.fromAsset("lambda"),
handler: "hello.handler",
environment: {
HITS_TABLE_NAME: resources.table.tableName
}
})
resources.table.grantReadWriteData(hello)
// defines an API Gateway REST API resource backed by our "hello" function.
const gateway = new apigw.LambdaRestApi(this, "Endpoint", {
handler: hello
})
this.hcEndpoint = new cdk.CfnOutput(this, "GatewayUrl", {
value: gateway.url
})
}
}
I have a file, app.ts
that instantiates the stack:
#!/usr/bin/env node
import * as cdk from "aws-cdk-lib"
import {Example4BeStack} from "../lib/example4-be-stack";
const app = new cdk.App()
new Example4BeStack(app, "Example4BePipeline")
and the way I run it is by running:
cdk synth --no-staging > template.yaml
sam local start-api
but this doesn't seem to create the DynamoDB table. I guess there's a cdk deploy
that I'm missing? How should this be done?
I'm happy to add more of the application if you want, there's CodePipeline involved in the actual deployment.
SAM-CDK integration runs emulated lambdas locally, calling cloud-deployed resources as required.
(1) Deploy your stacks using cdk deploy
Our local-lambdas will interact with deployed-resource1 dependencies (tables, queues, etc). It's an AWS best practice to use a separate account for testing.2
(2) Tell SAM about the lambdas' cloud-side environment variables.
// locals.json is used by SAM to resolve lambda env vars
{
// the lambda resource id from sam-template.yaml
"HandlerFunctionEDBA20C2": {
// our lambda has an environment variable named TABLE_NAME
// this is deployed-resource in cloud - tip: define a CfnOutput in your stack. cdk deploy will emit the values to the terminal
"TABLE_NAME": "CdkTsPlayLocalTestingStack-SuperTable2CB94566-1VPR5PAXO0GM5"
},
"AnotherLambdaCD21C99": {
"QUEUE_ARN": "<queue-arn>"
}
}
or you can have a single set of variables apply to all functions:
{
"Parameters": {
"TABLE_NAME": "CdkTsPlayLocalTestingStack-SuperTable2CB94566-1VPR5PAXO0GM5"
}
}
(3) Now you can iterate locally
Make a change to your lambdas locally, then let the SAM cli do its thing3:
# synth the app template, output a sam format template
npx cdk synth '*' -a 'ts-node ./bin/app.ts' --profile <testing-profile> --no-staging > cdk.out/sam-template.yaml
# call a local function
sam local invoke HandlerFunctionEDBA20C2 \
--template cdk.out/sam-template.yaml \
--profile <testing-profile> \
--event lib/stacks/testing/LocalTestingStack/test-events/create.json \
--env-vars lib/stacks/testing/LocalTestingStack/locals.json
(4) Or integrate with automated tests
start-lambda
starts emulator, automated tests run against the local endpoint. Docs here.
What is happening where?
SAM currently has 3 local test commands:
SAM command | lambda | apigw | other (dynamo, etc) | what for |
---|---|---|---|---|
sam local invoke | local | cloud | test a local lambda | |
sam local start-lambda | local | cloud | watch mode - call a lambda with SDK or CLI | |
sam local start-api | local | local | cloud | watch mode - test local lambda via a local endpoint |
Note that lambdas are the center of all this goodness. We can mock event inputs into local-lambdas. We can emulate an api/sdk endpoint. We can interact with deployed-resources referenced by lambda SDK calls. But other setups are not currently covered. For instance, we can't test a lambda DynamoDB streams handler. Let's hope the local functionality will grow over time.
(1) I use the term "local-lambda" to mean a local version of the lambda, which sam local
will run in a container. "deployed-resources" refers to our lambda's cloud-deployed integrations (e.g. DynamoDB tables, queues, etc.).
(2) At this point, testing does not touch your pipeline stack. Once you are done with local testing, push your changes to github, which will kick off the pipeline deploy to (staging
and) prod
accounts. You can have additional testing steps run as part of the pipeline stages.
(3) The preview sam-beta-cdk
CLI makes it somewhat easier to integrate CDK with SAM than the vanilla CLI sam
. For instance, it seems to create the sam .yaml
for us. But the two tools appear to have same underlying functionality.
Since @fedonev has already answered the question, I will focus on a potential implementation using CodePipeline (specifically CDK Pipelines in Python), since I had been already working on it. I restricted myself to not using anything which is experimental or preview.
Code for the example: https://github.com/KMK-Git/aws-cdk-sam-testing-demo
The Flow
Stages
Source
source = pipelines.CodePipelineSource.connection(
"KMK-Git/aws-cdk-sam-testing-demo",
"main",
connection_arn=ssm.StringParameter.value_for_string_parameter(
self,
"codestar_connection_arn",
),
)
The source code repository is configured using CodeStar connections. The connection was done manually beforehand, and I am getting the ARN from SSM parameter store.
Build, SelfMutate and UploadAssets
cdk_codepipeline = pipelines.CodePipeline(
self,
"Pipeline",
synth=pipelines.ShellStep(
"Synth",
input=source,
install_commands=[
"pip install -r requirements.txt",
"npm install -g aws-cdk",
],
commands=[
"cdk synth",
],
),
)
- The Synth stage is used to synthesize the CDK code into CloudFormation templates.
- The SelfMutate stage is a CDK feature to update the pipeline inside the pipeline itself. This allows you to make changes to your pipeline even after the initial deployment.
- The Assets stage uploads all cdk assets to their destinations, using the cdk-assets command. Since I have separated my stacks into two separate stages, each one gets its own assets upload step.
In case you are not using CDK Pipelines, here is the build spec for all stages:
For Synth:
{
"version": "0.2",
"phases": {
"install": {
"commands": [
"pip install -r requirements.txt",
"npm install -g aws-cdk"
]
},
"build": {
"commands": [
"cdk synth"
]
}
},
"artifacts": {
"base-directory": "cdk.out",
"files": "**/*"
}
}
For SelfMutate:
{
"version": "0.2",
"phases": {
"install": {
"commands": [
"npm install -g aws-cdk"
]
},
"build": {
"commands": [
"cdk -a . deploy PipelineStack --require-approval=never --verbose"
]
}
}
}
For Assets:
{
"version": "0.2",
"phases": {
"install": {
"commands": [
"npm install -g cdk-assets"
]
},
"build": {
"commands": [
"cdk-assets --path \"assembly-LambdaStage/LambdaStageLambdasStackABCD123.assets.json\" --verbose publish \"longrandomstring:current_account-current_region\""
]
}
}
}
Unit Testing
testing = pipelines.CodeBuildStep(
"UnitTesting",
input=source,
install_commands=[
"pip install -r requirements.txt -r requirements-dev.txt",
],
commands=[
"pytest --cov",
],
env={
"QUEUE_URL": "SampleQueue",
"TABLE_NAME": "SampleTest",
},
build_environment=codebuild.BuildEnvironment(
build_image=codebuild.LinuxBuildImage.STANDARD_5_0,
privileged=True,
compute_type=codebuild.ComputeType.SMALL,
),
)
This is a simple step, which runs any unit tests in your code.
In case you are not using CDK Pipelines, here is the build spec:
{
"version": "0.2",
"phases": {
"install": {
"commands": [
"pip install -r requirements.txt -r requirements-dev.txt"
]
},
"build": {
"commands": [
"pytest --cov"
]
}
}
}
Deploy Supporting resources
cdk_codepipeline.add_stage(
supporting_resources_stage,
pre=[
testing,
pipelines.ConfirmPermissionsBroadening(
"CheckSupporting", stage=supporting_resources_stage
),
],
)
This would deploy the supporting resources, which are required for your sam local
tests to run properly. The permissions broadening step stops the pipeline and forces manual approval in case of any broadening of IAM permissions. You can add unit testing in its own separate stage as well.
In case you are not using CDK Pipelines, here is the build spec:
{
"version": 0.2,
"phases": {
"build": {
"commands": [
"npm install -g aws-cdk",
"export PIPELINE_NAME=\"$(node -pe '`${process.env.CODEBUILD_INITIATOR}`.split(\"/\")[1]')\"",
"payload=\"$(node -pe 'JSON.stringify({ \"PipelineName\": process.env.PIPELINE_NAME, \"StageName\": process.env.STAGE_NAME, \"ActionName\": process.env.ACTION_NAME })' )\"",
"ARN=$CODEBUILD_BUILD_ARN",
"REGION=\"$(node -pe '`${process.env.ARN}`.split(\":\")[3]')\"",
"ACCOUNT_ID=\"$(node -pe '`${process.env.ARN}`.split(\":\")[4]')\"",
"PROJECT_NAME=\"$(node -pe '`${process.env.ARN}`.split(\":\")[5].split(\"/\")[1]')\"",
"PROJECT_ID=\"$(node -pe '`${process.env.ARN}`.split(\":\")[6]')\"",
"export LINK=\"https://$REGION.console.aws.amazon.com/codesuite/codebuild/$ACCOUNT_ID/projects/$PROJECT_NAME/build/$PROJECT_NAME:$PROJECT_ID/?region=$REGION\"",
"export PIPELINE_LINK=\"https://$REGION.console.aws.amazon.com/codesuite/codepipeline/pipelines/$PIPELINE_NAME/view?region=$REGION\"",
"if cdk diff -a . --security-only --fail $STAGE_PATH/\\*; then aws lambda invoke --function-name PipelineStack-PipelinePipelinesSecurityCheckCDKalpha-numeric --invocation-type Event --payload \"$payload\" lambda.out; export MESSAGE=\"No security-impacting changes detected.\"; else [ -z \"${NOTIFICATION_ARN}\" ] || aws sns publish --topic-arn $NOTIFICATION_ARN --subject \"$NOTIFICATION_SUBJECT\" --message \"An upcoming change would broaden security changes in $PIPELINE_NAME.\nReview and approve the changes in CodePipeline to proceed with the deployment.\n\nReview the changes in CodeBuild:\n\n$LINK\n\nApprove the changes in CodePipeline (stage $STAGE_NAME, action $ACTION_NAME):\n\n$PIPELINE_LINK\"; export MESSAGE=\"Deployment would make security-impacting changes. Click the link below to inspect them, then click Approve if all changes are expected.\"; fi"
]
}
},
"env": {
"exported-variables": [
"LINK",
"MESSAGE"
]
}
}
sam local testing
sam_cli_test_step = pipelines.CodeBuildStep(
"SAMTesting",
input=source,
env_from_cfn_outputs={
"QUEUE_URL": supporting_resources_stage.stack.queue_url,
"TABLE_NAME": supporting_resources_stage.stack.table_name,
},
install_commands=[
"pip install -r requirements.txt",
"npm install -g aws-cdk",
"mkdir testoutput",
],
commands=[
'cdk synth -a "python synth_lambdas_stack.py" -o sam.out',
'echo "{\\""SqsLambdaFunction\\"": {\\""QUEUE_URL\\"": \\""$QUEUE_URL\\""},'
+ '\\""DynamodbLambdaFunction\\"": {\\""TABLE_NAME\\"": \\""$TABLE_NAME\\"" }}"'
+ " > locals.json",
'sam local invoke -t "sam.out/LambdasStack.template.json" --env-vars locals.json'
+ ' --no-event "DynamodbLambdaFunction"',
'sam local invoke -t "sam.out/LambdasStack.template.json" --env-vars locals.json'
+ ' --no-event "SqsLambdaFunction"',
"nohup sam local start-api -t sam.out/LambdasStack.template.json"
+ " --env-vars locals.json > testoutput/testing.log & ",
"",
"sleep 30",
"curl --fail http://127.0.0.1:3000/sqs",
"curl --fail http://127.0.0.1:3000/dynamodb",
],
build_environment=codebuild.BuildEnvironment(
build_image=codebuild.LinuxBuildImage.STANDARD_5_0,
privileged=True,
compute_type=codebuild.ComputeType.SMALL,
),
primary_output_directory="testoutput/",
role_policy_statements=[
iam.PolicyStatement(
actions=[
"sqs:SendMessage",
"sqs:GetQueueAttributes",
"sqs:GetQueueUrl",
],
resources=["*"],
),
iam.PolicyStatement(
actions=[
"dynamodb:BatchWriteItem",
"dynamodb:PutItem",
"dynamodb:UpdateItem",
"dynamodb:DeleteItem",
],
resources=["*"],
),
],
)
- In your supporting resources stack, you will need to define any parameters like resource names and ARNs which your Lambda code needs as stack outputs. I have provided these values to my Lambda function using environment variables.
- In my stack code, I have specified the logical ids of the CloudFormation Lambda function resource instead of relying on CDK's auto generated value.
sqs_lambda_base: _lambda.CfnFunction = sqs_lambda.node.default_child
sqs_lambda_base.override_logical_id("SqsLambdaFunction")
- I have created a separate app file which only synthesizes my Lambda stack, instead of the full Pipeline stack. In theory you should be able to use the synth output of the full stack, but this is simpler to configure.
- I have added permissions needed by my Lambdas to interact with the resources created by my supporting resources stack.
-
sam-beta-cdk
can be used to simplify some of this workflow, but I did not use it here since it is still in preview.
In case you are not using CDK Pipelines, here is the build spec:
{
"version": "0.2",
"phases": {
"install": {
"commands": [
"pip install -r requirements.txt",
"npm install -g aws-cdk",
"curl --version",
"mkdir testoutput"
]
},
"build": {
"commands": [
"cdk synth -a \"python synth_lambdas_stack.py\" -o sam.out",
"echo \"{\\\"\"SqsLambdaFunction\\\"\": {\\\"\"QUEUE_URL\\\"\": \\\"\"$QUEUE_URL\\\"\"},\\\"\"DynamodbLambdaFunction\\\"\": {\\\"\"TABLE_NAME\\\"\": \\\"\"$TABLE_NAME\\\"\" }}\" > locals.json",
"sam local invoke -t \"sam.out/LambdasStack.template.json\" --env-vars locals.json --no-event \"DynamodbLambdaFunction\"",
"sam local invoke -t \"sam.out/LambdasStack.template.json\" --env-vars locals.json --no-event \"SqsLambdaFunction\"",
"nohup sam local start-api -t sam.out/LambdasStack.template.json --env-vars locals.json > testoutput/testing.log & ",
"",
"sleep 30",
"curl --fail http://127.0.0.1:3000/sqs",
"curl --fail http://127.0.0.1:3000/dynamodb"
]
}
},
"artifacts": {
"base-directory": "testoutput/",
"files": "**/*"
}
}
Deploy Lambda functions
cdk_codepipeline.add_stage(
lambdas_stage,
pre=[
sam_cli_test_step,
pipelines.ConfirmPermissionsBroadening(
"CheckLambda", stage=lambdas_stage
),
],
)
This is similar to the deploy supporting resources stage. You can add sam local testing in its own separate stage as well.
Notes:
- DynamoDB specifically has support for local testing if you don't want to deploy it first.
- While my pipeline only deploys one set of resources, you can create different stages deploying to different environments/accounts.
- The sam local testing step defined here is very simple. You can use features like API testing tools to run a whole suite of test cases. Here is an official AWS sample that uses selenium to test a web server, with the APIs running on sam local.
Limitations
- API Gateway authorizers are not supported for
sam local
testing, so you won't be able to test Cognito authorizers if you are planning to use them.