Migrate to Terraform (#3)

* infra: exit sceptre, enter terraform

infra: ignores, vars

* infra: tweaks

* ci: remove sceptre validation

* ci: bootstrap

* build: deps

* ci: py setup

* docs: add docs to tasks

* docs: update
This commit is contained in:
Marc Cataford 2021-03-19 13:38:54 -04:00 committed by GitHub
parent f27200b29c
commit 9a81426840
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
22 changed files with 518 additions and 239 deletions

View file

@ -6,17 +6,10 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v2 - uses: actions/checkout@v2
- uses: actions/setup-python@v2
with:
python-version: '3.8.2'
- name: Install dependencies - name: Install dependencies
run: | run: CI=1 . script/bootstrap
sudo apt install python3-venv
python3 -m pip install --upgrade pip setuptools wheel pipx
- name: Validate templates
run: |
cd infrastructure
python3 -m pipx run sceptre validate app/app.yaml
python3 -m pipx run sceptre validate bootstrap/bootstrap.yaml
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_KEY }}
- name: Lint - name: Lint
run: python3 -m pipx run black . --check run: inv lint

4
.gitignore vendored
View file

@ -2,3 +2,7 @@
*.pyc *.pyc
*.zip *.zip
*_out.json *_out.json
*.terraform*
*tfstate*
*tfvars

View file

@ -5,17 +5,15 @@
## Overview ## Overview
AWS Lambdas are fun, but often the amount of boilerplate involved in getting a project off the ground hinders the fun. From setting up a local environment to writing out a Cloudformation template, the overhead of Lambda-based greenfield projects can be daunting. No more. Just use this repository as a template or clone it and jump straight into the action! This repository offers a quick template your can use, full with a Docker setup for local development and invocation commands that you can use to package and deploy small Lambdas. AWS Lambdas are fun, but often the amount of boilerplate involved in getting a project off the ground hinders the fun. From setting up a local environment to writing out infrastructure templates, the overhead of Lambda-based greenfield projects can be daunting. No more. Just use this repository as a template or clone it and jump straight into the action! This repository offers a quick template your can use, full with a Docker setup for local development and invocation commands that you can use to package and deploy Lambda-based apps.
Use this as a foundation and tweak it to your use case!
## Local development ## Local development
To get started, hit the bootstrap script with `. script/bootstrap`. This will set up a Python 3.8 virtualenv set up with some basic tools that will make your life easier. To get started, hit the bootstrap script with `. script/bootstrap`. This will set up a Python 3.8 virtualenv set up with some basic tools that will make your life easier.
The base Lambda handler is at `src/base.py` and all the infrastructure templates and Sceptre configuration are in `infrastructure`. The base Lambda handler is at `src/base.py` and all the Terraform configurations are in `infrastructure`.
[Read more about Sceptre](https://sceptre.cloudreach.com/latest/index.html) [Read more about Sceptre](https://sceptre.cloudreach.com/latest/index.html://www.terraform.io/docs/index.html)
[Read more about AWS Lambda](https://docs.aws.amazon.com/lambda/latest/dg/lambda-python.html) [Read more about AWS Lambda](https://docs.aws.amazon.com/lambda/latest/dg/lambda-python.html)
@ -23,18 +21,11 @@ The base Lambda handler is at `src/base.py` and all the infrastructure templates
This template uses PyInvoke, all commands are of the format `inv <command> <parameters>`. This template uses PyInvoke, all commands are of the format `inv <command> <parameters>`.
|Command|Description| Use `inv --list` for the full list of commands.
|---|---|
|`app.start`|Start your Lambda in Docker.|
|`app.stop`|Stop and remove the container.|
|`app.invoke-function <function name> <Serialized JSON payload>`|Invokes the given local Lambda by container name|
|`stack.deploy`|Packages your code and deploys the stack|
|`stack.teardown-app`|Tears down the application stack|
|`stack.teardown-bootstrap`|Tears down the bootstrap stack|
## Deployment ## Deployment
The base setup assumes that your Lambda handler is located in `src.base`. Doing `inv stack.deploy` will zip up your `src` directory, create an S3 bucket for your development artifacts, uploads the source doe archive to S3 and kickstarts the Cloudformation deployment of your stack. Deployment is in three steps: on first setup, you will need to make sure that your `bootstrap` environment is ready via `inv cloud.apply bootstrap`. Then, you should upload your lambdas' source with `inv cloud.pack` and `inv cloud.push`. Finally, you can deploy your application resources with `inv cloud.deploy app`.
## Contributing ## Contributing

82
infrastructure/app/app.tf Normal file
View file

@ -0,0 +1,82 @@
provider aws {
profile = "default"
region = var.aws_region
}
resource "aws_iam_role" "lambda_role" {
name = "lambda_role"
assume_role_policy = <<EOF
{
"Version": "2012-10-17",
"Statement": [
{
"Action": "sts:AssumeRole",
"Principal": {
"Service": "lambda.amazonaws.com"
},
"Effect": "Allow",
"Sid": ""
}
]
}
EOF
}
resource "aws_lambda_function" "lambda_func" {
function_name = "boilerplate_function"
role = aws_iam_role.lambda_role.arn
handler = "src.base.handler"
runtime = "python3.8"
s3_bucket = var.artifacts_bucket_name
s3_key = var.lambda_archive_name
}
resource "aws_api_gateway_rest_api" "gateway" {
name = "boilerplate"
description = "Lambda Boilerplate"
}
resource "aws_api_gateway_resource" "lambda_proxy" {
rest_api_id = aws_api_gateway_rest_api.gateway.id
parent_id = aws_api_gateway_rest_api.gateway.root_resource_id
path_part = "{proxy+}"
}
resource "aws_api_gateway_method" "lambda_proxy" {
rest_api_id = aws_api_gateway_rest_api.gateway.id
resource_id = aws_api_gateway_resource.lambda_proxy.id
http_method = "ANY"
authorization = "NONE"
}
resource "aws_api_gateway_integration" "lambda" {
rest_api_id = aws_api_gateway_rest_api.gateway.id
resource_id = aws_api_gateway_resource.lambda_proxy.id
http_method = aws_api_gateway_method.lambda_proxy.http_method
integration_http_method = "POST"
type = "AWS_PROXY"
uri = aws_lambda_function.lambda_func.invoke_arn
}
resource "aws_api_gateway_deployment" "lambda" {
depends_on = [
aws_api_gateway_integration.lambda
]
rest_api_id = aws_api_gateway_rest_api.gateway.id
stage_name = "test"
}
resource "aws_lambda_permission" "apigw" {
statement_id = "AllowAPIGatewayInvoke"
action = "lambda:InvokeFunction"
function_name = aws_lambda_function.lambda_func.function_name
principal = "apigateway.amazonaws.com"
source_arn = "${aws_api_gateway_rest_api.gateway.execution_arn}/*/*"
}
output "base_url" {
value = aws_api_gateway_deployment.lambda.invoke_url
}

View file

@ -0,0 +1,11 @@
variable "aws_region" {
type = string
}
variable "artifacts_bucket_name" {
type = string
}
variable "lambda_archive_name" {
type = string
}

View file

@ -0,0 +1,3 @@
output "artifacts_bucket_name" {
value = aws_s3_bucket.artifacts.bucket
}

View file

@ -0,0 +1,5 @@
provider aws {
profile = "default"
region = var.aws_region
}

View file

@ -0,0 +1,8 @@
resource "aws_s3_bucket" "artifacts" {
bucket = var.artifacts_bucket_name
acl = "private"
tags = {
Name = var.artifacts_bucket_name
}
}

View file

@ -0,0 +1,7 @@
variable "artifacts_bucket_name" {
type = string
}
variable "aws_region" {
type = string
}

View file

@ -1,8 +0,0 @@
template_path: app.yaml
parameters:
# Lambda source artifacts bucket name
ArtifactsBucketName: !stack_output bootstrap/bootstrap.yaml::ArtifactsBucketNameTest
# Lambda zip key
SourceKey: {{ var.source_key | default("") }}
ApiStageName: "v0"

View file

@ -1,2 +0,0 @@
project_code: app
region: us-east-1

View file

@ -1,4 +0,0 @@
template_path: bootstrap.yaml
parameters:
# Bucket name for the artifacts S# bucket. Used solely to store versioned lambda zips.
ArtifactsBucketName: my-cool-bucket-name

View file

@ -1,2 +0,0 @@
project_code: app
region: us-east-1

View file

@ -1,2 +0,0 @@
project_code: app
region: us-east-1

View file

@ -1,99 +0,0 @@
AWSTemplateFormatVersion: 2010-09-09
Parameters:
ArtifactsBucketName:
Type: String
Description: Bucket storing the function source.
SourceKey:
Type: String
ApiStageName:
Type: String
Resources:
# Lambda function
Function:
Type: AWS::Lambda::Function
Properties:
Code:
S3Bucket: !Ref ArtifactsBucketName
S3Key: !Ref SourceKey
FunctionName: sampleFunction
Handler: src.base.handler
Role: !GetAtt LambdaExecutionRole.Arn
Runtime: python3.8
# Roles
LambdaExecutionRole:
Type: AWS::IAM::Role
Properties:
AssumeRolePolicyDocument:
Version: 2012-10-17
Statement:
- Effect: Allow
Principal:
Service:
- lambda.amazonaws.com
Action:
- sts:AssumeRole
ApiGatewayRole:
Type: AWS::IAM::Role
Properties:
AssumeRolePolicyDocument:
Version: 2012-10-17
Statement:
- Effect: Allow
Principal:
Service:
- apigateway.amazonaws.com
Action:
- sts:AssumeRole
Path: '/'
Policies:
- PolicyName: LambdaAccess
PolicyDocument:
Version: 2012-10-17
Statement:
- Effect: Allow
Action: 'lambda:*'
Resource: !GetAtt Function.Arn
RestApi:
Type: AWS::ApiGateway::RestApi
Properties:
Name: lambda-api
GatewayResource:
Type: AWS::ApiGateway::Resource
Properties:
ParentId: !GetAtt RestApi.RootResourceId
PathPart: lambda
RestApiId: !Ref RestApi
# This template only defines a POST method, but others can easily be defined
# by duping this resource and tweaking it for other resources, opnames or URIs.
PostMethod:
Type: AWS::ApiGateway::Method
Properties:
HttpMethod: POST
AuthorizationType: NONE
Integration:
Credentials: !GetAtt ApiGatewayRole.Arn
IntegrationHttpMethod: POST
Type: AWS_PROXY
Uri: !Sub 'arn:aws:apigateway:${AWS::Region}:lambda:path/2015-03-31/functions/${Function.Arn}/invocations'
OperationName: lambda
ResourceId: !Ref GatewayResource
RestApiId: !Ref RestApi
ApiStage:
Type: AWS::ApiGateway::Stage
Properties:
DeploymentId: !Ref ApiDeployment
RestApiId: !Ref RestApi
StageName: !Ref ApiStageName
ApiDeployment:
Type: AWS::ApiGateway::Deployment
DependsOn: PostMethod
Properties:
RestApiId: !Ref RestApi
Outputs:
ApiGatewayId:
Value: !Ref RestApi
Export:
Name: RestApiId

View file

@ -1,18 +0,0 @@
AWSTemplateFormatVersion: 2010-09-09
Parameters:
ArtifactsBucketName:
Type: String
Description: Bucket storing the function source
Resources:
DevBucket:
Type: AWS::S3::Bucket
Properties:
BucketName: !Ref ArtifactsBucketName
AccessControl: Private
VersioningConfiguration:
Status: Enabled
Outputs:
ArtifactsBucketNameTest:
Value: !Ref ArtifactsBucketName
Export:
Name: ArtifactsBucketNameTest

0
requirements.in Normal file
View file

6
requirements.txt Normal file
View file

@ -0,0 +1,6 @@
#
# This file is autogenerated by pip-compile
# To update, run:
#
# pip-compile requirements.in
#

6
requirements_dev.in Normal file
View file

@ -0,0 +1,6 @@
boto3
invoke
black
pytest
moto
pytest-cov

View file

@ -1,27 +1,192 @@
#
# This file is autogenerated by pip-compile
# To update, run:
#
# pip-compile --output-file=requirements_dev.txt requirements_dev.in
#
appdirs==1.4.4 appdirs==1.4.4
attrs==20.1.0 # via black
boto3==1.14.55 attrs==20.3.0
botocore==1.17.55 # via
# jsonschema
# pytest
aws-sam-translator==1.33.0
# via cfn-lint
aws-xray-sdk==2.6.0
# via moto
black==20.8b1
# via -r requirements_dev.in
boto3==1.16.59
# via
# -r requirements_dev.in
# aws-sam-translator
# moto
boto==2.49.0
# via moto
botocore==1.19.59
# via
# aws-xray-sdk
# boto3
# moto
# s3transfer
certifi==2020.12.5
# via requests
cffi==1.14.4
# via cryptography
cfn-lint==0.44.4
# via moto
chardet==4.0.0
# via requests
click==7.1.2 click==7.1.2
colorama==0.3.9 # via black
coverage==5.3.1
# via pytest-cov
cryptography==3.3.1
# via
# moto
# python-jose
# sshpubkeys
decorator==4.4.2 decorator==4.4.2
docutils==0.15.2 # via networkx
invoke==1.4.1 docker==4.4.1
Jinja2==2.11.2 # via moto
ecdsa==0.14.1
# via
# moto
# python-jose
# sshpubkeys
future==0.18.2
# via aws-xray-sdk
idna==2.10
# via
# moto
# requests
iniconfig==1.1.1
# via pytest
invoke==1.5.0
# via -r requirements_dev.in
jinja2==2.11.2
# via moto
jmespath==0.10.0 jmespath==0.10.0
MarkupSafe==1.1.1 # via
networkx==2.1 # boto3
packaging==16.8 # botocore
pathspec==0.8.0 jsondiff==1.2.0
# via moto
jsonpatch==1.28
# via cfn-lint
jsonpickle==1.5.0
# via aws-xray-sdk
jsonpointer==2.0
# via jsonpatch
jsonschema==3.2.0
# via
# aws-sam-translator
# cfn-lint
junit-xml==1.9
# via cfn-lint
markupsafe==1.1.1
# via
# jinja2
# moto
mock==4.0.3
# via moto
more-itertools==8.6.0
# via moto
moto==1.3.16
# via -r requirements_dev.in
mypy-extensions==0.4.3
# via black
networkx==2.5
# via cfn-lint
packaging==20.8
# via pytest
pathspec==0.8.1
# via black
pluggy==0.13.1
# via pytest
py==1.10.0
# via pytest
pyasn1==0.4.8 pyasn1==0.4.8
# via
# python-jose
# rsa
pycparser==2.20
# via cffi
pyparsing==2.4.7 pyparsing==2.4.7
# via packaging
pyrsistent==0.17.3
# via jsonschema
pytest-cov==2.11.1
# via -r requirements_dev.in
pytest==6.2.1
# via
# -r requirements_dev.in
# pytest-cov
python-dateutil==2.8.1 python-dateutil==2.8.1
PyYAML==5.3.1 # via
regex==2020.7.14 # botocore
rsa==4.5 # moto
s3transfer==0.3.3 python-jose[cryptography]==3.2.0
# via moto
pytz==2020.5
# via moto
pyyaml==5.4.1
# via
# cfn-lint
# moto
regex==2020.11.13
# via black
requests==2.25.1
# via
# docker
# moto
# responses
responses==0.12.1
# via moto
rsa==4.7
# via python-jose
s3transfer==0.3.4
# via boto3
six==1.15.0 six==1.15.0
toml==0.10.1 # via
typed-ast==1.4.1 # aws-sam-translator
typing==3.7.4.3 # cfn-lint
urllib3==1.25.10 # cryptography
# docker
# ecdsa
# jsonschema
# junit-xml
# moto
# python-dateutil
# python-jose
# responses
# websocket-client
sshpubkeys==3.1.0
# via moto
toml==0.10.2
# via
# black
# pytest
typed-ast==1.4.2
# via black
typing-extensions==3.7.4.3
# via black
urllib3==1.26.2
# via
# botocore
# requests
# responses
websocket-client==0.57.0
# via docker
werkzeug==1.0.1
# via moto
wrapt==1.12.1
# via aws-xray-sdk
xmltodict==0.12.0
# via moto
zipp==3.4.0
# via moto
# The following packages are considered to be unsafe in a requirements file:
# setuptools

View file

@ -1,15 +1,23 @@
VENV=lambda-boilerplate.venv if [ "$CI" -eq 1 ]; then
{
################################################################# python -m pip install -U pip pip-tools &&
# Bootstrapping sets up the Python 3.8 venv that allows the use # pip install -r requirements_dev.txt
# of the invoke commands. # }
################################################################# else
VENV=lambda-boilerplate.venv
{
pyenv virtualenv-delete -f $VENV #################################################################
pyenv virtualenv $VENV && # Bootstrapping sets up the Python 3.8 venv that allows the use #
pyenv activate $VENV && # of the invoke commands. #
python -m pip install -U pip && #################################################################
pip install -r requirements_dev.txt &&
echo "✨ Good to go! ✨" {
} pyenv virtualenv-delete -f $VENV
pyenv virtualenv $VENV &&
pyenv activate $VENV &&
python -m pip install -U pip pip-tools &&
pip install -r requirements_dev.txt &&
echo "✨ Good to go! ✨"
}
fi

213
tasks.py
View file

@ -3,42 +3,101 @@ import boto3
import os import os
from pathlib import Path from pathlib import Path
import hashlib import hashlib
import re
from typing import List, Dict
BASE_PATH = str(Path(__file__).parent.absolute())
VARIABLES_PATH = "../variables.tfvars"
HELP_SEGMENTS = {
"project": "Project name, either app or bootstrap",
"archive": "Archive file",
"function_name": "Name of the Lambda to invoke locally (as defined in the Cloudformation template)",
"payload": "JSON payload to include in the trigger event",
"fix": "Whether to fix errors",
"env": "Environment (dev or prod)",
"package": "Target package (incl. range)",
}
def _compose_path(path: str) -> str:
return str(Path(BASE_PATH, path).absolute())
def _build_help_dict(segments: List[str]) -> Dict[str, str]:
return {segment: HELP_SEGMENTS[segment] for segment in segments}
PROJECT_PATHS = {
"app": _compose_path("infrastructure/app"),
"bootstrap": _compose_path("infrastructure/bootstrap"),
}
##################### #####################
# Stack invocations # # Cloud invocations #
##################### #####################
@task(name="teardown-app") @task(name="plan", help=_build_help_dict(["project"]))
def teardown_app(ctx): def cloud_plan(ctx, project):
with ctx.cd("infrastructure"): """
ctx.run("sceptre delete app/app.yaml -y") Builds the Terraform plan for the given project.
"""
with ctx.cd(PROJECT_PATHS[project]):
ctx.run(f"terraform plan --var-file {VARIABLES_PATH}")
@task(name="teardown-bootstrap") @task(name="apply", help=_build_help_dict(["project"]))
def teardown_bootstrap(ctx): def cloud_apply(ctx, project):
with ctx.cd("infrastructure"): """
ctx.run("sceptre delete bootstrap/bootstrap.yaml -y") Applies infrastructure changes to the given project.
"""
with ctx.cd(PROJECT_PATHS[project]):
ctx.run("terraform taint --allow-missing aws_lambda_function.apgnd_lambda_func")
ctx.run("terraform taint --allow-missing aws_lambda_permission.apigw")
ctx.run(f"terraform apply --var-file {VARIABLES_PATH}")
@task(name="deploy") @task(name="destroy", help=_build_help_dict(["project"]))
def stack_deploy(ctx): def cloud_destroy(ctx, project):
path = Path(__file__).parent """
with ctx.cd("infrastructure"): Destroys resources associated with the given project.
ctx.run("sceptre launch bootstrap/bootstrap.yaml -y") """
with ctx.cd(PROJECT_PATHS[project]):
ctx.run(f"terraform destroy --var-file {VARIABLES_PATH}")
ctx.run("zip lambda_function.zip src/*")
with open("lambda_function.zip", "rb") as src:
srchash = hashlib.md5(src.read()).hexdigest()
new_archive_name = f"lambda_function_{srchash}.zip"
ctx.run(f"mv lambda_function.zip {new_archive_name}")
ctx.run( @task(name="pack")
f"aws s3 cp {new_archive_name} s3://mcat-dev-test-bucket-artifacts-2 && rm {new_archive_name}" def cloud_pack(ctx):
) """
Prepares and packages the source code for lambdas.
"""
with ctx.cd(BASE_PATH):
ctx.run("pip install -r requirements.txt --target package/")
ctx.run("zip -r lambda_function.zip src/*")
with ctx.cd("infrastructure"): with ctx.cd(_compose_path("package")):
ctx.run(f"sceptre --var source_key={new_archive_name} launch app/app.yaml -y") ctx.run("zip -r ../lambda_function.zip ./")
@task(name="push", help=_build_help_dict(["archive"]))
def cloud_push(ctx, archive):
"""
Pushes the given archive to S3.
"""
artifacts_bucket = None
with ctx.cd(_compose_path(PROJECT_PATHS["bootstrap"])):
out = ctx.run("terraform output", hide="out").stdout
artifacts_bucket_match = re.match(
"artifacts_bucket_name = (?P<bucket_name>[0-9a-zA-Z\-]+)\n", out
)
artifacts_bucket = artifacts_bucket_match.group("bucket_name")
with ctx.cd(BASE_PATH):
ctx.run(f"aws s3 cp {archive} s3://{artifacts_bucket}", hide="out")
print(f"Uploaded {archive} to s3 ({artifacts_bucket})!")
##################### #####################
@ -47,43 +106,109 @@ def stack_deploy(ctx):
@task(name="start") @task(name="start")
def app_start(ctx): def local_start(ctx):
"""
Starts your stack locally.
"""
ctx.run("docker-compose up -d --build") ctx.run("docker-compose up -d --build")
@task(name="stop") @task(name="stop")
def app_stop(ctx): def local_stop(ctx):
"""
Stops your local stack.
"""
ctx.run("docker-compose down") ctx.run("docker-compose down")
@task( @task(
name="invoke-function", name="invoke",
help={ help=_build_help_dict(["function_name", "payload"]),
"function_name": "Name of the Lambda to invoke locally (as defined in the Cloudformation template)",
"payload": "JSON payload to include in the trigger event",
},
) )
def app_invoke_function(ctx, function_name, payload): def local_invoke(ctx, function_name, payload):
"""
Triggers the local lambda with the given payload
"""
ctx.run( ctx.run(
f"aws lambda invoke --endpoint http://localhost:9001 --no-sign-request --function-name {function_name} --log-type Tail --payload {payload} {function_name}_out.json" f"aws lambda invoke --endpoint http://localhost:9001 --no-sign-request --function-name {function_name} --log-type Tail --payload {payload} {function_name}_out.json"
) )
#####################
# Other invocations #
####################
@task(name="lock")
def lock_requirements(ctx):
"""
Builds the pip lockfile
"""
with ctx.cd(BASE_PATH):
ctx.run("python -m piptools compile requirements.in", hide="both")
ctx.run(
"python -m piptools compile requirements_dev.in --output-file requirements_dev.txt",
hide="both",
)
@task(name="update", help=_build_help_dict(["env", "package"]))
def update_requirements(ctx, env, package):
"""
Updates a package an regenerates the lockfiles.
"""
deps = None
if env == "prod":
deps = "requirements.in"
elif env == "dev":
deps = "requirements_dev.in"
else:
raise ValueError("Invalid env")
with ctx.cd(BASE_PATH):
ctx.run(f"python -m piptools compile {deps} --upgrade-package {package}")
@task(name="lint", help=_build_help_dict(["fix"]))
def lint(ctx, fix=False):
"""
Lints
"""
with ctx.cd(BASE_PATH):
ctx.run("black *.py **/*.py" + (" --check" if not fix else ""))
@task(name="test")
def test(ctx):
"""
Runs tests
"""
with ctx.cd(BASE_PATH):
ctx.run("pytest --cov=src")
ns = Collection() ns = Collection()
# Stack invocations manage the Clouformation flows local = Collection("local")
local.add_task(local_start)
local.add_task(local_stop)
local.add_task(local_invoke)
stack = Collection("stack") cloud = Collection("cloud")
stack.add_task(stack_deploy) cloud.add_task(cloud_plan)
stack.add_task(teardown_app) cloud.add_task(cloud_apply)
stack.add_task(teardown_bootstrap) cloud.add_task(cloud_destroy)
cloud.add_task(cloud_pack)
cloud.add_task(cloud_push)
# App invocations manage local containers project = Collection("requirements")
project.add_task(lock_requirements)
project.add_task(update_requirements)
app = Collection("app") ns.add_collection(local)
app.add_task(app_start) ns.add_collection(cloud)
app.add_task(app_stop) ns.add_collection(project)
app.add_task(app_invoke_function)
ns.add_collection(stack) ns.add_task(lint)
ns.add_collection(app) ns.add_task(test)