Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deploy from mono repo #31

Open
wants to merge 41 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
41 commits
Select commit Hold shift + click to select a range
b46fbcc
tf changes
smohiudd Jun 2, 2023
05b591b
lambda db init
smohiudd Jun 5, 2023
b7eb968
tf resource renaming
smohiudd Jun 5, 2023
7c0f68e
minor changes
smohiudd Jun 5, 2023
c3ed84f
minor tf changes
smohiudd Jun 5, 2023
5cb6d71
Merge pull request #1 from NASA-IMPACT/feature/tf-changes
smohiudd Jun 6, 2023
956c2ed
Add cicd
slesaad Jun 6, 2023
19b4d31
Assign env to update-workflows
slesaad Jun 6, 2023
648dd82
Rename the action ref
slesaad Jun 6, 2023
4d504a7
Move action to correct location
slesaad Jun 6, 2023
b535117
Add `shell`
slesaad Jun 6, 2023
7bed530
Permit script to run
slesaad Jun 6, 2023
29cf318
Add expression parenthesis
slesaad Jun 6, 2023
e3a3f8b
Update env var to use $
slesaad Jun 6, 2023
e1adb8e
Revert
slesaad Jun 6, 2023
f5aa036
tf env variable, remove dns
smohiudd Jun 9, 2023
dbd79e9
Merge pull request #3 from NASA-IMPACT/fix/tf-env-variables
smohiudd Jun 9, 2023
3f63d5b
Merge pull request #2 from NASA-IMPACT/update-workflows
smohiudd Jun 9, 2023
580ab96
change tf vars
smohiudd Jun 9, 2023
12abbdf
update env sync and github action
smohiudd Jun 9, 2023
f36372d
update actions vars
smohiudd Jun 9, 2023
6cbe1a8
remove python dependencies from actions
smohiudd Jun 9, 2023
466c8ca
change chmod for script
smohiudd Jun 9, 2023
53f9b1e
add bash
smohiudd Jun 9, 2023
a4edbcd
update actions tf version
smohiudd Jun 9, 2023
69c0117
remove python from actions
smohiudd Jun 9, 2023
8e6c1e4
tf changes
smohiudd Jun 9, 2023
712750a
ecr user changes
smohiudd Jun 9, 2023
8a14201
ecr user change
smohiudd Jun 9, 2023
f717acf
permission boundary
smohiudd Jun 9, 2023
0e5a217
permission boundary fix
smohiudd Jun 9, 2023
2334c73
various tf changes
smohiudd Jun 12, 2023
82fdade
load balance port 80
smohiudd Jun 12, 2023
48e7401
fast api http
smohiudd Jun 12, 2023
ceeabe8
lb ingress 443 rule
smohiudd Jun 12, 2023
b1bffc6
add example env file
smohiudd Jun 12, 2023
21ab473
Deplot feature api
amarouane-ABDELHAK Jul 26, 2023
467cadd
Deploy features api from mono-repo
amarouane-ABDELHAK Oct 31, 2023
5704752
Deploy features api from mono-repo
amarouane-ABDELHAK Oct 31, 2023
0f5f355
Add permission boundaries
amarouane-ABDELHAK Oct 31, 2023
4f943b1
Add permission boundaries
amarouane-ABDELHAK Oct 31, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 7 additions & 0 deletions .example.env
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
ENV=dev
AWS_REGION=us-west-2
STATE_BUCKET_NAME=ghgc-features-tf-state-bucket
PROJECT_NAME=ghgc-features-api
REGISTRY_NAME=ghgc-features-api-registry
SERVICE_PORT=8080
VPC_ID=
59 changes: 59 additions & 0 deletions .github/actions/terraform-deploy/action.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@
name: Deploy
inputs:
env_aws_secret_name:
required: true
type: string
env-file:
type: string
default: ".env"
dir:
required: false
type: string
default: "."
script_path:
type: string

runs:
using: "composite"

steps:
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: "3.10"
cache: "pip"

- name: Install python dependencies
shell: bash
working-directory: ${{ inputs.dir }}
run: |
python -m pip install --upgrade pip
python -m pip install boto3

- name: Get relevant environment configuration from aws secrets
shell: bash
working-directory: ${{ inputs.dir }}
run: |
if [[ -z "${{ inputs.script_path }}" ]]; then
./scripts/sync-env.sh ${{ inputs.env_aws_secret_name }}
else
python ${{ inputs.script_path }} --secret-id ${{ inputs.env_aws_secret_name }}
source ${{ inputs.env-file }}
echo "PREFIX=feature-api-${STAGE}" >> ${{ inputs.env-file }}
echo "REGISTRY_NAME=feature-api-${STAGE}" >> ${{ inputs.env-file }}
echo "ENV=${STAGE}" >> ${{ inputs.env-file }}
echo "PROJECT_NAME=veda-${STAGE}" >> ${{ inputs.env-file }}
fi

- name: Setup Terraform
uses: hashicorp/setup-terraform@v2
with:
terraform_version: 1.3.9

- name: Deploy
shell: bash
working-directory: ${{ inputs.dir }}
run: |
bash ./scripts/deploy.sh .env <<< init
bash ./scripts/deploy.sh .env <<< deploy

60 changes: 60 additions & 0 deletions .github/workflows/cicd.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,60 @@
name: CICD 🚀

permissions:
id-token: write
contents: read

on:
push:
branches:
- main
- dev
- production
- update-workflows

jobs:
define-environment:
name: Set ✨ environment ✨
runs-on: ubuntu-latest
steps:
- name: Set the environment based on the branch
id: define_environment
run: |
if [ "${{ github.ref }}" = "refs/heads/main" ]; then
echo "env_name=staging" >> $GITHUB_OUTPUT
elif [ "${{ github.ref }}" = "refs/heads/dev" ]; then
echo "env_name=development" >> $GITHUB_OUTPUT
elif [ "${{ github.ref }}" = "refs/heads/production" ]; then
echo "env_name=production" >> $GITHUB_OUTPUT
elif [ "${{ github.ref }}" = "refs/heads/update-workflows" ]; then
echo "env_name=development" >> $GITHUB_OUTPUT
fi
- name: Print the environment
run: echo "The environment is ${{ steps.define_environment.outputs.env_name }}"

outputs:
env_name: ${{ steps.define_environment.outputs.env_name }}

deploy:
name: Deploy to ${{ needs.define-environment.outputs.env_name }} 🚀
runs-on: ubuntu-latest
needs: [define-environment]
if: ${{ needs.define-environment.outputs.env_name }}
environment: ${{ needs.define-environment.outputs.env_name }}
concurrency: ${{ needs.define-environment.outputs.env_name }}

steps:
- name: Checkout
uses: actions/checkout@v3

- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@v2
with:
role-to-assume: ${{ secrets.DEPLOYMENT_ROLE_ARN }}
role-session-name: "ghgc-features-api-github-${{ needs.define-environment.outputs.env_name }}-deployment"
aws-region: "us-west-2"

- name: Run deployment
uses: "./.github/actions/terraform-deploy"
with:
env_aws_secret_name: ${{ secrets.ENV_AWS_SECRET_NAME }}
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# VEDA Features API
# GHGC Features API

Hosting and serving collections of vector data features for VEDA
Hosting and serving collections of vector data features for GHGC

---

Expand Down
13 changes: 13 additions & 0 deletions db/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
FROM --platform=linux/amd64 public.ecr.aws/lambda/python:3.9

# WORKDIR /tmp

RUN pip install boto3 requests "urllib3<2" psycopg["binary"] -t "${LAMBDA_TASK_ROOT}"

COPY ./handler.py ${LAMBDA_TASK_ROOT}

# https://stackoverflow.com/a/61746719
# Turns out, asyncio is part of python
# RUN rm -rf /asset/asyncio*

CMD ["handler.handler"]
200 changes: 200 additions & 0 deletions db/handler.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,200 @@
# """Bootstrap Postgres db."""

import asyncio
import json

import boto3
import psycopg
# import requests
from psycopg import sql
from psycopg.conninfo import make_conninfo
import os

# def send(
# event,
# context,
# responseStatus,
# responseData,
# physicalResourceId=None,
# noEcho=False,
# ):
# """
# Copyright 2016 Amazon Web Services, Inc. or its affiliates. All Rights Reserved.
# This file is licensed to you under the AWS Customer Agreement (the "License").
# You may not use this file except in compliance with the License.
# A copy of the License is located at http://aws.amazon.com/agreement/ .
# This file is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, express or implied.
# See the License for the specific language governing permissions and limitations under the License.

# Send response from AWS Lambda.

# Note: The cfnresponse module is available only when you use the ZipFile property to write your source code.
# It isn't available for source code that's stored in Amazon S3 buckets.
# For code in buckets, you must write your own functions to send responses.
# """
# responseUrl = event["ResponseURL"]

# print(responseUrl)

# responseBody = {}
# responseBody["Status"] = responseStatus
# responseBody["Reason"] = (
# "See the details in CloudWatch Log Stream: " + context.log_stream_name
# )
# responseBody["PhysicalResourceId"] = physicalResourceId or context.log_stream_name
# responseBody["StackId"] = event["StackId"]
# responseBody["RequestId"] = event["RequestId"]
# responseBody["LogicalResourceId"] = event["LogicalResourceId"]
# responseBody["NoEcho"] = noEcho
# responseBody["Data"] = responseData

# json_responseBody = json.dumps(responseBody)

# print("Response body:\n" + json_responseBody)

# headers = {"content-type": "", "content-length": str(len(json_responseBody))}

# try:
# response = requests.put(responseUrl, data=json_responseBody, headers=headers)
# print("Status code: " + response.reason)
# except Exception as e:
# print("send(..) failed executing requests.put(..): " + str(e))


def get_secret(secret_name):
"""Get Secrets from secret manager."""
print(f"Fetching {secret_name}...")
client = boto3.client(service_name="secretsmanager")
response = client.get_secret_value(SecretId=secret_name)
return json.loads(response["SecretString"])


def create_db(cursor, db_name: str) -> None:
"""Create DB."""
cursor.execute(
sql.SQL("SELECT 1 FROM pg_catalog.pg_database " "WHERE datname = %s"), [db_name]
)
if cursor.fetchone():
print(f"database {db_name} exists, not creating DB")
else:
print(f"database {db_name} not found, creating...")
cursor.execute(
sql.SQL("CREATE DATABASE {db_name}").format(db_name=sql.Identifier(db_name))
)


def create_user(cursor, username: str, password: str) -> None:
"""Create User."""
cursor.execute(
sql.SQL(
"DO $$ "
"BEGIN "
" IF NOT EXISTS ( "
" SELECT 1 FROM pg_roles "
" WHERE rolname = {user}) "
" THEN "
" CREATE USER {username} "
" WITH PASSWORD {password}; "
" ELSE "
" ALTER USER {username} "
" WITH PASSWORD {password}; "
" END IF; "
"END "
"$$; "
).format(username=sql.Identifier(username), password=password, user=username)
)


def create_permissions(cursor, db_name: str, username: str) -> None:
"""Add permissions."""
cursor.execute(
sql.SQL(
"GRANT CONNECT ON DATABASE {db_name} TO {username};"
"GRANT CREATE ON DATABASE {db_name} TO {username};" # Allow schema creation
"GRANT USAGE ON SCHEMA public TO {username};"
"ALTER DEFAULT PRIVILEGES IN SCHEMA public "
"GRANT ALL PRIVILEGES ON TABLES TO {username};"
"ALTER DEFAULT PRIVILEGES IN SCHEMA public "
"GRANT ALL PRIVILEGES ON SEQUENCES TO {username};"
).format(
db_name=sql.Identifier(db_name),
username=sql.Identifier(username),
)
)


def register_extensions(cursor) -> None:
"""Add PostGIS extension."""
cursor.execute(sql.SQL("CREATE EXTENSION IF NOT EXISTS postgis;"))


def handler(event, context):
"""Lambda Handler."""
print(f"Event: {event}")

if event["tf"]["action"] not in ["create", "update"]:
print("failed")
return 0
# return send(event, context, "SUCCESS", {"msg": "No action to be taken"})

try:
connection_params = get_secret(os.environ['CONN_SECRET_ARN'])
user_params = event["user_params"]
print("Connecting to DB...")
con_str = make_conninfo(
dbname=connection_params.get("dbname", "postgres"),
user=connection_params["username"],
password=connection_params["password"],
host=connection_params["host"],
port=connection_params["port"],
)
with psycopg.connect(con_str, autocommit=True) as conn:
with conn.cursor() as cur:
print("Creating database...")
create_db(
cursor=cur,
db_name=user_params["dbname"],
)

print("Creating user...")
create_user(
cursor=cur,
username=user_params["username"],
password=user_params["password"],
)

print("Setting permissions...")
create_permissions(
cursor=cur,
db_name=user_params["dbname"],
username=user_params["username"],
)

# Install extensions on the user DB with
# superuser permissions, since they will
# otherwise fail to install when run as
# the non-superuser within the pgstac
# migrations.
print("Connecting to DB...")
con_str = make_conninfo(
dbname=user_params["dbname"],
user=connection_params["username"],
password=connection_params["password"],
host=connection_params["host"],
port=connection_params["port"],
)
with psycopg.connect(con_str, autocommit=True) as conn:
with conn.cursor() as cur:
print("Registering PostGIS ...")
register_extensions(cursor=cur)

except Exception as e:
print(e)
return {
'message' : e
}

print("Event Complete")
return {
'message' : connection_params
}
2 changes: 2 additions & 0 deletions docs/IACHOWTO.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,8 @@ $ tfenv use 1.3.9
5. we also use Terraform "workspaces" so our infra state stays nicely separated in the same S3 bucket. Some quick samples of how to interact with that:

```bash
$ AWS_PROFILE=<account> terraform workspace new west2-staging

$ AWS_PROFILE=<account> terraform workspace list
* default
west2-staging
Expand Down
25 changes: 25 additions & 0 deletions scripts/build.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
#!/bin/sh
export TARGET_ENVIRONMENT=dev
export TARGET_PROJECT_NAME=ghgc-features-api

cd wfs3-app/

aws ecr describe-repositories \
| jq '.repositories | map(.repositoryUri)' \
| grep $TARGET_PROJECT_NAME | grep $TARGET_ENVIRONMENT \
| xargs -I {} bash -c "aws ecr get-login-password | docker login --username AWS --password-stdin {}"

aws ecr describe-repositories \
| jq '.repositories | map(.repositoryUri)' \
| grep $TARGET_PROJECT_NAME | grep $TARGET_ENVIRONMENT \
| sed -E 's/"|,//g' \
| xargs -I {} docker build -t {}:latest ../wfs3-app/

aws ecr describe-repositories \
| jq '.repositories | map(.repositoryUri)' \
| grep $TARGET_PROJECT_NAME | grep $TARGET_ENVIRONMENT \
| sed -E 's/"|,//g' \
| xargs -I {} docker images --format "{{json . }}" {} \
| grep '"Tag":"latest"' \
| jq '"\(.Repository):\(.Tag)"' \
| xargs -I{} docker push {}
Loading