Skip to content

Commit

Permalink
Merge pull request #17 from fecgov/feature/40-test-cloud-gov
Browse files Browse the repository at this point in the history
Deploy to cloud.gov
  • Loading branch information
lbeaufort authored Feb 7, 2022
2 parents fee05bf + 423949e commit 735c198
Show file tree
Hide file tree
Showing 25 changed files with 436 additions and 272 deletions.
5 changes: 5 additions & 0 deletions .cfignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
# Database creation script
db

# git secrets config
install-git-secrets-hook.sh
38 changes: 23 additions & 15 deletions .circleci/README.md
Original file line number Diff line number Diff line change
@@ -1,19 +1,29 @@
# CircleCI Configuration
## Environment Variables
When configuring CircleCI, you will need to set environment varaialbes the database
When configuring CircleCI, you will need to set environment variables the database
configuration as follows:
```
FECFILE_DB_HOST=localhost
FECFILE_DB_USERNAME=postgres
FECFILE_DB_PASSWORD=postgres
FECFILE_DB_NAME=postgres
DATABASE_URL = "postgres://postgres:[email protected]/postgres"
FECFILE_TEST_DB_NAME = "postgres"
FECFILE_FEC_WEBSITE_API_KEY=
```
Notes:
* There is no default FECFILE_FEC_WEBSITE_API_KEY, you must obtain and set this yourself
* The FECFILE_DB_HOST value here is different than what you need for your local docker-compose configuration.

# Using CircleCI local CLI
CircleCI will attempt to deploy commits made to specific branches:
* branch __develop__ -> cloud.gov dev space
* branch __release__* (any branch starting with release) -> cloud.gov staging space
* branch __prod__ -> cloud.gov prod space

Authentication must be configured in a set of evironment variables:
* $FEC_CF_USERNAME_DEV
* $FEC_CF_PASSWORD_DEV
* $FEC_CF_USERNAME_STAGE
* $FEC_CF_PASSWORD_STAGE
* $FEC_CF_USERNAME_PROD
* $FEC_CF_PASSWORD_PROD

# Using CircleCI local CLI

## Install circleci local
Install on Linux or Mac with:
Expand All @@ -30,23 +40,21 @@ circleci config validate
```

## Run the CircleCI Job locally
You can run a CircleCI job locally and avoid the change/commit/wait loop you need to
do if you want to actually run the changes on Circle.
You can run a CircleCI job locally and avoid the change/commit/wait loop you need to
do if you want to actually run the changes on Circle.
This can save a lot of time when trying to debug an issue in CI.
```
circleci local execute --job JOB_NAME
```

## Necessary Environment Variables
The Django backend expects to find the database login info in the environment.
The Django backend expects to find the database login info in the environment.
To run in the local CircleCI for the django unit tests (for example), use the following:

```
circleci local execute -e FECFILE_DB_HOST=localhost \
-e FECFILE_DB_USERNAME=postgres \
-e FECFILE_DB_PASSWORD=postgres \
-e FECFILE_DB_NAME=postgres \
-e FECFILE_FEC_WEBSITE_API_KEY=${FECFILE_FEC_WEBSITE_API_KEY}\
circleci local execute -e DATABASE_URL=${DATABASE_URL} \
-e FECFILE_FEC_WEBSITE_API_KEY=${FECFILE_FEC_WEBSITE_API_KEY} \
-e FECFILE_TEST_DB_NAME=${FECFILE_TEST_DB_NAME} \
--job unit-test
```

Expand Down
48 changes: 37 additions & 11 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,14 +13,15 @@ jobs:

dependency-check:
docker:
- image: cimg/python:3.8
- image: cimg/python:3.7

steps:
- checkout

- python/install-packages:
pkg-manager: pip
pip-dependency-file: django-backend/requirements-test.txt
pip-dependency-file: requirements-test.txt


- run:
name: Run depency check
Expand All @@ -34,7 +35,7 @@ jobs:
awk -v "today=${today}" '{ if ($2 > today || $2 == "") print "-i", $1}' | # print any line after today
xargs echo # put all the output from previous command on one line
)
export command="safety check -r django-backend/requirements.txt --full-report $ignores"
export command="safety check -r requirements.txt --full-report $ignores"
echo "----------------------------------------------------"
echo "If you need to modify the ignore list for the safety"
Expand Down Expand Up @@ -68,29 +69,28 @@ jobs:
- run:
name: Create unified requirements so CircleCI can cache them
command: |
cd ~/project/django-backend
cd ~/project/
ls -l
cat requirements.txt > requirements-all.txt
echo >> requirements-all.txt # blank in case new newline at end of requirements.txt
cat requirements-test.txt >> requirements-all.txt
- python/install-packages:
pkg-manager: pip
app-dir: ~/project/django-backend/
app-dir: ~/project/
pip-dependency-file: requirements-all.txt

- run:
name: Wait for the database to be active
command: pip install psycopg2 psycopg2-binary retry && python wait_for_db.py
command: python wait_for_db.py
working_directory: ~/project/django-backend/

- run:
name: Load test database fixure
command: |
sudo apt-get update &&
sudo apt-get install postgresql-client-12 &&
export PGPASSWORD=${FECFILE_DB_PASSWORD} &&
psql -h ${FECFILE_DB_HOST} ${FECFILE_DB_NAME} ${FECFILE_DB_USERNAME} < fec_clean_dev_db_backup-20211227.sql
psql ${DATABASE_URL} < fec_clean_dev_db_backup-20211227.sql
working_directory: ~/project/db

- run:
Expand Down Expand Up @@ -127,7 +127,6 @@ jobs:
command: coverage xml -o ~/project/coverage-reports/coverage.xml
working_directory: ~/project/django-backend


# Sonar cloud setup and scanning
- run:
name: Create sonar-scanner cache directory if it doesn't exist
Expand Down Expand Up @@ -162,12 +161,39 @@ jobs:
key: v1-sonarcloud-scanner-4.6.2.2472
paths: /tmp/cache/scanner

deploy:

docker:
- image: cimg/python:3.7

steps:
- checkout

- python/install-packages:
pkg-manager: pip
pip-dependency-file: requirements.txt

- run:
name: Installs for deploy
command: |
mkdir -p $HOME/bin
export PATH=$HOME/bin:$PATH
curl -L "https://cli.run.pivotal.io/stable?release=linux64-binary&version=7.1.0" | tar xzv -C $HOME/bin
- deploy:
name: Deploy API
command: |
export PATH=$HOME/bin:$PATH
invoke deploy --branch $CIRCLE_BRANCH --login
# Invoke jobs via workflows
# See: https://circleci.com/docs/2.0/configuration-reference/#workflows
workflows:
test: # This is the name of the workflow, feel free to change it to better match your workflow.
jobs:
- test
- dependency-check


- deploy:
requires:
- test
- dependency-check
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -67,3 +67,6 @@ testem.log
Thumbs.db
.vscode/settings.json
front-end/.vscode/launch.json

# cloud foundry
.cfmeta
6 changes: 3 additions & 3 deletions django-backend/Dockerfile → Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -3,13 +3,13 @@ ENV PYTHONUNBUFFERED=1

RUN mkdir /opt/nxg_fec
WORKDIR /opt/nxg_fec
ADD requirements.txt /opt/nxg_fec/
RUN pip3 install -r requirements.txt
ADD requirements.txt /opt
RUN pip3 install -r /opt/requirements.txt

RUN mv /etc/localtime /etc/localtime.backup && ln -s /usr/share/zoneinfo/EST5EDT /etc/localtime

RUN useradd nxgu --no-create-home --home /opt/nxg_fec && chown -R nxgu:nxgu /opt/nxg_fec
user nxgu

EXPOSE 8080
ENTRYPOINT ["sh", "-c", "pip3 install -r requirements.txt && python wait_for_db.py && gunicorn --bind 0.0.0.0:8080 fecfiler.wsgi -w 10 -t 200 --reload"]
ENTRYPOINT ["sh", "-c", "cp /opt/requirements.txt . && pip3 install -r requirements.txt && python wait_for_db.py && gunicorn --bind 0.0.0.0:8080 fecfiler.wsgi -w 10 -t 200 --reload"]
1 change: 1 addition & 0 deletions Procfile
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
web: bin/run.sh
8 changes: 3 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,11 +30,9 @@ When running docker-compose you will need to be in the root directory of the pro
You should set the following environment variables in the shell where you are running 'docker-compose up -d'.
Proper values for the development variables are shown here as an example
```
export FECFILE_DB_HOST=db
export FECFILE_DB_USERNAME=postgres
export FECFILE_DB_PASSWORD=postgres
export FECFILE_DB_NAME=postgres
export FECFILE_URL=localhost
export DATABASE_URL = "postgres://postgres:[email protected]/postgres"
export FECFILE_TEST_DB_NAME = "postgres"
export DJANGO_SECRET_KEY = "If_using_test_db_use_secret_key_in_cloud.gov"
```
### Shut down the containers
`docker-compose down`
Expand Down
8 changes: 8 additions & 0 deletions bin/run.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
cd django-backend

# Run migrations
./manage.py makemigrations
./manage.py migrate --noinput

# Run application
python wait_for_db.py && gunicorn --bind 0.0.0.0:8080 fecfiler.wsgi -w 9 -t 200 --reload
30 changes: 6 additions & 24 deletions django-backend/fecfiler/core/transactions_chk_csv_duplicates.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
import hashlib
import os
import psycopg2
import pandas as pd
Expand All @@ -7,6 +6,7 @@
from pandas.util import hash_pandas_object
import numpy
from psycopg2.extensions import register_adapter, AsIs
from django.conf import settings


def addapt_numpy_float64(numpy_float64):
Expand All @@ -20,12 +20,6 @@ def addapt_numpy_int64(numpy_int64):
register_adapter(numpy.float64, addapt_numpy_float64)
register_adapter(numpy.int64, addapt_numpy_int64)


PG_HOST = os.getenv('FECFILE_DB_HOST', 'localhost')
PG_PORT = os.getenv('FECFILE_DB_PORT', 5432)
PG_DATABASE = os.getenv('FECFILE_DB_NAME', 'postgres')
PG_USER = os.getenv('FECFILE_DB_USERNAME', 'postgres')
PG_PASSWORD = os.getenv('FECFILE_DB_PASSWORD', 'postgres')
SQS_QUEUE_NAME = os.getenv('SQS_QUEUE_NAME')


Expand All @@ -50,13 +44,7 @@ def check_for_file_hash_in_db(cmteid, filename, hash, fecfilename):
""" insert a transactions_file_details """
selectsql = """SELECT cmte_id, md5, file_name, create_date FROM public.transactions_file_details WHERE cmte_id = %s AND file_name = %s AND md5 = %s AND fec_file_name = %s;"""

conn = psycopg2.connect(
user=PG_USER,
password=PG_PASSWORD,
host=PG_HOST,
port=PG_PORT,
database=PG_DATABASE
)
conn = psycopg2.connect(settings.DATABASE_URL)
cur = conn.cursor()
cur.execute(selectsql, (cmteid, filename, hash, fecfilename))
dbhash = cur.fetchone()
Expand All @@ -82,13 +70,7 @@ def load_file_hash_to_db(cmteid, filename, hash, fecfilename):
insertsql = """INSERT INTO transactions_file_details(cmte_id, file_name, md5, fec_file_name)
VALUES(%s, %s, %s, %s);"""

conn = psycopg2.connect(
user=PG_USER,
password=PG_PASSWORD,
host=PG_HOST,
port=PG_PORT,
database=PG_DATABASE
)
conn = psycopg2.connect(settings.DATABASE_URL)
cur = conn.cursor()
cur.execute(insertsql, (cmteid, filename, hash, fecfilename))
conn.commit()
Expand Down Expand Up @@ -140,17 +122,17 @@ def file_verify_upload(request):
bucket = AWS_STORAGE_IMPORT_CONTACT_BUCKET_NAME
file_name = request.data.get("fileName")
csv_obj = client.get_object(Bucket=bucket, Key=file_name)
hashlib.sha1(pd.util.hash_pandas_object(df).values).hexdigest()
hashlib.sha1(pd.util.hash_pandas_object(df).values).hexdigest()
#filepath = dirname(dirname(os.getcwd()))+"/csv/"
filename = "Disbursements_1q2020.csv"
hash_value = generate_md5_hash(filepath+filename)
fileexists = check_for_file_hash_in_db('C00000018', filename, hash_value)
if fileexists is None:
if fileexists is None:
load_file_hash_to_db('C00000018', filename, hash_value)
print('File loaded successfully!!!')
else:
print('File exists in DB')
print('File exists in DB')
return JsonResponse(contacts, status=status.HTTP_201_CREATED, safe=False)
Expand Down
38 changes: 6 additions & 32 deletions django-backend/fecfiler/core/transactions_validate_csv.py
Original file line number Diff line number Diff line change
@@ -1,35 +1,23 @@
import hashlib
import os
import os.path
from os import path
import psycopg2
import pandas as pd
from pandas_schema import Column, Schema
from pandas_schema.validation import MatchesPatternValidation, InListValidation
from pandas_schema.validation import MatchesPatternValidation
import boto3
from botocore.exceptions import ClientError
from io import StringIO
from pandas.util import hash_pandas_object
import numpy
from psycopg2.extensions import register_adapter, AsIs
import logging
import time
import re
from sqlalchemy import create_engine
from sqlalchemy.types import String
from sqlalchemy.types import NVARCHAR
from sqlalchemy.types import Text
from django.db import connection
from django.conf import settings

# Postgres Database Settings - local
PG_HOST = os.getenv('FECFILE_DB_HOST', 'localhost')
PG_PORT = os.getenv('FECFILE_DB_PORT', '5432')
PG_DATABASE = os.getenv('FECFILE_DB_NAME', 'postgres')
PG_USER = os.getenv('FECFILE_DB_USERNAME', 'postgres')
PG_PASSWORD = os.getenv('FECFILE_DB_PASSWORD', 'postgres')
SQS_QUEUE_NAME = os.getenv('SQS_QUEUE_NAME')


BACKEND_DB_HOST = os.getenv('BACKEND_DB_HOST')
BACKEND_DB_PORT = os.getenv('BACKEND_DB_PORT')
BACKEND_DB_NAME = os.getenv('BACKEND_DB_NAME')
Expand Down Expand Up @@ -57,7 +45,7 @@ def save_data_from_excel_to_db(data):
postgreSQLTable = 'ref_forms_scheds_format_specs'
try:
# "postgres://PG_USER:PG_PASSWORD@PG_HOST:PG_PORT/PG_DATABASE"
connectionstring = "postgres://" + PG_USER + ":" + PG_PASSWORD + "@" + PG_HOST + ":" + PG_PORT + "/" + PG_DATABASE
connectionstring = settings.DATABASE_URL
engine = create_engine(connectionstring, pool_recycle=3600)
postgreSQLConnection = engine.connect()
data.to_sql(postgreSQLTable, postgreSQLConnection, if_exists='append', index=False, dtype={'AUTO-GENERATE': Text})
Expand Down Expand Up @@ -176,11 +164,7 @@ def schema_validation(dataframe, schema, bktname, key, errorfilename):

def build_schemas(formname, sched, trans_type):
try:
connection = psycopg2.connect(user=PG_USER,
password=PG_PASSWORD,
host=PG_HOST,
port=PG_PORT,
database=PG_DATABASE)
connection = psycopg2.connect(settings.DATABASE_URL)
cursor = connection.cursor()
cursor.execute("SELECT rfsfs.formname, rfsfs.schedname, rfsfs.transaction_type, rfsfs.field_description, rfsfs.type, rfsfs.required FROM public.ref_forms_scheds_format_specs rfsfs WHERE rfsfs.formname = %s AND rfsfs.schedname = %s AND rfsfs.transaction_type = %s and rfsfs.type IS NOT NULL", (formname, sched, trans_type))
format_specs = cursor.fetchall()
Expand Down Expand Up @@ -415,13 +399,7 @@ def check_data_processed(md5, fecfilename):
WHERE tfd.fec_file_name = %s
ORDER BY create_Date DESC limit 1;
'''
conn = psycopg2.connect(
user=PG_USER,
password=PG_PASSWORD,
host=PG_HOST,
port=PG_PORT,
database=PG_DATABASE
)
conn = psycopg2.connect(settings.DATABASE_URL)
cur = conn.cursor()
cur.execute(selectsql, (fecfilename,))
if cur.rowcount == 1:
Expand Down Expand Up @@ -450,11 +428,7 @@ def load_transactions_from_temp_perm_tables(fecfilename):
try:
res = ''
selectsql = '''import_schedules'''
conn = psycopg2.connect(user=PG_USER,
password=PG_PASSWORD,
host=PG_HOST,
port=PG_PORT,
database=PG_DATABASE)
conn = psycopg2.connect(settings.DATABASE_URL)
cur = conn.cursor()
cur.callproc(selectsql, (fecfilename,))
if cur.rowcount == 1:
Expand Down
Loading

0 comments on commit 735c198

Please sign in to comment.