Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

1.4.1 #28

Merged
merged 2 commits into from
Aug 23, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 10 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -143,4 +143,13 @@

### Added
* Dry Run mode for both single and multi-account modes.
* Added README Documentation for Dry Run modes.
* Added README Documentation for Dry Run modes.

## [1.4.1] - 2021-08-23

### Added
* Tagging for VPC Flow Log Resources in single account mode.
* Cleanup options for VPC Flow Logs and CloudTrails created by Assisted Log Enabler for AWS.
* README Documentation
* Added details in the Cleanup section to reflect VPC Flow Logs and CloudTrail commands.
* Added section about the Shared Responsibility Model.
13 changes: 11 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -295,12 +295,21 @@ Once the logs have been enabled, you can safely remove any of the downloaded fil

For any AWS IAM Roles that are created, either manually or using AWS CloudFormation StackSets, those can be safely deleted upon enablement of logs through the Assisted Log Enabler for AWS.

NEW! A cleanup mode is available within the Assisted Log Enabler for AWS (currently only for single account, Amazon Route 53 Resover Query Logs). Collected logs within Amazon S3 will NOT be removed, however, logging resources can be removed by following the below commands:
A cleanup mode is available within the Assisted Log Enabler for AWS (currently only for single account). Collected logs within Amazon S3 will NOT be removed, however, logging resources can be removed by following the below commands:
```
# To remove Amazon Route 53 Resolver Query Logs (single account):
# To remove Amazon Route 53 Resolver Query Log resources created by Assisted Log Enabler for AWS (single account):
python3 assisted_log_enabler.py --mode cleanup --single_r53querylogs
# To remove Amazon VPC Flow Log resources created by Assisted Log Enabler for AWS (single account):
python3 assisted_log_enabler.py --mode cleanup --single_vpcflow
# To remove AWS CloudTrail trails created by Assisted Log Enabler for AWS (single account):
python3 assisted_log_enabler.py --mode cleanup --single_cloudtrail
```

## Shared Responsibility Model
All resources created fall into the customer side of the Shared Responsibility Model.

For AWS customers, please refer to the following link for more information about the Shared Responsibility Model: [Link](https://aws.amazon.com/compliance/shared-responsibility-model/)

## Additional Tools
For analyzing logs created by Assisted Log Enabler for AWS, consider taking a look at the AWS Security Analytics Bootstrap, a tool that provides an Amazon Athena analysis environment that's quick to deploy, ready to use, and easy to maintain. [Link to GitHub repository.](https://github.com/awslabs/aws-security-analytics-bootstrap)

Expand Down
11 changes: 10 additions & 1 deletion assisted_log_enabler.py
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,10 @@ def assisted_log_enabler():
function_parser_group.add_argument('--cloudtrail', action='store_true', help=' Turns on AWS CloudTrail.')

cleanup_parser_group = parser.add_argument_group('Cleanup Options', 'Use these flags to choose which resources you want to turn logging off for.')
cleanup_parser_group.add_argument('--single_r53querylogs', action='store_true', help=' Turns on Amazon Route 53 Resolver Query Logs.')
cleanup_parser_group.add_argument('--single_r53querylogs', action='store_true', help=' Removes Amazon Route 53 Resolver Query Log resources created by Assisted Log Enabler for AWS.')
cleanup_parser_group.add_argument('--single_cloudtrail', action='store_true', help=' Removes AWS CloudTrail trails created by Assisted Log Enabler for AWS.')
cleanup_parser_group.add_argument('--single_vpcflow', action='store_true', help=' Removes Amazon VPC Flow Log resources created by Assisted Log Enabler for AWS.')
cleanup_parser_group.add_argument('--single_all', action='store_true', help=' Turns off all of the log types within the Assisted Log Enabler for AWS.')

dryrun_parser_group = parser.add_argument_group('Dry Run Options', 'Use these flags to run Assisted Log Enabler for AWS in Dry Run mode.')
dryrun_parser_group.add_argument('--single_account', action='store_true', help=' Runs Assisted Log Enabler for AWS in Dry Run mode for a single AWS account.')
Expand Down Expand Up @@ -123,6 +126,12 @@ def assisted_log_enabler():
elif args.mode == 'cleanup':
if args.single_r53querylogs:
ALE_cleanup_single.run_r53_cleanup()
elif args.single_cloudtrail:
ALE_cleanup_single.run_cloudtrail_cleanup()
elif args.single_vpcflow:
ALE_cleanup_single.run_vpcflow_cleanup()
elif args.single_all:
ALE_cleanup_single.lambda_handler(event, context)
elif args.mode == 'dryrun':
if args.single_account:
ALE_dryrun_single.lambda_handler(event, context)
Expand Down
101 changes: 90 additions & 11 deletions subfunctions/ALE_cleanup_single.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,17 +20,7 @@
timestamp_date_string = str(timestamp_date)


logFormatter = '%(asctime)s - %(levelname)s - %(message)s'
logging.basicConfig(format=logFormatter, level=logging.INFO)
logger = logging.getLogger()
logger.setLevel(logging.INFO)
output_handle = logging.FileHandler('ALE_' + timestamp_date_string + '.log')
output_handle.setLevel(logging.INFO)
logger.addHandler(output_handle)
formatter = logging.Formatter('%(asctime)s - %(levelname)s - %(message)s')
output_handle.setFormatter(formatter)


cloudtrail = boto3.client('cloudtrail')
region = os.environ['AWS_REGION']


Expand All @@ -40,6 +30,8 @@
# 1. Remove the Route 53 Resolver Query Logging Resources created by Assisted Log Enabler
def r53_cleanup():
"""Function to clean up Route 53 Query Logging Resources"""
logging.info("Note: This script can take a while to finish, depending about how many Route 53 Query Log resources exist (about 60 seconds per Query Log resource) that were created by Assisted Log Enabler for AWS")
time.sleep(1)
for aws_region in region_list:
logging.info("---- LINE BREAK BETWEEN REGIONS ----")
logging.info("Cleaning up Route 53 Query Logging Resources in region " + aws_region + ".")
Expand Down Expand Up @@ -112,6 +104,93 @@ def r53_cleanup():
logging.error(exception_handle)


# 2. Remove the CloudTrail Logging Resources created by Assisted Log Enabler.
def cloudtrail_cleanup():
"""Function to clean up CloudTrail Logs"""
logging.info("Cleaning up CloudTrail Logs.")
try:
logging.info("Cleaning up CloudTrail Logs created by Assisted Log Enabler for AWS.")
trail_list: list = []
removal_list: list = []
logging.info("DescribeTrails API Call")
cloudtrail_trails = cloudtrail.describe_trails()
for trail in cloudtrail_trails['trailList']:
trail_list.append(trail['TrailARN'])
logging.info("Listing CloudTrail trails created by Assisted Log Enabler for AWS.")
print("Full trail list")
print(trail_list)
for removal_trail in trail_list:
logging.info("Checking tags for trails created by Assisted Log Enabler for AWS.")
logging.info("ListTags API Call")
trail_tags = cloudtrail.list_tags(
ResourceIdList=[removal_trail]
)
for tag_lists in trail_tags['ResourceTagList']:
for key_info in tag_lists['TagsList']:
print(key_info)
if key_info['Key'] == 'workflow' and key_info['Value'] == 'assisted-log-enabler':
removal_list.append(removal_trail)
print("Trails to be removed")
print(removal_list)
for delete_trail in removal_list:
logging.info("Deleting trails created by Assisted Log Enabler for AWS.")
logging.info("DeleteTrail API Call")
cloudtrail.delete_trail(
Name=delete_trail
)
logging.info(delete_trail + " has been deleted.")
time.sleep(1)
except Exception as exception_handle:
logging.error(exception_handle)


# 3. Remove the VPC Flow Log Resources created by Assisted Log Enabler for AWS.
def vpcflow_cleanup():
"""Function to clean up VPC Flow Logs"""
logging.info("Cleaning up VPC Flow Logs created by Assisted Log Enabler for AWS.")
for aws_region in region_list:
try:
logging.info("---- LINE BREAK BETWEEN REGIONS ----")
logging.info("Cleaning up VPC Flow Logs created by Assisted Log Enabler for AWS in region " + aws_region + ".")
removal_list: list = []
ec2 = boto3.client('ec2', region_name=aws_region)
logging.info("DescribeFlowLogs API Call")
vpc_flow_logs = ec2.describe_flow_logs(
Filter=[
{
'Name': 'tag:workflow',
'Values': [
'assisted-log-enabler'
]
},
]
)
for flow_log_id in vpc_flow_logs['FlowLogs']:
print(flow_log_id['FlowLogId'])
removal_list.append(flow_log_id['FlowLogId'])
print(removal_list)
logging.info("DeleteFlowLogs API Call")
delete_logs = ec2.delete_flow_logs(
FlowLogIds=removal_list
)
logging.info("Deleted Flow Logs that were created by Assisted Log Enabler for AWS.")
time.sleep(1)
except Exception as exception_handle:
logging.error(exception_handle)


def run_vpcflow_cleanup():
"""Function to run the vpcflow_cleanup function"""
vpcflow_cleanup()
logging.info("This is the end of the script. Please feel free to validate that logging resources have been cleaned up.")


def run_cloudtrail_cleanup():
"""Function to run the cloudtrail_cleanup function"""
cloudtrail_cleanup()
logging.info("This is the end of the script. Please feel free to validate that logging resources have been cleaned up.")


def run_r53_cleanup():
"""Function to run the r53_cleanup function"""
r53_cleanup()
Expand Down
13 changes: 12 additions & 1 deletion subfunctions/ALE_single_account.py
Original file line number Diff line number Diff line change
Expand Up @@ -140,7 +140,18 @@ def flow_log_activator(region_list, account_number, unique_end):
TrafficType='ALL',
LogDestinationType='s3',
LogDestination='arn:aws:s3:::aws-log-collection-' + account_number + '-' + region + '-' + unique_end + '/vpcflowlogs',
LogFormat='${version} ${account-id} ${interface-id} ${srcaddr} ${dstaddr} ${srcport} ${dstport} ${protocol} ${packets} ${bytes} ${start} ${end} ${action} ${log-status} ${vpc-id} ${type} ${tcp-flags} ${subnet-id} ${sublocation-type} ${sublocation-id} ${region} ${pkt-srcaddr} ${pkt-dstaddr} ${instance-id} ${az-id} ${pkt-src-aws-service} ${pkt-dst-aws-service} ${flow-direction} ${traffic-path}'
LogFormat='${version} ${account-id} ${interface-id} ${srcaddr} ${dstaddr} ${srcport} ${dstport} ${protocol} ${packets} ${bytes} ${start} ${end} ${action} ${log-status} ${vpc-id} ${type} ${tcp-flags} ${subnet-id} ${sublocation-type} ${sublocation-id} ${region} ${pkt-srcaddr} ${pkt-dstaddr} ${instance-id} ${az-id} ${pkt-src-aws-service} ${pkt-dst-aws-service} ${flow-direction} ${traffic-path}',
TagSpecifications=[
{
'ResourceType': 'vpc-flow-log',
'Tags': [
{
'Key': 'workflow',
'Value': 'assisted-log-enabler'
},
]
}
]
)
logging.info("VPC Flow Logs are turned on.")
except Exception as exception_handle:
Expand Down