Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

2024-06-25 | MAIN --> PROD | DEV (e00d9db) --> STAGING #4018

Merged
merged 2 commits into from
Jun 25, 2024
Merged

2024-06-25 | MAIN --> PROD | DEV (e00d9db) --> STAGING #4018

merged 2 commits into from
Jun 25, 2024

Conversation

jadudm
Copy link
Contributor

@jadudm jadudm commented Jun 25, 2024

This is an auto-generated pull request to merge main into prod for a staging release on 2024-06-25 with the last commit being merged as e00d9db

* Add secondary db to tf to run db_to_db backup

* add secondary db to local stack

* Update command in makefile

* bump s3 version and add tag

* add dev maintained s3

* update service bindings

* Remove binding steps from deploy

* Add historic data load

* Remove s3 bucket sharing

We opted to remove this as the decision was made that staging
would have its own dedicated backups bucket, and if files in
the prod s3 bucket are to be shared, we can create a new bucket for
the specific purpose of syncing then sharing

* Add dedicated backups bucket in each environment

* give org_name for backups bucket

* Preliminary bash script for backups

* chmod +x

* File Rename

* Function Modifications

* Add restore script

* Add backup workflow

* fix typo

* chmod +x

* Small modifications to ensure util works properly

* Have the version be an input

* Update db2db operation

* Testing workflow

* scheduled_backup workflow test

version bump to v0.1.2 of util

* s3_restore workflow test

* db_restore workflow test

* Backup workflow

Run via workflow_dispatch:

* Quote to prevent globbing

* Delete - File no longer used

* Rename and replace workflow call

Potentially going to delete

* Update pre-deploy backup call

* Add restore workflow

* New scheduled backup workflow

Now with a matrix, for all environments

* CODEOWNERS update

* Add docs

* Point source to correct repo

Though the redirect will still happen, the repo was moved to gsa-tts org

* Version bump and modify backup logic

* Change folder path

* Add daily backup option

* Update verbiage and workflow options

* change pathing for s3 dumps

* deploy_backup task test

* scheduled_backup task test

* Increase task instance size

* scheduled_backup task test v2

* daily_backup task test

* typo fixes

* s3_restore task test

* s3_restore task test v2

* db_restore task test

* Final cleanup

* remove (restore test)

* typo fix

* remove restore workflows

per discussion with matt/tim

* workflow cleanup and removal of unused items

* Fix a small rebase issue
@jadudm jadudm requested a review from a team as a code owner June 25, 2024 10:32
@jadudm jadudm added autogenerated Automated pull request creation automerge Used for automated deployments labels Jun 25, 2024
Copy link
Contributor

github-actions bot commented Jun 25, 2024

Terraform plan for staging

Plan: 2 to add, 0 to change, 0 to destroy.
Terraform used the selected providers to generate the following execution
plan. Resource actions are indicated with the following symbols:
+   create

Terraform will perform the following actions:

  # module.staging-backups-bucket.cloudfoundry_service_instance.bucket will be created
+   resource "cloudfoundry_service_instance" "bucket" {
+       id                             = (known after apply)
+       name                           = "backups"
+       replace_on_params_change       = false
+       replace_on_service_plan_change = false
+       service_plan                   = "021bb2a3-7e11-4fc2-b06b-d9f5938cd806"
+       space                          = "7bbe587a-e8ee-4e8c-b32f-86d0b0f1b807"
+       tags                           = [
+           "s3",
        ]
    }

  # module.staging.module.snapshot-database.cloudfoundry_service_instance.rds will be created
+   resource "cloudfoundry_service_instance" "rds" {
+       id                             = (known after apply)
+       json_params                    = jsonencode(
            {
+               storage = 50
            }
        )
+       name                           = "fac-snapshot-db"
+       replace_on_params_change       = false
+       replace_on_service_plan_change = false
+       service_plan                   = "815c6069-289a-4444-ba99-40f0fa03a8f5"
+       space                          = "7bbe587a-e8ee-4e8c-b32f-86d0b0f1b807"
+       tags                           = [
+           "rds",
        ]
    }

Plan: 2 to add, 0 to change, 0 to destroy.

Warning: Argument is deprecated

  with module.staging-backups-bucket.cloudfoundry_service_instance.bucket,
  on /tmp/terraform-data-dir/modules/staging-backups-bucket/s3/main.tf line 14, in resource "cloudfoundry_service_instance" "bucket":
  14:   recursive_delete = var.recursive_delete

Since CF API v3, recursive delete is always done on the cloudcontroller side.
This will be removed in future releases

(and 6 more similar warnings elsewhere)

✅ Plan applied in Deploy to Staging Environment #229

Copy link
Contributor

Terraform plan for production

Plan: 8 to add, 4 to change, 0 to destroy.
Terraform used the selected providers to generate the following execution
plan. Resource actions are indicated with the following symbols:
+   create
!~  update in-place

Terraform will perform the following actions:

  # module.production.cloudfoundry_app.postgrest will be updated in-place
!~  resource "cloudfoundry_app" "postgrest" {
!~      docker_image                    = "ghcr.io/gsa-tts/fac/postgrest@sha256:4b903ac223ea5f583bd870a328a3cf54d19267e5b5abfff896863b37f7cb68b6" -> "ghcr.io/gsa-tts/fac/postgrest@sha256:08852a35ccf68490cf974e2b1a47d19480457c24b2244fa9f302ed785bd89462"
        id                              = "70ac44be-3507-4867-a75f-c2d1ab12ee89"
        name                            = "postgrest"
#        (17 unchanged attributes hidden)

#        (1 unchanged block hidden)
    }

  # module.production.module.clamav.cloudfoundry_app.clamav_api will be updated in-place
!~  resource "cloudfoundry_app" "clamav_api" {
!~      docker_image                    = "ghcr.io/gsa-tts/fac/clamav@sha256:b0e61e765f6c9a861cb8a4fbcfbd1df3e45fcbfa7cd78cd67c16d2e540d5301d" -> "ghcr.io/gsa-tts/fac/clamav@sha256:ba95b2eab2464f762071de942b60190be73c901a17a143b234ac3a53dc947d68"
        id                              = "5d0afa4f-527b-472a-8671-79a60335417f"
        name                            = "fac-av-production"
#        (17 unchanged attributes hidden)

#        (1 unchanged block hidden)
    }

  # module.production.module.file_scanner_clamav.cloudfoundry_app.clamav_api will be updated in-place
!~  resource "cloudfoundry_app" "clamav_api" {
!~      docker_image                    = "ghcr.io/gsa-tts/fac/clamav@sha256:b0e61e765f6c9a861cb8a4fbcfbd1df3e45fcbfa7cd78cd67c16d2e540d5301d" -> "ghcr.io/gsa-tts/fac/clamav@sha256:ba95b2eab2464f762071de942b60190be73c901a17a143b234ac3a53dc947d68"
        id                              = "a14bb29f-8276-4967-9754-cf9c4187ebe3"
        name                            = "fac-av-production-fs"
#        (17 unchanged attributes hidden)

#        (1 unchanged block hidden)
    }

  # module.production.module.https-proxy.cloudfoundry_app.egress_app will be updated in-place
!~  resource "cloudfoundry_app" "egress_app" {
        id                              = "5e81ca8b-99cf-41f8-ae42-76652d51a44c"
        name                            = "https-proxy"
!~      source_code_hash                = "e246274fca627d48afccde010de949371f24b6c9974c48aa91044acd36654fa8" -> "9fcf4a7f6abfc9a220de2b8bb97591ab490a271ac0933b984f606f645319e1a4"
#        (21 unchanged attributes hidden)

#        (1 unchanged block hidden)
    }

  # module.production.module.newrelic.newrelic_alert_policy.alert_policy will be created
+   resource "newrelic_alert_policy" "alert_policy" {
+       account_id          = (known after apply)
+       id                  = (known after apply)
+       incident_preference = "PER_POLICY"
+       name                = "production-alert-policy"
    }

  # module.production.module.newrelic.newrelic_notification_channel.email_channel will be created
+   resource "newrelic_notification_channel" "email_channel" {
+       account_id     = 3919076
+       active         = true
+       destination_id = (known after apply)
+       id             = (known after apply)
+       name           = "production_email_notification_channel"
+       product        = "IINT"
+       status         = (known after apply)
+       type           = "EMAIL"

+       property {
+           key           = "subject"
+           value         = "{{issueTitle}}"
#            (2 unchanged attributes hidden)
        }
    }

  # module.production.module.newrelic.newrelic_notification_destination.email_destination will be created
+   resource "newrelic_notification_destination" "email_destination" {
+       account_id = 3919076
+       active     = true
+       guid       = (known after apply)
+       id         = (known after apply)
+       last_sent  = (known after apply)
+       name       = "email_destination"
+       status     = (known after apply)
+       type       = "EMAIL"

+       property {
+           key           = "email"
+           value         = "[email protected], [email protected], [email protected]"
#            (2 unchanged attributes hidden)
        }
    }

  # module.production.module.newrelic.newrelic_nrql_alert_condition.error_transactions will be created
+   resource "newrelic_nrql_alert_condition" "error_transactions" {
+       account_id                   = 3919076
+       aggregation_delay            = "120"
+       aggregation_method           = "event_flow"
+       aggregation_window           = 60
+       enabled                      = true
+       entity_guid                  = (known after apply)
+       id                           = (known after apply)
+       name                         = "Error Transactions (%)"
+       policy_id                    = (known after apply)
+       type                         = "static"
+       violation_time_limit         = (known after apply)
+       violation_time_limit_seconds = 259200

+       critical {
+           operator              = "above"
+           threshold             = 5
+           threshold_duration    = 300
+           threshold_occurrences = "all"
        }

+       nrql {
+           query = "SELECT percentage(count(*), WHERE error is true) FROM Transaction"
        }

+       warning {
+           operator              = "above"
+           threshold             = 3
+           threshold_duration    = 300
+           threshold_occurrences = "all"
        }
    }

  # module.production.module.newrelic.newrelic_nrql_alert_condition.infected_file_found will be created
+   resource "newrelic_nrql_alert_condition" "infected_file_found" {
+       account_id                   = (known after apply)
+       aggregation_delay            = "120"
+       aggregation_method           = "event_flow"
+       aggregation_window           = 60
+       enabled                      = true
+       entity_guid                  = (known after apply)
+       fill_option                  = "static"
+       fill_value                   = 0
+       id                           = (known after apply)
+       name                         = "Infected File Found!"
+       policy_id                    = (known after apply)
+       type                         = "static"
+       violation_time_limit         = (known after apply)
+       violation_time_limit_seconds = 259200

+       critical {
+           operator              = "above_or_equals"
+           threshold             = 1
+           threshold_duration    = 300
+           threshold_occurrences = "at_least_once"
        }

+       nrql {
+           query = "SELECT count(*) FROM Log WHERE tags.space_name ='production' and message LIKE '%ScanResult.INFECTED%'"
        }
    }

  # module.production.module.newrelic.newrelic_one_dashboard.search_dashboard will be created
+   resource "newrelic_one_dashboard" "search_dashboard" {
+       account_id  = (known after apply)
+       guid        = (known after apply)
+       id          = (known after apply)
+       name        = "Search Dashboard (production)"
+       permalink   = (known after apply)
+       permissions = "public_read_only"

+       page {
+           guid = (known after apply)
+           name = "Search"

+           widget_billboard {
+               column         = 1
+               height         = 3
+               id             = (known after apply)
+               legend_enabled = true
+               row            = 1
+               title          = "Searches Per Hour"
+               width          = 3

+               nrql_query {
+                   account_id = (known after apply)
+                   query      = "SELECT count(*) as 'Total', rate(count(*), 1 minute) as 'Per Minute' FROM Transaction where request.uri like '%/dissemination/search%' and request.method = 'POST' and appName = 'gsa-fac-production' since 1 hours AGO COMPARE WITH 1 week ago"
                }
            }

+           widget_line {
+               column         = 4
+               height         = 3
+               id             = (known after apply)
+               legend_enabled = true
+               row            = 1
+               title          = "Search Traffic"
+               width          = 6

+               nrql_query {
+                   account_id = (known after apply)
+                   query      = "SELECT count(*) FROM Transaction where request.uri like '%/dissemination/search%' and request.method = 'POST' and appName = 'gsa-fac-production' since 4 hours AGO COMPARE WITH 1 week ago TIMESERIES"
                }
            }
+           widget_line {
+               column         = 1
+               height         = 3
+               id             = (known after apply)
+               legend_enabled = true
+               row            = 2
+               title          = "Search Response Time"
+               width          = 6

+               nrql_query {
+                   account_id = (known after apply)
+                   query      = "FROM Metric SELECT average(newrelic.timeslice.value) WHERE appName = 'gsa-fac-production' WITH METRIC_FORMAT 'Custom/search' TIMESERIES SINCE 1 day ago COMPARE WITH 1 week ago"
                }
            }
        }
    }

  # module.production.module.newrelic.newrelic_workflow.alert_workflow will be created
+   resource "newrelic_workflow" "alert_workflow" {
+       account_id            = (known after apply)
+       destinations_enabled  = true
+       enabled               = true
+       enrichments_enabled   = true
+       guid                  = (known after apply)
+       id                    = (known after apply)
+       last_run              = (known after apply)
+       muting_rules_handling = "DONT_NOTIFY_FULLY_MUTED_ISSUES"
+       name                  = "production_alert_workflow"
+       workflow_id           = (known after apply)

+       destination {
+           channel_id            = (known after apply)
+           name                  = (known after apply)
+           notification_triggers = (known after apply)
+           type                  = (known after apply)
        }

+       issues_filter {
+           filter_id = (known after apply)
+           name      = "filter"
+           type      = "FILTER"

+           predicate {
+               attribute = "labels.policyIds"
+               operator  = "EXACTLY_MATCHES"
+               values    = (known after apply)
            }
        }
    }

  # module.production.module.snapshot-database.cloudfoundry_service_instance.rds will be created
+   resource "cloudfoundry_service_instance" "rds" {
+       id                             = (known after apply)
+       json_params                    = jsonencode(
            {
+               storage = 50
            }
        )
+       name                           = "fac-snapshot-db"
+       replace_on_params_change       = false
+       replace_on_service_plan_change = false
+       service_plan                   = "58b899e8-eb36-441f-b406-d2f5b1e49c00"
+       space                          = "5593dba8-7023-49a5-bdbe-e809fe23edf9"
+       tags                           = [
+           "rds",
        ]
    }

Plan: 8 to add, 4 to change, 0 to destroy.

Warning: Argument is deprecated

  with module.domain.cloudfoundry_service_instance.external_domain_instance,
  on /tmp/terraform-data-dir/modules/domain/domain/main.tf line 45, in resource "cloudfoundry_service_instance" "external_domain_instance":
  45:   recursive_delete = var.recursive_delete

Since CF API v3, recursive delete is always done on the cloudcontroller side.
This will be removed in future releases

(and 6 more similar warnings elsewhere)

📝 Plan generated in Pull Request Checks #3225

Copy link
Contributor

File Coverage Missing
All files 87%
api/serializers.py 88% 177-178 183 188
api/test_views.py 95% 103
api/uei.py 88% 87 118-119 163 167-168
api/views.py 98% 198-199 337-338
audit/forms.py 60% 31-38 109-116
audit/intake_to_dissemination.py 88% 57-62 264 308-316
audit/mixins.py 96% 28
audit/test_commands.py 91%
audit/test_intakelib.py 88% 154-158
audit/test_manage_submission_access_view.py 98% 15 19
audit/test_mixins.py 90% 159-160 164-166 254-255 259-261
audit/test_validators.py 95% 439 443 611-612 851 858 865 872 1117-1118 1149-1150 1175-1180
audit/test_views.py 98% 132
audit/test_workbooks_should_fail.py 88% 58 87-88 92
audit/test_workbooks_should_pass.py 87% 59 74-76
audit/utils.py 86% 9 19 60-62 65
audit/validators.py 93% 138 190 279 419-420 435-436 519-520 622-626 631-635 651-660
audit/cross_validation/additional_ueis.py 93% 33
audit/cross_validation/check_award_ref_declaration.py 90%
audit/cross_validation/check_award_reference_uniqueness.py 93%
audit/cross_validation/check_certifying_contacts.py 87%
audit/cross_validation/check_findings_count_consistency.py 87% 35
audit/cross_validation/check_ref_number_in_cap.py 91%
audit/cross_validation/check_ref_number_in_findings_text.py 91%
audit/cross_validation/errors.py 78% 30 77
audit/cross_validation/naming.py 93% 229
audit/cross_validation/submission_progress_check.py 91% 83 126 174 182-183
audit/cross_validation/tribal_data_sharing_consent.py 81% 33 36 40
audit/cross_validation/validate_general_information.py 65% 77 81-84 96 99
audit/fixtures/dissemination.py 71% 38
audit/fixtures/single_audit_checklist.py 55% 160-197 245-254
audit/intakelib/exceptions.py 71% 7-9 12
audit/intakelib/intermediate_representation.py 91% 27-28 73 91 129 200-203 212-213 283-284
audit/intakelib/mapping_audit_findings.py 97% 55
audit/intakelib/mapping_audit_findings_text.py 97% 54
audit/intakelib/mapping_federal_awards.py 93% 92
audit/intakelib/mapping_util.py 79% 21 25 29 63 99 104-105 114-120 130 145 150
audit/intakelib/checks/check_all_unique_award_numbers.py 79% 24
audit/intakelib/checks/check_cluster_names.py 75% 20-25
audit/intakelib/checks/check_cluster_total.py 95% 99
audit/intakelib/checks/check_finding_reference_pattern.py 74% 34 44-45
audit/intakelib/checks/check_findings_grid_validation.py 89% 59
audit/intakelib/checks/check_has_all_the_named_ranges.py 95% 66
audit/intakelib/checks/check_is_a_workbook.py 69% 20
audit/intakelib/checks/check_loan_balance_entries.py 83% 28
audit/intakelib/checks/check_look_for_empty_rows.py 91% 18
audit/intakelib/checks/check_no_major_program_no_type.py 76% 18 27
audit/intakelib/checks/check_no_repeat_findings.py 88% 21
audit/intakelib/checks/check_other_cluster_names.py 81% 23 33
audit/intakelib/checks/check_passthrough_name_when_no_direct.py 83% 11 49 58
audit/intakelib/checks/check_sequential_award_numbers.py 82% 25 35
audit/intakelib/checks/check_start_and_end_rows_of_all_columns_are_same.py 89% 14
audit/intakelib/checks/check_state_cluster_names.py 81% 23 33
audit/intakelib/checks/check_version_number.py 73% 30 40-41
audit/intakelib/checks/runners.py 95% 187 217
audit/intakelib/common/util.py 90% 22 39
audit/intakelib/transforms/xform_rename_additional_notes_sheet.py 81% 14
audit/management/commands/load_fixtures.py 47% 40-46
audit/models/models.py 85% 58 60 65 67 209 215 227 239-242 260 437 455-456 464 486 584-585 589 597 606 612
audit/views/audit_info_form_view.py 27% 25-74 77-117 120-137
audit/views/manage_submission.py 86% 73-80
audit/views/manage_submission_access.py 98% 113-114
audit/views/pre_dissemination_download_view.py 78% 15-16 21-22 29-39
audit/views/submission_progress_view.py 89% 117 182-183
audit/views/tribal_data_consent.py 34% 23-41 44-79
audit/views/unlock_after_certification.py 57% 28-51 73-87
audit/views/upload_report_view.py 26% 32-35 44 91-117 120-170 178-209
audit/views/views.py 53% 74 81-100 123-124 198-199 220-230 257 268-269 280-281 283-287 329-342 345-359 364-377 394-400 405-425 452-456 461-490 533-537 542-562 589-593 598-627 670-674 679-691 694-704 709-721 754-768
census_historical_migration/change_record.py 98% 30
census_historical_migration/end_to_end_core.py 26% 57-89 93-111 116-155 161-187 246-258 263 273-307
census_historical_migration/invalid_record.py 94% 50 54 58 62 66
census_historical_migration/migration_result.py 75% 17 21 25 29 33-42 46
census_historical_migration/report_type_flag.py 96% 19
census_historical_migration/test_federal_awards_xforms.py 99% 219-220
census_historical_migration/sac_general_lib/audit_information.py 91% 28 82-87 336
census_historical_migration/sac_general_lib/cognizant_oversight.py 68% 11
census_historical_migration/sac_general_lib/general_information.py 86% 166-167 177-178 186-187 195-200 233-255 354-355
census_historical_migration/sac_general_lib/sac_creator.py 90% 34
census_historical_migration/sac_general_lib/utils.py 84% 35 62-71
census_historical_migration/transforms/xform_remove_hyphen_and_pad_zip.py 92% 18
census_historical_migration/transforms/xform_retrieve_uei.py 67% 10
census_historical_migration/transforms/xform_string_to_bool.py 87% 17
census_historical_migration/workbooklib/additional_eins.py 84% 58-60 67-77
census_historical_migration/workbooklib/additional_ueis.py 77% 27-29 36-46
census_historical_migration/workbooklib/corrective_action_plan.py 46% 49-51 65 93-125 134-153
census_historical_migration/workbooklib/excel_creation_utils.py 69% 110 119-124 129-136 140-158 171-175 189-192
census_historical_migration/workbooklib/federal_awards.py 77% 181-184 262-301 487 554-562 572-597 621-622 918-1022
census_historical_migration/workbooklib/findings.py 69% 81-86 154-174 179-199 216-218 330-354
census_historical_migration/workbooklib/findings_text.py 46% 50-51 67 97-129 138-160
census_historical_migration/workbooklib/notes_to_sefa.py 66% 34-38 101-102 104-105 107-108 136-143 152-160 168-170 279-326
census_historical_migration/workbooklib/post_upload_utils.py 21% 22-35 66-83 89-111
census_historical_migration/workbooklib/secondary_auditors.py 88% 128-130 186-205
census_historical_migration/workbooklib/workbook_builder.py 38% 14-17 26-41
census_historical_migration/workbooklib/workbook_builder_loader.py 41% 18-30
config/error_handlers.py 94% 22
config/test_settings.py 92% 33-34 49-50
config/urls.py 72% 88
dissemination/file_downloads.py 81% 43-61 91-93
dissemination/forms.py 87% 135 144 255
dissemination/search.py 88% 113 115 119 127-128
dissemination/summary_reports.py 78% 274 300-302 306-310 421 438 459 511-575 603 638-640 664-672
dissemination/test_search.py 93% 51-66 473-474 579-596 608-632 644-669 677-693
dissemination/test_summary_reports.py 98%
dissemination/views.py 75% 134 140-142 159-225 268 298 300 336 387 389 391 469-474
dissemination/migrations/0002_general_fac_accepted_date.py 47% 10-12
dissemination/searchlib/search_alns.py 37% 44-58 78-110 115-177 184-187
dissemination/searchlib/search_direct_funding.py 86% 21-22
dissemination/searchlib/search_findings.py 76% 18-24 34 36 38
dissemination/searchlib/search_general.py 96% 138
dissemination/searchlib/search_passthrough_name.py 35% 21-31
djangooidc/backends.py 78% 32 57-63
djangooidc/exceptions.py 66% 19 21 23 28
djangooidc/oidc.py 16% 32-35 45-51 64-70 92-149 153-199 203-226 230-275 280-281 286
djangooidc/views.py 81% 22 43 109-110 117
djangooidc/tests/common.py 97%
report_submission/forms.py 92% 35
report_submission/test_views.py 98% 835
report_submission/views.py 79% 97 250 274-275 280-281 321-491 494-504 563 599-601 609-610 613-615
report_submission/templatetags/get_attr.py 76% 8 11-14 18
support/admin.py 88% 76 79 84 91-97 100-102
support/cog_over.py 91% 29-32 92 116-120 156
support/test_admin_api.py 81% 23 147-148 237-238 317-318
support/test_cog_over.py 98% 174-175 264
support/management/commands/seed_cog_baseline.py 98% 20-21
support/models/cog_over.py 89% 103-104
tools/update_program_data.py 89% 96
users/admin.py 99% 27
users/auth.py 96% 58-59
users/models.py 96% 18 74-75
users/fixtures/user_fixtures.py 91%

Minimum allowed coverage is 85%

Generated by 🐒 cobertura-action against e00d9db

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
autogenerated Automated pull request creation automerge Used for automated deployments
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants