Skip to content

Commit

Permalink
Merge branch 'master' into PRWLR-6061-create-sharepoint-service
Browse files Browse the repository at this point in the history
  • Loading branch information
danibarranqueroo committed Feb 27, 2025
2 parents 18895d5 + 1180522 commit 171bd4d
Show file tree
Hide file tree
Showing 77 changed files with 2,767 additions and 275 deletions.
26 changes: 25 additions & 1 deletion .env
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
# For production, it is recommended to use a secure method to store these variables and change the default secret keys.

#### Prowler UI Configuration ####
PROWLER_UI_VERSION="latest"
PROWLER_UI_VERSION="stable"
SITE_URL=http://localhost:3000
API_BASE_URL=http://prowler-api:8080/api/v1
NEXT_PUBLIC_API_DOCS_URL=http://prowler-api:8080/api/v1/docs
Expand All @@ -30,6 +30,30 @@ VALKEY_HOST=valkey
VALKEY_PORT=6379
VALKEY_DB=0

# API scan settings

# The path to the directory where scan output should be stored
DJANGO_TMP_OUTPUT_DIRECTORY = "/tmp/prowler_api_output"

# The maximum number of findings to process in a single batch
DJANGO_FINDINGS_BATCH_SIZE = 1000

# The AWS access key to be used when uploading scan output to an S3 bucket
# If left empty, default AWS credentials resolution behavior will be used
DJANGO_OUTPUT_S3_AWS_ACCESS_KEY_ID=""

# The AWS secret key to be used when uploading scan output to an S3 bucket
DJANGO_OUTPUT_S3_AWS_SECRET_ACCESS_KEY=""

# An optional AWS session token
DJANGO_OUTPUT_S3_AWS_SESSION_TOKEN=""

# The AWS region where your S3 bucket is located (e.g., "us-east-1")
DJANGO_OUTPUT_S3_AWS_DEFAULT_REGION=""

# The name of the S3 bucket where scan output should be stored
DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET=""

# Django settings
DJANGO_ALLOWED_HOSTS=localhost,127.0.0.1,prowler-api
DJANGO_BIND_ADDRESS=0.0.0.0
Expand Down
4 changes: 4 additions & 0 deletions .github/labeler.yml
Original file line number Diff line number Diff line change
Expand Up @@ -98,3 +98,7 @@ compliance:
- any-glob-to-any-file: "prowler/compliance/**"
- any-glob-to-any-file: "prowler/lib/outputs/compliance/**"
- any-glob-to-any-file: "tests/lib/outputs/compliance/**"

review-django-migrations:
- changed-files:
- any-glob-to-any-file: "api/src/backend/api/migrations/**"
2 changes: 2 additions & 0 deletions api/CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,10 +8,12 @@ All notable changes to the **Prowler API** are documented in this file.

### Added
- Social login integration with Google and GitHub [(#6906)](https://github.com/prowler-cloud/prowler/pull/6906)
- Add API scan report system, now all scans launched from the API will generate a compressed file with the report in OCSF, CSV and HTML formats [(#6878)](https://github.com/prowler-cloud/prowler/pull/6878).
- Configurable Sentry integration [(#6874)](https://github.com/prowler-cloud/prowler/pull/6874)

### Changed
- Optimized `GET /findings` endpoint to improve response time and size [(#7019)](https://github.com/prowler-cloud/prowler/pull/7019).
- Changed `findings.uid` field type from `varchar(300)` to `text` [(#7048)](https://github.com/prowler-cloud/prowler/pull/7048).

---

Expand Down
63 changes: 63 additions & 0 deletions api/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -269,3 +269,66 @@ poetry shell
cd src/backend
pytest
```

# Custom commands

Django provides a way to create custom commands that can be run from the command line.

> These commands can be found in: ```prowler/api/src/backend/api/management/commands```
To run a custom command, you need to be in the `prowler/api/src/backend` directory and run:

```console
poetry shell
python manage.py <command_name>
```

## Generate dummy data

```console
python manage.py findings --tenant
<TENANT_ID> --findings <NUM_FINDINGS> --re
sources <NUM_RESOURCES> --batch <TRANSACTION_BATCH_SIZE> --alias <ALIAS>
```

This command creates, for a given tenant, a provider, scan and a set of findings and resources related altogether.

> Scan progress and state are updated in real time.
> - 0-33%: Create resources.
> - 33-66%: Create findings.
> - 66%: Create resource-finding mapping.
>
> The last step is required to access the findings details, since the UI needs that to print all the information.
### Example

```console
~/backend $ poetry run python manage.py findings --tenant
fffb1893-3fc7-4623-a5d9-fae47da1c528 --findings 25000 --re
sources 1000 --batch 5000 --alias test-script

Starting data population
Tenant: fffb1893-3fc7-4623-a5d9-fae47da1c528
Alias: test-script
Resources: 1000
Findings: 25000
Batch size: 5000


Creating resources...
100%|███████████████████████| 1/1 [00:00<00:00, 7.72it/s]
Resources created successfully.


Creating findings...
100%|███████████████████████| 5/5 [00:05<00:00, 1.09s/it]
Findings created successfully.


Creating resource-finding mappings...
100%|███████████████████████| 5/5 [00:02<00:00, 1.81it/s]
Resource-finding mappings created successfully.


Successfully populated test data.
```
2 changes: 1 addition & 1 deletion api/docker-entrypoint.sh
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ start_prod_server() {

start_worker() {
echo "Starting the worker..."
poetry run python -m celery -A config.celery worker -l "${DJANGO_LOGGING_LEVEL:-info}" -Q celery,scans,deletion -E --max-tasks-per-child 1
poetry run python -m celery -A config.celery worker -l "${DJANGO_LOGGING_LEVEL:-info}" -Q celery,scans,scan-reports,deletion -E --max-tasks-per-child 1
}

start_worker_beat() {
Expand Down
30 changes: 23 additions & 7 deletions api/poetry.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

1 change: 1 addition & 0 deletions api/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -52,6 +52,7 @@ pytest-randomly = "3.15.0"
pytest-xdist = "3.6.1"
ruff = "0.5.0"
safety = "3.2.9"
tqdm = "4.67.1"
vulture = "2.14"

[tool.poetry.scripts]
Expand Down
45 changes: 27 additions & 18 deletions api/src/backend/api/decorators.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
from api.db_utils import POSTGRES_TENANT_VAR, SET_CONFIG_QUERY


def set_tenant(func):
def set_tenant(func=None, *, keep_tenant=False):
"""
Decorator to set the tenant context for a Celery task based on the provided tenant_id.
Expand Down Expand Up @@ -40,20 +40,29 @@ def some_task(arg1, **kwargs):
# The tenant context will be set before the task logic executes.
"""

@wraps(func)
@transaction.atomic
def wrapper(*args, **kwargs):
try:
tenant_id = kwargs.pop("tenant_id")
except KeyError:
raise KeyError("This task requires the tenant_id")
try:
uuid.UUID(tenant_id)
except ValueError:
raise ValidationError("Tenant ID must be a valid UUID")
with connection.cursor() as cursor:
cursor.execute(SET_CONFIG_QUERY, [POSTGRES_TENANT_VAR, tenant_id])

return func(*args, **kwargs)

return wrapper
def decorator(func):
@wraps(func)
@transaction.atomic
def wrapper(*args, **kwargs):
try:
if not keep_tenant:
tenant_id = kwargs.pop("tenant_id")
else:
tenant_id = kwargs["tenant_id"]
except KeyError:
raise KeyError("This task requires the tenant_id")
try:
uuid.UUID(tenant_id)
except ValueError:
raise ValidationError("Tenant ID must be a valid UUID")
with connection.cursor() as cursor:
cursor.execute(SET_CONFIG_QUERY, [POSTGRES_TENANT_VAR, tenant_id])

return func(*args, **kwargs)

return wrapper

if func is None:
return decorator
else:
return decorator(func)
Loading

0 comments on commit 171bd4d

Please sign in to comment.