Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dataset test UI improvements #5766

Merged
merged 23 commits into from
Feb 22, 2025
Merged

Conversation

galvana
Copy link
Contributor

@galvana galvana commented Feb 15, 2025

Closes LJ-348

Description Of Changes

This PR adds a Test logs section to the dataset test UI
test-logs

Code Changes

  • Adds a new log destination (a "sink") that filters for logs that belong to test privacy requests. Each log entry for a given privacy request is stored in a list under the Redis key log_{privacy_request_id}
  • Adds a new GET /privacy_request/{id}/logs endpoint to retrieve the logs
  • New test logs component for the front-end

Steps to Confirm

  1. Set FIDES__SECURITY__DSR_TESTING_TOOLS_ENABLED=true in your .env file
  2. Start Fides by running nox -s dev -- postgres (you might need to run nox -s build if you get an error)
  3. Create a new system called Postgres and add a Postgres integration to the postgres_example database
Host: host.docker.internal
Port: 6432
Username: postgres
Password: postgres
Database: postgres_example
  1. Create and attach the postgres_example_test_dataset.txt dataset
  2. Navigate to http://localhost:3000/systems/configure/postgres/test-datasets. There is a button from the integration config if you're running plus, but this is so we can test in fides
  3. Select the default policy, and set the email to [email protected]
  4. Click on run, the request should succeed and the logs should be shown at the bottom

Pre-Merge Checklist

  • Issue requirements met
  • All CI pipelines succeeded
  • CHANGELOG.md updated
    • Add a db-migration This indicates that a change includes a database migration label to the entry if your change includes a DB migration
    • Add a high-risk This issue suggests changes that have a high-probability of breaking existing code label to the entry if your change includes a high-risk change (i.e. potential for performance impact or unexpected regression) that should be flagged
  • Followup issues:
    • Followup issues created (include link)
    • No followup issues
  • Database migrations:
    • Ensure that your downrev is up to date with the latest revision on main
    • Ensure that your downgrade() migration is correct and works
      • If a downgrade migration is not possible for this change, please call this out in the PR description!
    • No migrations
  • Documentation:
    • Documentation complete, PR opened in fidesdocs
    • Documentation issue created in fidesdocs
    • If there are any new client scopes created as part of the pull request, remember to update public-facing documentation that references our scope registry
    • No documentation updates required

Copy link

vercel bot commented Feb 15, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

1 Skipped Deployment
Name Status Preview Comments Updated (UTC)
fides-plus-nightly ⬜️ Ignored (Inspect) Visit Preview Feb 22, 2025 1:25am

@galvana galvana marked this pull request as ready for review February 20, 2025 22:33
Copy link
Contributor Author

@galvana galvana left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here are some self-review comments, let me know if you have any questions!

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

New component with color-coded logs and auto-scrolling as new logs are available

from fides.api.models.privacy_request import PrivacyRequest, PrivacyRequestStatus
from fides.api.models.privacy_request import PrivacyRequest
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You'll notice a lot of these files just update the imports. I was running into cyclic dependency issues with some Pydantic schemas that were being defined under the models directory. This is kind of a noisy change but I figured it was the right thing to do in the long term, to move these schemas under the schema directory instead

Comment on lines +2183 to +2186
with logger.contextualize(
privacy_request_id=privacy_request.id,
privacy_request_source=privacy_request.source.value,
):
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is another key change. There are instances where we can't use the @log_context decorator because we're not passing in any params of type PrivacyRequest that we can automatically extract the ID and source from. In those cases we use a manual with logger.contextualize context

Comment on lines +301 to +305
with logger.contextualize(
privacy_request_source=(
privacy_request.source.value if privacy_request.source else None
)
):
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The diff here looks a bit weird but I'm just wrapping this whole thing in a with logger.contextualize

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

New push_encoded_object and get_decoded_list helpers on FidesopsCache

self.cache = get_cache()

def __call__(self, message: Message) -> None:
"""Write log message to Redis if conditions are met."""
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Key change - This is what filters the test privacy request logs and writes them to Redis

Comment on lines +177 to +188
# Add Redis sink if Redis is enabled
if config.redis.enabled:
redis_sink = RedisSink()
handlers.extend(
create_handler_dicts(
level=config.logging.level,
include_called_from=config.dev_mode,
sink=redis_sink,
serialize=True, # Always serialize for Redis
colorize=False, # Redis doesn't need colorization
)
)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is where we attach the new RedisSink to the existing logger configuration

return

# Ensure we have a Redis connection
self._ensure_cache()
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This was my attempt at lazy loading. I was having issues during server startup because the cache and logging setup had a bit of cyclic dependencies

Comment on lines +65 to +70
log_entry = LogEntry(
timestamp=record["time"].isoformat(),
level=record["level"].name,
module_info=module_info,
message=record["message"],
)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Storing the logs as separate parts so we can format the timestamp and color the log components separately in the UI

Copy link

codecov bot commented Feb 20, 2025

Codecov Report

Attention: Patch coverage is 90.65744% with 27 lines in your changes missing coverage. Please review.

Project coverage is 87.18%. Comparing base (0c225a1) to head (d7d4d9f).

Files with missing lines Patch % Lines
.../service/privacy_request/request_runner_service.py 84.88% 9 Missing and 4 partials ⚠️
src/fides/api/task/execute_request_tasks.py 89.18% 3 Missing and 1 partial ⚠️
src/fides/api/util/logger.py 91.17% 1 Missing and 2 partials ⚠️
.../api/api/v1/endpoints/privacy_request_endpoints.py 86.66% 1 Missing and 1 partial ⚠️
...ce/connectors/query_configs/manual_query_config.py 0.00% 2 Missing ⚠️
src/fides/service/dataset/dataset_service.py 87.50% 2 Missing ⚠️
src/fides/api/util/cache.py 90.00% 0 Missing and 1 partial ⚠️

❌ Your patch check has failed because the patch coverage (90.65%) is below the target coverage (100.00%). You can increase the patch coverage or adjust the target coverage.

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #5766      +/-   ##
==========================================
- Coverage   87.18%   87.18%   -0.01%     
==========================================
  Files         394      394              
  Lines       24439    24528      +89     
  Branches     2643     2655      +12     
==========================================
+ Hits        21307    21384      +77     
- Misses       2555     2563       +8     
- Partials      577      581       +4     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@galvana galvana requested a review from JadeCara February 21, 2025 16:18
Copy link
Contributor

@JadeCara JadeCara left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Legit!

@galvana galvana added the run unsafe ci checks Runs fides-related CI checks that require sensitive credentials label Feb 22, 2025
@galvana galvana merged commit 92c6866 into main Feb 22, 2025
40 of 43 checks passed
@galvana galvana deleted the LJ-348-dataset-test-ui-improvements branch February 22, 2025 07:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
run unsafe ci checks Runs fides-related CI checks that require sensitive credentials
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants