-
Notifications
You must be signed in to change notification settings - Fork 78
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Dataset test UI improvements #5766
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎ 1 Skipped Deployment
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here are some self-review comments, let me know if you have any questions!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
New component with color-coded logs and auto-scrolling as new logs are available
from fides.api.models.privacy_request import PrivacyRequest, PrivacyRequestStatus | ||
from fides.api.models.privacy_request import PrivacyRequest |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You'll notice a lot of these files just update the imports. I was running into cyclic dependency issues with some Pydantic schemas that were being defined under the models
directory. This is kind of a noisy change but I figured it was the right thing to do in the long term, to move these schemas under the schema
directory instead
with logger.contextualize( | ||
privacy_request_id=privacy_request.id, | ||
privacy_request_source=privacy_request.source.value, | ||
): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is another key change. There are instances where we can't use the @log_context
decorator because we're not passing in any params of type PrivacyRequest
that we can automatically extract the ID and source from. In those cases we use a manual with logger.contextualize
context
with logger.contextualize( | ||
privacy_request_source=( | ||
privacy_request.source.value if privacy_request.source else None | ||
) | ||
): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The diff here looks a bit weird but I'm just wrapping this whole thing in a with logger.contextualize
src/fides/api/util/cache.py
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
New push_encoded_object
and get_decoded_list
helpers on FidesopsCache
self.cache = get_cache() | ||
|
||
def __call__(self, message: Message) -> None: | ||
"""Write log message to Redis if conditions are met.""" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Key change - This is what filters the test privacy request logs and writes them to Redis
# Add Redis sink if Redis is enabled | ||
if config.redis.enabled: | ||
redis_sink = RedisSink() | ||
handlers.extend( | ||
create_handler_dicts( | ||
level=config.logging.level, | ||
include_called_from=config.dev_mode, | ||
sink=redis_sink, | ||
serialize=True, # Always serialize for Redis | ||
colorize=False, # Redis doesn't need colorization | ||
) | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is where we attach the new RedisSink
to the existing logger configuration
return | ||
|
||
# Ensure we have a Redis connection | ||
self._ensure_cache() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This was my attempt at lazy loading. I was having issues during server startup because the cache and logging setup had a bit of cyclic dependencies
log_entry = LogEntry( | ||
timestamp=record["time"].isoformat(), | ||
level=record["level"].name, | ||
module_info=module_info, | ||
message=record["message"], | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Storing the logs as separate parts so we can format the timestamp and color the log components separately in the UI
Codecov ReportAttention: Patch coverage is
❌ Your patch check has failed because the patch coverage (90.65%) is below the target coverage (100.00%). You can increase the patch coverage or adjust the target coverage. Additional details and impacted files@@ Coverage Diff @@
## main #5766 +/- ##
==========================================
- Coverage 87.18% 87.18% -0.01%
==========================================
Files 394 394
Lines 24439 24528 +89
Branches 2643 2655 +12
==========================================
+ Hits 21307 21384 +77
- Misses 2555 2563 +8
- Partials 577 581 +4 ☔ View full report in Codecov by Sentry. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Legit!
Closes LJ-348
Description Of Changes
This PR adds a

Test logs
section to the dataset test UICode Changes
log_{privacy_request_id}
GET /privacy_request/{id}/logs
endpoint to retrieve the logsSteps to Confirm
FIDES__SECURITY__DSR_TESTING_TOOLS_ENABLED=true
in your.env
filenox -s dev -- postgres
(you might need to runnox -s build
if you get an error)Postgres
and add a Postgres integration to thepostgres_example
database[email protected]
Pre-Merge Checklist
CHANGELOG.md
updatedmain
downgrade()
migration is correct and works