Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement Accuracy Drop for Robustness and Bias Tests #1125

Open
chakravarthik27 opened this issue Sep 23, 2024 · 0 comments · May be fixed by #1129
Open

Implement Accuracy Drop for Robustness and Bias Tests #1125

chakravarthik27 opened this issue Sep 23, 2024 · 0 comments · May be fixed by #1129
Assignees
Labels
⭐ Feature Indicates new feature requests

Comments

@chakravarthik27
Copy link
Collaborator

This implementation involves comparing the ground truth vs. expected result and the ground truth vs. actual result, where the actual result is derived from a perturbed version of the original text. The goal is to measure the accuracy drop that occurs due to robustness and bias. For more information, refer to the LangTest robustness tests.

@chakravarthik27 chakravarthik27 self-assigned this Sep 23, 2024
@chakravarthik27 chakravarthik27 added the ⭐ Feature Indicates new feature requests label Oct 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
⭐ Feature Indicates new feature requests
Projects
None yet
1 participant