Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Model Hallucinations #30

Open
sshivaditya2019 opened this issue Nov 2, 2024 · 0 comments · May be fixed by #31
Open

Model Hallucinations #30

sshivaditya2019 opened this issue Nov 2, 2024 · 0 comments · May be fixed by #31

Comments

@sshivaditya2019
Copy link
Collaborator

Model outputs often fail to utilize the provided context, leading to hallucinations and other artifacts. This should not occur; the model should leverage relevant information and only retrieve what it truly needs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant