-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature] Automatic Chat Template #947
base: main
Are you sure you want to change the base?
Conversation
Just thinking out loud a bit... But could structured state (in some form -- not necessarily the rough implementation in my open PR) allow us to call It might be arbitrarily hard to make that work, but again, just thinking out loud ;) |
I believe so, yes. There may be a few edge cases (e.g. what to do when a 'system' prompt isn't available... some templates will error, others quietly prepend to the first 'user' prompt) but that should be far more reliable than this entire approach |
Codecov ReportAttention: Patch coverage is
❗ Your organization needs to install the Codecov GitHub app to enable full functionality.
Additional details and impacted files@@ Coverage Diff @@
## main #947 +/- ##
==========================================
- Coverage 56.45% 50.77% -5.68%
==========================================
Files 63 63
Lines 4793 4823 +30
==========================================
- Hits 2706 2449 -257
- Misses 2087 2374 +287 ☔ View full report in Codecov by Sentry. |
Hi hudson, I would like to know your approach and would you mind pointing me to the relevant part of the code in your open PR or kindly give me an example? Thank you very much. fan |
Create a very basic function to extract a
guidance.ChatTemplate
from a given Transformers Tokenizer. This can be extended over time as we discover new and exciting tokenisers. However, this will have to be balanced against the probability that it can never be fully general - Transformers uses Jinja2 templates, which have all sorts of goodies like loops and branches.