-
Notifications
You must be signed in to change notification settings - Fork 49
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
decoding parameters (e.g., temperature) for Gemma-2? #64
Comments
Hi, please refer to the parameters in this script: https://github.com/tatsu-lab/alpaca_eval/blob/main/src/alpaca_eval/models_configs/gemma-2-9b-it-SimPO/configs.yaml |
Hi, i also met this problem. I only got WR/LC as follows: here is my evaluation config: Gemma-2-Aligned-simpo: The only different is that i remove "do_sample: true". I reviewed your config and your conversation with the AE author on GitHub, and now I’m quite confused. Thank you~ |
Maybe you use alpaca_eval_gpt4_turbo_fn. In this setting, the result is close to the result you reported. |
Hello, How should I set the decoding parameters (e.g., temperature) for Gemma-2? My result is about ~50.0, far from the benchmark of 76.
The text was updated successfully, but these errors were encountered: