When changing models mid conversation does LibreChat send the whole conversation history to the newly selected model? #3999
-
This comment here suggest that all the histroy of a conversation counts towards the token count. This means that the cost of the next message send in a conversation increases non-linearly with each additional message. This can spiral out of hand for really long messages. @danny-avila, in this context, would a Token Counter be helpful? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Of course, and there's many ways you can mitigate this. For example, you can utilize a preset to keep the context window intentionally at a limit you are comfortable with; otherwise, it will use the known limit for the model you have selected. For gpt-4o, this is a max of 128K tokens |
Beta Was this translation helpful? Give feedback.
Of course, and there's many ways you can mitigate this. For example, you can utilize a preset to keep the context window intentionally at a limit you are comfortable with; otherwise, it will use the known limit for the model you have selected. For gpt-4o, this is a max of 128K tokens