Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How enable 01? #651

Closed
LesnoyChelovek opened this issue Dec 18, 2024 · 1 comment
Closed

How enable 01? #651

LesnoyChelovek opened this issue Dec 18, 2024 · 1 comment

Comments

@LesnoyChelovek
Copy link

I cannot connect the o1 models.

Errors appear

`Unknown encoding gpt-3.5-turbo. Plugins found: ['tiktoken_ext.openai_public']
Traceback (most recent call last):
File "/home/ubuntu/chatgpt-telegram-bot/bot/openai_helper.py", line 672, in __count_tokens
encoding = tiktoken.encoding_for_model(model)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/chatgpt-telegram-bot/venv/lib/python3.12/site-packages/tiktoken/model.py", line 103, in encoding_for_model
return get_encoding(encoding_name_for_model(model_name))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/chatgpt-telegram-bot/venv/lib/python3.12/site-packages/tiktoken/model.py", line 90, in encoding_name_for_model
raise KeyError(
KeyError: 'Could not automatically map o1-mini to a tokeniser. Please use tiktoken.get_encoding to explicitly get the tokeniser you expect.'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/ubuntu/chatgpt-telegram-bot/bot/openai_helper.py", line 241, in __common_get_chat_response
token_count = self.__count_tokens(self.conversations[chat_id])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/chatgpt-telegram-bot/bot/openai_helper.py", line 674, in __count_tokens
encoding = tiktoken.get_encoding("gpt-3.5-turbo")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/chatgpt-telegram-bot/venv/lib/python3.12/site-packages/tiktoken/registry.py", line 68, in get_encoding
raise ValueError(
ValueError: Unknown encoding gpt-3.5-turbo. Plugins found: ['tiktoken_ext.openai_public']

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/home/ubuntu/chatgpt-telegram-bot/bot/telegram_bot.py", line 798, in prompt
await wrap_with_indicator(update, context, _reply, constants.ChatAction.TYPING)
File "/home/ubuntu/chatgpt-telegram-bot/bot/utils.py", line 100, in wrap_with_indicator
await asyncio.wait_for(asyncio.shield(task), 4.5)
File "/usr/lib/python3.12/asyncio/tasks.py", line 520, in wait_for
return await fut
^^^^^^^^^
File "/home/ubuntu/chatgpt-telegram-bot/venv/lib/python3.12/site-packages/telegram/ext/_application.py", line 1184, in __create_task_callback
raise exception
File "/home/ubuntu/chatgpt-telegram-bot/venv/lib/python3.12/site-packages/telegram/ext/_application.py", line 1161, in __create_task_callback
return await coroutine # type: ignore[misc]
^^^^^^^^^^^^^^^
File "/home/ubuntu/chatgpt-telegram-bot/bot/telegram_bot.py", line 770, in _reply
response, total_tokens = await self.openai.get_chat_response(chat_id=chat_id, query=prompt)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/chatgpt-telegram-bot/bot/openai_helper.py", line 145, in get_chat_response
response = await self.__common_get_chat_response(chat_id, query)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/chatgpt-telegram-bot/venv/lib/python3.12/site-packages/tenacity/_asyncio.py", line 142, in async_wrapped
return await fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/chatgpt-telegram-bot/venv/lib/python3.12/site-packages/tenacity/_asyncio.py", line 58, in call
do = await self.iter(retry_state=retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/chatgpt-telegram-bot/venv/lib/python3.12/site-packages/tenacity/_asyncio.py", line 110, in iter
result = await action(retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/chatgpt-telegram-bot/venv/lib/python3.12/site-packages/tenacity/_asyncio.py", line 78, in inner
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/chatgpt-telegram-bot/venv/lib/python3.12/site-packages/tenacity/init.py", line 390, in
self._add_action_func(lambda rs: rs.outcome.result())
^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/concurrent/futures/_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/home/ubuntu/chatgpt-telegram-bot/venv/lib/python3.12/site-packages/tenacity/_asyncio.py", line 61, in call
result = await fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/chatgpt-telegram-bot/bot/openai_helper.py", line 292, in __common_get_chat_response
raise Exception(f"⚠️ {localized_text('error', bot_language)}. ⚠️\n{str(e)}") from e
Exception: ⚠️ An error has occurred. ⚠️`

@n3d1117
Copy link
Owner

n3d1117 commented Dec 28, 2024

Should be fixed now in main branch

@n3d1117 n3d1117 closed this as completed Dec 28, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants