-
Notifications
You must be signed in to change notification settings - Fork 4.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TypeError: sequence item 1: expected str instance, NoneType found #1541
Comments
hi |
(myenv) F:\for open interpreter>interpreter --version |
Loading qwen2.5-coder:14b... Traceback (most recent call last): |
I am facing the same issue, here are the reproducible steps:
then choose uv version = 0.5.6 (installed with homebrew) nvm, found the solution in another issue, run |
thanks for your reply |
Describe the bug
Loading qwen2.5-coder:14b...
Traceback (most recent call last):
File "", line 198, in _run_module_as_main
File "", line 88, in run_code
File "C:\Python312\Scripts\interpreter.exe_main.py", line 7, in
File "C:\Python312\Lib\site-packages\interpreter\terminal_interface\start_terminal_interface.py", line 612, in main
start_terminal_interface(interpreter)
File "C:\Python312\Lib\site-packages\interpreter\terminal_interface\start_terminal_interface.py", line 471, in start_terminal_interface
interpreter = profile(
^^^^^^^^
File "C:\Python312\Lib\site-packages\interpreter\terminal_interface\profiles\profiles.py", line 64, in profile
return apply_profile(interpreter, profile, profile_path)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python312\Lib\site-packages\interpreter\terminal_interface\profiles\profiles.py", line 148, in apply_profile
exec(profile["start_script"], scope, scope)
File "", line 1, in
File "C:\Python312\Lib\site-packages\interpreter\core\core.py", line 145, in local_setup
self = local_setup(self)
^^^^^^^^^^^^^^^^^
File "C:\Python312\Lib\site-packages\interpreter\terminal_interface\local_setup.py", line 314, in local_setup
interpreter.computer.ai.chat("ping")
File "C:\Python312\Lib\site-packages\interpreter\core\computer\ai\ai.py", line 134, in chat
for chunk in self.computer.interpreter.llm.run(messages):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python312\Lib\site-packages\interpreter\core\llm\llm.py", line 86, in run
self.load()
File "C:\Python312\Lib\site-packages\interpreter\core\llm\llm.py", line 397, in load
self.interpreter.computer.ai.chat("ping")
File "C:\Python312\Lib\site-packages\interpreter\core\computer\ai\ai.py", line 134, in chat
for chunk in self.computer.interpreter.llm.run(messages):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python312\Lib\site-packages\interpreter\core\llm\llm.py", line 322, in run
yield from run_tool_calling_llm(self, params)
File "C:\Python312\Lib\site-packages\interpreter\core\llm\run_tool_calling_llm.py", line 178, in run_tool_calling_llm
for chunk in llm.completions(**request_params):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python312\Lib\site-packages\interpreter\core\llm\llm.py", line 466, in fixed_litellm_completions
raise first_error # If all attempts fail, raise the first error
^^^^^^^^^^^^^^^^^
File "C:\Python312\Lib\site-packages\interpreter\core\llm\llm.py", line 443, in fixed_litellm_completions
yield from litellm.completion(**params)
File "C:\Python312\Lib\site-packages\litellm\llms\ollama.py", line 427, in ollama_completion_stream
raise e
File "C:\Python312\Lib\site-packages\litellm\llms\ollama.py", line 403, in ollama_completion_stream
response_content = "".join(content_chunks)
^^^^^^^^^^^^^^^^^^^^^^^
TypeError: sequence item 1: expected str instance, NoneType found
(langchain) C:\Windows\System32>
i want to use ollama model qwen2.5-coder:14b with open interpreter its giving error: TypeError: sequence item 1: expected str instance, NoneType found
Reproduce
1.run interpreter --local
2.select ollama
3.select model: qwen2.5-coder:14b
Expected behavior
i want to run qwen2.5-coder:14b
Screenshots
Open Interpreter version
0.4.3
Python version
3.12.5
Operating System name and version
Windows 11
Additional context
The text was updated successfully, but these errors were encountered: