-
Notifications
You must be signed in to change notification settings - Fork 134
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
openai.InternalServerError: Error code: 502 #43
Comments
Hi, have you tried other models? |
hum, that's confusing, I thought this is using local LLM, why would it use OpenAI's stuff? |
got same error |
Very strange, I did not do anything, and it now works normally |
Hi everyone, the code is updated, you can try the manual way of installation now, i have not done the release yet so you cant do |
Hi guys! I got same error too, and figure it out that this issue happens while 8080 port is used by another program. I closed the other program and restart chat-with-mlx. It works well now. |
i have alread loaded model
Traceback (most recent call last):
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/gradio/queueing.py", line 495, in call_prediction
output = await route_utils.call_process_api(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/gradio/route_utils.py", line 235, in call_process_api
output = await app.get_blocks().process_api(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/gradio/blocks.py", line 1627, in process_api
result = await self.call_function(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/gradio/blocks.py", line 1185, in call_function
prediction = await utils.async_iteration(iterator)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/gradio/utils.py", line 514, in async_iteration
return await iterator.anext()
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/gradio/utils.py", line 640, in asyncgen_wrapper
response = await iterator.anext()
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/gradio/chat_interface.py", line 490, in _stream_fn
first_response = await async_iteration(generator)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/gradio/utils.py", line 514, in async_iteration
return await iterator.anext()
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/gradio/utils.py", line 507, in anext
return await anyio.to_thread.run_sync(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/anyio/to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 2144, in run_sync_in_worker_thread
return await future
^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 851, in run
result = context.run(func, *args)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/gradio/utils.py", line 490, in run_sync_iterator_async
return next(iterator)
^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/chat_with_mlx/app.py", line 166, in chatbot
response = client.chat.completions.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/openai/_utils/_utils.py", line 275, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 663, in create
return self._post(
^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/openai/_base_client.py", line 1200, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/openai/_base_client.py", line 889, in request
return self._request(
^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/openai/_base_client.py", line 965, in _request
return self._retry_request(
^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/openai/_base_client.py", line 1013, in _retry_request
return self._request(
^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/openai/_base_client.py", line 965, in _request
return self._retry_request(
^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/openai/_base_client.py", line 1013, in _retry_request
return self._request(
^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/openai/_base_client.py", line 980, in _request
raise self._make_status_error_from_response(err.response) from None
openai.InternalServerError: Error code: 502
The text was updated successfully, but these errors were encountered: