Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

openai.InternalServerError: Error code: 502 #43

Open
yjc11 opened this issue Mar 7, 2024 · 6 comments
Open

openai.InternalServerError: Error code: 502 #43

yjc11 opened this issue Mar 7, 2024 · 6 comments

Comments

@yjc11
Copy link

yjc11 commented Mar 7, 2024

i have alread loaded model
image

Traceback (most recent call last):
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/gradio/queueing.py", line 495, in call_prediction
output = await route_utils.call_process_api(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/gradio/route_utils.py", line 235, in call_process_api
output = await app.get_blocks().process_api(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/gradio/blocks.py", line 1627, in process_api
result = await self.call_function(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/gradio/blocks.py", line 1185, in call_function
prediction = await utils.async_iteration(iterator)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/gradio/utils.py", line 514, in async_iteration
return await iterator.anext()
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/gradio/utils.py", line 640, in asyncgen_wrapper
response = await iterator.anext()
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/gradio/chat_interface.py", line 490, in _stream_fn
first_response = await async_iteration(generator)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/gradio/utils.py", line 514, in async_iteration
return await iterator.anext()
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/gradio/utils.py", line 507, in anext
return await anyio.to_thread.run_sync(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/anyio/to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 2144, in run_sync_in_worker_thread
return await future
^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 851, in run
result = context.run(func, *args)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/gradio/utils.py", line 490, in run_sync_iterator_async
return next(iterator)
^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/chat_with_mlx/app.py", line 166, in chatbot
response = client.chat.completions.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/openai/_utils/_utils.py", line 275, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 663, in create
return self._post(
^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/openai/_base_client.py", line 1200, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/openai/_base_client.py", line 889, in request
return self._request(
^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/openai/_base_client.py", line 965, in _request
return self._retry_request(
^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/openai/_base_client.py", line 1013, in _retry_request
return self._request(
^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/openai/_base_client.py", line 965, in _request
return self._retry_request(
^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/openai/_base_client.py", line 1013, in _retry_request
return self._request(
^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/mlx-chat/lib/python3.11/site-packages/openai/_base_client.py", line 980, in _request
raise self._make_status_error_from_response(err.response) from None
openai.InternalServerError: Error code: 502

@qnguyen3
Copy link
Owner

qnguyen3 commented Mar 8, 2024

Hi, have you tried other models?

@RedemptionC
Copy link

hum, that's confusing, I thought this is using local LLM, why would it use OpenAI's stuff?

@nealsun
Copy link

nealsun commented Mar 11, 2024

got same error

@yjc11
Copy link
Author

yjc11 commented Mar 11, 2024

Very strange, I did not do anything, and it now works normally

@qnguyen3
Copy link
Owner

Hi everyone, the code is updated, you can try the manual way of installation now, i have not done the release yet so you cant do pip install -U chat-with-mlx yet, but soon!

@undefinedv
Copy link

Hi guys! I got same error too, and figure it out that this issue happens while 8080 port is used by another program. I closed the other program and restart chat-with-mlx. It works well now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants