Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

show error #62

Open
liuyf90 opened this issue Mar 14, 2024 · 4 comments
Open

show error #62

liuyf90 opened this issue Mar 14, 2024 · 4 comments

Comments

@liuyf90
Copy link

liuyf90 commented Mar 14, 2024

I am trying to install a package using pip on my M2 MacBook Air, but after entering the command and clicking submit, the following error is reported in the background:

❯ chat-with-mlx -h
You try to use a model that was created with version 2.4.0.dev0, however, your version is 2.4.0. This might cause unexpected behavior or errors. In that case, try to update to the latest version.

Running on local URL: http://127.0.0.1:7860

To create a public link, set share=True in launch().
Traceback (most recent call last):
File "/opt/homebrew/lib/python3.11/site-packages/gradio/queueing.py", line 495, in call_prediction
output = await route_utils.call_process_api(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/gradio/route_utils.py", line 235, in call_process_api
output = await app.get_blocks().process_api(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/gradio/blocks.py", line 1627, in process_api
result = await self.call_function(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/gradio/blocks.py", line 1185, in call_function
prediction = await utils.async_iteration(iterator)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/gradio/utils.py", line 514, in async_iteration
return await iterator.anext()
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/gradio/utils.py", line 640, in asyncgen_wrapper
response = await iterator.anext()
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/gradio/chat_interface.py", line 490, in _stream_fn
first_response = await async_iteration(generator)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/gradio/utils.py", line 514, in async_iteration
return await iterator.anext()
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/gradio/utils.py", line 507, in anext
return await anyio.to_thread.run_sync(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/anyio/to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 2144, in run_sync_in_worker_thread
return await future
^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 851, in run
result = context.run(func, *args)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/gradio/utils.py", line 490, in run_sync_iterator_async
return next(iterator)
^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/chat_with_mlx/app.py", line 166, in chatbot
response = client.chat.completions.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/openai/_utils/_utils.py", line 275, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 663, in create
return self._post(
^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/openai/_base_client.py", line 1200, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/openai/_base_client.py", line 889, in request
return self._request(
^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/openai/_base_client.py", line 965, in _request
return self._retry_request(
^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/openai/_base_client.py", line 1013, in _retry_request
return self._request(
^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/openai/_base_client.py", line 965, in _request
return self._retry_request(
^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/openai/_base_client.py", line 1013, in _retry_request
return self._request(
^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/openai/_base_client.py", line 980, in _request
raise self._make_status_error_from_response(err.response) from None
openai.InternalServerError: Error code: 503

@makevin23
Copy link

Try to replace EMPTY with your openai api key in line 41 in app.py.

@liuyf90
Copy link
Author

liuyf90 commented Mar 20, 2024

Try to replace EMPTY with your openai api key in line 41 in app.py.

image
it shows this error, now.

@qnguyen3
Copy link
Owner

Hi @liuyf90, if you are chatting with a file, you need to specify whether you are chatting with a PDF or YouTube video, secondly, make sure your model is loaded.

@WhiteNotWhite
Copy link

Hi @liuyf90, if you are chatting with a file, you need to specify whether you are chatting with a PDF or YouTube video, secondly, make sure your model is loaded.

Try to replace EMPTY with your openai api key in line 41 in app.py.

Hi @qnguyen3 ,Why need to set the openai API key for local deployment

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants