Replies: 2 comments
-
Please follow the issue template to update title and description of your issue. |
Beta Was this translation helpful? Give feedback.
0 replies
-
我按照你的配置: |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Step 1: Create a file named
docker-compose.yml
and fill in the following content:Step 2:Run the following command in the command line to start the docker compose project:
Step 3:Access http://localhost:3000/ in your browser to open NextChat page;
Step 4:Click on the model settings button at the far right of the conversation input box toolbar, switch the model to
gpt-4-turbo
, and close the window.Because Higrss's AI Proxy plugin (you can access http://localhost:8001 to view plugin configuration) is configured with gpt-4-turbo mapped to qwen-max, so actually here provides qwen-max as model service.
Done! Now you can have a conversation with AI.
Follow the above instructions for configuration, by default, you will have conversations with Tongyi Qwen (通义千问). If you want to switch to OpenAI ChatGPT, just modify
DEFAULT_AI_SERVICE=qwen
in the file asDEFAULT_AI_SERVICE=openai
, add OpenAI's API key configuration, and then restart docker compose project.Beta Was this translation helpful? Give feedback.
All reactions