Skip to content

Custom LLM integration with Voice Agent #1034

Discussion options

You must be logged in to vote

You might want to check how you are setting your configuration for the custom model.

Our Docs give an example here: https://developers.deepgram.com/docs/voice-agent-llm-models#passing-a-custom-llm-through-a-cloud-provider

Also there is a working demo of doing this with Azure Open AI here which might help you: https://github.com/deepgram-devs/voice-agent-azure-open-ai-services/blob/1ad0eee7d99cff521da0de9abaf247ad24a4e1a6/client.py#L41

Though if you are running your LLM locally, that might make this a bit more challenging as I have not tried that yet.

I've set up my custom LLM REST endpoint at /chat in my localhost and exposed it publicly using NGROK. and configured the agent:
There might…

Replies: 4 comments

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Answer selected by deepgram-community
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
1 participant