Skip to content

feat: add Groq support as alternative LLM provider #19

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

droongta-groq
Copy link

@droongta-groq droongta-groq commented Jan 25, 2025

Description

This PR adds support for using Groq as an alternative LLM provider alongside OpenAI. Users can now choose between OpenAI and Groq by setting an environment variable.

Changes

  • Add Groq SDK integration
  • Add environment variable support for model selection (MODEL_PROVIDER)
  • Add type definitions for environment configuration
  • Update documentation with Groq setup instructions
  • Add model configuration for both providers

Testing

  • Tested with OpenAI provider
  • Tested with Groq provider
  • Verified environment variable switching
  • Tested deployment with both providers

Additional Notes

The implementation maintains backward compatibility while adding the flexibility to choose between LLM providers. The default provider remains OpenAI if not specified.

Related Issues

N/A

- Add Groq SDK integration
- Add environment variable support for model selection
- Update documentation with Groq setup instructions
- Add type definitions for environment variables
Copy link

vercel bot commented Jan 25, 2025

Someone is attempting to deploy a commit to the browserbase Team on Vercel.

A member of the Team first needs to authorize it.

@MrlolDev
Copy link

Maybe changing llama 3.3 to deepseek?¿

@droongta-groq
Copy link
Author

Changed LLaMa to Deepseek :)

@dikkietrom
Copy link

add openrouter pls

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants