litellm as service #1048
-
Hello, we are currently searching for an AI abstraction layer that includes features such as user management and rate limiting. We have heard that Litellm is a great solution for our needs. However, our team is not familiar with the Python stack. I was wondering if these same features can be accessed via a Docker container? |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 4 replies
-
Hey @sfiodorov Does the docker image not work? https://github.com/BerriAI/litellm/blob/main/Dockerfile Here's the docs: https://docs.litellm.ai/docs/proxy/quick_start#quick-start-docker-image-github-container-registry If you're looking for a hosted solution, I'd recommend checking out Helicone |
Beta Was this translation helpful? Give feedback.
-
Hi @sfiodorov we'd love to help, sharing a link to our calendar if you'd like to meet and discuss this: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat |
Beta Was this translation helpful? Give feedback.
-
Hey, that sounds good. |
Beta Was this translation helpful? Give feedback.
Hey @sfiodorov we now enable client budget limits - https://docs.litellm.ai/docs/proxy/virtual_keys#set-budgets
Let me know if this is what you were looking for. Also happy to hop on a call to discuss.
Attaching my socials so we can set up a dedicated support channel for your team: