diff --git a/docs/reference/providers/backend.md b/docs/reference/providers/backend.md index 17436a4..6024066 100644 --- a/docs/reference/providers/backend.md +++ b/docs/reference/providers/backend.md @@ -12,6 +12,7 @@ Currently, we have a total of 8 backends available: - [Google Gemini](https://ai.google.dev/docs/gemini_api_overview) - [Hugging Face](https://huggingface.co) - [LocalAI](https://github.com/go-skynet/LocalAI) +- [Oracle Cloud Infrastructure (OCI) Generative AI](https://www.oracle.com/artificial-intelligence/generative-ai/generative-ai-service/) - FakeAI ## OpenAI @@ -149,6 +150,23 @@ LocalAI is a local model, which is an OpenAI compatible API. It uses llama.cpp a k8sgpt analyze --explain --backend localai ``` +## Oracle Cloud Infrastructure (OCI) Generative AI + +[Oracle Cloud Infrastructure (OCI)](https://www.oracle.com/cloud/) Generative AI s a fully managed OCI service that provides a set of state-of-the-art, customizable large language models. +K8sgpt can be configured to use ready-to-use pretrained models, or create and host your own fine-tuned custom models based on your own data on dedicated AI clusters. + +To authenticate with OCI, create a [OCI SDK/CLI](https://docs.oracle.com/en-us/iaas/Content/API/Concepts/sdkconfig.htm) `config` file in your home directory's `.oci/` directory. + +Next, configure the OCI backend for a given model within an OCI compartment: +```bash +k8sgpt auth add --backend oci --model --compartmentId +``` + +Analyze using the OCI backend: +```bash +k8sgpt anaylze --explain --backend oci +``` + ## FakeAI FakeAI or the NoOpAiProvider might be useful in situations where you need to test a new feature or simulate the behaviour of an AI based-system without actually invoking it. It can help you with local development, testing and troubleshooting.