Can we run kaito on self managed k8s like (k3s, minikube, etc) running on windows PC with Nvidia GPU #876
Unanswered
ajmal-yazdani
asked this question in
Q&A
Replies: 1 comment
-
Yes you can. Install the workspace controller following this guide.
If you have installed another node provisioning controller that supports Karpenter-core APIs, the steps for installing gpu-provisioner can be skipped.
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
The below comment suggested that it's quite possible, but how? Is there any article around this on how to please?
#452
Fei-Guo
on Aug 14, 2024
Collaborator
Author
Hey all, just quick question does this feature enhancement will it include self-hosted kubernetes, I checked a few places but wasn't sure so I figured maybe this could be right place to see if this is considered?
The consideration is that some will need for self-host community, home-labs and companies, etc that need the llms to be ran locally.
You can run Kaito in selfmanaged k8s if you already add GPU nodes in the cluster (with proper gpu driver and k8s plugin installed). In this case, you can just add those nodes in the Kaito workspace CR as preferrednodes in the Resource spec. Kaito will skip provisioning gpu nodes and just run inference workload in the existing nodes.
Beta Was this translation helpful? Give feedback.
All reactions