Skip to content

Commit

Permalink
On Mac, run in container ...
Browse files Browse the repository at this point in the history
if podman is configured with krunkit,

or if Docker is installed.

Signed-off-by: Daniel J Walsh <[email protected]>
  • Loading branch information
rhatdan committed Oct 11, 2024
1 parent 2066a8a commit f024055
Show file tree
Hide file tree
Showing 2 changed files with 34 additions and 2 deletions.
7 changes: 6 additions & 1 deletion docs/ramalama.1.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,11 @@ behaviour.

RamaLama supports multiple AI model registries types called transports. Supported transports:

Note:

On Macs with Arm support and Podman, the Podman machine must be
configured to use the krunkit VM Type. This allows the Mac's GPU to be
used within the VM.

## TRANSPORTS

Expand Down Expand Up @@ -117,7 +122,7 @@ store AI Models in the specified directory (default rootless: `$HOME/.local/shar


## SEE ALSO
**[podman(1)](https://github.com/containers/podman/blob/main/docs/podman.1.md)**
**[podman(1)](https://github.com/containers/podman/blob/main/docs/podman.1.md)**, **docker(1)

## HISTORY
Aug 2024, Originally compiled by Dan Walsh <[email protected]>
29 changes: 28 additions & 1 deletion ramalama/cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,14 +30,41 @@ class HelpException(Exception):
pass


def ai_support_in_vm():
conman = container_manager()
if conman == "":
return False

if conman == "podman":
conman_args = [conman, "machine", "list", "--format", "{{ .VMType }}"]
try:
output = run_cmd(conman_args).stdout.decode("utf-8").strip()
if output == "krunkit":
return True
except subprocess.CalledProcessError:
pass
perror(
"""\
Warning: podman needs to be configured to use krunkit for AI Workloads,
running without containers
"""
)
return False
# Assume this is running with Docker and return true
return True


def use_container():
transport = os.getenv("RAMALAMA_IN_CONTAINER")
if transport:
return transport.lower() == "true"

if in_container() or sys.platform == "darwin":
if in_container():
return False

if sys.platform == "darwin":
return ai_support_in_vm()

return True


Expand Down

0 comments on commit f024055

Please sign in to comment.