Skip to content

Commit

Permalink
add support for ramalama rm model
Browse files Browse the repository at this point in the history
Signed-off-by: Daniel J Walsh <[email protected]>
  • Loading branch information
rhatdan committed Aug 22, 2024
1 parent fc9bef0 commit 80db23a
Show file tree
Hide file tree
Showing 4 changed files with 34 additions and 0 deletions.
16 changes: 16 additions & 0 deletions docs/source/markdown/ramalama-rm.1.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
% ramalama-rm 1

## NAME
ramalama - Remove specified AI Model from system.

## SYNOPSIS
**ramalama rm** [*options*] *model*

## DESCRIPTION
Remove specified AI Model from system

## SEE ALSO
**[ramalama(1)](ramalama.1.md)

## HISTORY
Aug 2024, Originally compiled by Dan Walsh <[email protected]>
1 change: 1 addition & 0 deletions docs/source/markdown/ramalama.1.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@ Ramalama : The goal of ramalama is to make AI even more boring.
| [ramalama-pull(1)](ramalama-pull.1.md) | Pull AI Model from registry to local storage |
| [ramalama-push(1)](ramalama-push.1.md) | Push specified AI Model (OCI-only at present) |
| [ramalama-run(1)](ramalama-run.1.md) | Run a chatbot on AI Model. |
| [ramalama-rm(1)](ramalama-rm.1.md) | Remove an AI Model. |
| [ramalama-serve(1)](ramalama-serve.1.md)| Serve local AI Model as an API Service. |

## CONFIGURATION FILES
Expand Down
11 changes: 11 additions & 0 deletions ramalama/cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ def usage():
print(" list List models")
print(" pull MODEL Pull a model")
print(" push MODEL TARGET Push a model to target")
print(" rm MODEL Remove a model")
print(" run MODEL Run a model")
print(" serve MODEL Serve a model")
sys.exit(1)
Expand Down Expand Up @@ -216,6 +217,15 @@ def push_cli(store, args, port):
f"Unsupported repository type for model: {model}")


def rm_cli(store, args, port):
if len(args) < 1:
usage()

model = args.pop(0).replace("://", "/")
model = f"{store}/models/{model}"
exec_cmd(["rm", model])


def run_cli(store, args, port):
if len(args) < 1:
usage()
Expand Down Expand Up @@ -259,4 +269,5 @@ def in_container():
funcDict["pull"] = pull_cli
funcDict["push"] = push_cli
funcDict["run"] = run_cli
funcDict["rm"] = rm_cli
funcDict["serve"] = serve_cli
6 changes: 6 additions & 0 deletions test/ci.sh
Original file line number Diff line number Diff line change
Expand Up @@ -56,6 +56,12 @@ main() {
./${binfile} list | grep tiny-vicuna-1b
./${binfile} list | grep NAME
./${binfile} list | grep oci://quay.io/mmortari/gguf-py-example/v1/example.gguf
./${binfile} rm ollama://ben1t0/tiny-llm:latest
if ./${binfile} list | grep ben1t0/tiny-llm; then
exit 1
else
exit 0
fi
# ramalama list | grep granite-code
# ramalama rm granite-code
}
Expand Down

0 comments on commit 80db23a

Please sign in to comment.