Skip to content

Commit

Permalink
Bump and fix nextjs-ollama-llm-ui (#347856)
Browse files Browse the repository at this point in the history
  • Loading branch information
drupol authored Oct 12, 2024
2 parents 7971a49 + f385d94 commit afd96ba
Show file tree
Hide file tree
Showing 4 changed files with 8 additions and 905 deletions.
3 changes: 2 additions & 1 deletion nixos/modules/services/web-apps/nextjs-ollama-llm-ui.nix
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ in

ollamaUrl = lib.mkOption {
type = lib.types.str;
default = "127.0.0.1:11434";
default = "http://127.0.0.1:11434";
example = "https://ollama.example.org";
description = ''
The address (including host and port) under which we can access the Ollama backend server.
Expand All @@ -79,6 +79,7 @@ in
serviceConfig = {
ExecStart = "${lib.getExe nextjs-ollama-llm-ui}";
DynamicUser = true;
CacheDirectory = "nextjs-ollama-llm-ui";
};
};
};
Expand Down
Loading

0 comments on commit afd96ba

Please sign in to comment.