Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to run ollama container via systemd on system startup #805

Open
elvismdev opened this issue Jan 30, 2025 · 3 comments
Open

How to run ollama container via systemd on system startup #805

elvismdev opened this issue Jan 30, 2025 · 3 comments

Comments

@elvismdev
Copy link

I'm trying to run the ollama container automatically on system startup using systemd on my Jetson Orin Nano. While I can start the container manually using:

jetson-containers run --name ollama $(autotag ollama)

I've tried creating a systemd service, but I'm encountering issues. The service keeps restarting or fails to start properly. I've tried several approaches:

  1. Direct command in systemd service
  2. Running with --detach flag
  3. Using a wrapper script
  4. Removing -it flags and running in non-interactive mode

Current issues:

  • Service starts but immediately exits
  • Container either doesn't start or keeps restarting
  • Getting "the input device is not a TTY" errors when using -it flags
  • Service shows "Deactivated successfully" right after starting

Could you provide guidance on the correct way to set this up as a systemd service that starts on boot?

Environment:

  • Device: Jetson Orin Nano
  • L4T Version: 36.4.3
  • JetPack Version: 6.2
  • CUDA Version: 12.6

Logs:
When running manually (works):

jetson-containers run --name ollama $(autotag ollama)
# Successfully starts and runs

When running as service (fails):

systemd[1]: Started Ollama Server via jetson-containers.
systemd[1]: ollama.service: Deactivated successfully.
systemd[1]: ollama.service: Scheduled restart job

Any help would be greatly appreciated!

@dusty-nv
Copy link
Owner

Hey @elvismdev , when you transition to deployment, I would just start using the docker run command that jetson-containers spits out (it prints this before it invokes it), but that alone do not think is necessarily related. I remember some folks on the forum asking https://forums.developer.nvidia.com/t/start-a-docker-container-at-startup-in-jetson-nano/265760

My suggestion was going to be to use the docker restart/persistence flags or docker-compose instead, and yea that's what the thread concluded with. The updates to jetson-containers and jetson-ai-lab are using docker-compose. It is for managing deployments like this better. I have not gotten to the reboot/persistence part yet though.

@elvismdev
Copy link
Author

elvismdev commented Jan 30, 2025

@dusty-nv Thanks for the response! I've tried using the direct docker run command with the --restart unless-stopped flag as suggested:

docker run -d \
    --runtime nvidia \
    --restart unless-stopped \
    --network host \
    --shm-size=8g \
    --volume /tmp/argus_socket:/tmp/argus_socket \
    --volume /etc/enctune.conf:/etc/enctune.conf \
    --volume /etc/nv_tegra_release:/etc/nv_tegra_release \
    --volume /tmp/nv_jetson_model:/tmp/nv_jetson_model \
    --volume /var/run/dbus:/var/run/dbus \
    --volume /var/run/avahi-daemon/socket:/var/run/avahi-daemon/socket \
    --volume /var/run/docker.sock:/var/run/docker.sock \
    --volume /home/jetson/jetson-containers/data:/data \
    -v /etc/localtime:/etc/localtime:ro \
    -v /etc/timezone:/etc/timezone:ro \
    --device /dev/snd \
    --device /dev/bus/usb \
    --device /dev/i2c-0 \
    --device /dev/i2c-1 \
    --device /dev/i2c-2 \
    --device /dev/i2c-4 \
    --device /dev/i2c-5 \
    --device /dev/i2c-7 \
    -v /run/jtop.sock:/run/jtop.sock \
    --name ollama \
    $(autotag ollama)

However, the container keeps restarting in a loop. The interesting part is that when I run jetson-containers run --name ollama $(autotag ollama) manually, it works perfectly and stays running.

Could there be something in the container's configuration or environment that's causing it to exit when run with the direct docker run command? Any insights on what might be causing this behavior would be greatly appreciated.

@tokk-nv
Copy link
Collaborator

tokk-nv commented Jan 31, 2025

Alternatively, rather than using the container, you can use the official Ollama installer to install Ollama natively on Jetson.
https://www.jetson-ai-lab.com/tutorial_ollama.html#1-native-install

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants