This repo is a docker container for running SpikeGPT by ridgerchu with an Nvidia GPU. It's base is the Nvida PyTorch container which runs Ubuntu.
sudo apt-get update
sudo apt-get install -y nvidia-container-toolkit
sudo nvidia-ctk runtime configure --runtime=docker
sudo systemctl restart docker
Note: Installing the Nvidia Container Toolkit does not install the appropriate Nvidia drivers on your host machine for you. Be sure you can run nvidia-smi on the host machine with no errors.
sudo docker run --rm --runtime=nvidia --gpus all nvidia/cuda:11.6.2-base-ubuntu20.04 nvidia-smi
Your output should be similar to the following...
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 450.51.06 Driver Version: 450.51.06 CUDA Version: 11.0 |
|-------------------------------+----------------------+----------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|===============================+======================+======================|
| 0 Tesla T4 On | 00000000:00:1E.0 Off | 0 |
| N/A 34C P8 9W / 70W | 0MiB / 15109MiB | 0% Default |
| | | N/A |
+-------------------------------+----------------------+----------------------+
+-----------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=============================================================================|
| No running processes found |
+-----------------------------------------------------------------------------+
Be sure to include the approporate model weights as described in the SpikeGPT project README and copy the folder into the SpikeGPT-container directory
cd SpikeGPT-container
cp -R SpikeGPT .
docker build -t spikegpt .
./run.bash
Once in the container simply launch the run script
python run.py