This guide will let you install Ollama with an older Nvidia GPU (e.g. P1000, M40 or P40).

This guide works for me Have done it several time on bare metal and in VM's running on Proxmox. With a single P1000 (4GB), a dual P1000 setup, a single P40 (24GB) and a P1000 & P40 setup.

Things done in this guide

  • Install necessary software
  • Install older Nvidia drivers but most recent and usable for the older Nvidia K/M/P series.
  • Install Ollama
  • Allow Ollama to listen on all ports and IP's
  • Change Ollama model/llm directory

To start with install a clean DietPi host (x86_64).

Install necessary software

apt install gcc make initramfs-tools linux-headers-$(uname -r) libc-dev libc6-dev gcc g++ lshw pciutils zstd hashcat python3 python3-full pipx -y

Install Nvidia drivers

First of all we need to disable the Nouveau kernel driver, this according to the NVIDIA developer zone.

nano /etc/modprobe.d/blacklist-nouveau.conf

Enter the folowing lines

blacklist nouveau

options nouveau modeset=0

Save and exit by pressing CTRL+S

Now we need to update the system

update-initramfs -u

After completion do a reboot.

Login to the host/server.

Download the Nvidia 535.183.01 drivers

wget https://us.download.nvidia.com/XFree86/Linux-x86_64/535.183.01/NVIDIA-Linux-x86_64-535.183.01.run

Make the downloaded file executable

chmod +x NVIDIA-Linux-x86_64-535.183.01.run

Run the downloaded driver installation

./NVIDIA-Linux-x86_64-535.183.01.run

Follow the instructions

Install Ollama

curl -fsSL https://ollama.com/install.sh | sh

Changes to Ollama service

To allow Ollama to listen on all IP's we need to change something in the service startup file.

Edit Ollama service

systemctl edit ollama.service

Add the following lines after the first two remark lines

[Service]

Environment="OLLAMA_HOST=0.0.0.0:11434"

Restart the Ollama service

systemctl restart ollama.service

Change the Ollama model/llm directory

In my case I use /mnt/llm which point to an extra disk.

Set Ollama as owner of the new location

chown ollama /mnt/llm -R

chgrp ollama /mnt/llm -R

Edit Ollama service

systemctl edit ollama.service

Add the following line to the service section

Environment="OLLAMA_MODELS=/mnt/llm"

Restart the Ollama service

systemctl restart ollama.service

Previous Post