Welcome to my (technical) blog

This place will mainly function as my memory. Just writing down steps I do to get things done so I don't forget in case of a crash or re-install.

Why in English?

Well ... all Dutch speak English ;-)

This guide will let you install Ollama with an older Nvidia GPU (e.g. P1000, M40 or P40).

This guide works for me Have done it several time on bare metal and in VM's running on Proxmox. With a single P1000 (4GB), a dual P1000 setup, a single P40 (24GB) and a P1000 & P40 setup.

Things done in thi...

If you're /tmp directory on tempfs is to small or to big, resulting in unexpected errors when installing/extracting software, you can resize it.

nano /etc/fstab

Change the line that mentions the /tmp directory. Reboot after.

Original article

It is possible to limit the maximum power usage (default 250W) of the Nvidia Tesla P40.

Adjusting the maximum power usage saves power but also gives less performance.

Enable persistence mode

nvidia-smi -pm ENABLED

Set power limit to 140Watts

nvidia-smi -pl 140

140 watt seems to b...

The following instructions lets you change the location where Ollama stores it's model/LLM blobs. This is especially handy when running Ollama in a VM on Proxmox and you only want to backup the VM bootdisk with all settings and NOT the big disk that stores the models. The default location on linux i...