Quick Install
Install Ollama on Linux with a single command:
curl -fsSL https://ollama.com/install.sh | sh
This script automatically detects your system and installs the appropriate version.
Manual Installation
If upgrading from a prior version, remove old libraries first: sudo rm -rf /usr/lib/ollama
Download and Extract
Download the Ollama package for your architecture: curl -fsSL https://ollama.com/download/ollama-linux-amd64.tar.zst \
| sudo tar x -C /usr
Start Ollama
In a terminal, start the Ollama server:
Verify Installation
In another terminal, verify Ollama is running:
GPU Support
AMD GPU Installation
If you have an AMD GPU, download and extract the ROCm package:
curl -fsSL https://ollama.com/download/ollama-linux-amd64-rocm.tar.zst \
| sudo tar x -C /usr
While AMD has contributed the amdgpu driver upstream to the Linux kernel, the version may be older and not support all ROCm features. For best support, install the latest driver from AMD’s official site .
NVIDIA GPU Setup
Ollama automatically detects NVIDIA GPUs with CUDA support.
Requirements:
NVIDIA GPU with compute capability 5.0+
Driver version 531 or newer
Install CUDA drivers:
Download and install CUDA
Verify installation:
This should display your GPU details.
Running as a System Service
Recommended for production use and automatic startup.
Create Ollama User
Create a dedicated user and group for Ollama: sudo useradd -r -s /bin/ false -U -m -d /usr/share/ollama ollama
sudo usermod -a -G ollama $( whoami )
Create Service File
Create /etc/systemd/system/ollama.service: [Unit]
Description =Ollama Service
After =network-online.target
[Service]
ExecStart =/usr/bin/ollama serve
User =ollama
Group =ollama
Restart =always
RestartSec =3
Environment = "PATH=$PATH"
[Install]
WantedBy =multi-user.target
Enable and Start Service
Reload systemd and enable the service: sudo systemctl daemon-reload
sudo systemctl enable ollama
sudo systemctl start ollama
Verify Service Status
Check that Ollama is running: sudo systemctl status ollama
Configuration
Environment Variables
Customize Ollama by editing the systemd service:
sudo systemctl edit ollama
Or create an override file at /etc/systemd/system/ollama.service.d/override.conf:
[Service]
Environment = "OLLAMA_HOST=0.0.0.0:11434"
Environment = "OLLAMA_DEBUG=1"
Environment = "OLLAMA_MODELS=/mnt/models"
Common Environment Variables
Variable Description Default OLLAMA_HOSTServer bind address 127.0.0.1:11434OLLAMA_MODELSModel storage location /usr/share/ollama/.ollama/modelsOLLAMA_DEBUGEnable debug logging 0OLLAMA_NUM_PARALLELMax parallel requests 1CUDA_VISIBLE_DEVICESSelect specific NVIDIA GPUs All GPUs HSA_OVERRIDE_GFX_VERSIONAMD GPU override Auto-detect
Installing Specific Versions
Use the OLLAMA_VERSION environment variable to install a specific version:
curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION = 0.5.7 sh
Find available versions in the releases page .
Updates
Update Ollama by running the install script again:
curl -fsSL https://ollama.com/install.sh | sh
Or manually download the latest version:
curl -fsSL https://ollama.com/download/ollama-linux-amd64.tar.zst \
| sudo tar x -C /usr
sudo systemctl restart ollama
Logs and Debugging
View Logs
For systemd service:
Follow logs in real-time:
journalctl -u ollama --no-pager --follow
Manual Server Logs
If running ollama serve manually, logs appear in the terminal.
Enable Debug Logging
sudo systemctl edit ollama
Add:
[Service]
Environment = "OLLAMA_DEBUG=1"
Then restart:
sudo systemctl daemon-reload
sudo systemctl restart ollama
Troubleshooting
NVIDIA GPU Not Detected
If Ollama doesn’t detect your NVIDIA GPU:
Verify drivers are loaded:
Load the UVM driver:
Reload the nvidia_uvm driver:
sudo rmmod nvidia_uvm
sudo modprobe nvidia_uvm
Check for errors in system logs:
sudo dmesg | grep -i nvidia
AMD GPU Issues
For AMD GPU troubleshooting:
Check device permissions:
Ensure user is in the correct groups:
sudo usermod -a -G video,render ollama
Enable debug logging:
AMD_LOG_LEVEL = 3 ollama serve
Check for driver errors:
sudo dmesg | grep -i amdgpu
Temp Directory Issues
If your system has noexec on /tmp, set an alternate temporary directory:
sudo systemctl edit ollama
Add:
[Service]
Environment = "OLLAMA_TMPDIR=/usr/share/ollama/tmp"
Uninstallation
Stop and Disable Service
sudo systemctl stop ollama
sudo systemctl disable ollama
sudo rm /etc/systemd/system/ollama.service
Remove Binaries and Libraries
sudo rm $( which ollama )
sudo rm -r $( which ollama | tr 'bin' 'lib' )
Remove User Data
sudo rm -r /usr/share/ollama
sudo userdel ollama
sudo groupdel ollama
Next Steps
GPU Configuration Learn about GPU support and optimization
API Reference Integrate Ollama into your applications
Model Library Browse available models
Docker Setup Run Ollama in containers