Network / downloads
Ollama can't bind port 11434 — already in use
Error: listen tcp 127.0.0.1:11434: bind: address already in use
By Fredoline Eruo · Last verified May 7, 2026
Cause
Ollama defaults to port 11434. Conflicts happen when:
- Another Ollama instance is already running (systemd + manual start)
- LM Studio's local server is running on the same port (rare, but happens)
- A previous Ollama crashed without releasing the port
- A Docker container has the port published
Solution
1. Find what's holding the port:
# Linux / macOS
lsof -i :11434
# or
ss -tulpn | grep 11434
# Windows (PowerShell)
Get-NetTCPConnection -LocalPort 11434
2. If it's an old Ollama process, kill it cleanly:
# Linux: stop the systemd service
sudo systemctl stop ollama
# macOS: kill the LaunchAgent
launchctl bootout gui/$(id -u) /Library/LaunchAgents/com.ollama.ollama.plist 2>/dev/null
pkill -f ollama
# Then restart
ollama serve
3. Or run on a different port:
OLLAMA_HOST=0.0.0.0:11435 ollama serve
4. If it's Docker:
docker ps | grep 11434
docker stop <container>
5. Production deployment: prefer running Ollama as a systemd service with environment file pinning host + port. See the Linux local AI guide.
Related errors
Did this fix it?
If your case was different, email support@runlocalai.co with what you saw and we'll update the page. If it worked but took different commands on your platform, we want to know that too.