Enchanted
Native iOS / macOS Ollama client. Beautiful SwiftUI, talks to your home Ollama server.
Editorial verdict: “Best mobile Ollama client. Native SwiftUI; works against your home Ollama server.”
Compatibility at a glance
Which runtime + OS combos this app works against. Source of truth for "will it run on my setup?"
What it is
Enchanted is a native SwiftUI client for Ollama. Set the URL of your home / lab Ollama server, get a polished chat UI on iPhone, iPad, and Mac. Voice input, multi-chat, syncs across devices via iCloud. The 'iMessage-quality UX for local LLM' pick.
✓ Strengths
- +Native SwiftUI — feels Apple-platform-quality
- +Voice input on iPhone is well-implemented
- +Free and open-source
△ Caveats
- −Needs a reachable server (LAN or VPN)
- −Apple-platform only
About the Mobile app category
iOS / Android app that talks to your local model server.
Where to go from here
Pre-filled with this app's recommended use case + budget tier. Get the full rig + runtime + model picks.
The full directory — filter by category, runtime, OS, privacy posture, or VRAM.
What this app talks to: Ollama, vLLM, llama.cpp, MLX, LM Studio. The upstream layer.
Did this app work for you on a specific rig? Submit the benchmark — it powers the model + hardware pages.