Maid
Android Ollama client + on-device fallback for small models. Cross-platform Flutter.
Editorial verdict: “Best cross-platform Android-friendly Ollama client. Falls back to on-device for tiny models.”
Compatibility at a glance
Which runtime + OS combos this app works against. Source of truth for "will it run on my setup?"
What it is
Maid is a Flutter app that lets you chat against a remote Ollama server (LAN / VPN) OR run small models directly on the phone via llama.cpp embedded. Cross-platform: Android, iOS, Linux, Windows, macOS. The 'Ollama client AND tiny on-device model' combo.
✓ Strengths
- +True cross-platform (Android included)
- +On-device + remote-server modes
- +Open-source MIT
△ Caveats
- −Flutter UI doesn't feel as polished as Enchanted on iOS
- −On-device mode only viable for tiny models
About the Mobile app category
iOS / Android app that talks to your local model server.
Where to go from here
Pre-filled with this app's recommended use case + budget tier. Get the full rig + runtime + model picks.
The full directory — filter by category, runtime, OS, privacy posture, or VRAM.
What this app talks to: Ollama, vLLM, llama.cpp, MLX, LM Studio. The upstream layer.
Did this app work for you on a specific rig? Submit the benchmark — it powers the model + hardware pages.