Maid

Fully offline

Android Ollama client + on-device fallback for small models. Cross-platform Flutter.

Editorial verdict: “Best cross-platform Android-friendly Ollama client. Falls back to on-device for tiny models.

Mobile app
Free
MIT
4.0 / 5
GitHub ★ 2,000

Compatibility at a glance

Which runtime + OS combos this app works against. Source of truth for "will it run on my setup?"

§ Runtimes supported
ollamallama-cpp
§ OS / platform
androidioslinuxmacoswindows
§ Hardware + model hint
Minimum VRAM
2 GB
Recommended starter model
Llama 3.2 3B Q4 (on-device) or any (server)

What it is

Maid is a Flutter app that lets you chat against a remote Ollama server (LAN / VPN) OR run small models directly on the phone via llama.cpp embedded. Cross-platform: Android, iOS, Linux, Windows, macOS. The 'Ollama client AND tiny on-device model' combo.

✓ Strengths

  • +True cross-platform (Android included)
  • +On-device + remote-server modes
  • +Open-source MIT

△ Caveats

  • Flutter UI doesn't feel as polished as Enchanted on iOS
  • On-device mode only viable for tiny models

About the Mobile app category

iOS / Android app that talks to your local model server.