Brave's built-in AI assistant. Configurable to talk to local Ollama out of the box.
Editorial verdict: “Best built-in browser AI for Brave users. Local mode is a checkbox, not a hack.”
Which runtime + OS combos this app works against. Source of truth for "will it run on my setup?"
Leo is Brave Browser's built-in AI. Defaults to Brave's hosted LLM, but the settings let you point it at any local Ollama instance — flip a switch, set the URL, done. The fastest 'browser AI that respects local' setup if you already use Brave.
Browser extension that uses your local model.
Pre-filled with this app's recommended use case + budget tier. Get the full rig + runtime + model picks.
The full directory — filter by category, runtime, OS, privacy posture, or VRAM.
What this app talks to: Ollama, vLLM, llama.cpp, MLX, LM Studio. The upstream layer.
Did this app work for you on a specific rig? Submit the benchmark — it powers the model + hardware pages.