Aider
Terminal coding agent that edits files via your local model. Git-aware, surgical, fast.
Editorial verdict: “Best terminal-native coding agent for local models. Qwen 2.5 Coder 32B is its sweet spot.”
Compatibility at a glance
Which runtime + OS combos this app works against. Source of truth for "will it run on my setup?"
What it is
Aider runs in your terminal, reads code, and proposes edits as git diffs. Works against any OpenAI-compatible endpoint, including Ollama and llama.cpp running locally. The killer feature is the surgical edit format: aider gets the model to emit small, targeted diffs that almost always apply cleanly, even with mid-tier local models like Qwen 2.5 Coder 32B.
✓ Strengths
- +Terminal-native — no IDE lock-in
- +Git-aware: every edit is a commit you can revert
- +Excellent edit-format that minimizes apply failures
△ Caveats
- −Steep learning curve vs IDE-integrated agents
- −Needs 32B-class model for production-quality edits
About the Coding agent category
Editor-integrated or CLI agent that edits code via your model.
Best self-hosted server for teams. SSO + audit logs make it the IT-friendly pick.
Best minimal-surface Copilot-replacement that's been Ollama-native since day one.
Best IDE-integrated agent that fully respects 'all local' as a first-class option.
Best Copilot replacement that defaults to local. Configurable; pair with Qwen 2.5 Coder.
Where to go from here
Pre-filled with this app's recommended use case + budget tier. Get the full rig + runtime + model picks.
The full directory — filter by category, runtime, OS, privacy posture, or VRAM.
What this app talks to: Ollama, vLLM, llama.cpp, MLX, LM Studio. The upstream layer.
Did this app work for you on a specific rig? Submit the benchmark — it powers the model + hardware pages.