Khoj
Self-hosted AI assistant for your notes, emails, docs. Web + mobile + desktop, all local-first.
Editorial verdict: “Best 'AI second brain' app. Self-hosted, local-first, works against Obsidian.”
Compatibility at a glance
Which runtime + OS combos this app works against. Source of truth for "will it run on my setup?"
What it is
Khoj is a self-hosted AI second-brain. Indexes your notes (Obsidian, Org-mode, Markdown), emails, browser history, PDFs — then chat with it. Local-first: runs against Ollama or llama.cpp by default, falls back to cloud only if you opt in. Cross-platform (web, iOS, Android, desktop). Niche but well-executed.
✓ Strengths
- +Genuinely cross-platform (mobile apps work)
- +Strong Obsidian integration
- +Self-host + cloud sync is well-designed
△ Caveats
- −Initial index of a large corpus is slow
- −Some niche features (image generation) require cloud
About the RAG app category
Document retrieval + chat, fully offline-capable.
Where to go from here
Pre-filled with this app's recommended use case + budget tier. Get the full rig + runtime + model picks.
The full directory — filter by category, runtime, OS, privacy posture, or VRAM.
What this app talks to: Ollama, vLLM, llama.cpp, MLX, LM Studio. The upstream layer.
Did this app work for you on a specific rig? Submit the benchmark — it powers the model + hardware pages.