Continue

Hybrid (offline or cloud)

Open-source autocomplete + chat for VS Code and JetBrains. Local-model-first.

Editorial verdict: “Best Copilot replacement that defaults to local. Configurable; pair with Qwen 2.5 Coder.

Coding agent
Free
Apache-2.0
4.5 / 5
GitHub ★ 21,000

Compatibility at a glance

Which runtime + OS combos this app works against. Source of truth for "will it run on my setup?"

§ Runtimes supported
ollamaopenai-compatanthropicopenai
§ OS / platform
macoslinuxwindows
§ Hardware + model hint
Minimum VRAM
12 GB
Recommended starter model
Qwen 2.5 Coder 7B Q4_K_M (FIM) + 32B for chat

What it is

Continue is the open-source rival to Cursor and Copilot. It does autocomplete (FIM with a local code-completion model), inline chat, and edit-via-prompt — all configurable to use Ollama, llama.cpp, or any OpenAI-compatible endpoint. The default config nudges you toward local; cloud is an option, not the default.

✓ Strengths

  • +Single extension covers autocomplete + chat + edit
  • +Excellent local-model defaults
  • +JetBrains support is on par with VS Code

△ Caveats

  • Autocomplete latency on slower hardware is noticeable (~150-300ms)
  • Configuration is YAML-heavy