RUNLOCALAIv38
→WILL IT RUNBEST GPUCOMPARETROUBLESHOOTSTARTPULSEMODELSHARDWARETOOLSBENCH
RUNLOCALAI

Operator-grade instrument for local-AI hardware intelligence. Hand-written verdicts. Real benchmarks. Reproducible commands.

OP·Fredoline Eruo
DIR
  • Models
  • Hardware
  • Tools
  • Benchmarks
  • Will it run?
GUIDES
  • Best GPU
  • Best laptop
  • Best Mac
  • Best used GPU
  • Best budget GPU
  • Best GPU for Ollama
  • Best GPU for SD
  • AI PC build $2K
  • CUDA vs ROCm
  • 16 vs 24 GB
  • Compare hardware
  • Custom compare
REF
  • Systems
  • Ecosystem maps
  • Pillar guides
  • Methodology
  • Glossary
  • Errors KB
  • Troubleshooting
  • Resources
  • Public API
EDITOR
  • About
  • About the author
  • Changelog
  • Latest
  • Updates
  • Submit benchmark
  • Send feedback
  • Trust
  • Editorial policy
  • How we make money
  • Contact
LEGAL
  • Privacy
  • Terms
  • Sitemap
MAIL · MONTHLY DIGEST
Get monthly local AI changes
Monthly recap. No spam.
DISCLOSURE

Some links on this site are affiliate links (Amazon Associates and other first-class retailers). When you buy through them, we earn a small commission at no extra cost to you. Affiliate links do not influence our verdicts — there are cards we rate highly that we don't have affiliate relationships with, and cards that sell well that we refuse to recommend. Read more →

SYS · ONLINEUPTIME · 100%2026 · operator-owned
RUNLOCALAI · v38
← Home·/apps·Editor plugin

Codeium (with local backend)

Fully offline

Codeium self-hosted enterprise backend lets the popular IDE plugin run fully on your hardware.

Editorial verdict: “Best 'enterprise Copilot' replacement when self-hosting is mandatory. Paid tier.”

Editor plugin
Paid
Proprietary
★ 4.0 / 5
↗ Homepage

Compatibility at a glance

Which runtime + OS combos this app works against. Source of truth for "will it run on my setup?"

§ Runtimes supported
custom
§ OS / platform
macoslinuxwindows
§ Hardware + model hint
Minimum VRAM
24 GB
Recommended starter model
Codeium's own fine-tuned model (ships with self-host tier)
→ Build the rest of the stack with /stack-builder→ Pick a GPU for this app

What it is

Codeium is a popular cloud autocomplete service. Their enterprise tier ships a self-hosted backend you run on your own GPUs. The IDE plugins (VS Code, JetBrains, Neovim, etc.) are identical to cloud mode but point at your server. Useful when teams want the Codeium UX with no data leaving their network.

✓ Strengths

  • +Polished IDE plugins across VS Code / JetBrains / Vim / Neovim
  • +Self-host mode is well-supported (not a feature flag)
  • +Strong autocomplete quality

△ Caveats

  • −Paid enterprise tier only for self-host
  • −Closed-source — your team can't audit

About the Editor plugin category

Plugin for VS Code, JetBrains, Vim, etc.

§ Other editor plugin apps
Smart Connections (Obsidian)

Best local semantic search for personal notes. Foundational layer for Obsidian RAG.

Obsidian Copilot

Best Obsidian plugin for local LLM in your notes. Pair with Smart Connections for RAG.

Where to go from here

Stack Builder →

Pre-filled with this app's recommended use case + budget tier. Get the full rig + runtime + model picks.

Back to /apps →

The full directory — filter by category, runtime, OS, privacy posture, or VRAM.

Runtimes (/tools) →

What this app talks to: Ollama, vLLM, llama.cpp, MLX, LM Studio. The upstream layer.

Community benchmarks →

Did this app work for you on a specific rig? Submit the benchmark — it powers the model + hardware pages.