Continue.dev
Continue.dev is an open source extension that adds AI to your existing editor.
Overview
Section titled “Overview”| Type | Extension |
| Platforms | VS Code, JetBrains |
| Open Source | Yes (Apache 2.0) |
| Best For | Local models, existing editor |
Key Features
Section titled “Key Features”- Tab Completions — Inline suggestions
- Chat — Conversational coding
- Local Model Support — Ollama, LM Studio
- 100% Local Option — No data leaves your machine
- BYOK — Any OpenAI-compatible endpoint
Access Model
Section titled “Access Model”Continue.dev is open source. You can run it with your own provider keys or pair it with a local model stack, which makes it one of the clearest choices for teams that want explicit infrastructure control.
Privacy
Section titled “Privacy”| Setting | Value |
|---|---|
| Local Mode | 100% local possible |
| Training | Never |
| Retention | None |
| Jurisdiction | None (local) |
This is the most private option — nothing leaves your machine.
Model Options
Section titled “Model Options”Local (Ollama):
- DeepSeek V3.2
- Qwen3 Coder 32B
- Llama 4
Cloud (BYOK):
- Any OpenAI-compatible API
- Anthropic, OpenAI, Google, xAI, etc.
Getting Started
Section titled “Getting Started”- Install from VS Code marketplace or JetBrains
- Install Ollama for local models
- Run
ollama pull deepseek-coder-v2or similar - Configure Continue to use local model
100% Local Setup
Section titled “100% Local Setup”# Install Ollamacurl -fsSL https://ollama.com/install.sh | sh
# Pull a coding modelollama pull qwen3-coder:32b
# Configure Continue to use localhost:11434- Use
Cmd+Lto open chat - Local models require 8GB+ VRAM (24GB+ for best results)
- Combine with cloud models for complex tasks