An autonomous AI pair programmer for VS Code
• Chat with an AI coding assistant directly in the sidebar or panel. • Generate, edit & refactor files – workspace edits applied automatically. • Multi-step task planner (analysis → design → code → tests → docs). • Inline diff & safety guardrails (blocks destructive commands, sensitive files). • Works with multiple LLM back-ends: Ollama, OpenAI, or your own via plugin. • Fetches remote package READMEs to give the model extra context. • GitHub integration for PR diff summaries and automated review comments. • Rich task & status webviews – cancel / retry individual steps.
- Ensure you have Node.js ≥ 18 and VS Code ≥ 1.85.
- Install from the VSIX in the repo root:
code --install-extension code-companion-0.1.0.vsix
- Reload VS Code and look for the
code-companionicon in the Activity Bar.
- Open any workspace.
- Launch the Chat view (icon in Activity Bar).
- Type a natural-language request, e.g.
Add dark-mode toggle. - Review the generated execution plan, edit if necessary, then run.
- Code Companion applies changes – look for inline decorations or accept the edits.
Set these in your user or workspace settings.json:
| Setting | Default | Description |
|---|---|---|
codeCompanion.llmProvider |
ollama |
Selects the active LLM provider (ollama or openai). |
codeCompanion.confirmChanges |
true |
Ask before applying edits. Markdown files are applied directly without diff. |
codeCompanion.enableAutoSave |
true |
Automatically save documents after edits. |
codeCompanion.openAi.apiKey |
"" |
API key when llmProvider is openai. |
graph TD;
User["VS Code UI"] -->|Commands & prompts| Manager;
Manager --> Planner;
Planner --> Executor;
Executor -->|Reads| ContextAnalyzer;
Executor -->|LLM calls| LLMProvider;
LLMProvider --> Ollama & OpenAI;
Executor -->|Edits| Workspace;
Executor -->|Safety| SafetyValidator;
Manager --> UIWebviews;
ContextAnalyzer --> Git & RemoteDocs;
📄 .eslintrc.json
📄 DEVELOPMENT_SETUP.md
📄 IMPLEMENTATION_PLAN.md
📄 LICENSE
📄 README.md
📄 README.md.codecompanion
📄 README.md.codecompanion-1752349197779
📄 README.md.codecompanion-1752349886654
📄 TECHNICAL_ARCHITECTURE.md
📄 code-companion-0.1.0.vsix
📄 package-lock.json
📄 package.json
📁 resources
📄 icon.svg
📁 src
📁 agent
📄 context.ts
📄 executor.ts
📄 planner.ts
📄 remotedocs.ts
📄 safety.ts
📄 extension.ts
📁 git
📄 github.ts
📁 llm
📄 ollama.ts
📄 openai.ts
📄 provider.ts
📄 manager.ts
📄 types.ts
📁 ui
📄 chat.ts
📄 status.ts
📄 tasks.ts
📄 tsconfig.json
# install deps
npm install
# launch the extension host
code . --extensionDevelopmentPath=.- vscode:prepublish:
npm run compile - compile:
tsc -p ./ - watch:
tsc -watch -p ./ - pretest:
npm run compile && npm run lint - lint:
eslint src --ext ts - test:
node ./out/test/runTest.js
- axios
- ws
- node-fetch
- diff
- glob
- chalk
- @octokit/rest
- @types/vscode
- @types/node
- @typescript-eslint/eslint-plugin
- @typescript-eslint/parser
- eslint
- typescript
- @types/diff
- Better inline diff/merge UX.
- Context caching & chunking for large workspaces.
- Additional LLM providers (Anthropic, Gemini).
- Auto-generated unit & integration tests.
PRs and issues are welcome! Please read DEVELOPMENT_SETUP.md first.
MIT © 2025 code-companion Authors
Updated by CodeCompanion on 2025-07-12T20:01:58.678Z