Skip to content

krish9219/openforge

Repository files navigation

openforge

A minimal, self-hosted, browser-based AI pair-programmer for any folder.

demo

stars license next monaco sandbox tests

Point openforge at a folder. Get a three-pane editor in your browser: file tree, Monaco editor, AI chat. The chat can read your files, write changes, and ask follow-ups. No SaaS account, no per-seat pricing, no telemetry. You own the OS, you own the code, you own the API key.

Quick start

git clone https://github.com/krish9219/openforge
cd openforge
cp .env.example .env        # add OPENAI_API_KEY, optionally set WORKSPACE_DIR
npm install
npm run dev
# open http://localhost:3000

The included workspace/ folder has a hello.py file so you can try the AI immediately. Replace it with your own project by editing WORKSPACE_DIR in .env.

What you get

  • Three-pane UI — file tree on the left, Monaco editor in the middle, AI chat on the right.
  • Real file ops — clicking a file opens it; ⌘S saves it; AI edits write to disk.
  • Tool-using chat — the assistant can list_files, read_file, write_file. It explains what it's about to do before doing it.
  • Sandboxed — every file access is resolved against WORKSPACE_DIR and rejected if it tries to escape via ... Path-traversal attempts surface as visible errors.
  • Multi-language editor — Monaco gives you syntax highlighting for TS, Python, Go, Rust, Java, JSON, Markdown, and more.
  • Streaming responses — text and tool calls stream in as the model produces them. No spinner-of-doom.

How it works

flowchart LR
    B[browser<br/>Monaco + chat] -->|HTTP| F[/api/files<br/>GET tree / GET file / PUT file/]
    B -->|HTTP POST stream| C[/api/chat NDJSON/]
    C --> A[agent loop<br/>lib/llm.ts]
    A -->|list_files| W[lib/workspace.ts<br/>sandboxed]
    A -->|read_file| W
    A -->|write_file| W
    W --> D[(WORKSPACE_DIR<br/>your folder)]
    A -->|response| B
Loading

Every tool call is sandboxed by lib/workspace.ts::safe(path), which resolves the input against WORKSPACE_DIR and rejects anything that escapes. Tool iterations are capped at 8 per turn to prevent loops. The chat stream is NDJSON — one event per line — so the UI renders tool calls and tool results as they happen.

vs. the alternatives

openforge Cursor Continue Cline Aider
Self-hosted on a server yes no (desktop app) no (IDE extension) no (IDE extension) yes (CLI)
Browser-based UI yes no no no no
Inline ghost-text completion no yes yes no no
Codebase indexing / embeddings no yes yes yes partial
Multi-file edit via chat yes yes yes yes yes
Visible tool calls yes hidden yes yes no
LOC you must read to debug ~800 proprietary thousands thousands ~10k
Best for self-host on a VPS, read on iPad desktop dev VS Code / JetBrains users VS Code agentic users terminal lovers

If you want the best AI coding experience and don't care about self-hosting, use Cursor or Continue. If you want to read every line of the thing that's editing your files and run it on a Raspberry Pi, this is for you.

What this is NOT

I want to be specific about scope, because "open-source Cursor" sets expectations openforge does not meet:

  • No inline ghost-text completion (the Tab-to-accept thing). The chat panel proposes edits; you don't get autocomplete while typing.
  • No semantic codebase indexing. The AI sees only what it explicitly reads via read_file. It does not embed your repo.
  • No multi-cursor / extensions / debugger. This is Monaco with a chat sidebar, not VS Code.
  • One project at a time. WORKSPACE_DIR is global, set at startup.

If you want autocomplete-grade integration today, look at Continue.dev (VS Code/JetBrains extension) or Aider (terminal). openforge is for people who want a browser-based editor they can self-host on a server and reach from any device.

Configuration

Env var Default Notes
WORKSPACE_DIR ./workspace The folder the editor and AI can touch. Use an absolute path on a server.
OPENAI_API_KEY required Until I add Anthropic support to the chat route, OpenAI is mandatory.
MODEL gpt-4o-mini Any chat-completions model that supports tools.

Security notes

  • The path sandbox is the only thing standing between an LLM hallucination and your filesystem. Read lib/workspace.ts before pointing this at anything important.
  • There is no authentication. Run on localhost, behind a VPN, or behind HTTP basic auth at the proxy.
  • The /api/chat route accepts arbitrary message history. Don't expose this publicly without rate-limiting.

FAQ

Why not just use Cursor? If Cursor works for you, use it. openforge exists for people who want to run an AI editor on a server they control, reach it from any device through a browser, and read every line of code that touches their files.

Why no inline completion? Building reliable inline ghost-text requires model-side support (FIM completion APIs), token-level streaming, and careful UX. The chat-driven panel is the 80% solution in 800 lines.

Why no codebase embeddings? Adding embeddings means another service, another index, and another set of failure modes. The model only sees what it explicitly reads via read_file — that's transparent and debuggable.

Can the AI escape the sandbox? It cannot — safe() resolves every path against WORKSPACE_DIR and rejects anything that escapes via ... The tests cover this (workspace.test.ts). If you find a way, please report it as a security issue.

Anthropic / local models support? OpenAI today. Adding Anthropic is ~30 lines in lib/llm.ts — open an issue and I'll point you at the spot.

Multi-folder support? Not yet. WORKSPACE_DIR is global. A folder switcher in the sidebar is a clean addition if you want to PR it.

Tests

npm test

Tests cover the path sandbox, tree listing, and file read/write. They run without an API key.

Contributing

See CONTRIBUTING.md. Security: see SECURITY.md. The sandbox in lib/workspace.ts is the single most important file — extra scrutiny there.

Star history

Star History Chart

License

MIT — see LICENSE.

About

Open-source AI code editor for your folder. Monaco + a tool-using chat that reads and writes files, sandboxed.

Topics

Resources

License

Contributing

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors