Open source memory orchestration for context driven coding workflows.
🌐



Neural graph with real context content — each node is a context file, each color family is a memory area.
Codex Memory is a local memory service that reads markdown contexts, consolidates long term operational memory with a selectable engine (ollama or deterministic algorithm), and exposes a visual interface to inspect memory relationships.
The project runs in three modes:
- Desktop app with Electron
- Web GUI mode
- Daemon mode for scheduled memory refresh
Context files are easy to write but hard to keep synchronized over time. Codex Memory addresses this by:
- Consolidating multiple
context_*.mdfiles into one operational memory file - Keeping memory updated through daemon or manual sync
- Providing a neural style graph to inspect links between context nodes
- Exposing local APIs for automation and integration
- Selectable memory engine per session:
ollamaor deterministicalgorithm - Automatic or manual
AGENT_MEMORY.mdrefresh with stable header and no timestamp line - Fast GUI startup without blocking consolidation on boot
- Context CRUD from GUI and API
- Graph based context visualization with hover metadata
- Persisted neuron graph snapshot for faster visualization loading
- Multi language UI with automatic locale detection
- Desktop icon and branding from project assets
- Canonical decisions with contradiction tracking and source traceability
- Rolling
AGENT_MEMORY.mdsnapshots (latest 10)
context_1.md is the foundation context of the memory system and should be written and maintained by a human.
Why this matters:
- It anchors long-term project rules and non-negotiable decisions.
- The graph engine treats it as the foundation neuron and keeps it central in visualization and linkage.
- Other contexts should reference it so algorithmic linking can preserve a consistent memory backbone.
- Keeping this file human-curated reduces drift and prevents accidental loss of core project intent.
- Node.js 18+
- Ollama installed and available in
PATHonly if you run withMEMORY_ENGINE=ollama
npm installcopy config\ai_paths.json.example config\ai_paths.jsonEdit config/ai_paths.json and update baseRootPath to your absolute path:
{
"baseRootPath": "C:/Users/your_username/Documents"
}npm startnode server.js --mode guinode server.js --mode daemon --refresh-sec 300One time run:
node server.js --mode daemon --once| Variable | Default | Description |
|---|---|---|
OLLAMA_MODEL |
qwen2.5:3b |
Ollama model to use |
MEMORY_ENGINE |
ollama |
Memory engine (ollama or algorithm) |
OLLAMA_HOST |
http://127.0.0.1:11434 |
Ollama endpoint |
OLLAMA_TIMEOUT_SEC |
300 |
Ollama call timeout |
OLLAMA_CONTEXT_MAX_CHARS_PER_FILE |
3500 |
Max chars per context file |
OLLAMA_CONTEXT_MAX_TOTAL_CHARS |
22000 |
Total max chars sent |
DAEMON_REFRESH_SEC |
300 |
Daemon refresh interval |
GUI_HOST |
127.0.0.1 |
GUI host |
GUI_PORT |
4173 |
GUI port |
When MEMORY_ENGINE=algorithm, Ollama is not loaded during the session.
npm run test:memory| Method | Route | Description |
|---|---|---|
GET |
/api/status |
Daemon and engine status |
POST |
/api/sync |
Manual sync |
POST |
/api/memory/force |
Force consolidation |
POST |
/api/daemon/start |
Start daemon |
POST |
/api/daemon/stop |
Stop daemon |
POST |
/api/daemon/restart |
Restart daemon |
GET |
/api/contexts |
List contexts |
POST |
/api/contexts |
Create context |
GET |
/api/contexts/:name |
Read context |
PUT |
/api/contexts/:name |
Update context |
DELETE |
/api/contexts/:name |
Delete context |
GET |
/api/memory |
Read consolidated memory |
GET |
/api/graph |
Neuron graph |
Windows — fully supported via npm start, setup_codex_memory.bat, and the PowerShell scripts in scripts/.
Linux / Mac — run the server directly with Node.js:
node server.js --mode gui
node server.js --mode daemon --refresh-sec 300The Electron desktop mode and the .bat/.ps1 autostart scripts are Windows-only. All core features (GUI, daemon, API, memory consolidation) work on any platform that runs Node.js 18+.
Source available — free for non-commercial use with attribution. Commercial use requires prior written permission from the author. See LICENSE for full terms.
| Language | README | Installation | Technical |
|---|---|---|---|
| 🇧🇷 Português | docs/pt-BR/README.md | INSTALL.md | TECHNICAL.md |
| 🇺🇸 English | docs/en-US/README.md | INSTALL.md | TECHNICAL.md |
| 🇪🇸 Español | docs/es-ES/README.md | INSTALL.md | TECHNICAL.md |
