Agentic AI is at the Peak of Inflated Expectations on the Gartner Hype Cycle for AI 2026. MARS is a research-grade runtime that puts engineering discipline behind the hype: lifecycle management, trust, governance, federated discovery, and goal-driven loops β grounded in papers written before "agent" became a buzzword again.
MARS is a Python asyncio platform for building production-grade multi-agent systems with LLM integration.
Gartner's 2026 analysis identifies agentic AI as the fastest-advancing technology on the curve, moving toward the peak on the back of:
- π§ Autonomous / semi-autonomous agents that perceive, decide, and act
- π€ Multi-agent collaboration with clear task delegation
- π AI Trust, Risk, and Security Management (AI TRiSM) as a foundational requirement
- ποΈ The need for robust infrastructure, not just model access, to realise real business value
MARS addresses all four directly β not as an afterthought, but as core architecture inherited from research dating back to 1997.
- Run an async multi-agent runtime with
Platform,AMS,MessageBus,Directory, group rooms, and direct message routing across federated platforms. - Spawn
LLMAgent,ReactiveAgent,ProactiveAgent, andEchoBotinstances with lifecycle hooks, BDI-stylerole/goal/behaviourfields, and session memory. - Execute lifecycle FSM presets with
StateMachine+StrategyEngine, including goal-driven loops and shared proactive tick scheduling inAgent. - Connect to Anthropic Claude, GitHub Models / Copilot, or local Ollama β run multiple models in parallel, chat with each, and keep agents running persistently.
- Route work by skill:
LLMAgentexposeslist_skills/use_skill, while service agents advertise capabilities and can answer with messages or artifacts. - Run
mars-serveras a headless TCP/REST/WebSocket backend with per-client focus, audit logging, and a browser UI. - Auto-spawn five free service agents on server start:
clock,screenshot,profiler,status, andbridge.copilot@1. - Exchange text and binary artifacts with
/artifact list,/artifact get, and/artifact send. - Use the three-pane CLI with per-agent chat history, room chat, activity feed, visual mode badges (β‘ reactive / β° proactive), and MXTY auto-naming.
Future work, focused on turning the runtime into business-shaped capabilities. The research papers define what each item means; delivery order is driven by the business model behind it.
| Area | Feature | Paper origin |
|---|---|---|
| Trading | DIDF/DSDF specialist agent demo β hierarchical counterparty matching at runtime | Counterparty Matching |
| Location | Context-threaded federated retrieval β route queries by location + capability | Location-Aware Agent Retrieval |
| Federation | Multi-MARS network β authenticated clusters forming a global agent fabric | LARS |
| Governance | Formal Regelrahmen enforcement β hard policy boundaries with proof of compliance | Patient Technology |
| Self-organisation | Dynamic coalition formation β agents self-assemble teams for complex goals | EMIKA |
- SETUP.md β prerequisites, install, configure providers, start the server.
- USER.md β CLI walkthrough: spawn agents, chat, scopes, artifacts, trust.
- ARCHITECTURE.md β runtime stack, service marketplace, scopes, game-theory FSM, providers.
- BUILD.md β building, running tests, packaging a wheel, Git LFS.
- AGENTS.md β service-agent catalogue and wire protocol.
- CONTRIBUTING.md β project layout, coding conventions, how to add providers and tests.
After pip install -e ".[dev]" (see SETUP.md):
# Standalone CLI (in-process platform, no server needed)
python -m mars.cli.main --provider mock # offline test agent
python -m mars.cli.main --provider github-models # free LLM (needs GITHUB_TOKEN)
python -m mars.cli.main --provider ollama # local Ollama β no API key, no limits
python -m mars.cli.main --provider ollama github-models # both at once π
python -m mars.cli.main --provider anthropic --model claude-3-5-sonnet-20241022 # Anthropic Claude (needs ANTHROPIC_API_KEY)# Server + client (recommended for multi-client / persistent runs)
# Terminal 1 β start the headless server (auto-spawns free service agents)
python -m mars.srv.main # echo bots only
python -m mars.srv.main --provider github-models # + free GitHub Models LLM
python -m mars.srv.main --provider ollama # + local llama3.2 (unlimited, no key)
python -m mars.srv.main --provider ollama github-models # + one agent per provider π
# Terminal 2 β connect a CLI client
python -m mars.cli.main --connect localhost:7432The
pip install -e ".[dev]"step also registersmarsandmars-serverconsole scripts. They work whenever your PythonScripts/(Windows) orbin/(Linux/macOS) directory is onPATHβ butpython -m β¦always works regardless ofPATHconfiguration.No API key? Use
--provider ollamawith Ollama installed locally β completely free, no rate limits. See SETUP.md for the full Ollama setup guide.
The server exposes TCP 7432 (JSON-line for CLI + agents), HTTP 7433 (REST), and WebSocket 7434 (browser UI). See USER.md for the full command cheat sheet.
Gartner (2026) places agentic AI at the Peak of Inflated Expectations, warning that sustainable value requires:
- Alignment with real business objectives β not demos for their own sake
- Foundational infrastructure β lifecycle, messaging, trust, governance
- AI TRiSM β trust, risk, and security built in from the start
- Realistic scoping β iterative loops with defined exit criteria, not open-ended autonomy
MARS implements each of these:
| Gartner requirement | MARS feature |
|---|---|
| Agent lifecycle management | AMS β INITIALIZING β ACTIVE β SUSPENDED β TERMINATED |
| Multi-agent communication | MessageBus β async unicast / broadcast / multicast |
| Trust & security | TrustManager β per-vendor trust (TRUSTED / NEUTRAL / SUSPICIOUS / BLOCKED) |
| Federated discovery | GlobalDirectory + DomainDirectory β two-level capability routing across domains and clusters |
| Goal-driven loops | StateMachine + LoopContext β build β review β fix β done, with exit conditions |
| Real business alignment | LLM agents with send_message, list_agents, and artifact tools β wired to real providers |
The design is grounded in work by the author predating modern LLMs:
| Paper | Contribution to MARS |
|---|---|
| Intelligent Mobile Agents in the Intra/Internet (Nopper, 1997) | Foundational diploma thesis: mobile agent theory, transport, lifecycle, platform-independence β the direct predecessor of LARS and the entire MARS runtime model |
| LARS β Living Agents Runtime System (Nopper, 2000) | Platform-independence, lifecycle FSM (INITIALIZING β ACTIVE β SUSPENDED β MIGRATED β TERMINATED), clustering, XML messaging |
| EMIKA (MΓΌller, Eymann, Nopper et al., 2004) | Real-time sensor-to-agent middleware, self-organising coordination, Ist/Soll-Zustand loop |
| Agent-Based Counterparty Matching (Lohmann, Nopper, Henning, 1998) | Hierarchical DIDF/DSDF discovery, specialist agents over a shared runtime |
| Location-Aware Agent Retrieval (Nopper, Kammerer, 2000) | Context-threaded, federated multi-source agent retrieval |
| Patient Technology (MΓΌller, Nopper et al., ~2003) | Self-organisation requires an explicit governance framework (Regelrahmen) |
- Axelrod, R. (1984). The Evolution of Cooperation. Basic Books.
- Nopper, N. (1997). Intelligent Mobile Agents in the Intra/Internet. Diploma thesis, Hochschule Furtwangen University (HFU).
- Lohmann, Nopper, Henning (1998). Agent-Based Counterparty Matching in Financial Markets.
- Nopper, Kammerer (2000). Location-Aware Agent Retrieval.
- Nopper, N. (2000). LARS β Living Agents Runtime System. AgentLink.
- MΓΌller, Nopper et al. (~2003). Patient Technology β Self-Organisation and Governance.
- MΓΌller, Eymann, Nopper et al. (2004). EMIKA β Agent-Based Middleware for Real-Time Sensor Systems.
- Shoham & Leyton-Brown (2009). Multiagent Systems. Cambridge University Press.
- Gartner, Hype Cycle for Agentic AI, 2026 β https://www.gartner.com/en/articles/hype-cycle-for-agentic-ai