DISCOVER THE FUTURE OF AI AGENTSarrow_forward

agentchattr

calendar_todayAdded Apr 23, 2026
categoryAgent & Tooling
codeOpen Source
PythonWorkflow AutomationMulti-Agent SystemFastAPIModel Context ProtocolAI AgentsAgent FrameworkAgent & ToolingDeveloper Tools & CodingAutomation, Workflow & RPAProtocol, API & Integration

A local multi-agent chat coordination server enabling real-time communication via @mentions, task tracking, and structured workflows among AI coding agents.

agentchattr is a fully local chat server designed for real-time coordination between multiple AI coding agents and humans. It connects agents like Claude Code, Codex, and Gemini CLI to a Slack-like multi-channel chat system via MCP (Model Context Protocol). Agents can wake each other through @mentions and converse automatically, with a built-in Loop Guard to prevent runaway loops.

Core Features#

Agent Communication & Wake#

  • @mention Wake: Trigger agents by typing @claude, @codex, @gemini in chat — no manual copy-pasting needed.
  • Loop Guard: Auto-pauses inter-agent conversation after N rounds per channel; human @mentions always pass through; /continue to manually resume.
  • Multi-Instance Agents: Same agent can run multiple instances with auto-registered identities, color codes, and @mention routing.
  • Activity Indicator: Detects agent activity by hashing terminal screen buffers, cross-platform (Windows ReadConsoleOutputW / Unix tmux capture-pane).

Channel System#

  • Slack/Discord-like multi-channel chat, default #general, with create/rename/delete support and persistent storage.
  • Agents interact via MCP tools (chat_send, chat_read, chat_channels).

Tasks & Workflows#

  • Jobs: Bounded work tasks (like Slack threads) with To Do → Active → Closed state tracking; agents can auto-propose jobs (chat_propose_job).
  • Sessions: Structured multi-agent workflows with sequential phases, role assignments, and turn-based speaking. Built-in templates: Code Review, Debate, Design Critique, Planning.
  • Agent Roles: Assign Planner/Builder/Reviewer/Researcher or custom roles that influence injected prompts on wake.
  • Rules: Define agent working styles with propose/activate/archive management.

Human-Agent Interaction#

  • Inline Decision Cards: Agents embed clickable choice buttons in messages; humans click to reply.
  • Notifications: Sound effects on agent messages, unread counts, channel update alerts.
  • Pinned Messages: Pin messages as todo/done state.
  • Message Management: Single/bulk delete, scheduled/periodic messages.

Data & Security#

  • Project history export/import, Token identity system + direct MCP auth, per-project instance isolation (CLI flags + env vars).

Architecture#

Backend built on FastAPI + WebSocket, using cross-platform terminal wrappers (Windows ReadConsoleOutputW / Unix tmux capture-pane) to inject MCP prompts into agent terminals. Core modules: app.py (web server), router.py (message routing), agents.py/registry.py (agent registration & auth), wrapper.py series (terminal wrappers), mcp_bridge.py/mcp_proxy.py (MCP protocol bridging), session_engine.py (session engine), jobs.py (task management), rules.py (rules system), store.py/archive.py (storage layer). Frontend is a native HTML/JS chat UI running on localhost:8300 by default.

Supported Agents#

Claude Code, Codex, Gemini CLI, GitHub Copilot CLI, Kimi, Qwen, Kilo CLI, CodeBuddy, MiniMax, and any MCP-compatible agent.

Quick Start#

Windows#

  1. Navigate to the windows folder, double-click the agent launcher (e.g., start_claude.bat).
  2. First launch auto-creates virtual environment, installs dependencies, configures MCP.
  3. Open http://localhost:8300 in browser.

macOS / Linux#

  1. Install tmux: brew install tmux (macOS) / apt install tmux (Ubuntu/Debian).
  2. Navigate to macos-linux folder, run launcher: sh start_claude.sh etc.
  3. Agents run in tmux sessions; Ctrl+B, D to detach.
  4. Open http://localhost:8300 in browser.

Configuration#

  • config.toml: Default configuration.
  • config.local.toml: Local overrides (see config.local.toml.example).
  • TOML format, loaded by config_loader.py.

Related Projects

View All arrow_forward

STAY UPDATED

Get the latest AI tools and trends delivered straight to your inbox. No spam, just intelligence.

rocket_launch