Cloud-native persistent autonomous AI agent runtime in Rust, featuring multi-channel access, 13-layer self-learning, self-distillation, self-growing code (Cambium), and Witness verification system.
TEMM1E is a cloud-native persistent autonomous AI agent runtime written in Rust, designed with the principle "deploy once, stays up forever, grows itself."
At the cognitive layer, TEMM1E implements classifier + blueprint matching for zero-overhead intent recognition, TaskGraph topological task decomposition, 7-priority context budget management, self-correction engine, and verifiable completion condition engine. Its 13-layer self-learning system covers decaying episodic memory, cross-task strategy learning, reproducible flow recipes (Blueprints), user profile modeling (Tem Anima), and more, driven by a unified value function V(a,t) = Q × R × U.
For self-evolution, the Eigen-Tune mechanism collects training pairs from conversations, promotes them through statistical gating (Wilson 99% CI + SPRT + CUSUM), achieving self-distillation fine-tuning at zero additional LLM cost. Tem Cambium allows the agent to autonomously write Rust code and integrate it after passing compilation, lint, and tests, using a four-layer architecture (Heartwood → Cambium → Bark → Rings) to isolate safety boundaries.
For safety verification, the Witness system (enabled by default since v5.5.0) implements agent behavior auditing through pre-committed machine-checkable contracts, independent verification, and hash-chained SQLite Ledger, achieving an 88.9% lie detection rate across 1,800 simulated trajectories.
Interaction capabilities span Telegram, Discord, WhatsApp, Slack, TUI, and Web Dashboard. The tool layer supports Vision Browser (CDP screenshot → visual analysis → click), Tem Prowl (100+ service logins), Tem Gaze (desktop visual control), and full computer control via Shell/Git/MCP.
Reliability features include 4-layer panic resilience, provider circuit breakers, channel reconnection, in-memory backend failover, and Watchdog self-monitoring. Deployment supports S3/R2 object storage, OpenTelemetry observability, multi-tenant Workspace isolation, OAuth, horizontal scaling Orchestrator, with Terraform configuration included.
The project comprises ~160K lines of Rust code across 25 crates with 2,838 tests, currently at v5.5.4 under the MIT license, supporting Windows/macOS/Linux (including ARM64).
Installation#
Pre-built binary (30-second install):
curl -sSfL https://raw.githubusercontent.com/temm1e-labs/temm1e/main/install.sh | sh
temm1e tui
Supports macOS (Intel/Apple Silicon) and Linux (x86_64/ARM64), with automatic SHA-256 verification.
Build from source:
git clone https://github.com/temm1e-labs/temm1e.git && cd temm1e
cargo build --release
./target/release/temm1e tui
Requires Rust 1.82+ and Chrome/Chromium.
Docker: Dockerfile and docker-compose.yml included in the repository.
Windows: Download .exe from Releases or build from source; first-class support since v5.4.5.
Key Configuration#
Server mode is configured via TELEGRAM_BOT_TOKEN and DISCORD_BOT_TOKEN environment variables. Core configuration resides in ~/.temm1e/config.toml, supporting session budget hard caps (max_spend_usd), Witness toggle, Eigen-Tune dual opt-in, and Cambium runtime switching. Default feature flags: telegram, discord, browser, mcp, codex-oauth, desktop-control, tui. Optional: slack, whatsapp, whatsapp-web, postgres, openssl-vendored.
Unconfirmed Information#
Independent website, external research paper links (arXiv etc.), Hugging Face model page, actual production user count, brand rename timeline (SkyClaw → TEMM1E), Slack channel full availability, and WhatsApp Web integration maturity remain unconfirmed from the reviewed pages.