A Rust-based Web3-native AI Agent Framework featuring embedded EVM wallet, MCP & A2A protocol support, compatible with OpenAI and Ollama backends.
Machi is a Rust-based Web3/Web4.0-native AI Agent Framework (v0.8.1) under Apache-2.0 and MIT dual license.
Core Architecture#
The framework is built around these core concepts:
- Agent: Self-contained unit with LLM provider, instructions, tools, and optional sub-agents
- Runner: Stateless execution engine driving the ReAct loop (think → act → observe → repeat)
- Tool / DynTool: Callable capability units for agents
- ChatProvider: Trait abstracting LLM backends
Key Features#
LLM Backend Support
- OpenAI API (GPT-4o, etc.)
- Ollama local models
- Extensible via ChatProvider trait
Web3 Capabilities
- EVM wallet integration (via alloy/kobe-evm)
- x402 protocol support (402-based HTTP & EVM payment flows)
Protocol Support
- MCP (Model Context Protocol) server integration
- A2A (Agent-to-Agent) inter-agent communication
Tools & Extensions
- Built-in toolkit: filesystem, shell, web search
#[tool]procedural macro for custom tool development- JSON Schema structured output
- SQLite session persistence
Quick Start#
[dependencies]
machi = "0.8"
use std::sync::Arc;
use machi::prelude::*;
#[tokio::main]
async fn main() -> Result<()> {
let provider: SharedChatProvider = Arc::new(OpenAI::from_env()?);
let agent = Agent::new("assistant")
.instructions("You are a helpful assistant.")
.model("gpt-4o-mini")
.provider(provider);
let result = agent
.run("What is the capital of France?", RunConfig::default())
.await?;
println!("{}", result.output);
Ok(())
}
Feature Flags#
openai/ollama: LLM backendswallet: EVM wallet functionalitymcp/a2a: Protocol supportx402: Payment protocoltoolkit: Built-in tool collectionmemory-sqlite: Session persistenceschema: Structured outputfull: All features (default)
Repository Structure#
Workspace contains: machi (core library), machi-auto (CLI), machi-derive (procedural macros)