DISCOVER THE FUTURE OF AI AGENTSarrow_forward

TinyAgent

calendar_todayAdded Feb 25, 2026
categoryAgent & Tooling
codeOpen Source
PythonWorkflow AutomationRust大语言模型AI AgentsAgent FrameworkSDKAgent & ToolingModel & Inference FrameworkDeveloper Tools & CodingAutomation, Workflow & RPA

A small, modular agent framework for building LLM-powered applications in Python with streaming and optional Rust bindings.

Overview#

TinyAgent is a lightweight, modular Python agent framework designed to simplify building LLM-powered applications. Inspired by smolagents, it follows a minimal abstraction and zero-boilerplate design philosophy, allowing developers to transform Python functions into AI-callable tools via a simple @tool decorator.

Core Features#

Streaming-first Architecture#

  • Native streaming support for all LLM interactions
  • Subscribe to agent events for real-time UI updates
  • Complete event lifecycle management (Agent/Turn/Message/ToolExecution)

Tool System#

  • Zero-boilerplate with @tool decorator - functions become tools
  • Automatic multi-step reasoning and tool orchestration
  • Parallel tool calling support (v1.2.5+) via asyncio.gather()
  • Structured output support

Dual Provider Paths#

  • Pure Python Path: Default implementation via stream_openrouter and similar functions
  • Rust Binding Path: Optional high-performance implementation via PyO3 for SSE parsing and JSON deserialization

Provider Compatibility#

  • Compatible with any OpenAI-compatible /chat/completions endpoint
  • Default support for OpenRouter
  • Supports OpenAI, Anthropic (via OpenRouter), MiniMax, Chutes (local models)

Cost Optimization#

  • Anthropic-style Prompt caching support (cache breakpoints)
  • Reduced token costs and latency for repeated context

Architecture#

tinyagent/
├── agent.py              # Agent class
├── agent_loop.py         # Core agent execution loop
├── agent_tool_execution.py  # Tool execution helpers
├── agent_types.py        # Type definitions
├── caching.py            # Prompt caching utilities
├── openrouter_provider.py   # OpenRouter integration
├── alchemy_provider.py   # Rust-based provider
└── proxy.py              # Proxy server integration

Rust Binding Architecture#

Python (async)             Rust (Tokio)
─────────────────          ─────────────────────────
stream_alchemy_*()  ──>    alchemy_llm::stream()
                            │
AlchemyStreamResponse       ├─ SSE parse + deserialize
  .__anext__()       <──    ├─ event_to_py_value()
  (asyncio.to_thread)       └─ mpsc channel -> Python

Installation & Usage#

Installation#

# Recommended (faster)
uv pip install tiny_agent_os

# Or using pip
pip install tiny_agent_os

Environment Configuration#

export OPENAI_API_KEY=your_openrouter_key_here
export OPENAI_BASE_URL=https://openrouter.ai/api/v1

Quick Start#

from tinyagent import ReactAgent, tool

@tool
def add_numbers(a: int, b: int) -> int:
    """Add two numbers together."""
    return a + b

agent = ReactAgent(tools=[add_numbers])
result = agent.run("What is 5 plus 3?")
print(result)  # Output: 8

Multi-step Reasoning Example#

from tinyagent import tool, ReactAgent

@tool
def multiply(a: float, b: float) -> float:
    """Multiply two numbers together."""
    return a * b

@tool
def divide(a: float, b: float) -> float:
    """Divide the first number by the second."""
    return a / b

agent = ReactAgent(tools=[multiply, divide])
result = agent.run("What is 12 times 5, then divided by 3?")
# → 20 (auto-calls multiply(12, 5) → 60, then divide(60, 3) → 20)

Event Subscription#

def on_event(event):
    print(f"Event: {event.type}")

unsubscribe = agent.subscribe(on_event)

Core Classes & Configuration#

  • Agent: Base agent class
  • ReactAgent: Agent implementing ReAct (Reasoning + Acting) pattern
  • AgentOptions: Agent configuration options
  • AgentTool: Custom tool class

Key Configuration Options#

  • enable_prompt_caching: Enable prompt caching
  • model: Specify model (e.g., anthropic/claude-3.5-sonnet)
  • verbose=True: Enable debug logging
  • session_id: Session identifier

Current Status#

  • Version: v1.2.5
  • Stage: Beta (API may change between minor versions)
  • Primary Languages: Python (85.1%), Rust (14.9%)
  • License: MIT License

Important Notes#

  1. Project is currently in Beta stage, not recommended for critical production environments
  2. Rust bindings currently only support openai-completions and minimax-completions
  3. Image chunk support is incomplete (text and thinking chunks available)
  4. Discrepancy between website's Business Source License claim and GitHub's MIT License - GitHub takes precedence

Related Projects

View All arrow_forward

STAY UPDATED

Get the latest AI tools and trends delivered straight to your inbox. No spam, just intelligence.

rocket_launch