DISCOVER THE FUTURE OF AI AGENTSarrow_forward

SimpleLLMFunc

calendar_todayAdded Feb 25, 2026
categoryAgent & Tooling
codeOpen Source
PythonWorkflow Automation大语言模型MultimodalAI AgentsAgent FrameworkSDKAgent & ToolingModel & Inference FrameworkDeveloper Tools & CodingProtocol, API & Integration

A lightweight yet complete LLM/Agent application development framework. Uses decorators to transform function signatures and docstrings into prompts, enabling type-safe LLM capabilities without function body implementation. Features multi-provider support, multimodal I/O, tool calling, streaming, API key load balancing, and Langfuse observability integration.

SimpleLLMFunc is a Python framework for LLM/Agent application development, built on the principles of "LLM as Function / Prompt as Code / Code as Doc". Developed by Jingzhe Ni, released under MIT license.

Key Features#

Decorator-Driven Development#

  • @llm_function — Transforms async functions into LLM-powered functions with automatic prompt building, API calls, and response parsing
  • @llm_chat — Builds conversational agents with streaming and tool calling support
  • @tool — Registers async functions as LLM-callable tools with multimodal returns
  • @tui — Ready-to-use Textual terminal UI

Type Safety & Async-Native#

  • Python type hints + Pydantic models for type correctness
  • Fully async design with native asyncio support for high-concurrency scenarios
  • IDE-friendly with code completion and type checking

Multimodal & Multi-Provider#

  • Supports Text, ImgUrl, ImgPath multimodal input/output
  • Compatible with OpenAI, Deepseek, Anthropic Claude, Volcengine Ark, Baidu Qianfan, Ollama, vLLM, and more

Production-Ready Capabilities#

  • API Key Load Balancing: Multi-key configuration with min-heap algorithm for lowest-load selection
  • Rate Limiting: Token Bucket algorithm to prevent API throttling
  • Structured Logging: Automatic trace_id tracking
  • Observability: Langfuse integration

Built-in Tools#

  • PyRepl: Subprocess-based persistent Python REPL
  • SelfReference: Persistent Agent memory contract

Typical Use Cases#

ScenarioDescription
Data Processing & AnalysisEntity extraction, text classification, structured data extraction
Intelligent AssistantsAgents that search, calculate, and call APIs
Batch Data ProcessingHigh-concurrency calls with async + asyncio.gather
Multimodal ContentImage analysis, image-text comparison
Rapid PrototypingFunctional composition for quick business logic validation

Quick Start#

pip install SimpleLLMFunc
import asyncio
from SimpleLLMFunc import llm_function, OpenAICompatible

llm = OpenAICompatible.load_from_json_file("provider.json")["your_provider"]["model"]

@llm_function(llm_interface=llm)
async def classify_sentiment(text: str) -> str:
    """
    Analyze the sentiment tendency of text.
    Args:
        text: Text to analyze
    Returns:
        Sentiment classification: 'positive', 'negative', or 'neutral'
    """
    pass  # Prompt as Code!

asyncio.run(classify_sentiment("This product is amazing!"))

Requirements#

  • Python >=3.12, <4.0
  • SimpleManus: Agent Framework by the same author, built on SimpleLLMFunc

Related Projects

View All arrow_forward

STAY UPDATED

Get the latest AI tools and trends delivered straight to your inbox. No spam, just intelligence.

rocket_launch