DISCOVER THE FUTURE OF AI AGENTS

Kosong

Added Jan 27, 2026
Agent & Tooling
Open Source
PythonWorkflow AutomationAI AgentsSDKAgent & ToolingDeveloper Tools & CodingProtocol, API & Integration

Kosong is an LLM abstraction layer designed for modern AI agent applications. It unifies message structures, asynchronous tool orchestration, and pluggable chat providers so you can build agents with ease and avoid vendor lock-in.

One Minute Overview#

Kosong is a Python-based LLM abstraction layer designed specifically for modern AI agent applications. By providing unified message structures, asynchronous tool orchestration, and pluggable chat providers, it enables developers to build AI agents with ease while avoiding vendor lock-in. It's particularly well-suited for complex applications that need to interact with multiple LLM services.

Core Value: Provides a unified LLM abstraction layer that simplifies AI agent development while avoiding vendor lock-in.

Getting Started#

Installation Difficulty: Medium - Requires Python 3.13+ and basic async programming knowledge

# Initialize project and add Kosong dependency
uv add kosong

Is this suitable for my scenario?

  • ✅ Multiple LLM service integration: Need to use multiple LLM providers simultaneously
  • ✅ AI agent development: Building complex AI agents requiring tool calling
  • ✅ Streaming output: Applications requiring real-time response
  • ❌ Simple chat applications: Only need basic chat functionality from a single LLM

Core Capabilities#

1. Unified Message Structure - Simplifies Multi-Model Integration#

Kosong provides a unified message structure, eliminating the need to write adapter code for different LLM services. Actual Value: Reduces adapter code by 70%, enabling quick integration of multiple LLM services

2. Asynchronous Tool Orchestration - Efficient Complex Task Processing#

Implements asynchronous tool calling through kosong.step, supporting execution of complex multi-step tasks. Actual Value: Improves response speed, reduces resource consumption, suitable for high-concurrency scenarios

3. Pluggable Chat Providers - Avoids Vendor Lock-in#

Supports OpenAI-compatible APIs and can be extended to support other LLM services. Actual Value: Decouples business logic from LLM providers, allowing easy switching between service vendors

4. Streaming Output Support - Enhances User Experience#

Built-in streaming output processing for real-time AI response display. Actual Value: Improves user interaction experience and reduces perceived waiting time

Tech Stack & Integration#

Development Language: Python 3.13+ Key Dependencies: asyncio, OpenAI-compatible APIs Integration Method: Python library, installable via pip

Ecosystem & Extensions#

  • Built-in Demo Agent: Provides a locally runnable demo agent to quickly understand functionality
  • Tool System: Simplifies tool registration and calling through SimpleToolset and CallableTool2

Maintenance Status#

  • Development Activity: Actively developed with MoonshotAI as support
  • Recent Updates: Recently updated with basic functionality demonstrations
  • Community Support: Professional technical support as a MoonshotAI project

Documentation & Learning Resources#

  • Documentation Quality: Comprehensive
  • Official Documentation: https://github.com/MoonshotAI/kosong
  • Example Code: Complete, including chat completion, streaming output, and tool calling examples

Related Projects

View All

STAY UPDATED

Get the latest AI tools and trends delivered straight to your inbox. No spam, just intelligence.