A Python-based MCP protocol implementation framework providing client-server architecture for LLM tool orchestration, integrated with LangChain and ChromaDB for RAG development.
Overview#
MCP Agent Orchestrator is a professional-grade Python framework designed to standardize the interaction between Large Language Models and external tools or knowledge bases. It fully implements the Model Context Protocol (MCP) using a decoupled client-server architecture communicating via StdIO.
Core Capabilities#
1. MCP Protocol Implementation#
- Complete implementation of Model Context Protocol specification
- Decoupled client-server design with StdIO-based inter-process communication
- Automatic tool schema conversion supporting OpenAI-compatible function calling
2. Intelligent Client Bridging (client.py, rag_agent.py)#
- Async lifecycle management using AsyncExitStack
- Persistent session state and multi-turn reasoning loops
- Automatic conversion from
list_toolsto OpenAItoolsschema
3. RAG Knowledge Server (rag_server.py)#
- Data Ingestion: Supports PDF and TXT formats (using LangChain)
- Vector Database: Persistent storage via ChromaDB
- Search Optimization: MMR (Maximal Marginal Relevance) algorithm for diverse retrieval
- Embedding Models: Integrated HuggingFace transformer models
4. Weather Service Server (server.py)#
- Integration with external REST API (WeatherAPI)
- Data normalization and formatting for LLM consumption
- Async request handling using httpx
Implementation Variants#
The project offers three implementation paths:
- Standard Edition: mcp_rag_agent/mcp_agent
- GraphRAG Edition: mcp_rag_agent_graphrag
- LangChain Edition: mcp_rag_langchain
Typical Use Cases#
- Intelligent Conversational Systems: Building AI assistants with tool-calling capabilities
- Knowledge Base Q&A: Intelligent Q&A systems based on private documents
- Real-time Data Integration: LLM integration with real-time external APIs (e.g., weather queries)
- Multi-modal Information Retrieval: Hybrid retrieval systems combining vector search and LLM reasoning
System Architecture#
├── client.py # Standard MCP client implementation
├── rag_agent.py # RAG operation-specific agent
├── server.py # Weather service MCP server
├── rag_server.py # RAG knowledge base MCP server
├── test.py # LLM API connection test
└── data/
├── rag_db/ # Vector storage persistence directory
└── text.txt # Sample knowledge base source file
Installation & Quick Start#
Requirements#
- Python 3.10+
- Virtual environment recommended
Dependency Installation#
pip install mcp langchain langchain-community langchain-openai chromadb httpx python-dotenv openai
Environment Configuration#
Create a .env file:
API_KEY=your_llm_api_key
BASE_URL=https://dashscope.aliyuncs.com/compatible-mode/v1
MODEL=qwen-plus
EMBED_MODEL=sentence-transformers/all-MiniLM-L6-v2
Quick Start#
# Run weather agent
python client.py server.py
# Run RAG agent
python rag_agent.py --server_script rag_server.py
Key Features#
- Standardized Protocol: Follows MCP specification ensuring tool-model interoperability
- Modular Design: Decoupled client-server architecture, easy to extend
- Production Ready: Complete lifecycle management and error handling
- Flexible Deployment: Supports multiple LLM backends (Qwen/DashScope and other OpenAI-compatible interfaces)
Developer Information#
- Developer: Haohao-end
- Primary Language: Python (100%)
- Commit History: 57 commits, actively maintained