DISCOVER THE FUTURE OF AI AGENTSarrow_forward

SalmAlm

calendar_todayAdded Feb 25, 2026
categoryAgent & Tooling
codeOpen Source
PythonWorkflow Automation大语言模型LangChainFastAPIModel Context ProtocolRAGAI AgentsLiteLLMWeb ApplicationAgent & ToolingModel & Inference FrameworkDeveloper Tools & CodingAutomation, Workflow & RPAProtocol, API & Integration

A self-hosted personal AI gateway integrating 6 LLM providers, 3-tier auto-routing, and a persistent memory system, enabling personal life automation and dev assistance via 62 built-in tools.

SalmAlm (삶앎) is an open-source Personal AI Gateway project designed to consolidate fragmented AI capabilities into a local environment with the philosophy of "Your Entire AI Life in One pip install".

Core Capabilities#

AI Engine: Supports 6 AI providers (Anthropic, OpenAI, Google, xAI, DeepSeek, local LLMs) with 3-tier auto-routing (simple→Haiku, moderate→Sonnet, complex→Opus/GPT-5) for cost optimization, claiming 83% savings. Features 4-level extended thinking depth and cross-provider failover.

Memory System: Uses a 2-layer file architecture (MEMORY.md long-term + memory/YYYY-MM-DD.md daily logs) combined with TF-IDF RAG technology for automatic context recall and curation. Supports Korean jamo full-file cosine similarity search.

Tool Ecosystem: 62 built-in tools covering Shell execution, file I/O, web search, Python eval, image generation, TTS/STT, browser automation, RAG search, cron jobs, system monitoring, etc. Supports dynamic loading (0-12 tools on demand) and risk-tiered management. Built-in MCP marketplace for installing and managing Model Context Protocol tool servers.

Unique Features#

  • Self-Evolving Prompts: AI automatically learns personality rules from conversations (FIFO mechanism, max 20 entries)
  • Shadow Mode: Learns user communication style and replies on their behalf when absent
  • Dead Man's Switch: Automatically executes preset actions (email/commands) after N days of inactivity
  • Life Dashboard: Unified view for managing expenses, habits, calendar, mood, and routines
  • A/B Split Response: Dual-model parallel comparison for the same question
  • Time Capsule: Schedule encrypted messages to your future self
  • Inter-Agent Communication: HMAC-SHA256 signed communication between SalmAlm instances

Security Mechanisms#

  • Encrypted Vault: PBKDF2-200K key derivation + AES-256-GCM encryption
  • Secure by Default: Dangerous features (Shell/eval) disabled by default
  • Built-in SSRF/CSRF protection, CSP content security policy, audit logging
  • 150+ security test coverage

Deployment & Access#

Installation: pipx install salmalm or via venv Startup: salmalm --open default access at http://localhost:18800 Access Channels: Web UI (SSE streaming), Telegram (polling + webhook), Discord (bot) Local LLM: Supports Ollama, LM Studio, vLLM

Project Metrics#

  • Python files: 192 | Lines of code: ~52,760 | Tests: 1,908 passed
  • Docstring coverage: 99% | Return type hints: 81%
  • Commits: 801+ | Releases: 39
  • License: MIT License

Related Projects

View All arrow_forward

STAY UPDATED

Get the latest AI tools and trends delivered straight to your inbox. No spam, just intelligence.

rocket_launch