Reflective memory for AI agents — semantic storage with multi-format ingestion, intent tracking, and cross-session context persistence.
keep (keep-skill)#
Tagline#
Reflective memory for AI agents — a semantic storage and retrieval system.
Core Capabilities#
Semantic Storage & Retrieval#
- Vector Search: Retrieve by meaning, not keywords, powered by vector embeddings
- Multi-format Support: URL, text files, PDF, HTML, Office docs, audio, images
- Auto Summary & Tags: Generate summaries, embeddings, and extract tags on ingestion
- Content-addressed ID: Identical text generates same ID (
%prefix), enabling deduplication
Reflective Memory Mechanism#
- Contextual Feedback: Auto-associate open commitments, past experiences, lessons learned on retrieval
- Intent Tracking:
keep nowcommand records current work with versioned persistence - Version History: Complete version chain for each note, traceable via
@V{1}syntax
Tags & Organization#
- Structured Tags: Support dimensions like
project,topic,type,status - Precise Filtering + Semantic Discovery: Tags for exact filtering, search for fuzzy discovery
Document Analysis#
- Decomposition Analysis:
analyzesplits documents into independently retrievable segments
Architecture#
Storage Layer#
- Vector Store: ChromaDB (embedding storage & similarity search)
- Metadata & Versions: SQLite (document metadata, tags, version history)
Embedding & Summary Backends#
- Local Models: MLX (macOS Apple Silicon), Ollama (auto-detected)
- Cloud APIs: OpenAI, Google Gemini, Voyage (embeddings), Anthropic (summaries)
- Hosted Service: keepnotes.ai
Integration Layer#
- Skill Prompt: Embed reflection guidance in system prompts
- Hooks: Inject
keep nowcontext at session start, prompt submission, session end - MCP: stdio server with 9 tools, MCP-protocol compatible
- LangChain: LangGraph BaseStore, retriever, tools, middleware
Installation & Quick Start#
# Recommended: using uv
uv tool install keep-skill
# Or using pip
pip install keep-skill
# Configure API key (not needed with local Ollama)
export OPENAI_API_KEY=...
# Basic usage
keep put "Rate limit is 100 req/min" -t topic=api
keep find "what's the rate limit?"
keep now "Debugging auth flow"
keep list --tag project=myapp
Python API#
from keep import Keeper
kp = Keeper()
# Store
kp.put(uri="file:///path/to/doc.md", tags={"project": "myapp"})
kp.put("Rate limit is 100 req/min", tags={"topic": "api"})
# Semantic search
results = kp.find("rate limit", limit=5)
for r in results:
print(f"[{r.score:.2f}] {r.summary}")
# Version history
prev = kp.get_version("doc:1", offset=1)
AI Tool Integration#
Auto-detected and installed on first run:
- Claude Code: CLAUDE.md reflection prompts + settings.json hooks
- VS Code Copilot: Auto-reads Claude Code hooks
- Kiro: Practice prompts + agent hooks
- OpenAI Codex: AGENTS.md prompts
- OpenClaw: Prompts + plugin + daily cron deep reflection
Set KEEP_NO_SETUP=1 to skip auto-installation.
Use Cases#
- AI Agent Memory: Persistent semantic memory for AI coding assistants
- Personal Knowledge Management: Store notes, docs, URLs with semantic retrieval
- Team Context Continuity: Track work progress with
keep now, auto-restore across sessions - Document Knowledge Extraction: Decompose long documents into independently retrievable parts
- Reflective Practice: Guide AI agents through structured reflection with
SKILL.md