A local-first AI agents toolkit for Node.js and the browser, providing composable Agent, Task, and semantic memory modules, compatible with local inference backends like Ollama/Llama.cpp, emphasizing explicit control and declarative definitions.
Key Features#
Agent Smith is an agent development framework emphasizing Local First and Explicit Control. Unlike highly abstracted frameworks, it allows developers direct control over inference parameters and tool definitions.
Core Capabilities#
- Think: Connect to language model servers for inference queries
- Work: Manage long-running workflows with multiple tasks using tools
- Remember: Integrate semantic memory storage for data persistence and retrieval
- Interact: Support interactive dialogue with users
Design Philosophy#
- Composable: Packages have limited responsibility, can be used independently or together
- Declarative: Focus on business logic expression rather than low-level control flow
- Explicit: No hidden magic, behavior under user control
Core Package Structure#
| Package | Description | Node.js | Browser |
|---|---|---|---|
| @agent-smith/cli | Terminal client | ✅ | ❌ |
| @agent-smith/agent | Core Agent implementation | ✅ | ✅ |
| @agent-smith/task | Task definition and execution | ✅ | ✅ |
| @agent-smith/smem | Semantic memory | ✅ | ❌ |
| @agent-smith/tfm | Prompt templates | ✅ | ✅ |
| @agent-smith/apicli | API client | ✅ | ✅ |
| @agent-smith/wscli | WebSocket client | ✅ | ✅ |
Use Cases#
- CLI Tool Development: Built-in plugins for Git, Vision, etc., suitable for building developer productivity tools
- Local AI Applications: Building AI assistants with memory capabilities in fully offline or privacy-sensitive scenarios
- Full-stack Agents: Unified front-end and back-end Agent definitions, supporting direct browser calls or Node backend execution
CLI Tool Capabilities#
- Simple inference query:
lm q "your question" - Vision/Multimodal:
lm vision img1.jpg img2.jpg "Compare images" - Git assistant:
lm commit(auto-generate commit message) - Plugin system: Supports Inference, Vision, Code, Git, Filesystem plugins
Code Example#
import { Agent } from "@agent-smith/agent";
import { Lm } from "@locallm/api";
const lm = new Lm({
providerType: "openai",
serverUrl: "http://localhost:8080/v1",
apiKey: "",
onToken: (t) => process.stdout.write(t),
});
const agent = new Agent(lm);
await agent.run("List the planets",
{ temperature: 0.5, max_tokens: 4096, model: { name: "qwen4b" } },
{ verbose: true, system: "You are a helpful assistant" }
);
Dependency Ecosystem#
- Locallm: For unified local model API calls
- Modprompt: Prompt template management
- Echo: Go-based server backend