DISCOVER THE FUTURE OF AI AGENTSarrow_forward

Agent Smith

calendar_todayAdded Feb 26, 2026
categoryAgent & Tooling
codeOpen Source
TypeScriptNode.jsWorkflow Automation大语言模型AI AgentsAgent FrameworkSDKAgent & ToolingModel & Inference FrameworkDeveloper Tools & CodingProtocol, API & Integration

A local-first AI agents toolkit for Node.js and the browser, providing composable Agent, Task, and semantic memory modules, compatible with local inference backends like Ollama/Llama.cpp, emphasizing explicit control and declarative definitions.

Key Features#

Agent Smith is an agent development framework emphasizing Local First and Explicit Control. Unlike highly abstracted frameworks, it allows developers direct control over inference parameters and tool definitions.

Core Capabilities#

  • Think: Connect to language model servers for inference queries
  • Work: Manage long-running workflows with multiple tasks using tools
  • Remember: Integrate semantic memory storage for data persistence and retrieval
  • Interact: Support interactive dialogue with users

Design Philosophy#

  • Composable: Packages have limited responsibility, can be used independently or together
  • Declarative: Focus on business logic expression rather than low-level control flow
  • Explicit: No hidden magic, behavior under user control

Core Package Structure#

PackageDescriptionNode.jsBrowser
@agent-smith/cliTerminal client
@agent-smith/agentCore Agent implementation
@agent-smith/taskTask definition and execution
@agent-smith/smemSemantic memory
@agent-smith/tfmPrompt templates
@agent-smith/apicliAPI client
@agent-smith/wscliWebSocket client

Use Cases#

  1. CLI Tool Development: Built-in plugins for Git, Vision, etc., suitable for building developer productivity tools
  2. Local AI Applications: Building AI assistants with memory capabilities in fully offline or privacy-sensitive scenarios
  3. Full-stack Agents: Unified front-end and back-end Agent definitions, supporting direct browser calls or Node backend execution

CLI Tool Capabilities#

  • Simple inference query: lm q "your question"
  • Vision/Multimodal: lm vision img1.jpg img2.jpg "Compare images"
  • Git assistant: lm commit (auto-generate commit message)
  • Plugin system: Supports Inference, Vision, Code, Git, Filesystem plugins

Code Example#

import { Agent } from "@agent-smith/agent";
import { Lm } from "@locallm/api";

const lm = new Lm({
    providerType: "openai",
    serverUrl: "http://localhost:8080/v1",
    apiKey: "",
    onToken: (t) => process.stdout.write(t),
});

const agent = new Agent(lm);
await agent.run("List the planets",
    { temperature: 0.5, max_tokens: 4096, model: { name: "qwen4b" } },
    { verbose: true, system: "You are a helpful assistant" }
);

Dependency Ecosystem#

  • Locallm: For unified local model API calls
  • Modprompt: Prompt template management
  • Echo: Go-based server backend

Related Projects

View All arrow_forward

STAY UPDATED

Get the latest AI tools and trends delivered straight to your inbox. No spam, just intelligence.

rocket_launch