Open-source generative UI toolkit for React with streaming components and MCP integration, enabling AI agents to render and interact with React components directly.
Overview#
Tambo AI is a React SDK focused on Generative UI. It solves the core problem: how to enable LLMs to not just output text, but directly render rich React components and interact with them.
Core Philosophy#
- Developers define standard React components with Zod Schema descriptions
- Tambo Agent automatically selects components and generates Schema-compliant Props based on user intent
- Frontend receives streaming Props in real-time via SDK and renders components
Core Capabilities#
React SDK (@tambo-ai/react)#
Provides hooks and providers for thread management, streaming, and component rendering:
- TamboProvider: Global context provider for API Key, component registration, tool registration, and user authentication
- useTambo(): Access message list and streaming state
- useTamboThreadInput(): Manage input state and submission logic
- useTamboSuggestions(): Get AI-generated next-step suggestions
Component Model#
Generative Components
One-time rendering with instant results as conversation progresses:
const components: TamboComponent[] = [
{
name: "Graph",
description: "Displays data as charts using Recharts library",
component: Graph,
propsSchema: z.object({
data: z.array(z.object({ name: z.string(), value: z.number() })),
type: z.enum(["line", "bar", "pie"]),
}),
},
];
Interactable Components
Persistent with unique IDs, supporting cross-session updates:
const InteractableNote = withInteractable(Note, {
componentName: "Note",
description: "A note supporting title, content, and color modifications",
propsSchema: z.object({
title: z.string(),
content: z.string(),
color: z.enum(["white", "yellow", "blue", "green"]).optional(),
}),
});
Built-in Agent#
- Run LLM conversation loops without external frameworks
- Bring Your Own Key (BYOK) support
- Optional integration with LangChain/Mastra
Streaming Infrastructure#
- Real-time streaming of Props as LLM generates them
- Support for cancellation, error recovery, and reconnection
MCP Integration#
Full MCP protocol support for connecting to external systems/tools:
import { MCPTransport } from "@tambo-ai/react/mcp";
const mcpServers = [
{
name: "filesystem",
url: "http://localhost:8261/mcp",
transport: MCPTransport.HTTP,
},
];
Local Tools#
Define browser-side tools (DOM manipulation, authenticated fetch, React state access):
const tools: TamboTool[] = [
{
name: "getWeather",
description: "Fetches weather for a location",
tool: async (params: { location: string }) =>
fetch(`/api/weather?q=${encodeURIComponent(params.location)}`).then((r) => r.json()),
inputSchema: z.object({ location: z.string() }),
outputSchema: z.object({ temperature: z.number(), condition: z.string(), location: z.string() }),
},
];
Use Cases#
- AI-powered Data Visualization: User says "plot last quarter's sales trend" and a line chart renders directly
- Dynamic Interactive Components: Update Kanban tasks or modify shopping cart via conversation
- External Data-connected Assistants: Connect to databases or APIs via MCP protocol and display in UI
Quick Start#
CLI Initialization#
npm create tambo-app my-tambo-app # auto-initializes git + tambo setup
cd my-tambo-app
npm run dev
Template Options#
- AI Chat with Generative UI: Chat interface + dynamic component generation
- AI Analytics Dashboard: AI-powered analytics dashboard
Provider Configuration Example#
<TamboProvider
apiKey={process.env.NEXT_PUBLIC_TAMBO_API_KEY!}
userKey={currentUserId}
components={components}
tools={tools}
mcpServers={mcpServers}
contextHelpers={{
selectedItems: () => ({
key: "selectedItems",
value: selectedItems.map((i) => i.name).join(", "),
}),
currentPage: () => ({ key: "page", value: window.location.pathname }),
}}
>
<Chat />
<InteractableNote id="note-1" title="My Note" content="Start writing..." />
</TamboProvider>
Authentication#
- userKey: Server-side/trusted environments
- userToken: Client-side OAuth, supporting Auth0, Clerk, Supabase, WorkOS, and other major providers
Deployment Options#
- Tambo Cloud: Managed backend handling conversation state and agent orchestration
- Self-hosted: Run the same backend via Docker on your own infrastructure
LLM Provider Support#
- OpenAI
- Anthropic
- Cerebras
- Google Gemini
- Mistral
- Other OpenAI-compatible providers