Are you an LLM? Read llms.txt for a summary of the docs, or llms-full.txt for the full context.
Skip to content

SDK Overview

Two ways to embed Amodal in your product:

  • @amodalai/react — Drop-in React components: AmodalProvider, AmodalChat, AmodalAction, and hooks. Talks to a running runtime server over HTTP/SSE.
  • @amodalai/react/widget — Standalone chat widget with SSE streaming, theming, and callbacks. No React required on the host page.
  • @amodalai/runtime — The server-side engine. Use createAgent() to embed the agent runtime directly in your Node.js application.

React (client-side)

npm install @amodalai/react
import { AmodalProvider, AmodalChat } from '@amodalai/react'
 
function App() {
  return (
    <AmodalProvider runtimeUrl="http://localhost:3847" appId="my-app">
      <AmodalChat />
    </AmodalProvider>
  )
}

Your React app calls a running runtime server. See the React SDK reference for the full component API.

Server-side runtime

For server-side embedding — your own Express/Fastify/Hono/Next.js route handlers running the agent in-process — use createAgent() from @amodalai/runtime:

npm install @amodalai/runtime
import { createAgent } from '@amodalai/runtime'
 
const agent = await createAgent({
  repoPath: './my-agent',
  provider: 'anthropic',
  apiKey: process.env.ANTHROPIC_API_KEY,
})
 
// In your Express route:
app.post('/api/chat', async (req, res) => {
  const session = await agent.createSession({ userId: req.user.id })
  for await (const event of session.stream(req.body.message)) {
    res.write(`data: ${JSON.stringify(event)}\n\n`)
  }
  res.end()
})

The agent you create owns its own tool registry, provider, and store backend — you bring your own database (Postgres pool or PGLite for dev), your own auth, your own framework. Amodal stays invisible to your end users.

What you get

  • State-machine agent loop — see State Machine for the full architecture.
  • Multi-provider support — Anthropic, OpenAI, Google, DeepSeek, Groq, Mistral, xAI via the Vercel AI SDK. Provider failover chains built in.
  • Tool system — store tools, connection tools with ACL enforcement, custom tools (handler.ts files), MCP tools, and admin file tools.
  • Sub-agent dispatchdispatch_task spawns a write-enabled sub-agent with its own context.
  • Context compaction + loop detection — long agent runs stay coherent without blowing the budget.
  • Store backends — PGLite for dev, Postgres for production, both via Drizzle ORM. Bring your own via the storeBackend injection.
  • SSE streaming — every session.stream() call yields typed SSE events (init, text_delta, tool_call_start, tool_call_result, done).
  • MCP support — discover tools from connected MCP servers.

When to use which

ScenarioUse
You want to talk to your agent from your own web appReact SDK
You want a chat widget on a marketing page or third-party siteChat widget
You're building a vertical SaaS and want the agent as the core of your productcreateAgent() runtime
You want to run amodal dev locally and iterate on agent configCLI (no SDK code needed)