Charger — A Coding Agent
Build a production-ready coding agent with sandboxed execution in under 10 minutes.
Tip: Copy this guide into Claude Code and let it build the project for you.
We're going to build a production-ready coding agent that can read, write, and execute code in a sandboxed environment using Daytona, all in about 10 minutes.
For the complete working example, see Charger.
What We're Building
By the end of this guide, you'll have a fully functional coding agent that can:
- Execute shell commands in an isolated sandbox
- Read and write files
- Search codebases with grep and glob patterns
- Search the web for documentation
- Track tasks across multi-step workflows
The stack:
- kernl — Agent framework
- Daytona — Secure sandboxed environments
- Hono — API server
- Vercel AI SDK — Streaming chat UI
Prerequisites
- Node.js 20+
- A Daytona API key
- An Anthropic API key
- A Parallel AI API key (for web search)
- PostgreSQL database (for thread persistence)
1. Project Setup
Start by scaffolding a new kernl project:
pnpm create kernl@latest charger
cd chargerInstall the kernl CLI and add the toolkits from the marketplace:
npm install -g @kernl-sdk/cli
kernl add toolkit daytona
kernl add toolkit parallelThis creates:
charger/
├── src/
│ ├── agents/
│ │ └── jarvis.ts # Example agent (we'll replace this)
│ ├── toolkits/
│ │ ├── daytona/ # Sandbox filesystem, process, git
│ │ └── parallel/ # Web search via MCP
│ └── index.ts
├── kernl.json # CLI configuration
├── package.json
└── tsconfig.json2. The Agent
Here's what we're building — an agent with five toolkits:
// src/agents/charger.ts
import { Agent } from "kernl";
import { anthropic } from "@kernl-sdk/ai/anthropic";
import { fs, process, git, type SandboxContext } from "@/toolkits/daytona";
import { web } from "@/toolkits/parallel";
import { todo, type TodoContext } from "@/toolkits/todo";
// Intersection of all toolkit contexts - ensures type safety across tools
type ChargerContext = SandboxContext & TodoContext;
export const charger = new Agent<ChargerContext>({
id: "charger", // Used in API routes: /agents/charger/stream
name: "Charger",
description: "A coding assistant with sandbox access",
model: anthropic("claude-opus-4-5"),
instructions: `You are Charger, a coding assistant with access to a sandbox.
Use the todo tools to track multi-step tasks. Search the web when you need
documentation or examples.`,
toolkits: [fs, process, git, web, todo],
});The toolkits give the agent these capabilities:
| Toolkit | Tools | Source |
|---|---|---|
| fs | read, write, edit, list, mkdir, rm, mv, find, grep | Daytona (marketplace) |
| process | exec, kill | Daytona (marketplace) |
| git | clone, status, commit, push, pull | Daytona (marketplace) |
| web | search, extract | Parallel (marketplace) |
| todo | read, write | Custom (we'll build this) |
The ChargerContext type is the intersection of all toolkit contexts — this ensures type safety when tools access shared state.
3. The Toolkits
Daytona Toolkit
The Daytona toolkit uses lazy sandbox provisioning — sandboxes are created on first tool use, not upfront. Here's what a tool looks like:
// src/toolkits/daytona/fs.ts
import { z } from "zod";
import { tool, type Context } from "kernl";
import { getSandbox, type SandboxContext } from "./client";
const read = tool({
id: "fs_read",
description: "Read the contents of a file",
parameters: z.object({
path: z.string().describe("Path to the file"),
}),
execute: async (ctx: Context<SandboxContext>, { path }) => {
const sandbox = await getSandbox(ctx); // creates or reuses the sandbox for this thread
const buffer = await sandbox.fs.downloadFile(path);
return buffer.toString("utf-8");
},
});The magic is in getSandbox(ctx) — it lazily provisions sandboxes and stores the ID in context:
// src/toolkits/daytona/client.ts
import { Daytona, Sandbox, SandboxState } from "@daytonaio/sdk";
import type { Context } from "kernl";
export const daytona = new Daytona();
// Git credentials for private repo operations (clone, push, pull)
export interface GitCredentials {
username: string; // For GitHub: "x-access-token"
token: string; // Personal access token or app token
}
// Extends the thread context - fields are persisted across requests
export interface SandboxContext {
sandboxId?: string;
git?: GitCredentials;
env?: Record<string, string>;
}
// Helper for git tools to get credentials from context
export function getGitCredentials(ctx: Context<SandboxContext>) {
const { git } = ctx.context;
return git ? { username: git.username, password: git.token } : {};
}
export async function getSandbox(
ctx: Context<SandboxContext>,
): Promise<Sandbox> {
const { context } = ctx;
// Reuse existing sandbox if we have one
if (context.sandboxId) {
const sandbox = await daytona.get(context.sandboxId);
if (sandbox.state !== SandboxState.STARTED) {
await sandbox.start();
}
return sandbox;
}
// First tool call - create a new sandbox
const sandbox = await daytona.create({
language: "typescript",
autoStopInterval: 30,
});
// Store ID in context - persisted with the thread
context.sandboxId = sandbox.id;
return sandbox;
}Because context is persisted with the thread, the sandbox survives across HTTP requests — the entire conversation shares one sandbox.
For private repo access, pass git credentials in the stream call:
agent.stream(input, {
threadId: tid,
context: {
git: { username: "x-access-token", token: "ghp_..." },
},
});Passing credentials through context keeps them out of the LLM's view entirely. The model never sees the token — it just calls git_clone(url, path) and the tool injects auth server-side. This prevents accidental leakage in responses or logs.
Diff UI: The
fs_edittool returns adiffobject withbefore,after, and line stats. The agentic-chatbot template uses this to render a side-by-side diff view for code changes.
Adding Web Search
The Parallel toolkit uses kernl's MCPToolkit to connect to an external MCP server:
// src/toolkits/parallel/index.ts
import { MCPToolkit, MCPServerStreamableHttp } from "kernl";
// MCPToolkit wraps any MCP server as a kernl toolkit
export const web = new MCPToolkit({
id: "web_search",
description: "Real-time web search via Parallel AI",
// Streamable HTTP transport for MCP servers
server: new MCPServerStreamableHttp({
url: "https://search-mcp.parallel.ai/mcp",
requestInit: {
headers: {
Authorization: `Bearer ${process.env.PARALLEL_API_KEY}`,
},
},
}),
});Adding Task Tracking
Let's build a simple todo toolkit from scratch. This shows how to create tools with in-memory state:
// src/toolkits/todo/index.ts
import { z } from "zod";
import { tool, Toolkit, type Context } from "kernl";
const store = new Map<string, Todo[]>();
interface Todo {
id: string;
content: string;
status: "pending" | "in_progress" | "completed";
priority: "high" | "medium" | "low";
}
export interface TodoContext {
threadId?: string;
}
const TodoSchema = z.object({
id: z.string().describe("Unique identifier"),
content: z.string().describe("Task description"),
status: z.enum(["pending", "in_progress", "completed"]),
priority: z.enum(["high", "medium", "low"]),
});
const read = tool({
id: "todo_read",
description: "Read the current todo list",
parameters: z.object({}),
execute: async (ctx: Context<TodoContext>) => {
const key = ctx.context.threadId ?? "default";
const todos = store.get(key) ?? [];
const pending = todos.filter((t) => t.status !== "completed").length;
return { message: `${pending} todos remaining`, todos };
},
});
const write = tool({
id: "todo_write",
description: "Update the todo list",
parameters: z.object({
todos: z.array(TodoSchema).describe("The complete updated todo list"),
}),
execute: async (ctx: Context<TodoContext>, { todos }) => {
const key = ctx.context.threadId ?? "default";
store.set(key, todos);
const pending = todos.filter((t) => t.status !== "completed").length;
return { message: `${pending} todos remaining`, todos };
},
});
export const todo = new Toolkit<TodoContext>({
id: "todo",
description: "Task management: create, update, and track todos",
tools: [read, write],
});4. The API Server
To connect the agentic-chatbot UI, you need an API server with these minimum endpoints:
| Method | Endpoint | Description |
|---|---|---|
POST | /agents/{agentId}/stream | Stream agent response (SSE) |
GET | /agents | List available agents |
GET | /threads | List threads (for history) |
GET | /threads/{tid} | Get thread metadata |
GET | /threads/{tid}/messages | Get thread messages |
For the full API specification including request/response schemas, see the OpenAPI spec or the Charger microproject for a complete implementation.
Install the dependencies:
pnpm add hono @hono/node-server @hono/zod-validator @kernl-sdk/ai @kernl-sdk/pg aiCreate the following structure:
src/
├── agents/
│ └── charger.ts
├── api/
│ ├── agents.ts
│ └── threads.ts
├── toolkits/
│ ├── daytona/
│ ├── parallel/
│ └── todo/
├── app.ts
└── index.tsAgents Endpoints
// src/api/agents.ts
import { Hono } from "hono";
import { Kernl } from "kernl";
import { zValidator } from "@hono/zod-validator";
import { createUIMessageStreamResponse, type UIMessage } from "ai";
import { UIMessageCodec, toUIMessageStream } from "@kernl-sdk/ai";
import { z } from "zod";
const StreamBody = z.object({
tid: z.string().min(1),
message: z.record(z.string(), z.unknown()),
});
type Variables = { kernl: Kernl };
export const agents = new Hono<{ Variables: Variables }>();
// GET /agents - List all registered agents
agents.get("/", async (c) => {
const kernl = c.get("kernl");
const list = kernl.agents.list();
return c.json({
agents: list.map((a) => ({
id: a.id,
name: a.name,
description: a.description,
model: {
provider: a.model.provider,
modelId: a.model.modelId,
},
})),
});
});
// POST /agents/:id/stream - Stream agent response
agents.post("/:id/stream", zValidator("json", StreamBody), async (c) => {
const kernl = c.get("kernl");
const id = c.req.param("id");
const agent = kernl.agents.get(id);
if (!agent) {
return c.json({ error: "Agent not found" }, 404);
}
const { tid, message } = c.req.valid("json");
// Decode the AI SDK UIMessage into kernl's internal format
const input = await UIMessageCodec.decode(message as unknown as UIMessage);
// Auto-create thread on first message (frontend generates the UUID)
const existing = await agent.threads.get(tid);
if (!existing) {
await agent.threads.create({ tid });
}
// agent.stream() returns a kernl stream
const stream = agent.stream(input, { threadId: tid });
// Convert to AI SDK's UIMessageStream format for the frontend
return createUIMessageStreamResponse({
stream: toUIMessageStream(stream),
});
});Threads Endpoint
// src/api/threads.ts
import { Hono } from "hono";
import { Kernl } from "kernl";
import { zValidator } from "@hono/zod-validator";
import { historyToUIMessages } from "@kernl-sdk/ai";
import { z } from "zod";
const ListQuery = z.object({
agent_id: z.string().optional(),
limit: z.coerce.number().int().positive().optional(),
});
type Variables = { kernl: Kernl };
export const threads = new Hono<{ Variables: Variables }>();
threads.get("/", zValidator("query", ListQuery), async (c) => {
const kernl = c.get("kernl");
const { agent_id, limit } = c.req.valid("query");
const page = await kernl.threads.list({ agentId: agent_id, limit });
const list = await page.collect();
return c.json({ threads: list, count: list.length });
});
threads.get("/:tid", async (c) => {
const kernl = c.get("kernl");
const tid = c.req.param("tid");
// Fetch thread with message history (newest first from DB)
const thread = await kernl.threads.get(tid, { history: { limit: 50 } });
if (!thread) {
return c.json({ error: "Thread not found" }, 404);
}
return c.json({
...thread,
// Convert kernl messages to AI SDK UIMessage format, oldest first for display
history: historyToUIMessages((thread.history ?? []).reverse()),
});
});
threads.delete("/:tid", async (c) => {
const kernl = c.get("kernl");
await kernl.threads.delete(c.req.param("tid"));
return c.json({ success: true });
});App Builder
// src/app.ts
import { Hono } from "hono";
import { cors } from "hono/cors";
import { Kernl } from "kernl";
import { postgres } from "@kernl-sdk/pg";
import { charger } from "@/agents/charger";
import { agents } from "@/api/agents";
import { threads } from "@/api/threads";
type Variables = { kernl: Kernl };
export function build(): Hono<{ Variables: Variables }> {
const kernl = new Kernl({
storage: { db: postgres({ url: process.env.DATABASE_URL! }) },
});
kernl.register(charger);
const app = new Hono<{ Variables: Variables }>();
app.use("/*", cors({ origin: ["http://localhost:3000"], credentials: true }));
app.use("/*", async (c, next) => {
c.set("kernl", kernl);
await next();
});
app.route("/agents", agents);
app.route("/threads", threads);
return app;
}Entry Point
// src/index.ts
import { serve } from "@hono/node-server";
import { build } from "./app";
const PORT = process.env.PORT ?? "3001";
const app = build();
console.log(`Server running at http://localhost:${PORT}`);
serve({ fetch: app.fetch, port: parseInt(PORT) });Environment Variables
Create .env:
ANTHROPIC_API_KEY=sk-ant-...
DAYTONA_API_KEY=...
PARALLEL_API_KEY=...
DATABASE_URL="postgres://postgres:postgres@localhost:5432/postgres"Start the server:
pnpm dev5. The Chatbot UI
Restructure as a monorepo with Turbo:
charger/
├── server/ # Move existing code here
│ ├── src/
│ ├── package.json
│ └── ...
├── web/ # Frontend
├── package.json # Root package.json
├── pnpm-workspace.yaml
└── turbo.jsonCreate pnpm-workspace.yaml:
packages:
- "server"
- "web"Create turbo.json:
{
"$schema": "https://turborepo.com/schema.json",
"ui": "tui",
"tasks": {
"dev": {
"cache": false,
"persistent": true
},
"build": {
"dependsOn": ["^build"],
"outputs": ["dist/**", ".next/**"]
}
}
}Create the root package.json:
{
"name": "charger",
"private": true,
"packageManager": "pnpm@10.19.0",
"scripts": {
"build": "turbo run build",
"dev": "turbo run dev",
"lint": "turbo run lint",
"format": "prettier --write \"**/*.{ts,tsx,md}\"",
"check-types": "turbo run check-types"
},
"devDependencies": {
"prettier": "^3.7.4",
"turbo": "^2.7.5",
"typescript": "5.9.2"
},
"engines": {
"node": ">=18"
}
}Clone the agentic-chatbot template:
git clone https://github.com/kernl-sdk/agentic-chatbot.git web
rm -rf web/.gitUpdate the default agent ID from jarvis to charger:
find web/src -type f -name "*.tsx" -exec sed -i '' 's/jarvis/charger/g' {} +Configure web/.env.local:
NEXT_PUBLIC_API_BASE_URL=http://localhost:3001Install and run:
pnpm install
pnpm devThis starts both the server (port 3001) and the frontend (port 3000) in parallel.
Full Example
See the Charger microproject for the complete implementation.
API Specification
For the complete API specification, see the OpenAPI spec in the agentic-chatbot template.
Key implementation notes:
-
Thread auto-creation: The
/agents/:id/streamendpoint creates the thread if it doesn't exist. The frontend generates a UUID for new conversations. -
UIMessage format: Messages use the Vercel AI SDK UIMessage format. The
historyToUIMessageshelper from@kernl-sdk/aiconverts kernl's internal format. -
SSE streaming: Use
createUIMessageStreamResponseandtoUIMessageStreamfrom the AI SDK to format the response correctly. -
CORS: The server must allow requests from the frontend origin (
http://localhost:3000in development).