Local AI Agent Runtime

Paean Claw

Your personal AI agent, running locally. 477 lines of code. Any LLM provider, MCP tools, web PWA + Telegram. Ultra-minimal, fully hackable, local-first.

$bunx paeanclaw

or install globally

npm install -g paeanclawbun install -g paeanclaw

Radically Minimal

477
Lines of Code
5
Source Files
2
Runtime Deps
~20ms
Bun Startup

Any LLM Provider

Works with any OpenAI-compatible API — OpenAI, Claude, Gemini, Ollama, DeepSeek, Paean AI, or your own endpoint.

MCP Tool Ecosystem

Connect to any MCP server for filesystem access, web search, API integrations, and custom tool calling. Full Model Context Protocol support built in.

PWA + Telegram

Access your agent from any device via the installable web PWA, or interact through Telegram — in private and group chats.

Bun-First Runtime

Built-in SQLite via bun:sqlite eliminates all native dependencies. Zero compile, ~20ms startup, 2x faster than Node.js.

Local-First Data

All data stored in local SQLite. No cloud lock-in, no telemetry, no data leaving your machine unless you choose to.

AI-Hackable

The entire source fits in a single LLM context window. Fork it, customize with AI assistance, and make it yours.

Architecture

src/index.ts~140 lines
src/agent.ts~130 lines
src/store.ts~90 lines
src/mcp.ts~60 lines
src/telegram.ts~60 lines

API

POST/api/chat
GET/api/conversations
GET/api/messages

Paean Claw

Your personal AI agent, running locally. 477 lines of code. Any LLM provider, MCP tools, web PWA + Telegram. Ultra-minimal, fully hackable, local-first.