# Sandboxing AI agents, 100x faster

> Cloudflare's Dynamic Worker Loader spins up V8 sandboxes in milliseconds, making per-request AI agent isolation 100x faster than containers.

Published: 2026-03-26
URL: https://daniliants.com/insights/sandboxing-ai-agents-100x-faster/
Tags: cloudflare-workers, ai-agents, sandboxing, code-mode, mcp, llm, javascript, infrastructure

---

## Summary

Cloudflare has released Dynamic Worker Loader in open beta, allowing Workers to spin up fully isolated V8-based sandboxes at runtime in a few milliseconds and a few megabytes of memory -- roughly 100x faster and 10x-100x more memory-efficient than containers. The model is: the LLM writes JavaScript/TypeScript, a new isolate runs it in isolation, the isolate is discarded. This makes per-request sandboxing economically viable at consumer scale, something containers cannot offer.

## Key Insight

- **The core problem with containers for AI agents:** hundreds of milliseconds to boot, hundreds of megabytes per instance, forcing warm-pool management and tempting sandbox reuse that erodes security. At one agent per user (or more), this doesn't scale.
- **The V8 isolate alternative:** Dynamic Worker Loader exposes the same isolate mechanism that has powered Workers for eight years. A few ms to start, a few MB of RAM, no per-sandbox concurrency limits, no rate limits on creation. Scales to millions of requests per second where every single request runs a fresh isolated Worker.
- **Token efficiency gain:** Cloudflare's own MCP server exposes the entire Cloudflare API through just 2 tools in under 1,000 tokens by having the LLM write TypeScript against a typed API instead of navigating hundreds of individual tool definitions. They previously demonstrated an 81% token reduction by switching from flat MCP tool calls to Code Mode.
- **TypeScript as the agent API surface:** TypeScript interfaces describe APIs far more concisely than OpenAPI specs (the article shows side-by-side: a ChatRoom interface fits in ~15 lines of TypeScript vs. 60+ lines of OpenAPI YAML). LLMs are trained heavily on TypeScript and will generate it reliably.
- **Security architecture:** Cloudflare patches V8 security issues to production within hours (faster than Chrome ships), runs a custom second-layer sandbox, uses hardware MPK extensions, and has Spectre mitigations. Using Dynamic Workers inherits all of this.
- **Pricing:** $0.002 per unique Worker loaded per day (waived during beta), plus standard CPU time and invocation costs. Effectively negligible vs. LLM inference costs.
- **Supporting libraries released alongside:**
  - `@cloudflare/codemode` -- wraps `DynamicWorkerExecutor` for Code Mode use cases
  - `@cloudflare/worker-bundler` -- bundles npm dependencies at runtime so the agent can use third-party libraries inside the sandbox
  - `@cloudflare/shell` -- virtual filesystem (SQLite + R2 backed) with typed state methods to minimise RPC round-trips in agentic file editing
- **Real-world validation:** Zite (LLM-generated CRUD apps) reports millions of daily executions and says Dynamic Workers outperformed all benchmarked alternatives on speed and library support.