Cloudflare announced four new infrastructure primitives on April 13 designed to move AI agents from local demos to production workloads running across its global network. The release covers compute, storage, OS-level isolation, and persistence, collectively positioning the company as a full-stack platform for autonomous agent deployment.
Dynamic Workers: Isolate-Based Compute at 100x Container Speed
The centerpiece is Dynamic Workers, an isolate-based runtime built to execute AI-generated code. When an agent needs to call an API, transform data, or chain tool calls, a Dynamic Worker spins up in milliseconds, runs the JavaScript, and disappears.
Cloudflare claims 100x the speed of traditional containers at a fraction of the cost, with scaling to millions of concurrent executions and no warm-up required. The pitch is direct: running each agent in its own container is expensive, which is why today’s agentic tools are mostly limited to coding assistants for engineers who can justify the cost. Dynamic Workers are designed to make per-agent compute cheap enough for mass deployment.
Artifacts: Git Storage for Agent-Generated Code
Artifacts is a git-compatible storage primitive for the code and files that agents produce. It supports creation of tens of millions of repositories, forking from any remote source, and access from any standard git client.
The problem Cloudflare is targeting: traditional version control platforms are struggling to maintain scale and uptime under autonomous workloads that generate code at rates far exceeding human development patterns.
Sandboxes Hit General Availability
For tasks that need a full operating system, Cloudflare’s Sandboxes are now generally available. Each Sandbox is a persistent, isolated Linux environment with shell access, a filesystem, and background process support. Agents can clone repositories, install packages, run builds, and iterate with the same feedback loop a human developer gets.
The Sandboxes complement Dynamic Workers. Most agent tasks only need the fast, lightweight isolate runtime. But when an agent needs to install Python packages, run a build pipeline, or iterate on a codebase, it gets a full Linux box.
Think: Persistence for Long-Running Tasks
Think is a new framework within Cloudflare’s Agents SDK that addresses the disconnect between short-lived agents and long-lived tasks. It adds support for multi-step workflows that persist across sessions, moving beyond the single-prompt-response pattern.
Model Flexibility Post-Replicate Acquisition
Following its acquisition of Replicate, Cloudflare expanded its model catalog to include OpenAI’s GPT-5.4 alongside open-source models, all accessible through a unified API. Switching providers requires changing a single line of code.
The Infrastructure Race
“The way people build software is fundamentally changing. We are entering a world where agents are the ones writing and executing code,” Cloudflare CEO Matthew Prince said in the press release. “We’ve spent nine years building the foundation for this with Cloudflare Workers. Today, we are making Cloudflare the definitive platform for the agentic web.”
Rohan Varma from OpenAI’s Codex team validated the positioning: “Cloud agents are quickly becoming a foundational building block for how work gets done, and with Cloudflare, we’re making it dramatically easier for developers to deploy production-ready agents powered by GPT-5.4 and Codex to run real enterprise workloads at scale,” according to the same press release.
The announcement lands during a week of rapid agent infrastructure buildout. Microsoft open-sourced its Agent Governance Toolkit on April 14. AWS previewed Agent Registry for Bedrock AgentCore. Anthropic launched Claude Managed Agents on April 8. Each targets a different layer of the stack. Cloudflare’s bet is that compute, storage, and runtime will be the bottleneck, not governance or model access, as agents scale from thousands to millions of concurrent instances.