Open Source
Explore the latest AI open-source projects from GitHub and HuggingFace.
Explore the latest AI open-source projects from GitHub and HuggingFace.
Cloudflare Agents is an open-source SDK for building persistent, stateful AI agents that run on Cloudflare's global edge network. Powered by Durable Objects, each agent maintains its own state, storage, and lifecycle with built-in support for real-time communication, scheduling, AI model calls, MCP integration, workflows, and more. The SDK v0.5.0, released in February 2026, introduced a rewritten @cloudflare/ai-chat module and a new Rust-powered Infire inference engine optimized for edge performance. ## Persistent Stateful Architecture Unlike traditional serverless functions that are stateless by design, Cloudflare Agents provide persistent execution environments where state syncs across connected clients and survives restarts. Each agent is an isolated Durable Object with its own storage, identity, and lifecycle. The system hibernates agents when inactive, enabling millions of agents to run cost-effectively across Cloudflare's global network. This architecture solves the fundamental limitation of serverless AI: maintaining conversation context, user preferences, and task progress across interactions without external database overhead. ## Infire: Rust-Powered Edge Inference The v0.5.0 release introduced Infire, a custom-built inference engine written in Rust and designed specifically for edge deployment. Infire optimizes resource allocation at Cloudflare's network edge, reducing latency for AI model calls. Combined with Durable Objects for state management, this creates a vertically integrated execution layer where compute, state, and inference coexist at the edge. ## Real-Time Communication and Streaming Agents support WebSocket-based bidirectional communication out of the box. The @cloudflare/ai-chat package provides message persistence with resumable streaming, meaning users can disconnect and reconnect without losing conversation progress. Server-Sent Events are also supported for simpler streaming use cases. ## MCP Server and Client Support Agents can act as MCP servers, exposing tools to other AI systems, or as MCP clients connecting to external tool providers. This dual MCP capability makes Cloudflare Agents interoperable with the growing MCP ecosystem, including integration with Claude, GPT, and other LLM platforms. ## Workflow Orchestration The SDK includes durable multi-step workflow support with human approval loops. Tasks that require human review can pause execution, wait for approval, and resume later. This is critical for production AI deployments where automated actions need oversight before execution. ## Developer Experience Getting started requires a single command: npm create cloudflare@latest with the agents-starter template. No API keys are required by default since it uses Workers AI. The SDK supports calling any AI model provider including Workers AI, OpenAI, Anthropic, and Gemini. React hooks (useAgent and useAgentChat) are provided for frontend integration, and a Hono middleware package enables server-side framework integration. ## Production Scale Cloudflare's infrastructure enables agents to scale to tens of millions of instances globally. The MIT-licensed SDK is already used by over 2,300 projects, with 66+ contributors and 137 releases demonstrating active maintenance and community adoption.