
A developer runs a self-hosted AI agent using Moltworker on Cloudflare’s edge platform—demonstrating how AI agents can operate securely without relying on local hardware. Image Source: ChatGPT-5.2
Cloudflare Introduces Moltworker, Enabling Self-Hosted AI Agents Without Dedicated Hardware
Global interest in self-hosted AI agents surged this week as developers rushed to buy Mac minis to run Moltbot, an open-source, self-hosted personal AI agent designed to operate entirely on a user’s own hardware. But Cloudflare believes there’s another path.
In a new engineering blog post, Cloudflare introduced Moltworker, an adaptation of Moltbot that allows the AI agent to run on Cloudflare’s Developer Platform instead of requiring dedicated local machines. The project demonstrates how self-hosted AI agents can operate securely, scalably, and closer to users — without the need for physical hardware.
Editor’s note: As of January 30, 2026, Moltbot has been renamed to OpenClaw.
Key Takeaways: Cloudflare Moltworker and Self-Hosted AI Agents
Moltworker enables the Moltbot (OpenClaw) AI agent to run on Cloudflare Workers instead of local hardware.
Cloudflare combines Workers, Sandboxes, AI Gateway, Browser Rendering, R2 storage, and Zero Trust Access to support AI agents.
Moltworker is released as an open-source proof of concept, not a commercial Cloudflare product.
The project demonstrates how self-hosted AI agents can run securely at the edge.
Developers can deploy and manage AI agents without purchasing dedicated machines.
What Is Moltbot (OpenClaw) — and Why Cloudflare Built Moltworker
Moltbot (now OpenClaw) is an open-source, self-hosted AI agent designed to function as a personal assistant. It can help manage tasks such as finances, scheduling, and social media, and it integrates with popular chat platforms and AI models. Traditionally, Moltbot runs on a user’s own machine — often requiring dedicated hardware like a Mac mini to operate continuously.
Cloudflare’s Moltworker project asks a simple question: What if you could run a self-hosted AI agent online, without buying new hardware, while still retaining control and security?
Moltworker answers that by adapting Moltbot to run on Cloudflare’s global edge infrastructure.
How Moltworker Runs a Self-Hosted AI Agent on Cloudflare Workers
Moltworker acts as a middleware Worker and set of adapted scripts that allow Moltbot to run on Cloudflare’s Sandbox SDK and Developer Platform APIs, rather than on dedicated local hardware.
At its core, Moltworker is a combination of:
An entrypoint Worker that routes API requests and the isolated environment, both protected by Cloudflare Access
A Sandbox container that runs the standard Moltbot Gateway runtime and its integrations, connected through an administration UI
Supporting Cloudflare services — including R2 for persistent storage, Browser Rendering for web automation, AI Gateway for model access, and Zero Trust Access for security
Cloudflare says recent improvements in Node.js compatibility within Workers were critical to making Moltworker possible. APIs that previously required workarounds are now supported natively, allowing more complex applications — including AI agents — to run closer to users at the edge.
The company points to internal experiments showing that nearly all of the 1,000 most popular npm packages now run successfully on Workers, reinforcing the platform’s growing maturity and production readiness.
What makes Moltworker notable is not just what it does, but why it is now feasible. Cloudflare notes that expanded Node.js compatibility, combined with tighter integration across its Developer Platform, made it possible to run more application logic directly on Workers. Paired with Sandboxes for secure code execution, Browser Rendering for web automation, and R2 for persistent storage, these capabilities provided the building blocks needed to adapt Moltbot in a way that would have been difficult or impractical on the platform just a few years ago.
AI Gateway: Managing Models, Costs, and Reliability
Moltworker integrates with Cloudflare AI Gateway, which acts as a proxy between AI agents and model providers. AI Gateway provides:
Centralized cost tracking and analytics
All AI requests routed through AI Gateway are logged and metered in one place, giving developers visibility into usage patterns, request volume, and model-level costs across providers.Support for Bring Your Own Key (BYOK)
Developers can securely store and manage their own AI provider keys within Cloudflare, avoiding the need to pass credentials in plaintext with each request while maintaining control over which models are used.Unified Billing without provider credential management
As an alternative to BYOK, developers can use Cloudflare’s Unified Billing to access supported AI models directly, paying through Cloudflare credits instead of managing individual provider accounts or API keys.
While Moltbot supports multiple AI providers, Cloudflare highlights the advantage of AI Gateway’s flexibility: developers can switch models or providers without redeploying the agent, and configure fallbacks for reliability.
Sandboxes: Running Untrusted Code Securely
A critical requirement for AI agents is the ability to execute untrusted code safely. Moltworker relies on Cloudflare Sandboxes, which provide isolated environments designed specifically for secure code execution.
Built on top of Cloudflare Containers, the Sandbox SDK abstracts away much of the underlying complexity. Instead of managing container lifecycles, networking, file systems, and background processes directly, developers interact with a simplified, developer-friendly API from within Workers applications.
For Moltworker, this replaces the need to run Docker locally while preserving strong isolation guarantees. The agent can execute commands, manage files, and run background processes securely, without exposing the host system — a capability that has become increasingly important as AI agents take on more autonomous, unsupervised tasks.
Persistent Storage with R2
When AI agents run on a local machine or traditional virtual private server (VPS), persistent storage is available by built-in and free by default, allowing data to be retained without additional configuration.
Because containers are temporary by design, Moltworker uses Cloudflare R2 for persistent storage. By mounting an R2 bucket directly into the Sandbox environment, Moltbot can retain conversation history, memory files, and other stateful data across restarts.
This approach preserves one of the main benefits of local deployment — persistent memory — while running the agent in the cloud.
Browser Rendering for Web Automation
AI agents depend heavily on interacting with the web — and not the clean, structured kind. Many real-world tasks require navigating dynamic websites, handling forms, clicking through interfaces, capturing visual output, and extracting information from pages that were never designed for automation.
Moltbot relies on browser automation to perform these tasks, using Chromium to navigate websites, fill out forms, take screenshots, and interact with web-based tools as a human would. In a traditional setup, this would require running a full browser instance inside a container — adding complexity, resource overhead, and operational friction.
Moltworker takes a different approach. Instead of running Chromium directly inside the Sandbox container, it uses Cloudflare Browser Rendering, an API-driven service that provides programmatic access to headless browser instances running across Cloudflare’s edge network. Developers can control these browsers using familiar automation frameworks such as Puppeteer and Playwright, allowing Moltbot to perform web tasks without managing browser infrastructure itself. Cloudflare also supports MCP, enabling tighter integration with AI-driven workflows.
To make this work, Cloudflare implemented two key adaptations. First, Moltworker creates a lightweight Chrome DevTools Protocol (CDP) proxy, which routes browser commands from the Sandbox container through the Moltbot Worker to Cloudflare’s Browser Rendering service using standard Puppeteer APIs. Second, a Browser Rendering “skill” is injected into the Moltbot runtime when the Sandbox starts.
From Moltbot’s perspective, this setup behaves as if a local browser is available. It connects to what appears to be a local CDP endpoint and executes browser actions normally — even though the browser itself is running elsewhere on Cloudflare’s global network. The result is simpler architecture, reduced resource overhead, and scalable browser automation that remains transparent to the AI agent.
Zero Trust Access for Authentication and Observability
To secure Moltworker’s APIs and administration interface, Cloudflare uses Zero Trust Access, which replaces the need to build custom authentication and identity systems from scratch. Instead of embedding login logic into the application, developers define who can access specific endpoints using policy-based controls, such as identity providers, device posture, or user attributes.
Once Access is in place, Cloudflare handles authentication automatically and attaches a signed JSON Web Token (JWT) to every request that reaches the application. Moltworker can validate this token to confirm the request came through Cloudflare Access — protecting the system from unauthorized or spoofed traffic without adding complexity to the application itself.
Beyond security, Zero Trust Access also provides visibility into usage. Developers can see who is accessing Moltworker, when, and how often, making it easier to monitor behavior, audit access, and manage operational risk as AI agents move from experimentation into real-world use.
Moltworker in Action: Real-World AI Agent Tasks
Cloudflare demonstrated Moltworker through a Slack-connected AI agent performing tasks such as:
Finding travel routes in Google Maps, navigating step-by-step directions, and capturing screenshots directly into Slack
Browsing the web and retaining memory across interactions, allowing the agent to recall prior requests and context
Recommending local food options, including searching for nearby Asian cuisine and presenting results visually
Generating videos from browser sessions, downloading assets and using tools like ffmpeg to compile captured frames into a video
These examples illustrate how AI agents can execute multi-step, real-world tasks that combine web navigation, automation, memory, and media generation — all without requiring local hardware or dedicated machines.
Open Source and Experimental by Design
Cloudflare has open-sourced Moltworker and made the full implementation available on GitHub, allowing developers to deploy and run their own instance of Moltbot on Cloudflare Workers rather than on local hardware. The project is available at:
https://github.com/cloudflare/moltworker
The repository includes a detailed README that walks users through the full setup process, including configuring Workers, Sandboxes, storage, browser automation, and AI model access. To deploy Moltworker, users need a Cloudflare account and a minimum $5 USD Workers paid plan, which is required to access Sandbox Containers. Other services used in the project — such as AI Gateway, R2, and Browser Rendering — are either free to start or offer generous free tiers suitable for experimentation and small-scale use.
Importantly, Cloudflare positions Moltworker as a proof of concept rather than a supported Cloudflare product. The goal is not to offer a turnkey commercial service, but to demonstrate how far the Cloudflare Developer Platform has evolved — particularly its ability to run AI agents, execute untrusted code securely, automate the web, and maintain persistent state while operating at global scale.
By open-sourcing the project, Cloudflare is inviting developers to explore, extend, and adapt the architecture for their own use cases. The company has indicated it will monitor the repository for a period of time to support community engagement and is also considering contributing Cloudflare-specific capabilities upstream to the broader Moltbot ecosystem.
Q&A: What Moltworker Signals About the Future of AI Agents
Q: What problem is Cloudflare trying to solve with Moltworker?
A: Cloudflare is addressing a growing friction point in AI agent adoption: the need for dedicated local hardware. Moltworker shows how developers can run self-hosted AI agents online, while maintaining control, security, and observability — without buying and maintaining physical machines.
Q: How is Moltworker different from running an AI agent on a local computer or VPS?
A: Instead of relying on a single device, Moltworker runs AI agents across Cloudflare’s global infrastructure. This allows agents to operate closer to users, benefit from built-in security and scaling, and integrate directly with services like AI Gateway, Browser Rendering, and R2 storage.
Q: Why does Cloudflare emphasize that Moltworker is a proof of concept?
A: Cloudflare is not positioning Moltworker as a finished product. Instead, it serves as a reference architecture that demonstrates how modern AI agents can be built using Cloudflare’s Developer Platform — leaving room for developers to adapt, extend, or integrate the approach into their own systems.
Q: Who is this project most relevant for?
A: Moltworker is most relevant for developers, platform teams, and organizations experimenting with AI agents who want more control than fully managed SaaS tools offer — without taking on the operational burden of running everything locally.
What This Means: Why Cloudflare’s AI Agent Experiment Matters
Moltworker isn’t just a technical demo — it reflects a broader shift in how AI agents are expected to operate. As agents become more autonomous and more deeply integrated into daily workflows, questions around control, security, cost, and deployment are becoming just as important as model performance.
Cloudflare’s experiment suggests that AI agents don’t need to live exclusively on personal machines or centralized SaaS platforms. By running agents at the edge, closer to users and data, platforms can offer a middle ground: flexibility without full operational burden, and autonomy without isolation.
For developers, this approach lowers the barrier to experimenting with AI agents at scale. For enterprises, it hints at a future where agents can be deployed with stronger governance, observability, and security guarantees. And for the broader AI ecosystem, it signals that infrastructure providers are increasingly shaping how AI agents run — not just what models they use.
Moltworker shows that the next evolution of AI agents may be defined less by where models live, and more by where execution, trust, and control intersect.
Sources:
Cloudflare Engineering Blog — “Introducing Moltworker: A Self-Hosted AI Agent, Minus the Minis”
https://blog.cloudflare.com/moltworker-self-hosted-ai-agent/OpenClaw (formerly Moltbot) — Official GitHub Repository
https://github.com/openclaw/openclawCloudflare Moltworker — Open-Source Implementation
https://github.com/cloudflare/moltworkerCloudflare Moltworker — Project README and Deployment Guide
https://github.com/cloudflare/moltworker/blob/main/README.mdCloudflare Engineering Blog — “A Year of Improving Node.js Compatibility in Workers”
https://blog.cloudflare.com/nodejs-workers-2025/Cloudflare Browser Rendering — Developer Documentation
https://developers.cloudflare.com/browser-rendering/Works on Workers — Node.js Compatibility Experiment Results
https://worksonworkers.southpolesteve.workers.dev/Cloudflare Sandboxes (Sandbox SDK) — Official Documentation
https://sandbox.cloudflare.com/Cloudflare R2 — Object Storage Documentation
https://developers.cloudflare.com/r2/Cloudflare AI Gateway — Secrets Store and BYOK Changelog
https://developers.cloudflare.com/changelog/2025-08-25-secrets-store-ai-gateway/Cloudflare AI Gateway — Unified Billing Overview
https://developers.cloudflare.com/ai-gateway/features/unified-billing/Cloudflare AI Gateway — Supported Model Providers
https://developers.cloudflare.com/ai-gateway/usage/providers/Cloudflare Workers — Platform Pricing and Plans
https://developers.cloudflare.com/workers/platform/pricing/
Editor’s Note: This article was created by Alicia Shapiro, CMO of AiNews.com, with writing, image, and idea-generation support from ChatGPT, an AI assistant. However, the final perspective and editorial choices are solely Alicia Shapiro’s. Special thanks to ChatGPT for assistance with research and editorial support in crafting this article.
