The Web wakened this week to a flood of individuals shopping for Mac minis to run Moltbot (previously Clawdbot), an open-source, self-hosted AI agent designed to behave as a private assistant. Moltbot runs within the background on a consumer’s personal {hardware}, has a large and rising checklist of integrations for chat functions, AI fashions, and different standard instruments, and could be managed remotely. Moltbot may also help you along with your funds, social media, manage your day — all via your favourite messaging app.
However what should you don’t need to purchase new devoted {hardware}? And what should you may nonetheless run your Moltbot effectively and securely on-line? Meet Moltworker, a middleware Employee and tailored scripts that enables working Moltbot on Cloudflare’s Sandbox SDK and our Developer Platform APIs.
A private assistant on Cloudflare — how does that work?
Node.js compatibility on Cloudflare Employees is healthier than ever earlier than. The place up to now we needed to mock APIs to get some packages working, now these APIs are supported natively by the Employees Runtime.
This has modified how we will construct instruments on Cloudflare Employees. After we first carried out Playwright, a preferred framework for net testing and automation that runs on Browser Rendering, we needed to depend on memfs. This was unhealthy as a result of not solely is memfs a hack and an exterior dependency, but it surely additionally pressured us to float away from the official Playwright codebase. Fortunately, with extra Node.js compatibility, we had been capable of begin utilizing node:fs natively, decreasing complexity and maintainability, which makes upgrades to the most recent variations of Playwright straightforward to do.
The checklist of Node.js APIs we help natively retains rising. The weblog publish “A 12 months of enhancing Node.js compatibility in Cloudflare Employees” supplies an summary of the place we’re and what we’re doing.
We measure this progress, too. We just lately ran an experiment the place we took the 1,000 hottest NPM packages, put in and let AI unfastened, to attempt to run them in Cloudflare Employees, Ralph Wiggum as a “software program engineer” fashion, and the outcomes had been surprisingly good. Excluding the packages which are construct instruments, CLI instruments or browser-only and don’t apply, solely 15 packages genuinely didn’t work. That is 1.5%.
Right here’s a graphic of our Node.js API help over time:
We put collectively a web page with the outcomes of our inside experiment on npm packages help right here, so you may examine for your self.
Moltbot doesn’t essentially require a whole lot of Employees Node.js compatibility as a result of many of the code runs in a container anyway, however we thought it might be vital to spotlight how far we acquired supporting so many packages utilizing native APIs. It is because when beginning a brand new AI agent software from scratch, we will truly run a whole lot of the logic in Employees, nearer to the consumer.
The opposite vital a part of the story is that the checklist of merchandise and APIs on our Developer Platform has grown to the purpose the place anybody can construct and run any sort of software — even essentially the most complicated and demanding ones — on Cloudflare. And as soon as launched, each software working on our Developer Platform instantly advantages from our safe and scalable world community.
These services and products gave us the elements we would have liked to get began. First, we now have Sandboxes, the place you may run untrusted code securely in remoted environments, offering a spot to run the service. Subsequent, we now have Browser Rendering, the place you may programmatically management and work together with headless browser situations. And eventually, R2, the place you may retailer objects persistently. With these constructing blocks obtainable, we may start work on adapting Moltbot.
How we tailored Moltbot to run on us
Moltbot on Employees, or Moltworker, is a mix of an entrypoint Employee that acts as an API router and a proxy between our APIs and the remoted setting, each protected by Cloudflare Entry. It additionally supplies an administration UI and connects to the Sandbox container the place the usual Moltbot Gateway runtime and its integrations are working, utilizing R2 for persistent storage.
Excessive-level structure diagram of Moltworker.
Let’s dive in additional.
Cloudflare AI Gateway acts as a proxy between your AI functions and any standard AI supplier, and offers our clients centralized visibility and management over the requests going via.
Not too long ago we introduced help for Convey Your Personal Key (BYOK), the place as an alternative of passing your supplier secrets and techniques in plain textual content with each request, we centrally handle the secrets and techniques for you and might use them along with your gateway configuration.
An excellent higher choice the place you don’t need to handle AI suppliers’ secrets and techniques in any respect end-to-end is to make use of Unified Billing. On this case you prime up your account with credit and use AI Gateway with any of the supported suppliers instantly, Cloudflare will get charged, and we’ll deduct credit out of your account.
To make Moltbot use AI Gateway, first we create a brand new gateway occasion, then we allow the Anthropic supplier for it, then we both add our Claude key or buy credit to make use of Unified Billing, after which all we have to do is ready the ANTHROPIC_BASE_URL setting variable so Moltbot makes use of the AI Gateway endpoint. That’s it, no code adjustments essential.
As soon as Moltbot begins utilizing AI Gateway, you’ll have full visibility on prices and have entry to logs and analytics that can enable you perceive how your AI agent is utilizing the AI suppliers.
Notice that Anthropic is one choice; Moltbot helps different AI suppliers and so does AI Gateway. The benefit of utilizing AI Gateway is that if a greater mannequin comes alongside from any supplier, you don’t need to swap keys in your AI Agent configuration and redeploy — you may merely change the mannequin in your gateway configuration. And extra, you specify mannequin or supplier fallbacks to deal with request failures and guarantee reliability.
Final 12 months we anticipated the rising want for AI brokers to run untrusted code securely in remoted environments, and we introduced the Sandbox SDK. This SDK is constructed on prime of Cloudflare Containers, but it surely supplies a easy API for executing instructions, managing information, working background processes, and exposing companies — all out of your Employees functions.
In brief, as an alternative of getting to cope with the lower-level Container APIs, the Sandbox SDK provides you developer-friendly APIs for safe code execution and handles the complexity of container lifecycle, networking, file techniques, and course of administration — letting you give attention to constructing your software logic with just some traces of TypeScript. Right here’s an instance:
import { getSandbox } from '@cloudflare/sandbox';
export { Sandbox } from '@cloudflare/sandbox';
export default {
async fetch(request: Request, env: Env): Promise {
const sandbox = getSandbox(env.Sandbox, 'user-123');
// Create a venture construction
await sandbox.mkdir('/workspace/venture/src', { recursive: true });
// Verify node model
const model = await sandbox.exec('node -v');
// Run some python code
const ctx = await sandbox.createCodeContext({ language: 'python' });
await sandbox.runCode('import math; radius = 5', { context: ctx });
const end result = await sandbox.runCode('math.pi * radius ** 2', { context: ctx });
return Response.json({ model, end result });
}
}; This matches like a glove for Moltbot. As an alternative of working Docker in your native Mac mini, we run Docker on Containers, use the Sandbox SDK to problem instructions into the remoted setting and use callbacks to our entrypoint Employee, successfully establishing a two-way communication channel between the 2 techniques.
R2 for persistent storage
The advantage of working issues in your native laptop or VPS is you get persistent storage without cost. Containers, nevertheless, are inherently ephemeral, which means information generated inside them is misplaced upon deletion. Worry not, although — the Sandbox SDK supplies the sandbox.mountBucket() that you should utilize to routinely, effectively, mount your R2 bucket as a filesystem partition when the container begins.
As soon as we now have a neighborhood listing that’s assured to outlive the container lifecycle, we will use that for Moltbot to retailer session reminiscence information, conversations and different property which are required to persist.
Browser Rendering for browser automation
AI brokers rely closely on looking the generally not-so-structured net. Moltbot makes use of devoted Chromium situations to carry out actions, navigate the net, fill out kinds, take snapshots, and deal with duties that require an online browser. Certain, we will run Chromium on Sandboxes too, however what if we may simplify and use an API as an alternative?
With Cloudflare’s Browser Rendering, you may programmatically management and work together with headless browser situations working at scale in our edge community. We help Puppeteer, Stagehand, Playwright and different standard packages in order that builders can onboard with minimal code adjustments. We even help MCP for AI.
To be able to get Browser Rendering to work with Moltbot we do two issues:
First we create a skinny CDP proxy (CDP is the protocol that enables instrumenting Chromium-based browsers) from the Sandbox container to the Moltbot Employee, again to Browser Rendering utilizing the Puppeteer APIs.
Then we inject a Browser Rendering ability into the runtime when the Sandbox begins.
From the Moltbot runtime perspective, it has a neighborhood CDP port it will probably connect with and carry out browser duties.
Zero Belief Entry for authentication insurance policies
Subsequent up we need to defend our APIs and Admin UI from unauthorized entry. Doing authentication from scratch is tough, and is usually the sort of wheel you don’t need to reinvent or need to cope with. Zero Belief Entry makes it extremely straightforward to guard your software by defining particular insurance policies and login strategies for the endpoints.
Zero Belief Entry Login strategies configuration for the Moltworker software.
As soon as the endpoints are protected, Cloudflare will deal with authentication for you and routinely embody a JWT token with each request to your origin endpoints. You may then validate that JWT for further safety, to make sure that the request got here from Entry and never a malicious third get together.
Like with AI Gateway, as soon as all of your APIs are behind Entry you get nice observability on who the customers are and what they’re doing along with your Moltbot occasion.
Demo time. We’ve put up a Slack occasion the place we may play with our personal occasion of Moltbot on Employees. Listed here are among the enjoyable issues we’ve finished with it.
We hate unhealthy information.
Right here’s a chat session the place we ask Moltbot to search out the shortest route between Cloudflare in London and Cloudflare in Lisbon utilizing Google Maps and take a screenshot in a Slack channel. It goes via a sequence of steps utilizing Browser Rendering to navigate Google Maps and does a fairly good job at it. Additionally have a look at Moltbot’s reminiscence in motion once we ask him the second time.
We’re within the temper for some Asian meals right now, let’s get Moltbot to work for assist.
We eat with our eyes too.
Let’s get extra inventive and ask Moltbot to create a video the place it browses our developer documentation. As you may see, it downloads and runs ffmpeg to generate the video out of the frames it captured within the browser.
We open-sourced our implementation and made it obtainable at https://github.com/cloudflare/moltworker, so you may deploy and run your individual Moltbot on prime of Employees right now.
The README guides you thru the mandatory steps to arrange every part. You will want a Cloudflare account and a minimal $5 USD Employees paid plan subscription to make use of Sandbox Containers, however all the opposite merchandise are both free to make use of, like AI Gateway, or have beneficiant free tiers you should utilize to get you began and run for so long as you need below affordable limits.
Notice that Moltworker is a proof of idea, not a Cloudflare product. Our aim is to showcase among the most enjoyable options of our Developer Platform that can be utilized to run AI brokers and unsupervised code effectively and securely, and get nice observability whereas making the most of our world community.
Be happy to contribute to or fork our GitHub repository; we’ll keep watch over it for some time for help. We’re additionally contemplating contributing upstream to the official venture with Cloudflare abilities in parallel.
We hope you loved this experiment, and we had been capable of persuade you that Cloudflare is the proper place to run your AI functions and brokers. We’ve been working relentlessly making an attempt to anticipate the long run and launch options just like the Brokers SDK that you should utilize to construct your first agent in minutes, Sandboxes the place you may run arbitrary code in an remoted setting with out the issues of the lifecycle of a container, and AI Search, Cloudflare’s managed vector-based search service, to call a couple of.
Cloudflare now presents a whole toolkit for AI improvement: inference, storage APIs, databases, sturdy execution for stateful workflows, and built-in AI capabilities. Collectively, these constructing blocks make it potential to construct and run even essentially the most demanding AI functions on our world edge community.
For those who’re enthusiastic about AI and need to assist us construct the following technology of merchandise and APIs, we’re hiring.



