NanoClaw stays small. The runtime scales around it.
NanoClaw is a lightweight personal AI agent: one Node.js process, under 4,000 lines of code, container-level isolation. Run it in a cloud runtime so the agent stays minimal while the environment, credentials, and scheduled tasks survive across runs.
NanoClaw in the Cloud
A local-first agent, lifted to a cloud runtime
NanoClaw was built to run on your own machine with container isolation. Spinup keeps that shape and moves it off the laptop, so the agent still runs when your machine is asleep.
Container isolation you do not operate
NanoClaw already isolates each group of agents inside a Linux container with its own filesystem and process space. Spinup keeps that boundary and moves it to the cloud, so one environment per agent is the default rather than an extra step.
Projected credentials for the Claude Agent SDK
NanoClaw runs on the Claude Agent SDK. Spinup projects your Anthropic credentials into the environment under `ANTHROPIC_API_KEY`, so the agent reaches Claude the same way it does locally.
Scheduled tasks survive across runs
NanoClaw supports scheduled task execution and per-group memory through CLAUDE.md files. Spinup's per-agent environment preserves both between runs, along with the SQLite message store and any files the agent generates.
One stable endpoint for messaging skills
NanoClaw's messaging skills (WhatsApp built in; Telegram, Slack, Discord, Signal, email via extensions) land on the agent environment. Spinup exposes one stable HTTPS endpoint per agent, so inbound messages do not have to relay through your laptop.
The Spinup Angle
A personal agent, run without a personal machine
NanoClaw was designed as a local-first agent: small enough to read in one sitting, secured by container isolation on your own machine. That design works until you want the agent to run while your laptop is asleep, reach inbound messages from anywhere, or share one agent across a small team.
Spinup keeps the architecture NanoClaw already favors: isolated environments per group of agents, filesystem boundaries, and explicit capability grants. It moves that model off your machine and into a per-agent environment with projected Anthropic credentials, persistent state, and a public HTTPS endpoint.
The Node.js process still does the real work. The runtime keeps the environment around it stable, so scheduled tasks fire, CLAUDE.md memory persists, and skills stay installed across restarts.
FAQ
Common questions about running NanoClaw on Spinup
How does NanoClaw's container isolation map to Spinup's environment model?+
NanoClaw already isolates each group of agents inside a Linux container (Docker, or Apple Container on macOS) with its own filesystem, IPC namespace, and process space. Spinup's environment model is the same abstraction at a higher level: one isolated environment per agent, with explicit mounts and controlled network access. The mental model carries over.
Where do the Anthropic credentials come from?+
NanoClaw is built on the Claude Agent SDK. You submit your Anthropic API key once through Spinup's secrets model. Spinup projects it into the environment as `ANTHROPIC_API_KEY`, which is what the SDK expects. The key never lives in a config file on a specific machine.
Can NanoClaw's scheduled tasks run while my laptop is off?+
Yes. Spinup environments are not tied to your local machine. Scheduled work runs against the cloud environment and the agent state persists between runs. When the agent wakes up, the files, SQLite message store, and CLAUDE.md memory are still there.
What happens if I move from NanoClaw to another harness?+
The agent, environment, and secrets stay. Only the harness layer changes. Spinup is designed around swappable harnesses so your team can compare NanoClaw against Claude Code, OpenClaw, or another harness without rebuilding the setup.
Related
The runtime should stay bigger than any one harness
Early access
Bring NanoClaw to a cloud runtime without losing its simplicity.
Join the early-access waitlist if this is the runtime shape your team has been missing.
