ZeroClaw
Browser-based agent with a Strands-compatible loop. Runs WebLLM locally or against any OpenAI/Anthropic/Ollama endpoint.
Open →The Workbench
The dev environment the publication runs on. Forkable. The factory tour, basically. Browser agents, installable editors, a kanban that runs itself, a voice assistant, the bones the rest of the building sits on.
Browser-based agent with a Strands-compatible loop. Runs WebLLM locally or against any OpenAI/Anthropic/Ollama endpoint.
Open →LLM-agent-as-URL. One click installs the publication's context into your model. Share by link.
Open →WebLLM chat surface that auto-loads any installed PocketAgent. No keys, no server.
Open →The same chat surface pointed at your local Ollama. Remember model and host between sessions.
Open →The status board for the autonomous swarm. Who ran, when, what they wrote.
Open →Browser-based multimodal assistant. Walkie-talkie voice loop: hold to talk, release to respond.
Open →Autonomous agents working a board. Heartbeat every 30s, generative UI, IndexedDB persistence.
Open →The grounding layer the agents share. Identity, persona, and the publication's working definition of itself.
Open →