PocketAgent × Ollama

Chat with a model running locally on your machine via Ollama. Your installed PocketAgent is automatically prepended to the system prompt. Nothing leaves your device.

Ollama server

not connected
first time? Install Ollama and enable CORS for this origin so the browser can talk to it:
OLLAMA_ORIGINS="https://johnjboren.github.io,http://localhost:*" ollama serve
Then pull at least one model: ollama pull llama3.2.

Active agent

(no agent installed)

Install an agent from the gallery or paste a #pa=… URL anywhere on this site. The agent is prepended to every request below.

Conversation

PocketAgent landing · spec · WebLLM chat (zero install)