Solo
Free
forever · open source core
The full br CLI. All 84 tools. Run it on your Mac, your Pi, your servers. No account required.
✓84 CLI tools
✓Local Ollama models
✓Fleet management (br nodes)
✓GEB oracle (br geb)
✓CECE identity (br cece)
–Tokenless gateway
–Agent emails
–30K agent runtime
Get on GitHub →
Most Popular
Pro
$49
/ month · billed annually
Everything in Solo plus the tokenless gateway, agent email addresses, and the full agent runtime. Your AI, properly armed.
✓Everything in Solo
✓Tokenless gateway (:8787)
✓Agent email routing (@blackroad.io)
✓PS-SHA∞ memory persistence
✓Web dashboard + SSE fleet
✓br oracle (LLM reflection)
✓Priority support
–30K agent runtime
Get Pro →
Enterprise
Custom
contact us · unlimited scale
30,000 agents. Railway GPU cluster. Custom agent identities. Dedicated infra. SLA. White-glove onboarding.
✓Everything in Pro
✓30,000 agent runtime
✓Railway A100/H100 GPU
✓Custom agent identities
✓Dedicated Cloudflare Workers
✓SLA + uptime guarantee
✓Direct line to Alexa
✓On-prem deployment option
Talk to us →
FAQ
Can I run this completely offline?
Yes. The br CLI and local Ollama models work without internet. Only cloud provider features (Claude, GPT) require connectivity, and those route through your gateway.
What hardware do I need?
Anything from a Raspberry Pi to a datacenter. Most of BlackRoad OS was built on a Mac and a fleet of Pis. The Pro plan runs comfortably on a Mac Mini + 2–3 Pis.
Do agents need their own API keys?
Never. That's the point of the tokenless gateway. Agents call the gateway; the gateway owns the secrets. Verified by verify-tokenless-agents.sh on every commit.
Can I use my own LLM models?
Absolutely. BlackRoad wraps Ollama — point it at any model. We have first-class support for Qwen, DeepSeek, Llama, Mistral, and custom Modelfile-defined personalities like CECE and Lucidia.
What's the catch?
There isn't one. We're a small team building infrastructure we use ourselves. The pricing reflects what it costs to keep the lights on and keep building.