You installed OpenClaw on your Umbrel server—promising “the AI that actually does things”—but instead of managing your calendar or clearing your inbox, it’s refusing to connect.
Whether you are staring at a 401 Authentication Error when connecting Anthropic/Claude, or your local LLMs (via Ollama) are failing to handshake with the OpenClaw container, you are not alone. Because OpenClaw runs in a sandboxed Docker environment on Umbrel, networking and authentication require specific configurations that aren’t always obvious in the default setup wizard.
This guide provides technical solutions to the most common configuration and runtime errors for OpenClaw on Umbrel.
What “Connection Refused” and “401 Errors” Actually Mean
When troubleshooting OpenClaw, you will typically encounter two distinct categories of failure:
HTTP 401: Authentication Error (Invalid Bearer Token): This usually happens with Anthropic/Claude configurations. It means the token you provided is either expired, malformed, or—most commonly—the wrong type of token (e.g., trying to use a standard API key where an OAuth setup token is required, or vice versa).
ECONNREFUSED / Network Error: This occurs when OpenClaw tries to offload inference to a local model (like Ollama running on the same Umbrel). Since OpenClaw runs inside its own isolated Docker container, localhost refers to the container itself, not your Umbrel device. It cannot “see” your other apps without the correct bridge IP.
Main Causes of Failure
Token Format Mismatch: Users often confuse the sk-ant- API key with the temporary OAuth tokens generated via the Claude CLI.
Docker Network Isolation: OpenClaw cannot access local services (Ollama, Home Assistant) via 127.0.0.1 or localhost.
Gateway Conflicts: The OpenClaw Gateway service failing to bind to the correct port due to conflicts with other Umbrel apps.
Insufficient Resources: Running the OpenClaw agent plus a local LLM/inference engine often exceeds the RAM available on Raspberry Pi 4/5 setups, causing the container to silently OOM (Out of Memory) kill.
Step-by-Step Fixes
Fix 1: Resolving Claude/Anthropic 401 Authentication Errors
If you are seeing HTTP 401: authentication_error while setting up Claude, you likely used a standard API key in a field expecting an OAuth token, or the token generation failed.
Verify Token Type: OpenClaw’s native integration often requires a setup-token generated via the Claude CLI, not just a raw API key from the dashboard.
Generate a Fresh Token:SSH into your Umbrel:
ssh umbrel@umbrel.local If you haven’t installed the Claude CLI, you may need to do so, or perform this on your local machine if OpenClaw supports remote auth.
Inject the Token Correctly:Instead of relying on the web UI, use the OpenClaw CLI from within the container to force the auth update:
# Enter the OpenClaw container
docker exec -it openclaw /bin/bash
# Manually add the auth token
openclaw models auth add --provider anthropic --token "YOUR_NEW_TOKEN_HERE"Restart the Service:Exit the container and restart the app via Umbrel’s dashboard or CLI:
docker restart openclawFix 2: Connecting Local Ollama Models (Network Fix)
If you are trying to run OpenClaw using local compute (Ollama) to save on API costs, and it fails to connect, follow this network fix.
Identify the Docker Bridge IP:Inside the OpenClaw container, localhost is isolated. You must point it to the Umbrel host’s Docker bridge IP. This is usually 172.17.0.1.
Update the Model Config:In your OpenClaw configuration (either via the TUI or config.json), change the Ollama endpoint.
- Incorrect:
http://localhost:11434 - Correct:
http://172.17.0.1:11434
Verify Ollama Bind Address:Ensure your Ollama instance is listening on all interfaces, not just localhost.Check the Ollama service file or environment variables for OLLAMA_HOST. It should be set to 0.0.0.0.
- Note: On standard Umbrel installs, apps are containerized. You may need to ensure the Ollama container maps port 11434 to the host.
Fix 3: Clearing Corrupt “State” Files
If OpenClaw loops on startup or gets stuck “Thinking…”, the agent’s state file might be corrupt.
Stop the App: via Umbrel UI.
SSH and Navigate to App Data:
cd ~/umbrel/app-data/openclaw/data (Note: Path may vary slightly based on UmbrelOS version; check ls ~/umbrel/app-data)
Remove State Files:Look for .state or session.db files and rename/delete them to force a fresh handshake.
mv session.db session.db.bakRestart.
Hardware Limitations: Raspberry Pi vs. Mini PC
OpenClaw is resource-intensive. If you are experiencing random crashes (Agent just stops responding), diagnose the hardware.
- Raspberry Pi 4 (4GB/8GB):
- Verdict: Unstable.
- Running the OpenClaw agent Logic plus a local LLM is usually impossible. The Pi will run out of RAM and swap thrash.
- Solution: Use external API providers (Anthropic/OpenAI) exclusively. Do not try to run local embeddings or local LLMs on the Pi while running the agent.
- Umbrel Home / Mini PCs (16GB+ RAM):
- Verdict: Stable.
- Capable of running local Quantized models (Llama 3 8B) alongside the agent.
Check for OOM Kills
To confirm if your hardware is the bottleneck, run this via SSH:
dmesg | grep -i "killed process"
If you see openclaw or python being killed, you need to upgrade hardware or switch to cloud-based inference.
When Reinstalling is Necessary
If you have messed with the internal config.json inside the container or broke the Python environment dependencies:
Uninstall via the Umbrel UI (Right-click -> Uninstall).
Prune Docker Volumes:Sometimes data persists. To truly wipe it:
ssh umbrel@umbrel.local
cd ~/umbrel/scripts
sudo ./app stop openclaw
sudo rm -rf ~/umbrel/app-data/openclawReinstall from the App Store.
How to Prevent Future Issues
- Use the TUI for Setup: The Terminal User Interface (TUI) often provides more verbose error messages than the Web UI. You can access it by attaching to the container:
docker attach openclaw. - Pin Your Model Versions: AI models update frequently. In your config, specify exact model versions (e.g.,
claude-3-5-sonnet-20240620) rather than generic tags likelatestto prevent sudden behavior changes. - Monitor API Limits: OpenClaw can burn through tokens rapidly if it gets stuck in a loop. Set hard limits in your OpenAI/Anthropic dashboard.
FAQ
Q: Can I run OpenClaw on a Raspberry Pi 4?
A: Yes, but only if you use cloud APIs (OpenAI/Anthropic). Local models will likely crash the device.
Q: Why does OpenClaw say “Gateway Error”?
A: This usually means the internal communication port (often 8080 or 3000 inside the container) is blocked or the service crashed. Check logs with docker logs openclaw.
Q: Where is the config file located on Umbrel?
A: Typically found in ~/umbrel/app-data/openclaw/data/config/.
Q: OpenClaw is installed but I can’t open the UI.
A: Give it 2-3 minutes after status “Running”. The AI agent initialization takes longer than standard web apps.
Conclusion
OpenClaw is a powerful tool for automating your digital life, but it requires a stable network environment to function. By ensuring your auth tokens are the correct type and pointing your local model configs to the Docker Bridge IP (172.17.0.1), you can resolve 90% of startup failures.
Internal Linking Suggestions:
