COMBINE GUIDE
Main takeaway: connect three local tools in this order?Ollama, OpenClaw, then n8n
Local means the services run on your own machine, not a cloud server.
WINDOWS
Start services in three terminals
ollama serve
openclaw gateway start
n8n startExpected result: all three services stay active without errors.
macOS
Start services in three terminals
ollama serve
openclaw gateway start
n8n startExpected result: same output pattern as Windows.
Linux
Start services in three terminals
ollama serve
openclaw gateway start
n8n startExpected result: all services run and ports are reachable.
N8N TO OLLAMA
HTTP Request node config
In n8n, POST to http://localhost:11434/api/generate with JSON body containing model, prompt, and stream:false.
Expected result: n8n receives JSON text from Ollama.
VALIDATION
Check health quickly
openclaw gateway status
ollama list
n8n --versionExpected result: each command returns a normal response.
TROUBLESHOOTING
Common fixes
If n8n cannot reach Ollama, confirm Ollama is on port 11434.
If OpenClaw is down, run openclaw gateway restart.
If n8n editor is blocked, check for a port conflict on 5678.
BACK
Guides hub
Review setup and use pages if needed.
Open guides hub →NEXT
Full local stack
Add Agent Zero as the fourth layer.
Open full stack guide →