Install openclaw.ai on Docker

Beginner ⏱ 15 minutes 📅 Updated Feb 2026

📋 Prerequisites

1

Quick Start

Run openclaw.ai with a local LLM (Ollama) in one go. No API key required.

bash
docker run -it --rm \
  -v openclaw-data:/root/.openclaw \
  -e OPENCLAW_LLM_PROVIDER=ollama \
  -e OLLAMA_HOST=http://host.docker.internal:11434 \
  openclaw/openclaw:latest
💡

Ollama Requirement

Ensure Ollama is running on your host machine. host.docker.internal allows the container to talk to it.

Run with OpenAI or Anthropic API key:

bash
docker run -it --rm \
  -e OPENAI_API_KEY="your-api-key-here" \
  -v openclaw-data:/root/.openclaw \
  openclaw/openclaw:latest
✓ Expected Output
🦀 openclaw.ai v1.0.0
✓ Configuration loaded
✓ LLM provider connected (ollama/openai)
✓ Ready!

openclaw> _
2

Docker Compose (Recommended)

For a more permanent setup, create a docker-compose.yml file:

yaml
version: '3.8'

services:
  openclaw:
    image: openclaw/openclaw:latest
    container_name: openclaw
    restart: unless-stopped
    stdin_open: true
    tty: true
    environment:
      - OPENAI_API_KEY=${OPENAI_API_KEY}
      # Or use other providers:
      # - ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}
    volumes:
      - openclaw-data:/root/.openclaw
      - ./workspace:/workspace
    ports:
      - "8080:8080"  # Web UI if available

volumes:
  openclaw-data:

Create .env File

bash
echo "OPENAI_API_KEY=your-api-key-here" > .env

Start the Container

bash
docker-compose up -d
docker-compose exec openclaw openclaw run
3

With Local LLM (Ollama)

Run completely offline with local LLMs:

yaml
version: '3.8'

services:
  ollama:
    image: ollama/ollama:latest
    container_name: ollama
    volumes:
      - ollama-data:/root/.ollama
    ports:
      - "11434:11434"

  openclaw:
    image: openclaw/openclaw:latest
    container_name: openclaw
    depends_on:
      - ollama
    environment:
      - OLLAMA_HOST=http://ollama:11434
      - OPENCLAW_LLM_PROVIDER=ollama
      - OPENCLAW_LLM_MODEL=llama2
    volumes:
      - openclaw-data:/root/.openclaw

volumes:
  ollama-data:
  openclaw-data:

Pull a Model and Run

bash
docker-compose up -d
docker-compose exec ollama ollama pull llama2
docker-compose exec openclaw openclaw run

📋 Useful Docker Commands

bash
# View logs
docker-compose logs -f openclaw

# Stop containers
docker-compose down

# Update to latest version
docker-compose pull
docker-compose up -d

# Enter container shell
docker-compose exec openclaw /bin/bash

# Remove all data (fresh start)
docker-compose down -v