Step by Step: Install and Use OpenClaw on macOS

Posted February 5, 2026 by XAI Tech Team ‐ 7 min read

OpenClaw

OpenClaw + macOS

This is a from-scratch guide for running OpenClaw locally on macOS. It covers:

  • Install the OpenClaw CLI
  • Copy the config file (Path A, pick 1 of 2)
  • Enable Memory Search (optional, recommended)
  • Add official OpenAI Codex OAuth (optional advanced path)
  • Start the Gateway (foreground / background)
  • Access the Control UI
  • Add Telegram (optional)
  • Enable macOS system capabilities (optional)

0) Prerequisites

  • Node.js 22+ (CLI + Gateway only)
  • macOS Terminal
  • If you want macOS system actions (system.run / screen / camera / notifications), install OpenClaw.app (menu-bar app)

Node.js official download: https://nodejs.org/ (LTS recommended)


1) Install the OpenClaw CLI

sudo npm install -g openclaw@latest

If your npm global prefix is user-writable, you can drop sudo:

npm install -g openclaw@latest

Verify:

openclaw --version

Tip: Do not use sudo to run OpenClaw. It will write configs under /var/root and break macOS permissions.


2) Prepare the config file (Path A: pick 1 of 2)

For most users, Path A is enough. You do not need to understand environment variables, provider switching, or OAuth routing on day one. The simplest path is to pick 1 of the 2 full configs below, paste it, and get OpenClaw running first.

Create the config directory and open the default config file:

mkdir -p ~/.openclaw
nano ~/.openclaw/openclaw.json

If you use nano, save it like this:

  • paste the full content
  • press Ctrl + O to save
  • press Enter to confirm the filename
  • press Ctrl + X to exit

Keep these 5 points in mind:

  • The two configs below are pick 1, not both.
  • No matter which option you choose below, the final file on disk should still be ~/.openclaw/openclaw.json.
  • You must replace every "apiKey": "sk..." example with your own xairouter.com API key. The placeholder value will not work as-is.
  • If environment variables mean nothing to you, the easiest option is to replace sk... in the example with your own API key and leave everything else unchanged for now.
  • The example gateway.auth.token can also be used as-is for your first run. You will need that same token later for curl testing and the web UI. You can replace it with your own random string later if you want.

Path A / Option 1: (openai-responses)

If you want the openai-responses route first, copy this full content into ~/.openclaw/openclaw.json:

{
  "models": {
    "mode": "merge",
    "providers": {
      "xairouter": {
	"baseUrl": "https://api.xairouter.com",
	// Replace with your own xairouter.com API key
	"apiKey": "sk...",
	"api": "openai-responses",
	"models": [
	  {
	    "id": "gpt-5.4",
	    "name": "GPT",
	  }
	]
      }
    }
  },
  "agents": {
    "defaults": {
      "model": {
	"primary": "xairouter/gpt-5.4"
      },
      "models": {
	"xairouter/gpt-5.4": {
	  "alias": "GPT"
	}
      }
    }
  },
  "commands": {
    "native": "auto",
    "nativeSkills": "auto"
  },
  "gateway": {
    "mode": "local",
    "auth": {
      "mode": "token",
      "token": "db86e9b5a527ddf76339d61153eb59915ab58d14ed1cfdfc75b2c35dc0f98c01"
    },
  },
  "messages": {
    "ackReactionScope": "group-mentions"
  },
  "plugins": {
    "entries": {
      "telegram": {
	"enabled": true
      }
    }
  },
  "meta": {
    "lastTouchedVersion": "2026.1.30",
    "lastTouchedAt": "2026-02-02T08:31:26.394Z"
  }
}

Path A / Option 2: (openai-completions)

If you prefer the openai-completions route, copy this full content into ~/.openclaw/openclaw.json:

{
  "models": {
    "mode": "merge",
    "providers": {
      "xairouter": {
	"baseUrl": "https://api.xairouter.com",
	// Replace with your own xairouter.com API key
	"apiKey": "sk...",
	"api": "openai-completions",
	"models": [{ "id": "MiniMax-M2.5", "name": "MiniMax-M2.5" }]
      }
    }
  },
  "agents": {
    "defaults": {
      "model": {
	"primary": "xairouter/MiniMax-M2.5"
      },
      "models": {
	"xairouter/MiniMax-M2.5": {
	  "alias": "MiniMax"
	}
      },
      "maxConcurrent": 4,
      "subagents": {
	"maxConcurrent": 8
      }
    }
  },
  "commands": {
    "native": "auto",
    "nativeSkills": "auto"
  },
  "gateway": {
    "mode": "local",
    "auth": {
      "mode": "token",
      "token": "db86e9b5a527ddf76339d61153eb59915ab58d14ed1cfdfc75b2c35dc0f98c01"
    }
  },
  "messages": {
    "ackReactionScope": "group-mentions"
  },
  "plugins": {
    "entries": {
      "telegram": {
	"enabled": true
      }
    }
  },
  "meta": {
    "lastTouchedVersion": "2026.1.30",
    "lastTouchedAt": "2026-02-02T08:31:26.394Z"
  }
}

If your goal right now is simply to get OpenClaw running, you can skip this section for now. If you want OpenClaw to later search notes inside MEMORY.md and memory/*.md, come back and add the block below.

At a glance, choose like this:

What you wantRecommended modelBest for
Turn it on first, keep cost steadiertext-embedding-3-smallMost users
Better recall, larger budgettext-embedding-3-largeLarger note sets and stronger semantic matching

Before you paste anything, remember these 4 points:

  • memorySearch.provider should be "openai" here because this uses an OpenAI-compatible embeddings endpoint.
  • memorySearch.remote.baseUrl should be https://api.xairouter.com/v1 here, not the https://api.xairouter.com value used above for the chat model provider.
  • To avoid confusion with the custom xairouter chat provider above, the simplest path is to put the key directly in memorySearch.remote.apiKey.
  • memorySearch.remote.apiKey must also be replaced with your own xairouter.com API key. sk... is only a placeholder.

Add this block inside your existing agents.defaults:

If you paste this at the end of agents.defaults, remember to add a comma after the previous field.

"memorySearch": {
  "enabled": true,
  "provider": "openai",
  "model": "text-embedding-3-small",
  "fallback": "none",
  "remote": {
    "baseUrl": "https://api.xairouter.com/v1",
    // Replace with your own xairouter.com API key
    "apiKey": "sk..."
  },
  "query": {
    "maxResults": 8,
    "minScore": 0.25
  }
}

If you want the quality-first version, change only:

"model": "text-embedding-3-large"

After the Gateway starts, run these 3 checks:

openclaw memory status --deep
openclaw memory index --force
openclaw memory search --query "deployment notes"

If you do not have any memory files yet, create a tiny starter example first:

mkdir -p ~/.openclaw/workspace/memory
printf '# Team Notes\n\n- Deploy with blue-green rollout.\n' > ~/.openclaw/workspace/memory/team-notes.md

Do not want to assemble it yourself? Here are 2 full copy-paste configs with Memory Search built in

  • The two full configs below already include Memory Search, using text-embedding-3-small by default.
  • If you prefer text-embedding-3-large, simply change "model": "text-embedding-3-small" to "model": "text-embedding-3-large".
  • These are still pick 1, not both.
  • Replace every apiKey in the full config you copy with your own xairouter.com API key.
{
  "models": {
    "mode": "merge",
    "providers": {
      "xairouter": {
        "baseUrl": "https://api.xairouter.com",
        // Replace with your own xairouter.com API key
        "apiKey": "sk...",
        "api": "openai-responses",
        "models": [
          {
            "id": "gpt-5.4",
            "name": "GPT"
          }
        ]
      }
    }
  },
  "agents": {
    "defaults": {
      "model": {
        "primary": "xairouter/gpt-5.4"
      },
      "models": {
        "xairouter/gpt-5.4": {
          "alias": "GPT"
        }
      },
      "memorySearch": {
        "enabled": true,
        "provider": "openai",
        "model": "text-embedding-3-small",
        "fallback": "none",
        "remote": {
          "baseUrl": "https://api.xairouter.com/v1",
          // Replace with your own xairouter.com API key
          "apiKey": "sk..."
        },
        "query": {
          "maxResults": 8,
          "minScore": 0.25
        }
      }
    }
  },
  "commands": {
    "native": "auto",
    "nativeSkills": "auto"
  },
  "gateway": {
    "mode": "local",
    "auth": {
      "mode": "token",
      "token": "db86e9b5a527ddf76339d61153eb59915ab58d14ed1cfdfc75b2c35dc0f98c01"
    }
  },
  "messages": {
    "ackReactionScope": "group-mentions"
  },
  "plugins": {
    "entries": {
      "telegram": {
        "enabled": true
      }
    }
  },
  "meta": {
    "lastTouchedVersion": "2026.1.30",
    "lastTouchedAt": "2026-02-02T08:31:26.394Z"
  }
}
{
  "models": {
    "mode": "merge",
    "providers": {
      "xairouter": {
        "baseUrl": "https://api.xairouter.com",
        // Replace with your own xairouter.com API key
        "apiKey": "sk...",
        "api": "openai-completions",
        "models": [
          {
            "id": "MiniMax-M2.5",
            "name": "MiniMax-M2.5"
          }
        ]
      }
    }
  },
  "agents": {
    "defaults": {
      "model": {
        "primary": "xairouter/MiniMax-M2.5"
      },
      "models": {
        "xairouter/MiniMax-M2.5": {
          "alias": "MiniMax"
        }
      },
      "maxConcurrent": 4,
      "subagents": {
        "maxConcurrent": 8
      },
      "memorySearch": {
        "enabled": true,
        "provider": "openai",
        "model": "text-embedding-3-small",
        "fallback": "none",
        "remote": {
          "baseUrl": "https://api.xairouter.com/v1",
          // Replace with your own xairouter.com API key
          "apiKey": "sk..."
        },
        "query": {
          "maxResults": 8,
          "minScore": 0.25
        }
      }
    }
  },
  "commands": {
    "native": "auto",
    "nativeSkills": "auto"
  },
  "gateway": {
    "mode": "local",
    "auth": {
      "mode": "token",
      "token": "db86e9b5a527ddf76339d61153eb59915ab58d14ed1cfdfc75b2c35dc0f98c01"
    }
  },
  "messages": {
    "ackReactionScope": "group-mentions"
  },
  "plugins": {
    "entries": {
      "telegram": {
        "enabled": true
      }
    }
  },
  "meta": {
    "lastTouchedVersion": "2026.1.30",
    "lastTouchedAt": "2026-02-02T08:31:26.394Z"
  }
}

Path B (optional advanced step): official OpenAI Codex OAuth

If your goal right now is simply to get OpenClaw working, you can skip this for now. Only run this when you want direct ChatGPT / official OpenAI Codex sign-in:

openclaw models auth login --provider openai-codex

If you intentionally want to switch your default model to the official Codex path, add --set-default:

openclaw models auth login --provider openai-codex --set-default

This flow will:

  • open a browser for ChatGPT sign-in
  • use http://127.0.0.1:1455/auth/callback as the local callback URL
  • ask you to paste the callback URL / code back into the terminal if auto-callback does not complete
  • store OAuth credentials in ~/.openclaw/agents/<agentId>/agent/auth-profiles.json
  • leave your Path A config file untouched

If OAuth fails before the browser opens with a TLS certificate validation error, repair the Homebrew Node / OpenSSL trust chain first:

brew postinstall ca-certificates
brew postinstall openssl@3

3) Start the Gateway (foreground)

openclaw gateway --bind loopback --port 18789 --verbose

After it starts, run the quick check that matches the config you chose.

If you chose openai-responses

curl -sS http://127.0.0.1:18789/v1/responses \
  -H "Authorization: Bearer db86e9b5a527ddf76339d61153eb59915ab58d14ed1cfdfc75b2c35dc0f98c01" \
  -H "Content-Type: application/json" \
  -H "x-openclaw-agent-id: main" \
  -d '{"model":"openclaw","input":"ping"}'

If you chose openai-completions

curl -sS http://127.0.0.1:18789/v1/chat/completions \
  -H "Authorization: Bearer db86e9b5a527ddf76339d61153eb59915ab58d14ed1cfdfc75b2c35dc0f98c01" \
  -H "Content-Type: application/json" \
  -H "x-openclaw-agent-id: main" \
  -d '{"model":"openclaw","messages":[{"role":"user","content":"ping"}]}'

If you changed gateway.auth.token yourself, replace the token in the command with the value from your current config file.

The model: "openclaw" value is the Gateway's unified model name. You do not need to replace it with xairouter/gpt-5.4 or xairouter/MiniMax-M2.5.

Control UI:

http://127.0.0.1:18789/

If the UI asks for a token, enter the gateway.auth.token value from your config file.


4) Install as a background service (launchd)

To auto-start at login:

openclaw gateway install

Common maintenance commands:

openclaw gateway status
openclaw gateway restart
openclaw gateway stop

Logs:

~/.openclaw/logs/gateway.log
~/.openclaw/logs/gateway.err.log

If you run OpenClaw.app in Local mode, it manages the Gateway for you—avoid manual install to prevent conflicts.


5) Add Telegram (optional)

Fastest way (CLI writes config):

openclaw channels add --channel telegram --token <BOT_TOKEN>

Or edit config directly:

{
  "channels": {
    "telegram": {
      "enabled": true,
      "botToken": "BOT_TOKEN",
      "dmPolicy": "pairing",
      "groupPolicy": "allowlist"
    }
  }
}

First DM requires pairing approval:

openclaw pairing list telegram
openclaw pairing approve telegram <code>

6) Enable macOS system capabilities (optional)

To use system.run, screen recording, camera, notifications, etc:

  1. Install and launch OpenClaw.app (menu bar)
  2. Grant permissions (Notifications, Accessibility, Screen Recording, Microphone, Speech Recognition, Automation/AppleScript)
  3. Configure Exec approvals in the app

Exec approvals file:

~/.openclaw/exec-approvals.json

Recommended allowlist example:

{
  "version": 1,
  "defaults": { "security": "allowlist", "ask": "on-miss" },
  "agents": {
    "main": {
      "security": "allowlist",
      "ask": "on-miss",
      "allowlist": [
        { "pattern": "/opt/homebrew/bin/rg" },
        { "pattern": "/usr/bin/osascript" }
      ]
    }
  }
}

If you fully understand the risks and want no prompts, set security: "full" and ask: "off".

Fully open example (high risk):

{
  "version": 1,
  "defaults": { "security": "full", "ask": "off" },
  "agents": {
    "main": {
      "security": "full",
      "ask": "off"
    }
  }
}

7) Common checks

openclaw health
openclaw status
openclaw models status
openclaw channels status

Summary

  • Path A: the default choice for most users, pick 1 full config and save it as ~/.openclaw/openclaw.json
  • Option 1: openai-responses
  • Option 2: openai-completions
  • Memory Search: optional but recommended; when using xairouter OpenAI-compatible embeddings, set provider to openai and baseUrl to https://api.xairouter.com/v1, and this guide now includes 2 full copy-paste configs with Memory Search already built in
  • Path B: official openai-codex OAuth, optional and more advanced
  • OpenClaw.app: unlock system permissions and device features
  • Gateway install: make it auto-start and run in the background

Note: if you follow this guide with Option 1 (openai-responses / gpt-5.4), choose the matching Codex (Codex/Claude/OpenClaw) package on the recharge page. If you use Option 2 (openai-completions / MiniMax-M2.5), choose the matching MiniMax-M2.5 (OpenClaw/Claude) package. Purchase page: m.xairouter.com. Use the exact package name shown on the recharge page.