PromptShield

OpenClaw

Run PromptShield as the proxy between OpenClaw and your LLM provider.

OpenClaw supports custom providers via openclaw.json. Point it at PromptShield and every request gets scanned, rate limited, and logged before it reaches the LLM.

OpenClaw → PromptShield (:8080) → Anthropic / OpenAI / Gemini

Configure the proxy

Set your provider and key in .env:

PROMPTSHIELD_PROVIDER=anthropic
ANTHROPIC_API_KEY=sk-ant-api03-xxxx

For multi-provider routing, see Providers.

Then start the proxy:

make run

Configure OpenClaw

Edit ~/.openclaw/openclaw.json:

{
  "models": {
    "providers": {
      "promptshield": {
        "baseUrl": "http://localhost:8080/v1",
        "apiKey": "local",
        "api": "openai-completions"
      }
    }
  },
  "agents": {
    "defaults": {
      "model": {
        "primary": "promptshield/claude-sonnet-4-5"
      }
    }
  }
}

apiKey is a placeholder — the proxy holds the real upstream key. Copy from examples/openclaw.json.

On this page