Skip to main content
So you’ve signed up, you’ve got your API key, and you’re staring at a terminal wondering what to do next. Let’s fix that. By the end of this guide, you’ll have a running OpenClaw agent that you can talk to, remembers what you said, and supports multiple conversation threads. All through curl.
This guide assumes you already have an organization and API key. If not, hit the Quickstart first — it takes 2 minutes.

Before we start

Set your API key as an environment variable so you don’t have to paste it into every command:
export CHOWDER_KEY="chd_org_your_key_here"
Every request in this guide uses $CHOWDER_KEY for auth.

Create an instance

1
Spin it up
2
An instance is a fully isolated OpenClaw agent — its own sandbox, workspace, memory, and gateway. Let’s create one:
3
curl
curl -X POST https://api.chowder.dev/v1/instances \
  -H "Authorization: Bearer $CHOWDER_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "my-first-agent",
    "model_provider": "anthropic"
  }'
python
import requests

resp = requests.post(
    "https://api.chowder.dev/v1/instances",
    headers={"Authorization": f"Bearer {CHOWDER_KEY}"},
    json={"name": "my-first-agent", "model_provider": "anthropic"}
)
print(resp.json())
4
{
  "id": "ead7b76b-f34e-4b91-93e7-9c979cf9e41c",
  "organization_id": "4737cccd-3d1d-4790-979b-6825d6a333de",
  "name": "my-first-agent",
  "status": "provisioning",
  "model_provider": "anthropic",
  "sandbox_provider": "sandbox",
  "sandbox_id": null,
  "openclaw_version": "v1",
  "openclaw_config": null,
  "gateway_url": null,
  "region": null,
  "error_message": null,
  "created_at": "2026-02-14T10:30:00Z",
  "updated_at": "2026-02-14T10:30:00Z"
}
5
Notice the "status": "provisioning". Behind the scenes, Chowder is creating a cloud sandbox, installing OpenClaw, running the onboarding wizard, and starting the gateway. This takes about 60–90 seconds.
6
You can also pass "model_provider": "openai" or "model_provider": "gemini" if that’s your jam. Just make sure you’ve configured the corresponding API key on your organization first via PATCH /v1/organization.
7
Save that id — you’ll need it for everything else:
8
export INSTANCE_ID="ead7b76b-f34e-4b91-93e7-9c979cf9e41c"
9
Wait for it to be ready
10
Poll the status endpoint until you see "running":
11
curl https://api.chowder.dev/v1/instances/$INSTANCE_ID/status \
  -H "Authorization: Bearer $CHOWDER_KEY"
12
{
  "id": "ead7b76b-f34e-4b91-93e7-9c979cf9e41c",
  "status": "provisioning",
  "gateway_url": null
}
13
Wait a minute, try again:
14
{
  "id": "ead7b76b-f34e-4b91-93e7-9c979cf9e41c",
  "status": "running",
  "gateway_url": "https://ead7b76b-f34e.preview.sandbox.work"
}
15
Once you see "status": "running", you’re good to go. The gateway_url is the internal gateway — you don’t need to use it directly, Chowder proxies everything for you.
16
If the status is "error", check the error_message field on the full instance object (GET /v1/instances/{id}). Common causes: missing model API key on your org, or a transient sandbox provisioning issue. Delete it and try again.
17
Send your first message
18
Now for the fun part. Send a message and specify which model the agent should use for this turn:
19
curl -X POST https://api.chowder.dev/v1/instances/$INSTANCE_ID/responses \
  -H "Authorization: Bearer $CHOWDER_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "claude-sonnet-4-20250514",
    "input": "Hey! Who are you and what can you do?"
  }'
20
{
  "id": "resp_238a6f70-1fe6-42fc-8488-6a3762ef7e68",
  "status": "completed",
  "model": "claude-sonnet-4-20250514",
  "output": [
    {
      "type": "message",
      "role": "assistant",
      "content": [
        {
          "type": "output_text",
          "text": "Hey! I just came online — fresh instance, blank slate. I can help you with all sorts of things: write code, search the web, manage files in my workspace, run shell commands, and more. What are we building?"
        }
      ]
    }
  ]
}
21
The response follows the OpenAI-compatible Responses API format. The output array contains message objects, each with a content array of typed blocks.
22
You can switch models per-request. Use claude-sonnet-4-20250514 for complex reasoning, gpt-4o for general tasks, or gpt-4o-mini when you want speed. The agent’s memory persists regardless of which model you pick.
23
Verify it remembers context
24
The agent maintains conversational memory. Send a follow-up and watch it reference your first message:
25
curl -X POST https://api.chowder.dev/v1/instances/$INSTANCE_ID/responses \
  -H "Authorization: Bearer $CHOWDER_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "claude-sonnet-4-20250514",
    "input": "What did I just ask you?"
  }'
26
{
  "id": "resp_c91f3a10-2b5e-4d8c-a2f7-84fe1e0d37a2",
  "status": "completed",
  "model": "claude-sonnet-4-20250514",
  "output": [
    {
      "type": "message",
      "role": "assistant",
      "content": [
        {
          "type": "output_text",
          "text": "You asked me who I am and what I can do. I told you I'm a freshly spun-up instance that can write code, search the web, manage files, and run commands. Still waiting on what we're building together!"
        }
      ]
    }
  ]
}
27
It remembers. That’s the OpenClaw memory system at work — conversations persist across requests, not just within a single API call.
28
Create a named session
29
By default, all messages go to the agent’s main conversation thread. But what if you want separate conversations? That’s what sessions are for.
30
Create a new session by sending a message to the session endpoint:
31
curl -X POST https://api.chowder.dev/v1/instances/$INSTANCE_ID/session/project-alpha/responses \
  -H "Authorization: Bearer $CHOWDER_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "claude-sonnet-4-20250514",
    "input": "This session is for project alpha. We are building a CLI tool in Rust. Remember that."
  }'
32
{
  "id": "resp_f2e81b34-9a1c-4e57-b6d0-3c7a1f8e5d92",
  "status": "completed",
  "model": "claude-sonnet-4-20250514",
  "output": [
    {
      "type": "message",
      "role": "assistant",
      "content": [
        {
          "type": "output_text",
          "text": "Got it — project alpha, Rust CLI tool. I'll keep that context for this session. What's the first feature we're tackling?"
        }
      ]
    }
  ]
}
33
The session ID (project-alpha) is whatever string you want. It’s created automatically the first time you use it.
34
Switch between sessions
35
Each session has its own isolated memory. Your main conversation doesn’t know about project-alpha, and vice versa.
36
Talk to the main session (no session ID in the URL):
37
curl -X POST https://api.chowder.dev/v1/instances/$INSTANCE_ID/responses \
  -H "Authorization: Bearer $CHOWDER_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gpt-4o",
    "input": "What project are we working on?"
  }'
38
{
  "id": "resp_a4d29e8f-7c3b-4a1e-9d5f-2b6c8e4f1a73",
  "status": "completed",
  "model": "gpt-4o",
  "output": [
    {
      "type": "message",
      "role": "assistant",
      "content": [
        {
          "type": "output_text",
          "text": "You haven't told me about a specific project yet in this conversation. You did ask who I am and what I can do earlier. Want to start something?"
        }
      ]
    }
  ]
}
39
Now ask the same thing in the project-alpha session:
40
curl -X POST https://api.chowder.dev/v1/instances/$INSTANCE_ID/session/project-alpha/responses \
  -H "Authorization: Bearer $CHOWDER_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gpt-4o",
    "input": "What project are we working on?"
  }'
41
{
  "id": "resp_d8f41c26-3e7a-4b92-8f1d-5a9c2e7b3d04",
  "status": "completed",
  "model": "gpt-4o",
  "output": [
    {
      "type": "message",
      "role": "assistant",
      "content": [
        {
          "type": "output_text",
          "text": "We're working on project alpha — a CLI tool in Rust. You set that context at the start of this session. Ready to dive in?"
        }
      ]
    }
  ]
}
42
Completely isolated. This is great for multi-tenant apps where each user gets their own conversation thread.
43
Stop and restart
44
When you’re done for a while, stop the instance to free up resources:
45
curl -X POST https://api.chowder.dev/v1/instances/$INSTANCE_ID/stop \
  -H "Authorization: Bearer $CHOWDER_KEY"
46
{
  "id": "ead7b76b-f34e-4b91-93e7-9c979cf9e41c",
  "name": "my-first-agent",
  "status": "stopped",
  "model_provider": "anthropic",
  "sandbox_provider": "sandbox",
  "openclaw_version": "v1",
  "created_at": "2026-02-14T10:30:00Z",
  "updated_at": "2026-02-14T11:45:00Z"
}
47
The sandbox is paused but your data and sessions are preserved. Start it back up when you need it:
48
curl -X POST https://api.chowder.dev/v1/instances/$INSTANCE_ID/start \
  -H "Authorization: Bearer $CHOWDER_KEY"
49
The instance comes back with "status": "running" and all your conversations are intact.
50
You can only stop a running instance and only start a stopped one. If the instance is in any other state (like provisioning or error), you’ll get a 409 Conflict.
51
Delete when done
52
When you’re truly finished with an instance:
53
curl -X DELETE https://api.chowder.dev/v1/instances/$INSTANCE_ID \
  -H "Authorization: Bearer $CHOWDER_KEY"
54
This returns 204 No Content. The sandbox is destroyed and the instance is marked as terminated. This is permanent — there’s no undo.

Recap

Here’s what you just did:
StepEndpointWhat happened
CreatePOST /v1/instancesProvisioned a sandbox + OpenClaw agent
Poll statusGET /v1/instances/{id}/statusWaited for running
ChatPOST /v1/instances/{id}/responsesSent messages, got responses
SessionsPOST /v1/instances/{id}/session/{name}/responsesCreated isolated conversation threads
Stop/StartPOST /v1/instances/{id}/stop and /startPaused and resumed the instance
DeleteDELETE /v1/instances/{id}Cleaned up

What’s next?