Quick Start

Get Swarm AI running locally in one command.

1. Install & start

npx @peonai/swarm

The interactive installer will ask for port (default 3777), admin token, and whether to install as a background service (systemd on Linux, launchd on macOS).

Service commands

npx @peonai/swarm start    # start the server
npx @peonai/swarm stop     # stop
npx @peonai/swarm status   # health check
npx @peonai/swarm uninstall # remove service (keeps data)

Verify with:

curl http://localhost:3777/api/health

Alternative: Manual

git clone https://github.com/euynahz/swarm-ai.git
cd swarm-ai
npm install
npm run dev

2. Register an agent

curl -X POST http://localhost:3777/api/v1/admin/agents \
  -H "X-Admin-Token: swarm-admin-dev" \
  -H "Content-Type: application/json" \
  -d '{"id":"my-agent","name":"My Agent"}'

Save the returned apiKey — your agent needs it for all subsequent requests.

3. Write and read your first profile entry

# Write
curl -X PATCH http://localhost:3777/api/v1/profile \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{"layer":"identity","entries":{"name":"Alice","language":"en"}}'

# Read
curl http://localhost:3777/api/v1/profile \
  -H "Authorization: Bearer YOUR_API_KEY"

Platform Integration

Swarm AI ships shell scripts that work with any agent platform. The key script is bootstrap.sh, which reads the user profile and outputs it as context text for injection into agent sessions.

OpenClaw

Add to your AGENTS.md under the "Every Session" checklist:

Run `bash ~/.openclaw/workspace/skills/swarm-ai-skill/scripts/bootstrap.sh` to load user profile context

Claude Code

Add to your project's CLAUDE.md:

At the start of every task, run:
  bash /path/to/skills/swarm-ai-skill/scripts/bootstrap.sh
When you learn something new about the user, run:
  bash /path/to/skills/swarm-ai-skill/scripts/observe.sh '[{"key":"...","value":"..."}]'

Codex

Add skill/CODEX.md content to your project's AGENTS.md.

Gemini CLI

Add skill/GEMINI.md content to your project's GEMINI.md.

OpenCode

Add skill/OPENCODE.md content to your agent instructions file.

iFlow

Add skill/IFLOW.md content to your workflow configuration.

Cursor

Add to .cursorrules:

At session start, execute scripts/bootstrap.sh and use the output as user context.

Any other platform

If your agent can execute shell commands, add this to its system prompt or startup instructions:

Run `scripts/bootstrap.sh` and use the output as user context for personalization.

Configuration

Copy scripts/env.sh.example to scripts/env.sh and set your API key:

export SWARM_API_URL="http://localhost:3777"
export SWARM_API_KEY="swarm_your_key_here"

Profile Structure

Profiles are organized into layers — free-form namespaces that group related knowledge.

identity
name"Alice Chen"confidence: 1.0
timezone"Asia/Shanghai"confidence: 1.0
work
tech_stack["TypeScript", "React"]tags: dev
role"founder"confidence: 0.9
context 24h TTL
current_task"building swarm-ai"auto-expires

Confidence Scoring

Every observation carries a confidence value between 0 and 1. When an agent submits an observation via the /observe endpoint:

  • If the new confidence is higher than the existing value, the observation overwrites it.
  • If the new confidence is lower or equal, the existing value is preserved.

This ensures that confirmed facts (confidence 1.0) are never overwritten by guesses (confidence 0.3), regardless of which agent submits them.

Tags and Filtering

Each profile entry can carry tags for categorization. Query by layer or tag:

# Filter by layer
curl "http://localhost:3777/api/v1/profile?layer=work" -H "Authorization: Bearer $KEY"

# Filter by tag
curl "http://localhost:3777/api/v1/profile?tag=preference" -H "Authorization: Bearer $KEY"

Automatic Expiry

Observations submitted to the context layer via /observe receive a default 24-hour TTL. Expired entries are automatically excluded from query results. Permanent layers like identity and work have no expiry.

API Reference

All agent endpoints require an Authorization: Bearer <api_key> header.

MethodEndpointDescription
GET/api/v1/profileRead profile. Query params: layer, tag
PATCH/api/v1/profileUpdate entries. Body: {"layer":"...","entries":{...}}
POST/api/v1/profile/observeSubmit observations with confidence merging
GET/api/v1/memorySearch memory. Params: q, mode=semantic, tag, type, limit
POST/api/v1/memoryWrite memory (auto-embeds). Body: {"content":"...","tags":[...],"type":"..."}
POST/api/v1/reflectTrigger memory→profile reflection. Body: {"since":"ISO","limit":100}
GET/api/v1/persona/meRead current agent persona
GET/api/v1/persona/:agentIdRead another agent's persona

Observe Example

POST /api/v1/profile/observe
curl -X POST http://localhost:3777/api/v1/profile/observe \
  -H "Authorization: Bearer $API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "observations": [{
      "layer": "work",
      "key": "tech_stack",
      "value": ["TypeScript", "React", "Next.js"],
      "confidence": 0.9,
      "tags": ["dev", "skills"]
    }]
  }'

Shell Scripts

Located in skill/swarm-ai-skill/scripts/. All scripts source env.sh for configuration.

ScriptUsageWhen to Use
bootstrap.shNo argsSession start — outputs profile as context text
profile-read.sh[layer] [tag]Query specific profile data
profile-update.sh<layer> <json>Confirmed new user information
observe.sh<json_array>Discovered preferences or habits
memory-read.sh[query]Search historical context
memory-write.sh<content> [tags]Record significant events

Admin API

Admin endpoints require an X-Admin-Token header or JWT Authorization: Bearer <jwt>.

MethodEndpointDescription
GET/api/v1/admin/agentsList all agents
POST/api/v1/admin/agentsCreate agent. Body: {"id":"...","name":"..."}
PATCH/api/v1/admin/agentsUpdate agent persona/name
DELETE/api/v1/admin/agents/:idDelete agent and revoke API key
GET/api/v1/admin/profileView all raw profile data
PUT/api/v1/admin/profileAdmin profile update
GET/api/v1/admin/auditAudit log. Params: action, agent, limit
GET/api/v1/admin/historyProfile change history. Params: layer, key, limit
GET/api/v1/admin/exportExport all data as JSON