Skip to main content
The crystal-memory plugin integrates directly into OpenClaw’s conversation lifecycle. Once installed, every session is automatically captured, and relevant memories are injected before each AI response. No manual recall needed.

Install

Run the installer script:
curl -fsSL https://memorycrystal.ai/crystal | bash
The installer sets up the plugin, configures your memory backend, and wires everything into your OpenClaw installation.

Enable and disable

# Wire the plugin into your OpenClaw config
npm run crystal:enable

# Remove the plugin wiring
npm run crystal:disable
To remove plugin files entirely when disabling:
npm run crystal:disable -- --purge

Verify

Run the doctor command to check that the plugin is wired correctly and can reach the memory backend:
npm run crystal:doctor

Lifecycle hooks

The plugin registers hooks for these OpenClaw lifecycle events:
HookWhat it does
before_agent_startInjects wake context and relevant recall at session start
before_tool_callSurfaces warnings before risky tool calls
before_dispatchRate limiting, proactive recall, and reinforcement injection
message_receivedCaptures incoming user messages
llm_outputCaptures assistant responses and extracts durable memories
session_endClears per-session state
The plugin also handles before_compaction and after_compaction to preserve context across OpenClaw’s context compaction boundaries.

Tools

The plugin registers these tools, available in your AI session:

Memory tools

  • crystal_recall — semantic search across long-term memory
  • crystal_remember — store a durable memory manually
  • crystal_recent — fetch recent memory-backed messages
  • crystal_search_messages — search conversation history
  • crystal_what_do_i_know — topic knowledge snapshot
  • crystal_why_did_we — decision archaeology
  • crystal_checkpoint — milestone memory snapshot
  • crystal_preflight — pre-flight check before risky actions
  • crystal_forget — archive or delete a memory
  • crystal_trace — trace a memory to its source conversation
  • crystal_wake — session startup briefing
  • crystal_stats — memory and usage statistics
  • crystal_set_scope — override channel scope for the session
  • crystal_who_owns — find ownership context for files or modules
  • crystal_explain_connection — explain relationships between concepts
  • crystal_dependency_chain — trace dependency chains

Local tools (when local store is available)

  • crystal_grep — search in-session local history and summaries
  • crystal_describe — inspect a local summary node
  • crystal_expand — expand a local summary into underlying context
Legacy compatibility tools (also registered for older integrations):
  • memory_search — returns crystal/<id>.md paths
  • memory_get — reads a memory by ID or path

Configuration

Set these options in your OpenClaw plugin config under plugins.entries.crystal-memory.config:
OptionTypeDefaultDescription
apiKeystringYour Memory Crystal API key
convexUrlstringhttps://your-deployment.convex.siteMemory Crystal backend URL. Override for self-hosted deployments.
defaultRecallModestringgeneralDefault recall mode. Options: general, decision, project, people, workflow, conversation
defaultRecallLimitnumber8Number of memories to recall per query (1–20)
channelScopestringNamespace prefix for tenant, client, or agent isolation. Example: coach
localSummaryInjectionbooleantrueInject relevant local summaries into the context window based on the current query
localSummaryMaxTokensnumber2000Maximum tokens to spend on injected local summaries (0–8000)

Example config

{
  "plugins": {
    "entries": {
      "crystal-memory": {
        "enabled": true,
        "config": {
          "apiKey": "your-api-key-here",
          "defaultRecallMode": "project",
          "defaultRecallLimit": 10,
          "channelScope": "my-team",
          "localSummaryMaxTokens": 3000
        }
      }
    }
  }
}
Use channelScope when running multiple agents or teams on the same Memory Crystal account. All API calls will be namespaced as {channelScope}:{peerId}, keeping memories isolated between contexts.