Back to Wyrm

Wyrm Docs

Everything you need to install, configure, and use Wyrm — persistent memory for AI agents.

// INSTALLATION

Install Wyrm

npm (global)
$ npm install -g wyrm-mcp
npx (no install)
$ npx wyrm-mcp
From source
$ git clone https://github.com/ghosts-lk/Wyrm.git
$ cd Wyrm
$ npm install
$ npm run build
// CLIENT CONFIGURATION

Configure Your AI Client

Claude Desktop

Add to ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows):

{
  "mcpServers": {
    "wyrm": {
      "command": "wyrm-mcp"
    }
  }
}

GitHub Copilot (VS Code)

Add to your VS Code settings.json:

{
  "github.copilot.chat.mcpServers": {
    "wyrm": {
      "command": "wyrm-mcp"
    }
  }
}

Cursor

Add to .cursor/mcp.json in your project root:

{
  "mcpServers": {
    "wyrm": {
      "command": "wyrm-mcp"
    }
  }
}

Windsurf

Add to ~/.codeium/windsurf/mcp_config.json:

{
  "mcpServers": {
    "wyrm": {
      "command": "wyrm-mcp"
    }
  }
}
// TOOLS REFERENCE

All 31 MCP Tools

Organized by category. Every tool is available in the Free tier.

Project

4 tools
wyrm_create_projectCreate a new project with name, description, and tech stack
wyrm_list_projectsList all projects with optional status filter
wyrm_get_projectGet full project details by ID
wyrm_update_projectUpdate project metadata, status, or tech stack

Session

5 tools
wyrm_create_sessionStart a new session linked to a project
wyrm_list_sessionsList sessions for a project with date range filters
wyrm_get_sessionGet session details including summary and decisions
wyrm_update_sessionUpdate session summary, status, or metadata
wyrm_end_sessionEnd a session with final summary and outcomes

Quest

4 tools
wyrm_create_questCreate a task with priority, tags, and dependencies
wyrm_list_questsList quests with status, priority, and tag filters
wyrm_get_questGet quest details including progress and blockers
wyrm_update_questUpdate quest status, priority, or assignee

Context & Skill

5 tools
wyrm_save_contextSave a context entry (decision, preference, pattern)
wyrm_list_contextsList context entries with category filters
wyrm_save_skillStore a reusable skill or pattern
wyrm_list_skillsList all stored skills with optional category filter
wyrm_get_skillGet full skill details and usage instructions

Data Lake

5 tools
wyrm_data_insertInsert structured data with category and tags
wyrm_data_batch_insertBatch insert multiple data entries at once
wyrm_data_queryQuery data lake with filters, sorting, and pagination
wyrm_data_updateUpdate existing data lake entries
wyrm_data_deleteDelete data lake entries by ID or filter

Search

3 tools
wyrm_searchFull-text search across all data types
wyrm_search_contextSearch specifically within context entries
wyrm_search_sessionsSearch session summaries and content

Stats & Usage

2 tools
wyrm_usageGet token usage, cost estimation, and cache hit rates
wyrm_statsGet database statistics — projects, sessions, storage size

System

3 tools
wyrm_syncBi-directional sync with .wyrm/ markdown files
wyrm_exportExport all data as JSON for backup or migration
wyrm_healthHealth check — DB status, version, feature flags
// CONFIGURATION

Environment Variables

WYRM_DB_PATHDefault: ~/.wyrm/wyrm.db

Path to the SQLite database file

WYRM_ENCRYPTION_KEYDefault: (none)

AES-256 encryption key for data at rest. Set to enable encryption.

WYRM_VECTOR_PROVIDERDefault: (none)

Vector embedding provider for semantic search (optional). Supports 'openai' or 'local'.

WYRM_LOG_LEVELDefault: info

Logging verbosity: debug, info, warn, error

WYRM_CACHE_SIZEDefault: 500

Maximum entries in the in-memory read cache

Example with encryption
WYRM_DB_PATH=~/.wyrm/wyrm.db \
WYRM_ENCRYPTION_KEY=your-32-byte-hex-key \
WYRM_LOG_LEVEL=debug \
wyrm-mcp
// ARCHITECTURE

How Wyrm Works

Wyrm is built as a Model Context Protocol (MCP) server that any compatible AI client can connect to. It uses a local SQLite database as its storage engine, enhanced with FTS5 for full-text search.

SQLite with WAL mode for concurrent read/write access
FTS5 full-text search engine for instant queries
In-memory cache layer with auto-invalidation on writes
Optional AES-256-GCM encryption for data at rest
Optional vector embeddings for semantic search
MCP protocol with cache_control hints for prompt caching
TypeScript with full type safety, zero external runtime deps
Architecture
┌─────────────────────────┐
│   AI Client (Claude,    │
│   Copilot, Cursor, etc) │
└───────────┬─────────────┘
            │ MCP Protocol
┌───────────▼─────────────┐
│      Wyrm MCP Server    │
│  ┌───────────────────┐  │
│  │   MemCache Layer   │  │
│  │   (sub-ms reads)   │  │
│  └─────────┬─────────┘  │
│  ┌─────────▼─────────┐  │
│  │  Encryption Layer  │  │
│  │  (AES-256-GCM)     │  │
│  └─────────┬─────────┘  │
│  ┌─────────▼─────────┐  │
│  │    SQLite + FTS5   │  │
│  │   (WAL mode)       │  │
│  └───────────────────┘  │
└─────────────────────────┘
// FAQ

Frequently Asked Questions

Is Wyrm free?

Yes. Wyrm is free and open source under the MIT license. The core MCP server with all 31 tools, full-text search, and local storage is completely free.

Where is my data stored?

Locally in a SQLite database file on your machine (default: ~/.wyrm/wyrm.db). Your data never leaves your computer unless you enable cloud sync (coming in Pro).

Is it secure?

Yes. Wyrm includes optional AES-256-GCM encryption for data at rest, input sanitization on all tool inputs, path validation to prevent directory traversal, and parameterized queries to prevent SQL injection.

Which AI clients work with Wyrm?

Any MCP-compatible client: Claude Desktop, GitHub Copilot (VS Code), Cursor, Windsurf, Continue, and any other tool that supports the Model Context Protocol.

Will it slow down my AI?

No. Wyrm uses SQLite WAL mode for concurrent access and an in-memory cache layer that returns cached reads in under 1ms. Write operations are typically under 5ms.

Can I use it with multiple projects?

Yes. Wyrm supports unlimited projects, each with their own sessions, quests, and context. Switch between projects seamlessly.

Ready to Get Started?

Install Wyrm in under 60 seconds and give your AI persistent memory.