Private Beta — April 2026

Your AI forgets.
Not anymore.

Persistent memory for AI, delivered over MCP. One config entry. Every session, every surface, every tool — your context travels with you.

4
AI surfaces connected
MCP
Open protocol, zero plugins
AES‑256
Encrypted at rest
scroll
The problem

Every session
starts from zero

You've explained your architecture to Claude fourteen times. You re-described your tech stack to Copilot this morning. You told Telegram which model you prefer three times this week.

Your tools are brilliant in isolation. But they forget the moment the session ends. Context is trapped — per-surface, per-session, per-conversation. That's the default state of AI tooling in 2026.

We're the exception.

context.status — live
without What Next
claude_desktop context: null
vscode_copilot context: null
github_copilot context: null
telegram_bot context: null
with What Next
all_surfaces waiting…
How it works

Add it once.
It's everywhere.

What Next implements the Model Context Protocol — the open standard for AI tool integrations. Drop in a single config entry. Every MCP-compatible surface shares the same persistent memory, automatically.

Claude Desktop — using What Next via MCP
Architecture

Built on three
hard principles

Always-on sync

SQLite caches locally on your machine. Postgres on Railway handles cloud. Every write goes through both. Works offline — catches up when you're back.

AES-256-GCM encryption

Memory is encrypted per-user at rest. The cloud stores ciphertext. Your key never leaves your machine. We cannot read your data — structurally.

Open protocol

Built on MCP. No proprietary plugins, no locked-in extensions. Any MCP-compatible surface you add in future gets full memory — no extra config needed.

Integrations

Everywhere you
already work

If the surface supports MCP, What Next is already compatible. No additional setup required.

Claude Desktop
Live
VS Code
Live
GitHub Copilot
Live
Telegram
Live
ChatGPT
Soon
Setup

Two minutes to
add it anywhere

Add What Next to your MCP config. No browser extensions, no per-surface accounts, no OAuth flows.

claude_desktop_config.json
{
  "mcpServers": {
    "what-next": {
      "command": "node",
      "args": ["src/server.js"],
      "env": {
        "WHATNEXT_API_KEY": "..."
      }
    }
  }
}
mcp.json (VS Code & Copilot)
{
  "servers": {
    "what-next": {
      "command": "node",
      "args": ["src/server.js"],
      "env": {
        "WHATNEXT_API_KEY": "..."
      }
    }
  }
}
Private Beta

A small first
cohort. Real support.

We're admitting a first wave of developers. Not a waitlist of 50,000 that goes nowhere — a real group we can actually onboard, support, and learn from.

Beta access gets you an API key, full cloud sync, and a direct line to us while the product is still being shaped.

What Next does one thing well: it remembers. It's rough at the edges. If you're comfortable with that tradeoff, we want to work with you.

Optional. Helps us tailor onboarding for your setup.

You're in the queue. We'll reach out when we're ready for you. Thanks.