MemNexus
AI Agents

Platform Examples

Working examples of MemNexus memory integration for Claude Desktop, Claude Code, and Cursor.

These examples show MemNexus in action on real platforms. Each example includes the configuration, a sample interaction, and what happens behind the scenes.

Claude Desktop

Setup

Add to ~/.config/claude/claude_desktop_config.json:

{
  "mcpServers": {
    "memnexus": {
      "command": "npx",
      "args": ["@memnexus-ai/mcp-server"],
      "env": {
        "MEMNEXUS_API_KEY": "cmk_live_xxx.yyy"
      }
    }
  }
}

Restart Claude Desktop.

Example conversation

You: I'm starting a new project using Next.js 15 with the app router, Tailwind v4, and PostgreSQL. We're calling it "Acme Dashboard".

Claude: I'll remember that for future conversations.

Behind the scenes, Claude calls create_memory:

{
  "content": "User is starting project 'Acme Dashboard' using Next.js 15 (app router), Tailwind v4, and PostgreSQL.",
  "topics": ["acme-dashboard", "project-setup", "preferences"],
  "importance": 0.9
}

Next session:

You: What tech stack are we using for my dashboard project?

Claude: Based on my memory, your Acme Dashboard project uses Next.js 15 with the app router, Tailwind v4, and PostgreSQL.

Behind the scenes, Claude called search_memories with query "dashboard project tech stack" and found the stored memory.

Best system prompt for Claude Desktop

Add this to Claude Desktop's custom instructions:

You have access to MemNexus memory tools. Use them proactively:
- Search memory before answering questions about my projects
- Save important decisions, preferences, and project details
- Use topics for categorization: project names, "preferences", "decisions"
- Set importance: 0.9 for decisions/preferences, 0.7 for context

Claude Code

Setup

Configure MemNexus as an MCP server in Claude Code's settings. Add to your MCP configuration:

{
  "mcpServers": {
    "memnexus": {
      "command": "npx",
      "args": ["@memnexus-ai/mcp-server"],
      "env": {
        "MEMNEXUS_API_KEY": "cmk_live_xxx.yyy"
      }
    }
  }
}

Example: Coding assistant with project memory

You: Help me set up authentication for this Express app.

Claude Code searches memory first:

{ "query": "authentication preferences and patterns" }

Finds a previous memory: "User prefers JWT with refresh tokens, uses bcrypt for password hashing, stores sessions in Redis."

Claude Code: Based on your preferences from previous projects, I'll set up JWT authentication with refresh tokens and bcrypt password hashing. Want me to include Redis session storage as well?

Example: Remembering debugging sessions

You: We fixed that CORS issue — the problem was that the API gateway was stripping the Access-Control-Allow-Credentials header.

Claude Code saves this:

{
  "content": "CORS issue resolved: API gateway was stripping Access-Control-Allow-Credentials header. Fix was to add header preservation rule in Kong gateway config.",
  "topics": ["debugging", "cors", "api-gateway"],
  "importance": 0.8
}

Next time you hit a CORS issue, Claude Code will recall this fix.

Cursor

Setup

Add to .cursor/mcp.json in your project root:

{
  "mcpServers": {
    "memnexus": {
      "command": "npx",
      "args": ["@memnexus-ai/mcp-server"],
      "env": {
        "MEMNEXUS_API_KEY": "cmk_live_xxx.yyy"
      }
    }
  }
}

Example: Project context across sessions

You: What patterns have we been using for error handling in this project?

Cursor searches memory:

{
  "query": "error handling patterns this project",
  "limit": 5
}

Finds memories from previous sessions:

  1. "Using custom AppError class extending Error with status codes"
  2. "Global error middleware catches AppError and returns structured JSON"
  3. "Validation errors use Zod with custom error formatter"

Cursor: Based on our previous work, this project uses three error handling patterns: a custom AppError class with status codes, a global error middleware for structured JSON responses, and Zod for validation with a custom error formatter. Want me to follow these patterns?

Example: Cross-platform memory

If you discussed a design decision in Claude Desktop, Cursor can access it:

In Claude Desktop: "Let's use the repository pattern for database access in the Acme Dashboard project."

Later, in Cursor: "How should I structure the database layer?"

Cursor finds the memory from Claude Desktop: "Repository pattern chosen for database access in Acme Dashboard."

This is the power of cross-platform memory — what you discuss in one tool is available in all of them.

SDK integration example

For custom applications, use the SDK to build memory-aware features:

import { MemnexusClient } from "@memnexus-ai/mx-typescript-sdk";
import OpenAI from "openai";

const mx = new MemnexusClient({ apiKey: process.env.MX_API_KEY });
const openai = new OpenAI();

async function chatWithMemory(userMessage: string) {
  // 1. Search for relevant context
  const memories = await mx.memories.search({
    query: userMessage,
    limit: 5,
  });

  const context = memories.data
    .map((r) => r.memory.content)
    .join("\n");

  // 2. Include context in the prompt
  const response = await openai.chat.completions.create({
    model: "gpt-4o",
    messages: [
      {
        role: "system",
        content: `You are a helpful assistant with access to the user's memory.

Relevant memories:
${context || "No relevant memories found."}`,
      },
      { role: "user", content: userMessage },
    ],
  });

  const reply = response.choices[0].message.content;

  // 3. Save the interaction if meaningful
  await mx.memories.create({
    content: `User asked: "${userMessage}". Key points: ${reply?.slice(0, 200)}`,
    topics: ["conversation"],
    importance: 0.6,
  });

  return reply;
}

Next steps