GuidesIntegrations
ChatGPT
Use MemNexus memory with ChatGPT via custom GPTs or the API.
ChatGPT doesn't support MCP natively, but you can still connect it to MemNexus memory using Custom GPTs with API actions or by building with the OpenAI API.
Option 1: Custom GPT with API actions
Create a Custom GPT that calls the MemNexus API directly.
1. Create a Custom GPT
In ChatGPT, go to Explore GPTs > Create a GPT.
2. Add instructions
You have access to MemNexus, a persistent memory system. Use the provided
actions to search and create memories.
WHEN TO SEARCH: Before answering questions about the user's projects,
preferences, or past conversations.
WHEN TO SAVE: When the user shares important decisions, preferences,
or project context.
Always search memory before asking the user to repeat information.
3. Add API actions
In the GPT configuration, add actions pointing to the MemNexus API:
Search memories:
openapi: 3.0.0
info:
title: MemNexus Memory Search
version: 1.0.0
servers:
- url: https://api.memnexus.ai
paths:
/api/memories/search:
post:
operationId: searchMemories
summary: Search memories by meaning
requestBody:
required: true
content:
application/json:
schema:
type: object
properties:
query:
type: string
limit:
type: integer
default: 5
responses:
'200':
description: Search results
Create memory:
/api/memories:
post:
operationId: createMemory
summary: Create a new memory
requestBody:
required: true
content:
application/json:
schema:
type: object
required: [content]
properties:
content:
type: string
topics:
type: array
items:
type: string
importance:
type: number
responses:
'201':
description: Memory created
4. Configure authentication
Set the authentication to API Key with:
- Auth type: Bearer
- API Key: Your MemNexus API key (
cmk_live_xxx.yyy)
Option 2: OpenAI API with MemNexus SDK
Build a custom chatbot that combines OpenAI with MemNexus:
import { MemnexusClient } from "@memnexus-ai/mx-typescript-sdk";
import OpenAI from "openai";
const mx = new MemnexusClient({ apiKey: process.env.MX_API_KEY });
const openai = new OpenAI();
async function chat(message: string) {
// Search for relevant memories
const memories = await mx.memories.search({
query: message,
limit: 5,
});
const context = memories.data
.map((r) => `- ${r.memory.content}`)
.join("\n");
// Chat with memory context
const response = await openai.chat.completions.create({
model: "gpt-4o",
messages: [
{
role: "system",
content: `You are a helpful assistant with persistent memory.
Relevant memories:
${context || "No relevant memories found."}
If the user shares important information, note it so we can save it.`,
},
{ role: "user", content: message },
],
});
return response.choices[0].message.content;
}
Limitations
- No MCP support — ChatGPT uses Custom GPT actions, not MCP tools
- Action limits — Custom GPTs have limits on the number and frequency of API calls
- No proactive saving — The GPT needs explicit instructions to save memories
For the best memory experience, consider using Claude Desktop or Cursor which support MCP natively.
Next steps
- Claude Desktop — Native MCP integration
- SDK Search — Build custom integrations
- Agent Patterns — Memory design patterns