Skip to content

Integration

Using MetaMemory with OpenAI Embeddings

OpenAI offers some of the most widely adopted embedding models in the industry, and MetaMemory is optimized to take full advantage of them. The text-embedding-3 family delivers excellent semantic understanding across a broad range of domains, making it an ideal default choice for most MetaMemory deployments. The text-embedding-3-small model strikes a strong balance between quality and cost, capturing nuanced meaning while keeping per-token costs low. For use cases that demand maximum fidelity, text-embedding-3-large scales up to 3072 dimensions, capturing even finer distinctions in meaning. MetaMemory automatically handles dimension mapping, normalization, and batching for both models, so you get optimal throughput without manual tuning. Because OpenAI embeddings are the most battle-tested option in production environments, MetaMemory includes specific optimizations for them: adaptive batching that stays within rate limits, automatic retry with exponential backoff, and pre-computed similarity thresholds calibrated to the text-embedding-3 vector space. If you are building your first MetaMemory integration or need a reliable default, OpenAI is the recommended starting point. The combination of broad language support, consistent quality, and deep MetaMemory optimization means you can go from zero to production-ready memory in minutes.

Setup Guide

1

Get Your OpenAI API Key

Sign in to your OpenAI account at platform.openai.com and navigate to the API Keys section under your profile settings. Click "Create new secret key" and give it a descriptive name like "MetaMemory Production". Copy the key immediately — OpenAI only shows it once. Make sure your account has billing enabled and sufficient credits, as embedding requests are billed per token processed.

2

Add the Key to Your MetaMemory Dashboard

Open your MetaMemory dashboard and go to Settings then Provider Keys. Select "OpenAI" from the provider dropdown and paste your API key into the key field. Choose your preferred default model — we recommend text-embedding-3-small for most use cases. MetaMemory will validate the key by making a test embedding request. Once validated, the key is encrypted and stored securely. You can rotate keys at any time without downtime.

3

Make Your First Memory

With your OpenAI key configured, you are ready to store your first memory. Use the MetaMemory API to send a memory payload — the system will automatically route the text through your OpenAI embedding model, generate multi-vector representations across all four embedding types, and store the result. Try a simple test: store a short paragraph and then retrieve it with a semantic query to confirm everything is connected and working end to end.

Configuration Example

curl -X POST https://api.metamemory.tech/v1/providers \
  -H "Authorization: Bearer YOUR_METAMEMORY_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "provider": "openai",
    "api_key": "sk-YOUR_OPENAI_API_KEY",
    "default_model": "text-embedding-3-small",
    "settings": {
      "dimensions": 3072,
      "batch_size": 100
    }
  }'

Supported Models

text-embedding-3-smallDefault
text-embedding-3-large
text-embedding-ada-002

Capabilities

EmbeddingsLLM

Ready to use OpenAI with MetaMemory?

Get started in minutes. Connect your OpenAI API key and give your agents persistent, intelligent memory.

Your agents deserve to remember

Bring your own AI keys. Integrate in minutes. Your data stays yours.