Integration
Using MetaMemory with Voyage AI
Voyage AI has emerged as a top-tier embedding provider by focusing on domain-specific optimization, and MetaMemory integrates seamlessly with all Voyage models. The voyage-3 model consistently ranks at or near the top of the MTEB benchmark for general-purpose retrieval, making it an excellent choice when raw embedding quality is the priority. What sets Voyage AI apart is its lineup of specialized models. The voyage-code-3 model is purpose-built for code understanding — it embeds source code, documentation, and technical discussions with significantly higher fidelity than general-purpose models. For agents that work with developers or interact with codebases, this specialization translates directly into better memory retrieval. The voyage-3-lite model offers a compelling cost-performance tradeoff: it produces smaller vectors at a fraction of the cost while maintaining surprisingly strong retrieval quality. MetaMemory supports all Voyage models interchangeably, so you can even use different models for different memory types — voyage-code-3 for process memories and voyage-3 for everything else. MetaMemory handles Voyage AI's batching requirements, automatic truncation, and rate limit management out of the box. The integration also supports Voyage's reranking capabilities, which MetaMemory can optionally use as a second-pass filter to further improve retrieval precision. For teams building developer tools, code assistants, or technical support agents, Voyage AI combined with MetaMemory delivers best-in-class memory quality.
Setup Guide
Get Your Voyage AI API Key
Visit dash.voyageai.com and create an account. After signing in, navigate to the API Keys page and click "Create New API Key". Give it a name that identifies your MetaMemory deployment. Voyage AI offers a free tier with enough credits to build and test your integration. Copy the key — you will need it in the next step. Review the rate limits for your plan tier so you can configure MetaMemory's batching accordingly.
Configure Voyage AI in MetaMemory
Open your MetaMemory dashboard and go to Settings then Provider Keys. Select "Voyage AI" from the provider dropdown and paste your API key. Choose voyage-3 as the default model for general-purpose memory encoding, or select voyage-code-3 if your primary use case involves code and technical content. MetaMemory will validate the connection and display the available models and their dimensions. You can change the default model at any time.
Create and Query Memories
With Voyage AI configured, use the MetaMemory API to store your first memory. The system routes your content to the Voyage embedding endpoint and generates all four multi-vector representations. Test retrieval by querying with both exact and semantically related terms. Voyage models are particularly strong at capturing paraphrased meaning, so try queries that express the same idea in different words to see the embedding quality in action.
Configuration Example
curl -X POST https://api.metamemory.tech/v1/providers \
-H "Authorization: Bearer YOUR_METAMEMORY_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"provider": "voyage-ai",
"api_key": "pa-YOUR_VOYAGE_API_KEY",
"default_model": "voyage-3",
"settings": {
"input_type": "document",
"truncation": true
}
}'Supported Models
Capabilities
Related Integrations
Cohere
OpenAI
Mistral AI
Ready to use Voyage AI with MetaMemory?
Get started in minutes. Connect your Voyage AI API key and give your agents persistent, intelligent memory.