Skip to content

Core Concepts

Adaptive Retrieval

A 7-layer self-improving system that continuously refines retrieval quality.

Overview

MetaMemory's adaptive retrieval system is a 7-layer stack that continuously learns which retrieval strategies work best for different query types. It starts with no assumptions and improves with every search.

Layer 1: Meta-Memory Rules

A library of 50+ LLM-discovered patterns that map query characteristics to effective retrieval strategies. For example: “temporal queries about recent events perform better with higher recency decay weights.” Rules require a minimum confidence of 0.6 to be applied.

Layer 2: Thompson Sampling

A Beta-Bernoulli multi-armed bandit that balances exploration and exploitation. Each strategy arm maintains a Beta(α, β) posterior distribution, starting from a uniform Beta(1,1) prior. On each query, the system samples from each arm's posterior and selects the strategy with the highest sample.

Layer 3: UCB Selection

Upper Confidence Bound selection provides a complementary exploration signal. It selects the arm that maximizes the mean reward plus an exploration bonus proportional to uncertainty, ensuring under-explored strategies get tried.

Layer 4: Gradient Boosting

After 100+ retrieval samples have accumulated, a gradient boosting model (50 decision stumps, learning rate η=0.1) activates. It predicts retrieval effectiveness based on query features, context, and historical performance.

Layer 5: Ensemble Orchestration

Combines all strategy signals using weighted averaging:

SignalWeight
ML (Gradient Boosting)0.4
Collaborative Filtering0.3
MAB (Thompson/UCB)0.2
Meta-Memory Rules0.1

Layer 6: Bayesian Parameter Optimization

Optimizes 7 continuous parameters (similarity threshold, decay rate, channel weights, etc.) using Bayesian optimization. Targets the 75th percentile of retrieval quality rather than the mean, ensuring consistently good results rather than high-variance performance.

Layer 7: Online Drift Detection

Monitors retrieval quality in real-time. If average quality drops by 10% or more from the rolling baseline, the system triggers an automatic retrain of the gradient boosting model and resets the Bayesian optimizer, preventing degradation from distributional shift.