LEGBA's memory system mirrors biological memory formation: short-term working memory for immediate recall and long-term consolidation through sleep-like processes.
Implemented via local vector store, STM holds recently verified events with recency and importance weights. Like human working memory, it naturally decays over time unless reinforced.
Fast semantic search and retrieval
Slow exponential fading of unused items
Agent can mark important memories for retention
Dual scoring for retrieval prioritization
Implemented as LoRA (Low-Rank Adaptation) adapters on top of the base model, LTM enables persistent learning without catastrophic forgetting or full model retraining.
Lightweight weight modifications
Consolidation during idle periods
Version control with zero downtime
Avoids catastrophic forgetting
LEGBA's autonomous sleep cycle processes verified experiences through the Planner, validates them via the Verifier, and generates low-rate LoRA updates with bounded influence and rollback on failure
LEGBA's sleep cycle mirrors human long-term potentiation, the biological process where repeated neural activation strengthens synaptic connections. Sleep consolidation reinforces successful patterns through slow, continual LoRA weight adjustments at bounded influence. This creates durable learning without catastrophic forgetting that plagues traditional fine-tuning.
Verified outcomes from the cognitive cycle enter short-term memory
Sleep cycle processes and filters experiences for long-term storage
LoRA adapters encode persistent behavioral improvements
Unlike traditional fine-tuning that can overwrite existing knowledge, LEGBA's cumulative LoRA updates preserve prior learning while adding new capabilities. Each sleep cycle builds upon previous adaptations rather than replacing them.
Hot-swappable LoRA adapters allow the system to continue operating with the previous memory version while new consolidation occurs in the background. Updates apply seamlessly without service interruption.
LoRA adapters require only a fraction of the storage compared to full model copies. Each memory version is lightweight, versioned, and can be rolled back if needed - enabling extensive experimentation without storage concerns.
Every memory consolidation is driven by verified performance deltas. The system only learns from confirmed successes and failures, ensuring that each adaptation moves toward measurable improvement rather than random drift.
Dive into the engineering details, performance targets, and deployment requirements