Cognitive memory
for AI agents.
No LLM required. No embeddings. No cloud. pip install aura-memory, 2.7 MB binary, works offline.
Autonomous cognitive memory
DNA Layering, Cognitive Crystallization, SDR Indexing, and Synaptic Synthesis — a unified architecture that learns, consolidates, and evolves without external model retraining.
DNA Memory Layers
Hierarchical memory partitioning: Identity (user_core) for permanent immutable data, Wisdom (super_core) for synthesized abstractions, Context (general) for transient high-velocity data.
Cognitive Crystallization
Autonomous promotion of memories from transient to permanent layers based on semantic intensity, temporal frequency, and cross-contextual relevance. Includes Flash-Crystallization for safety-critical triggers.
SDR Indexing
Deterministic O(k) search via Sparse Distributed Representations with Tanimoto similarity. Bitwise operations on 256K-bit vectors. Sub-millisecond recall, zero garbage collection pauses.
Synaptic Synthesis
Semantic deduplication via SDR resonance. High-resonance entries (Tanimoto score ≥ 0.75) are merged into dense super-synapses, eliminating redundancy without data loss.
Encryption at Rest
ChaCha20-Poly1305 with Argon2id key derivation. Append-only binary storage ensures transactional data integrity and power-loss resilience across edge and cloud.
Trust & Provenance
Source authority scoring, provenance stamping, credibility tracking for 60+ domains. Auto-protect guards detect PII (phone, email, wallets, API keys) automatically.
Cognitive Crystallization Process
From input to permanent memory — no LLM calls, no embedding API, no cloud. Pure deterministic computation in Rust.
Input Encoding
Text is converted into a Sparse Distributed Representation (SDR) — a 256K-bit vector via xxHash3. Deterministic, no neural model needed.
Anchor Check
Flash-Crystallization scans for safety-critical, emotional, or identity triggers. If detected, the record is immediately committed to user_core.
Resonance Search
Tanimoto similarity is computed against existing synapses via bitwise operations. O(k) complexity where k = active bits.
Store or Merge
Tanimoto > 0.75 triggers Synaptic Synthesis (merge into super-synapse). Tanimoto > 0.2 updates existing synapse. Below 0.2 creates new synapse in general layer.
Crystallization
Background process autonomously promotes memories from general to super_core to user_core based on semantic intensity, access frequency, and cross-contextual relevance.
Kinetic Decay
Low-stability records are pruned via entropy-weighted decay. Each DNA layer has its own retention rate. Power-loss resilient via append-only binary storage.
How Aura compares
Most agent memory solutions require LLM calls for basic operations. Aura is pure local computation.
| Feature | Aura | Mem0 | Zep | Letta/MemGPT |
|---|---|---|---|---|
| LLM required | No | Yes | Yes | Yes |
| Embedding model required | No | Yes | Yes | No |
| Works fully offline | Partial | With local LLM | ||
| Cost per operation | $0 | API billing | Credit-based | LLM cost |
| Recall latency (1K records) | <1ms | ~200ms+ | ~200ms | LLM-bound |
| Binary size | 2.7 MB | ~50 MB+ (Python) | Cloud service | ~50 MB+ (Python) |
| Memory lifecycle (decay/promote) | Via LLM | Via LLM | ||
| Trust & provenance | ||||
| Encryption at rest | ChaCha20 | |||
| Language | Rust | Python | Proprietary | Python |
Three lines to remember everything
Python SDK works with any LLM framework. Store, recall, done.
DNA Memory Layers
Memories are partitioned into hierarchical layers with adaptive kinetic decay. Cognitive Crystallization promotes records upward automatically.
Permanent, immutable identity. User preferences, name, core traits. Protected from decay.
Synthesized wisdom and generalizations. Learned facts, decisions, domain knowledge. Auto-promoted via Crystallization.
Transient high-velocity context. Recent messages, current tasks. Decays naturally, promotes on access frequency.
Install in one line
Python 3.9+. Pre-built wheels for Linux, macOS, and Windows. No compilation needed.
Built in Ukraine