Memary is a Python library designed to streamline memory management in AI agents. It offers a unified interface supporting various storage backends like in-memory, Redis, and vector databases. Developers can define schemas for short-term and long-term memory and perform semantic retrieval using embeddings. Memary ensures seamless context persistence across sessions, empowering LLM applications such as chatbots and virtual assistants with improved conversational continuity and knowledge retention.
Memary is a Python library designed to streamline memory management in AI agents. It offers a unified interface supporting various storage backends like in-memory, Redis, and vector databases. Developers can define schemas for short-term and long-term memory and perform semantic retrieval using embeddings. Memary ensures seamless context persistence across sessions, empowering LLM applications such as chatbots and virtual assistants with improved conversational continuity and knowledge retention.
At its core, Memary provides a modular memory management system tailored for large language model agents. By abstracting memory interactions through a common API, it supports multiple storage backends, including in-memory dictionaries, Redis for distributed caching, and vector stores like Pinecone or FAISS for semantic search. Users define schema-based memories (episodic, semantic, or long-term) and leverage embedding models to populate vector stores automatically. Retrieval functions allow contextually relevant memory recall during conversations, enhancing agent responses with past interactions or domain-specific data. Designed for extensibility, Memary can integrate custom memory backends and embedding functions, making it ideal for developing robust, stateful AI applications such as virtual assistants, customer service bots, and research tools requiring persistent knowledge over time.
Who will use Memary?
AI developers
LLM researchers
Chatbot builders
Software architects
Enterprise teams
How to use the Memary?
Step1: Install Memary via pip (`pip install memary`)
Step2: Import Memary and select a memory store backend
Step3: Define memory schemas for short-term and long-term contexts
Step4: Add memory entries and generate embeddings as needed
Step5: Retrieve relevant memories during agent runtime
Step6: Integrate Memary into your AI agent pipeline for context enrichment
Platform
mac
windows
linux
Memary's Core Features & Benefits
The Core Features
Unified memory API for AI agents
Support for in-memory, Redis, and vector store backends
Schema-based short-term and long-term memory definitions
Automatic embedding integration for semantic search
Contextual memory retrieval during conversations
Extensible architecture for custom backends
The Benefits
Improves conversational continuity
Enables stateful AI applications
Scalable backend support
Modular and customizable
Persistent context across sessions
Memary's Main Use Cases & Applications
Chatbots maintaining user context over multiple turns