Memor is an open-source Python library offering AI agents a robust memory management system. It enables persistent storage of embeddings, context-aware retrieval, filtering, and segmentation across multiple backends. Developers can seamlessly integrate memory operations into chatbots or autonomous agents, allowing models to recall past interactions and maintain continuity over long conversations.
Memor is an open-source Python library offering AI agents a robust memory management system. It enables persistent storage of embeddings, context-aware retrieval, filtering, and segmentation across multiple backends. Developers can seamlessly integrate memory operations into chatbots or autonomous agents, allowing models to recall past interactions and maintain continuity over long conversations.
Memor offers a memory subsystem for language model agents, allowing them to store embeddings of past events, user preferences, and contextual data in vector databases. It supports multiple backends such as FAISS, ElasticSearch, and in-memory stores. Using semantic similarity search, agents can retrieve relevant memories based on query embeddings and metadata filters. Memor’s customizable memory pipelines include chunking, indexing, and eviction policies, ensuring scalable, long-term context management. Integrate it within your agent’s workflow to enrich prompts with dynamic historical context and boost response relevance over multi-session interactions.
Who will use Memor?
AI Developers
LLM Researchers
Chatbot Builders
Software Engineers
How to use the Memor?
Step1: Install Memor via pip install memor
Step2: Import Memor client and configure your vector store backend
Step3: Initialize a memory session with desired parameters
Step4: Use memory.write() to store embeddings and metadata
Step5: Use memory.query() to retrieve relevant past memories
Step6: Integrate memory operations within your agent’s interaction loop
Platform
mac
windows
linux
Memor's Core Features & Benefits
The Core Features
Vector-based memory storage
Multi-backend support (FAISS, ElasticSearch, in-memory)
Semantic retrieval with similarity search
Metadata filtering and chunking
Customizable eviction policies
Context segmentation and indexing
The Benefits
Enhanced context continuity across sessions
Improved response relevance
Scalable long-term storage
Open-source flexibility
Easy integration with LLM frameworks
Memor's Main Use Cases & Applications
Contextual chatbots that remember user preferences across sessions
Autonomous agents with long-term memory for multi-step tasks
Personalized virtual assistants with user-specific knowledge