An MCP server that gives LLMs persistent, searchable semantic memory.
pip install mcp-external-memoryfrom mcp_external_memory import memory_store, memory_search
# Store a memory
result = memory_store(content="Alice prefers dark mode", namespace="users", tags=["alice", "ui"])
# Search memories
results = memory_search(query="what does Alice prefer?", namespace="users")mcp-external-memory --help| Tool | Description |
|---|---|
memory_store |
Persist text + optional namespace/tags/metadata |
memory_search |
Semantic search (cosine similarity) over all memories |
memory_get |
Retrieve a single memory by ID |
memory_delete |
Delete a memory by ID |
memory_list |
List memories with optional namespace/tag filter + pagination |
memory_stats |
Count of memories, namespaces, DB path |
memory_update |
Update an existing memory |
The server supports multiple embedding backends:
- TF-IDF (default): Pure Python, no external dependencies
- OpenAI: Uses
text-embedding-3-smallmodel - Ollama: Local embeddings with Ollama
Set via MEMORY_EMBED_BACKEND environment variable.
git clone https://github.com/daedalus/mcp-external-memory.git
cd mcp-external-memory
pip install -e ".[test]"
# run tests
pytest
# format
ruff format src/ tests/
# lint
ruff check src/ tests/
# type check
mypy src/mcp-name: io.github.daedalus/mcp-external-memory