Migrate from Mem0
Mem0 provides agent memory as a standalone service. HatiData provides agent memory as part of a complete data infrastructure -- with SQL queryability, hybrid search, governance, chain-of-thought logging, branch isolation, and a Postgres-compatible wire protocol. This guide covers what changes, how to migrate your data, and how to rewrite Mem0 API calls.
Feature Comparison
| Capability | Mem0 | HatiData |
|---|---|---|
| Memory storage | Vector + metadata | SQL + vector hybrid |
| Search | Vector similarity | Hybrid: vector ANN + SQL filters |
| Metadata filtering | Key-value filters | Full SQL predicates, joins, aggregates |
| Wire protocol | REST API | Postgres wire protocol (port 5439) + REST |
| SQL queries on memories | Not supported | Native -- SELECT * FROM _hatidata_agent_memory |
| Cross-table joins | Not supported | Join memories with any business table |
| Chain-of-thought ledger | Not supported | Cryptographically hash-chained, immutable |
| Semantic triggers | Not supported | Cosine-based event firing |
| Branch isolation | Not supported | Copy-on-write schema branches |
| RBAC / ABAC | Basic API keys | Full RBAC + ABAC with row-level security |
| Audit trail | Not supported | Immutable, hash-chained audit log |
| Per-agent billing | Not supported | Native per-agent metering |
| MCP tools | Not supported | 24 MCP tools for Claude, Cursor, etc. |
| Framework integrations | LangChain, CrewAI | LangChain, CrewAI, AutoGen, LlamaIndex, Vercel AI SDK |
Data Migration
Step 1: Export from Mem0
Export your memories from Mem0 using their API:
from mem0 import MemoryClient as Mem0Client
import json
mem0 = Mem0Client(api_key="your-mem0-api-key")
# Export all memories for each agent
agent_ids = ["agent-1", "agent-2", "support-agent"]
all_memories = []
for agent_id in agent_ids:
memories = mem0.get_all(agent_id=agent_id)
for m in memories:
all_memories.append({
"id": m["id"],
"agent_id": agent_id,
"content": m["memory"],
"metadata": m.get("metadata", {}),
"created_at": m.get("created_at"),
})
with open("mem0-export.jsonl", "w") as f:
for m in all_memories:
f.write(json.dumps(m) + "\n")
print(f"Exported {len(all_memories)} memories.")
Step 2: Import into HatiData
from hatidata import HatiDataClient
from hatidata.memory import MemoryClient
import json
client = HatiDataClient(
host="localhost",
port=5439,
api_key="hd_live_your_api_key",
)
memory = MemoryClient(client)
with open("mem0-export.jsonl") as f:
for line in f:
m = json.loads(line)
memory.store(
agent_id=m["agent_id"],
content=m["content"],
metadata={
**m["metadata"],
"migrated_from": "mem0",
"original_id": m["id"],
},
)
print("Import complete.")
Step 3: Verify Import
-- Count imported memories
SELECT agent_id, COUNT(*) AS memory_count
FROM _hatidata_agent_memory
WHERE metadata->>'migrated_from' = 'mem0'
GROUP BY agent_id;
-- Spot-check content
SELECT memory_id, agent_id, content, created_at
FROM _hatidata_agent_memory
WHERE metadata->>'migrated_from' = 'mem0'
LIMIT 10;
API Migration Reference
Store Memory
# Mem0
mem0.add(
messages="User prefers email notifications over SMS.",
agent_id="support-agent",
metadata={"category": "preference"},
)
# HatiData
memory.store(
agent_id="support-agent",
content="User prefers email notifications over SMS.",
metadata={"category": "preference"},
)
Search Memory
# Mem0
results = mem0.search(
query="notification preferences",
agent_id="support-agent",
limit=5,
)
# HatiData -- SDK
results = memory.search(
agent_id="support-agent",
query="notification preferences",
top_k=5,
)
# HatiData -- SQL (not possible with Mem0)
rows = client.query("""
SELECT content, metadata,
semantic_rank(content, 'notification preferences') AS relevance
FROM _hatidata_agent_memory
WHERE agent_id = 'support-agent'
AND semantic_match(content, 'notification preferences', 0.7)
ORDER BY relevance DESC
LIMIT 5
""")
Get All Memories
# Mem0
memories = mem0.get_all(agent_id="support-agent")
# HatiData -- SDK
memories = memory.list(agent_id="support-agent")
# HatiData -- SQL (with full filtering power)
rows = client.query("""
SELECT memory_id, content, metadata, created_at
FROM _hatidata_agent_memory
WHERE agent_id = 'support-agent'
ORDER BY created_at DESC
""")
Delete Memory
# Mem0
mem0.delete(memory_id="mem-abc123")
# HatiData -- SDK
memory.delete(memory_id="mem-abc123")
# HatiData -- SQL (bulk delete with filters)
client.execute("""
DELETE FROM _hatidata_agent_memory
WHERE agent_id = 'support-agent'
AND CAST(metadata->>'importance' AS FLOAT) < 0.3
AND created_at < NOW() - INTERVAL '90 days'
""")
Filtered Search
# Mem0
results = mem0.search(
query="billing issue",
agent_id="support-agent",
filters={"category": "billing"},
)
# HatiData -- with full SQL filter power
results = memory.search(
agent_id="support-agent",
query="billing issue",
filters={
"category": "billing",
"created_at": {"gte": "2025-10-01"},
},
top_k=10,
min_score=0.7,
)
What You Gain
Migrating from Mem0 to HatiData gives agents capabilities that are not possible with a memory-only service:
- SQL on memories -- Query, filter, join, and aggregate memories with standard SQL
- Hybrid search -- Vector ANN + SQL predicates in a single query
- Cross-table joins -- Join memories with business data, events, and knowledge bases
- Chain-of-thought ledger -- Immutable reasoning traces alongside memories
- Semantic triggers -- Fire events when stored content matches a concept
- Branch isolation -- Explore data safely without affecting production
- Governance -- Row-level security, column masking, per-agent audit trails
- Postgres wire protocol -- Connect with psycopg2, asyncpg, SQLAlchemy, any BI tool
Related Concepts
- Persistent Memory -- Memory architecture
- Hybrid SQL --
semantic_matchandsemantic_rank - Memory Patterns -- Advanced memory patterns
- MCP Tools Reference --
store_memory,search_memorytools - Migrate from Pinecone -- Vector database migration