Skip to main content

Migrate from Mem0

Mem0 provides agent memory as a standalone service. HatiData provides agent memory as part of a complete data infrastructure -- with SQL queryability, hybrid search, governance, chain-of-thought logging, branch isolation, and a Postgres-compatible wire protocol. This guide covers what changes, how to migrate your data, and how to rewrite Mem0 API calls.

Feature Comparison

CapabilityMem0HatiData
Memory storageVector + metadataSQL + vector hybrid
SearchVector similarityHybrid: vector ANN + SQL filters
Metadata filteringKey-value filtersFull SQL predicates, joins, aggregates
Wire protocolREST APIPostgres wire protocol (port 5439) + REST
SQL queries on memoriesNot supportedNative -- SELECT * FROM _hatidata_agent_memory
Cross-table joinsNot supportedJoin memories with any business table
Chain-of-thought ledgerNot supportedCryptographically hash-chained, immutable
Semantic triggersNot supportedCosine-based event firing
Branch isolationNot supportedCopy-on-write schema branches
RBAC / ABACBasic API keysFull RBAC + ABAC with row-level security
Audit trailNot supportedImmutable, hash-chained audit log
Per-agent billingNot supportedNative per-agent metering
MCP toolsNot supported24 MCP tools for Claude, Cursor, etc.
Framework integrationsLangChain, CrewAILangChain, CrewAI, AutoGen, LlamaIndex, Vercel AI SDK

Data Migration

Step 1: Export from Mem0

Export your memories from Mem0 using their API:

from mem0 import MemoryClient as Mem0Client
import json

mem0 = Mem0Client(api_key="your-mem0-api-key")

# Export all memories for each agent
agent_ids = ["agent-1", "agent-2", "support-agent"]
all_memories = []

for agent_id in agent_ids:
memories = mem0.get_all(agent_id=agent_id)
for m in memories:
all_memories.append({
"id": m["id"],
"agent_id": agent_id,
"content": m["memory"],
"metadata": m.get("metadata", {}),
"created_at": m.get("created_at"),
})

with open("mem0-export.jsonl", "w") as f:
for m in all_memories:
f.write(json.dumps(m) + "\n")

print(f"Exported {len(all_memories)} memories.")

Step 2: Import into HatiData

from hatidata import HatiDataClient
from hatidata.memory import MemoryClient
import json

client = HatiDataClient(
host="localhost",
port=5439,
api_key="hd_live_your_api_key",
)
memory = MemoryClient(client)

with open("mem0-export.jsonl") as f:
for line in f:
m = json.loads(line)
memory.store(
agent_id=m["agent_id"],
content=m["content"],
metadata={
**m["metadata"],
"migrated_from": "mem0",
"original_id": m["id"],
},
)

print("Import complete.")

Step 3: Verify Import

-- Count imported memories
SELECT agent_id, COUNT(*) AS memory_count
FROM _hatidata_agent_memory
WHERE metadata->>'migrated_from' = 'mem0'
GROUP BY agent_id;

-- Spot-check content
SELECT memory_id, agent_id, content, created_at
FROM _hatidata_agent_memory
WHERE metadata->>'migrated_from' = 'mem0'
LIMIT 10;

API Migration Reference

Store Memory

# Mem0
mem0.add(
messages="User prefers email notifications over SMS.",
agent_id="support-agent",
metadata={"category": "preference"},
)

# HatiData
memory.store(
agent_id="support-agent",
content="User prefers email notifications over SMS.",
metadata={"category": "preference"},
)

Search Memory

# Mem0
results = mem0.search(
query="notification preferences",
agent_id="support-agent",
limit=5,
)

# HatiData -- SDK
results = memory.search(
agent_id="support-agent",
query="notification preferences",
top_k=5,
)

# HatiData -- SQL (not possible with Mem0)
rows = client.query("""
SELECT content, metadata,
semantic_rank(content, 'notification preferences') AS relevance
FROM _hatidata_agent_memory
WHERE agent_id = 'support-agent'
AND semantic_match(content, 'notification preferences', 0.7)
ORDER BY relevance DESC
LIMIT 5
""")

Get All Memories

# Mem0
memories = mem0.get_all(agent_id="support-agent")

# HatiData -- SDK
memories = memory.list(agent_id="support-agent")

# HatiData -- SQL (with full filtering power)
rows = client.query("""
SELECT memory_id, content, metadata, created_at
FROM _hatidata_agent_memory
WHERE agent_id = 'support-agent'
ORDER BY created_at DESC
""")

Delete Memory

# Mem0
mem0.delete(memory_id="mem-abc123")

# HatiData -- SDK
memory.delete(memory_id="mem-abc123")

# HatiData -- SQL (bulk delete with filters)
client.execute("""
DELETE FROM _hatidata_agent_memory
WHERE agent_id = 'support-agent'
AND CAST(metadata->>'importance' AS FLOAT) < 0.3
AND created_at < NOW() - INTERVAL '90 days'
""")
# Mem0
results = mem0.search(
query="billing issue",
agent_id="support-agent",
filters={"category": "billing"},
)

# HatiData -- with full SQL filter power
results = memory.search(
agent_id="support-agent",
query="billing issue",
filters={
"category": "billing",
"created_at": {"gte": "2025-10-01"},
},
top_k=10,
min_score=0.7,
)

What You Gain

Migrating from Mem0 to HatiData gives agents capabilities that are not possible with a memory-only service:

  • SQL on memories -- Query, filter, join, and aggregate memories with standard SQL
  • Hybrid search -- Vector ANN + SQL predicates in a single query
  • Cross-table joins -- Join memories with business data, events, and knowledge bases
  • Chain-of-thought ledger -- Immutable reasoning traces alongside memories
  • Semantic triggers -- Fire events when stored content matches a concept
  • Branch isolation -- Explore data safely without affecting production
  • Governance -- Row-level security, column masking, per-agent audit trails
  • Postgres wire protocol -- Connect with psycopg2, asyncpg, SQLAlchemy, any BI tool

Stay in the loop

Product updates, engineering deep-dives, and agent-native insights. No spam.