HatiData Documentation
The data warehouse that gives every AI agent a brain. Sub-10ms SQL, persistent memory, semantic triggers -- all through a single Postgres connection.
What is HatiData
HatiData is a data warehouse built for AI agents. It speaks the Postgres wire protocol, understands Snowflake-compatible SQL, and delivers sub-10ms query latency -- so your existing queries, dashboards, and dbt models work without changes. Beyond SQL, HatiData gives agents persistent memory, semantic triggers, chain-of-thought logging, and branch isolation out of the box.
Why HatiData
Legacy Cloud Warehouses were designed for human analysts running batch queries. AI agents need something different:
- Sub-10ms latency -- Agents make hundreds of queries per reasoning chain. Seconds-per-query latency breaks the loop.
- Your VPC, your data -- The data plane runs inside your infrastructure. No data leaves your network.
- Agent-native billing -- Per-agent metering, per-query audit trails, and reasoning chain tracking out of the box.
- Drop-in compatibility -- Write SQL the way you already do. HatiData transpiles it to DuckDB automatically.
Three-Layer Architecture
HatiData scales from a developer laptop to a multi-tenant enterprise deployment through three layers:
| Layer | Price | Data Location | Best For |
|---|---|---|---|
| Local | Free | Your machine | Development, prototyping, offline work |
| Cloud | $29/month | HatiData-managed | Small teams, startups, CI pipelines |
| Enterprise | Custom | Your VPC | Regulated industries, large-scale production |
How It Works
Every SQL query flows through a secure pipeline that handles access control, SQL transpilation, execution, column masking, metering, and audit automatically. Security and compliance are built into every query path -- not bolted on after.
For the full pipeline details, see Core Concepts.
Key Differentiators
- Local-first -- Start with
hati initand query immediately. No cloud account needed. - Agent-native -- Agents identify themselves via startup parameters. Per-agent billing, scheduling, and audit happen automatically.
- Snowflake SQL compatible -- The transpiler handles function mapping (NVL, IFF, DATEDIFF, etc.), type mapping (VARIANT, TIMESTAMP_NTZ, etc.), and construct rewriting (FLATTEN, OBJECT_CONSTRUCT, etc.).
- Secure by default -- Access control, column masking, metering, and audit built into every query path.
- Three-tier caching -- Layered caching from in-memory to disk to object storage for minimal latency.
Open Source
The CLI, SDKs, and integrations are open source under the Apache-2.0 license:
- GitHub: HatiOS-AI/HatiData-Core -- CLI, Python SDK, TypeScript SDK, dbt adapter, LangChain, CrewAI, MCP configs, and examples
Next Steps
- Quickstart -- Install and run your first query in under 5 minutes
- SQL Compatibility -- See exactly which functions and types are supported
- Python SDK -- Agent-aware database access for Python
- Enterprise Deployment -- In-VPC architecture with PrivateLink