Skip to main content

Migrate from Snowflake

HatiData transpiles Snowflake SQL automatically — most queries run unchanged on day one. This guide walks through Shadow Mode validation, schema and data migration, and connection cutover.

Migration guide

This page helps teams moving workloads from Snowflake to HatiData. HatiData's transpiler handles most Snowflake SQL automatically — no query rewrites required.

SQL Compatibility

HatiData's transpiler handles the majority of Snowflake SQL at query time. You do not need to rewrite queries before migrating.

Function Mappings

SnowflakeHatiDataNotes
NVL(a, b)COALESCE(a, b)Auto-transpiled
IFF(cond, a, b)IF(cond, a, b)Auto-transpiled
DATEDIFF('day', a, b)DATE_DIFF('day', a, b)Auto-transpiled
LISTAGG(col, ',')STRING_AGG(col, ',')Auto-transpiled
ARRAY_AGG(col)LIST(col)Auto-transpiled
ZEROIFNULL(a)COALESCE(a, 0)Auto-transpiled
IFNULL(a, b)COALESCE(a, b)Auto-transpiled

Type Mappings

Snowflake TypeHatiData TypeNotes
VARIANTJSONFull JSON path support
TIMESTAMP_NTZTIMESTAMPUTC assumed
TIMESTAMP_TZTIMESTAMPTZTimezone preserved
NUMBER(p, s)NUMERIC(p, s)Exact precision
TEXT / STRINGVARCHARUnlimited by default
BINARYBLOBByte-for-byte compatible

Construct Mappings

Snowflake ConstructHatiData Equivalent
FLATTEN(input => col)UNNEST(col)
OBJECT_CONSTRUCT('k', v)json_object('k', v)
GET_PATH(col, 'a.b')col->'a'->'b'
PARSE_JSON('...')'...'::JSON
TO_ARRAY(col)LIST_VALUE(col)
LATERAL FLATTENLATERAL UNNEST

Shadow Mode Validation

Shadow Mode lets you validate HatiData against your existing Snowflake environment before cutting over. No production traffic is affected.

# Step 1: Export your Snowflake query history (last 30 days)
hati shadow export-history \
--source snowflake \
--account myaccount.us-east-1 \
--days 30 \
--output query-history.jsonl

# Step 2: Upload and replay against HatiData
hati shadow replay \
--queries query-history.jsonl \
--connection postgresql://admin@localhost:5439/mydb \
--sample-rate 0.2

# Step 3: Generate compatibility report
hati shadow report --output shadow-report.html

The report shows query-by-query result comparison, latency comparison, and any transpilation errors. Target a 95%+ pass rate before cutover.

Step-by-Step Migration

Step 1: Export Schema

# Export DDL from Snowflake
snowsql -a myaccount -u myuser -d mydb -s myschema \
--query "SELECT GET_DDL('SCHEMA', 'myschema')" \
--output-format csv > schema.sql

# HatiData auto-transpiles the DDL on import
hati schema import --file schema.sql --target myschema

Step 2: Migrate Data via Parquet

Parquet is the recommended format for large tables — it preserves types and compresses well.

# Export tables from Snowflake as Parquet
snowsql -a myaccount -u myuser -d mydb \
--query "COPY INTO @~/exports/orders FROM orders FILE_FORMAT=(TYPE=PARQUET)"

# Push Parquet files to HatiData
hati push --source ./exports/ --schema myschema --format parquet

For smaller tables, CSV export works fine:

hati push --source ./exports/ --schema myschema --format csv

Step 3: Swap the Connection String

# Before: Snowflake connector
import snowflake.connector
conn = snowflake.connector.connect(
account="myaccount.us-east-1",
user="myuser",
password="...",
database="mydb",
schema="myschema",
)

# After: any Postgres client — same queries, no rewrites required
import psycopg2
conn = psycopg2.connect(
"postgresql://myuser:mypass@localhost:5439/mydb"
)

Your existing SQL continues to work. HatiData transpiles Snowflake SQL at query time, inside your VPC.

What You Gain

CapabilityLegacy Cloud WarehouseHatiData
Query latencySeconds (cold) / 100ms+ (warm)Sub-10ms (in-process query engine)
DeploymentVendor cloud onlyIn your VPC
Agent memoryNot supportedBuilt-in SQL + vector hybrid
Chain-of-thought ledgerNot supportedCryptographically hash-chained
Semantic triggersNot supportedCosine similarity evaluation
Branch isolationNot supportedPer-agent schema branching
Billing modelCompute credits per queryPer-agent, predictable
Wire protocolProprietaryPostgres (any client works)

Stay in the loop

Product updates, engineering deep-dives, and agent-native insights. No spam.