Convert any CSV into CREATE TABLE DDL and INSERT statements instantly. Paste your CSV data, pick a target dialect (Snowflake, PostgreSQL, BigQuery, or ANSI SQL), and get a complete scaffold with inferred column types. Parser follows RFC 4180 for quoted fields, escaped quotes, and embedded commas. All processing runs in your browser.
Every column is scanned across all non-empty values. If all values match an integer pattern, the column is typed as NUMBER(38,0) for Snowflake, BIGINT for Postgres, or INT64 for BigQuery. Decimals get NUMBER(38,10) / NUMERIC. ISO dates (YYYY-MM-DD) become DATE; ISO timestamps become TIMESTAMP. Booleans (true/false) become BOOLEAN. Any value that breaks a numeric pattern downgrades the column to VARCHAR / STRING / TEXT - with length picked by the longest observed value.
Generated INSERT statements are ideal for seeding test data, dbt seeds, fixtures, and small lookup tables (under ~1,000 rows). For production ingestion of larger CSVs, use bulk-load commands - they are 100-1000x faster and cheaper than INSERT-per-row. Use this tool to generate just the CREATE TABLE DDL, then:
COPY INTO my_table FROM @stage/file.csv FILE_FORMAT = (TYPE = CSV SKIP_HEADER = 1)\COPY my_table FROM 'file.csv' CSV HEADERbq load --source_format=CSV --skip_leading_rows=1 dataset.table gs://bucket/file.csvCOPY my_table FROM 's3://bucket/file.csv' IAM_ROLE '...' CSV IGNOREHEADER 1The CSV parser and SQL generator run 100% in your browser. No data is uploaded, nothing is stored, and the tool works offline after first load. Paste internal or regulated data with confidence - nothing leaves your machine.
See the JSON to SQL DDL Generator for JSON inputs, the SQL Formatter to pretty-print the output, and the dbt Schema.yml Generator to turn your new CREATE TABLE into a full dbt staging scaffold. For Snowflake loading patterns see Snowflake SQL reference.
← Back to Home