CSV to SQL

Generate SQL INSERT statements from CSV data

Try an example

CSV Input

SQL Output

Runs locally Β· Safe to paste secrets
SQL will appear here…
Table name:

What is CSV to SQL Conversion?

CSV to SQL conversion transforms comma-separated values into Structured Query Language statements that a relational database can execute. The most common output is a pair: a CREATE TABLE statement that defines columns, and one or more INSERT INTO statements that populate those columns with the rows from the CSV file.

CSV files follow the RFC 4180 specification. Each line is a record, fields are separated by a delimiter (usually a comma), and fields containing the delimiter or newlines are wrapped in double quotes. SQL, on the other hand, is the standard language for managing data in systems like PostgreSQL, MySQL, SQLite, and SQL Server. Converting between these two formats is one of the most frequent data import tasks in software development.

A proper CSV-to-SQL converter handles quoting, escapes single quotes inside values (doubling them to '' per SQL standard), maps column headers to valid SQL identifiers, and can optionally infer data types. Beyond basic escaping, a good converter normalises header names (replacing spaces and hyphens with underscores) and wraps the output in a transaction block, so an import failure rolls back cleanly instead of leaving partial data in the table. Without a converter, hand-writing INSERT statements for even a few hundred rows is error-prone and slow.

Why Use a CSV to SQL Converter?

Writing SQL INSERT statements by hand from spreadsheet data is tedious and invites syntax errors. A converter automates the repetitive parts so you can focus on schema design and data validation.

⚑
Generate SQL in seconds
Paste your CSV, get ready-to-run SQL output. No need to manually quote strings, escape apostrophes, or count columns. The conversion runs entirely in your browser.
πŸ”’
Keep data in your browser
Your CSV never leaves your machine. All parsing and SQL generation happens client-side, which means no server upload, no logging, and no third-party access to your dataset.
🎯
Produce correct SQL syntax
Single quotes inside values are escaped as '' per the SQL standard. Column names are sanitized to valid identifiers. The output is syntactically valid for PostgreSQL, MySQL, and SQLite.
πŸ“‹
Handle any CSV structure
The tool auto-detects delimiters (comma, semicolon, tab, pipe) and handles quoted fields with embedded commas or newlines, following RFC 4180 rules.

CSV to SQL Use Cases

Database Seeding
Convert a CSV fixture file into INSERT statements for development or staging databases. Useful in CI pipelines where you need repeatable seed data without an ORM.
Data Migration
Move data exported from one system (CRM, spreadsheet, legacy app) into a new relational database. Generate the SQL, review it, then run it in a transaction.
Backend API Development
Quickly populate a local PostgreSQL or MySQL instance with test data from a CSV. Faster than writing a migration script when you only need a one-time load.
QA and Test Automation
Generate SQL from CSV test fixtures to set up database state before integration tests. Pair with a teardown script to reset the table between test runs.
Data Analysis Prep
Load CSV datasets into SQLite for ad-hoc queries. SQLite reads SQL INSERT statements directly, making this a fast path from spreadsheet export to queryable data.
Learning SQL
Students can convert sample CSVs into SQL to practice SELECT, JOIN, and aggregate queries on real-looking data without setting up a schema from scratch.

SQL Data Type Reference

When converting CSV to SQL, every column defaults to TEXT because CSV has no type metadata. If you know the column types, you can replace TEXT in the CREATE TABLE output. This table shows the most common SQL types and when to use each.

TypeUsed ForNotes
TEXT / VARCHARStrings, mixed contentDefault safe choice; works in every SQL dialect
INTEGER / INTWhole numbers (age, count, ID)Use when column contains only digits with no decimals
REAL / FLOATDecimal numbers (price, rate)Needed for columns with dots like 19.99 or 3.14
DATEISO 8601 dates (2024-01-15)Requires consistent formatting; varies by database
BOOLEANtrue/false, 1/0, yes/noMySQL uses TINYINT(1); PostgreSQL has native BOOL
NUMERIC / DECIMALExact precision (currency)Specify scale: DECIMAL(10,2) for money columns
BLOB / BYTEABinary dataRarely needed in CSV imports; use for hex-encoded fields

SQL Dialect Comparison

SQL syntax varies between database engines. The generated INSERT statements use standard SQL that works in most systems, but certain features differ. This table summarizes the differences that matter most when importing CSV data.

FeaturePostgreSQLMySQLSQLiteSQL Server
Identifier quoting"col"`col`"col"[col]
String escape'''' or \'''''
INSERT syntaxINSERT INTOINSERT INTOINSERT INTOINSERT INTO
Batch INSERTVALUES (),()…VALUES (),()…VALUES (),()…max 1000 rows
Auto-incrementSERIALAUTO_INCREMENTAUTOINCREMENTIDENTITY(1,1)
Upsert / mergeON CONFLICTON DUPLICATE KEYON CONFLICTMERGE
NULL handlingIS NULLIS NULL / <=>IS NULLIS NULL
COPY from CSVCOPY … FROMLOAD DATA INFILE.importBULK INSERT

INSERT vs COPY: Choosing an Import Method

For small to medium datasets (under 10,000 rows), INSERT statements work well and are portable across every SQL database. For large imports, databases provide bulk-loading commands that bypass the SQL parser entirely.

INSERT INTO
Standard SQL. Works everywhere. Each row is parsed as a SQL statement, so overhead is higher for large datasets. Supports conditional logic (ON CONFLICT, ON DUPLICATE KEY). Best for seed data, small migrations, and cases where you need row-level control.
COPY / LOAD DATA
Database-specific bulk loader. PostgreSQL uses COPY, MySQL uses LOAD DATA INFILE, SQLite uses .import, and SQL Server uses BULK INSERT. Reads CSV directly, skipping the SQL parser. 10-100x faster for large files (100K+ rows). Requires file system access on the server or client.

Code Examples

These examples show how to convert CSV to SQL INSERT statements in different languages. Each handles single-quote escaping and column name sanitization.

JavaScript (Node.js)
// CSV β†’ SQL INSERT statements
const csv = `name,age,city
Alice,30,Berlin
Bob,25,Tokyo`

function csvToSql(csv, table = 'data') {
  const rows = csv.trim().split('\n').map(r => r.split(','))
  const [headers, ...data] = rows
  const cols = headers.map(h => h.trim().toLowerCase().replace(/\s+/g, '_'))

  const values = data.map(row =>
    '  (' + row.map(v => `'${v.replace(/'/g, "''").trim()}'`).join(', ') + ')'
  )

  return `INSERT INTO ${table} (${cols.join(', ')}) VALUES
${values.join(',\n')};`
}

console.log(csvToSql(csv, 'users'))
// β†’ INSERT INTO users (name, age, city) VALUES
//     ('Alice', '30', 'Berlin'),
//     ('Bob', '25', 'Tokyo');
Python
import csv
import io

csv_string = """name,age,city
Alice,30,Berlin
Bob,25,Tokyo"""

reader = csv.reader(io.StringIO(csv_string))
headers = [h.strip().lower().replace(' ', '_') for h in next(reader)]

table = 'users'
rows = list(reader)

# CREATE TABLE
col_defs = ', '.join(f'{h} TEXT' for h in headers)
print(f'CREATE TABLE {table} ({col_defs});')
# β†’ CREATE TABLE users (name TEXT, age TEXT, city TEXT);

# INSERT statements
for row in rows:
    vals = ', '.join(f"'{v.replace(chr(39), chr(39)*2)}'" for v in row)
    print(f"INSERT INTO {table} ({', '.join(headers)}) VALUES ({vals});")
# β†’ INSERT INTO users (name, age, city) VALUES ('Alice', '30', 'Berlin');
# β†’ INSERT INTO users (name, age, city) VALUES ('Bob', '25', 'Tokyo');
Go
package main

import (
	"encoding/csv"
	"fmt"
	"strings"
)

func csvToSQL(data, table string) string {
	r := csv.NewReader(strings.NewReader(data))
	records, _ := r.ReadAll()
	if len(records) < 2 {
		return ""
	}

	headers := records[0]
	var vals []string
	for _, row := range records[1:] {
		escaped := make([]string, len(row))
		for i, v := range row {
			escaped[i] = "'" + strings.ReplaceAll(v, "'", "''") + "'"
		}
		vals = append(vals, "  ("+strings.Join(escaped, ", ")+")")
	}

	return fmt.Sprintf("INSERT INTO %s (%s) VALUES\n%s;",
		table, strings.Join(headers, ", "), strings.Join(vals, ",\n"))
}

func main() {
	csv := "name,age,city\nAlice,30,Berlin\nBob,25,Tokyo"
	fmt.Println(csvToSQL(csv, "users"))
	// β†’ INSERT INTO users (name, age, city) VALUES
	//     ('Alice', '30', 'Berlin'),
	//     ('Bob', '25', 'Tokyo');
}
CLI (sqlite3 / psql)
# SQLite: import CSV directly into a table
sqlite3 mydb.db <<EOF
.mode csv
.import data.csv users
SELECT * FROM users LIMIT 5;
EOF

# PostgreSQL: COPY from CSV file (server-side)
psql -c "COPY users FROM '/path/to/data.csv' WITH (FORMAT csv, HEADER true);"

# PostgreSQL: \copy from CSV (client-side, no superuser needed)
psql -c "\copy users FROM 'data.csv' WITH (FORMAT csv, HEADER true);"

# MySQL: LOAD DATA from CSV
mysql -e "LOAD DATA INFILE '/path/to/data.csv' INTO TABLE users
  FIELDS TERMINATED BY ',' ENCLOSED BY '"'
  LINES TERMINATED BY '\n' IGNORE 1 ROWS;"

Frequently Asked Questions

How are single quotes in CSV values handled in SQL output?
Every single quote (') in a CSV value is doubled to '' in the SQL output. This is the SQL standard escape sequence. For example, a value like "O'Brien" becomes 'O''Brien' in the INSERT statement. This works in PostgreSQL, MySQL, SQLite, and SQL Server.
Can I convert CSV to SQL for a specific database like MySQL or PostgreSQL?
The generated INSERT statements use standard SQL syntax that both MySQL and PostgreSQL accept. The main difference is identifier quoting: PostgreSQL uses double quotes, MySQL uses backticks. For basic INSERT operations, the output works in either without modification.
What happens if my CSV has no header row?
The converter treats the first row as column headers. If your CSV lacks headers, add a header row before converting, or the first data row will become the column names in the CREATE TABLE statement. Most converters, including this one, require a header row.
Is there a row limit for CSV to SQL conversion?
Since the conversion runs in your browser, the practical limit depends on your device's memory. Files with tens of thousands of rows convert without issues. For very large files (500K+ rows), consider using a database's native COPY or LOAD DATA command instead of INSERT statements.
Why does the output use TEXT for all columns instead of INTEGER or DATE?
CSV is a plain text format with no type metadata. The converter uses TEXT as a safe default to avoid incorrect type inference. You can change the column types in the generated CREATE TABLE statement after reviewing your data. INTEGER for numeric columns and DATE for date columns are common adjustments.
How should I handle CSV files with semicolons or tabs as delimiters?
This tool auto-detects the delimiter by analyzing the first row of your CSV. It checks for commas, semicolons, tabs, and pipe characters, then uses whichever appears most frequently. European-format CSVs that use semicolons work without any configuration change.
Is the generated SQL safe from injection if I run it directly?
The output escapes single quotes, which prevents accidental syntax errors. However, if your CSV data comes from an untrusted source, treat the generated SQL the same way you would treat any unvalidated input. Review the output before running it against a production database. For programmatic imports, parameterized queries are always safer than string concatenation.