CSV to SQL
Generate SQL INSERT statements from CSV data
CSV Input
SQL Output
What is CSV to SQL Conversion?
CSV to SQL conversion transforms comma-separated values into Structured Query Language statements that a relational database can execute. The most common output is a pair: a CREATE TABLE statement that defines columns, and one or more INSERT INTO statements that populate those columns with the rows from the CSV file.
CSV files follow the RFC 4180 specification. Each line is a record, fields are separated by a delimiter (usually a comma), and fields containing the delimiter or newlines are wrapped in double quotes. SQL, on the other hand, is the standard language for managing data in systems like PostgreSQL, MySQL, SQLite, and SQL Server. Converting between these two formats is one of the most frequent data import tasks in software development.
A proper CSV-to-SQL converter handles quoting, escapes single quotes inside values (doubling them to '' per SQL standard), maps column headers to valid SQL identifiers, and can optionally infer data types. Beyond basic escaping, a good converter normalises header names (replacing spaces and hyphens with underscores) and wraps the output in a transaction block, so an import failure rolls back cleanly instead of leaving partial data in the table. Without a converter, hand-writing INSERT statements for even a few hundred rows is error-prone and slow.
Why Use a CSV to SQL Converter?
Writing SQL INSERT statements by hand from spreadsheet data is tedious and invites syntax errors. A converter automates the repetitive parts so you can focus on schema design and data validation.
CSV to SQL Use Cases
SQL Data Type Reference
When converting CSV to SQL, every column defaults to TEXT because CSV has no type metadata. If you know the column types, you can replace TEXT in the CREATE TABLE output. This table shows the most common SQL types and when to use each.
| Type | Used For | Notes |
|---|---|---|
| TEXT / VARCHAR | Strings, mixed content | Default safe choice; works in every SQL dialect |
| INTEGER / INT | Whole numbers (age, count, ID) | Use when column contains only digits with no decimals |
| REAL / FLOAT | Decimal numbers (price, rate) | Needed for columns with dots like 19.99 or 3.14 |
| DATE | ISO 8601 dates (2024-01-15) | Requires consistent formatting; varies by database |
| BOOLEAN | true/false, 1/0, yes/no | MySQL uses TINYINT(1); PostgreSQL has native BOOL |
| NUMERIC / DECIMAL | Exact precision (currency) | Specify scale: DECIMAL(10,2) for money columns |
| BLOB / BYTEA | Binary data | Rarely needed in CSV imports; use for hex-encoded fields |
SQL Dialect Comparison
SQL syntax varies between database engines. The generated INSERT statements use standard SQL that works in most systems, but certain features differ. This table summarizes the differences that matter most when importing CSV data.
| Feature | PostgreSQL | MySQL | SQLite | SQL Server |
|---|---|---|---|---|
| Identifier quoting | "col" | `col` | "col" | [col] |
| String escape | '' | '' or \' | '' | '' |
| INSERT syntax | INSERT INTO | INSERT INTO | INSERT INTO | INSERT INTO |
| Batch INSERT | VALUES (),()β¦ | VALUES (),()β¦ | VALUES (),()β¦ | max 1000 rows |
| Auto-increment | SERIAL | AUTO_INCREMENT | AUTOINCREMENT | IDENTITY(1,1) |
| Upsert / merge | ON CONFLICT | ON DUPLICATE KEY | ON CONFLICT | MERGE |
| NULL handling | IS NULL | IS NULL / <=> | IS NULL | IS NULL |
| COPY from CSV | COPY β¦ FROM | LOAD DATA INFILE | .import | BULK INSERT |
INSERT vs COPY: Choosing an Import Method
For small to medium datasets (under 10,000 rows), INSERT statements work well and are portable across every SQL database. For large imports, databases provide bulk-loading commands that bypass the SQL parser entirely.
Code Examples
These examples show how to convert CSV to SQL INSERT statements in different languages. Each handles single-quote escaping and column name sanitization.
// CSV β SQL INSERT statements
const csv = `name,age,city
Alice,30,Berlin
Bob,25,Tokyo`
function csvToSql(csv, table = 'data') {
const rows = csv.trim().split('\n').map(r => r.split(','))
const [headers, ...data] = rows
const cols = headers.map(h => h.trim().toLowerCase().replace(/\s+/g, '_'))
const values = data.map(row =>
' (' + row.map(v => `'${v.replace(/'/g, "''").trim()}'`).join(', ') + ')'
)
return `INSERT INTO ${table} (${cols.join(', ')}) VALUES
${values.join(',\n')};`
}
console.log(csvToSql(csv, 'users'))
// β INSERT INTO users (name, age, city) VALUES
// ('Alice', '30', 'Berlin'),
// ('Bob', '25', 'Tokyo');import csv
import io
csv_string = """name,age,city
Alice,30,Berlin
Bob,25,Tokyo"""
reader = csv.reader(io.StringIO(csv_string))
headers = [h.strip().lower().replace(' ', '_') for h in next(reader)]
table = 'users'
rows = list(reader)
# CREATE TABLE
col_defs = ', '.join(f'{h} TEXT' for h in headers)
print(f'CREATE TABLE {table} ({col_defs});')
# β CREATE TABLE users (name TEXT, age TEXT, city TEXT);
# INSERT statements
for row in rows:
vals = ', '.join(f"'{v.replace(chr(39), chr(39)*2)}'" for v in row)
print(f"INSERT INTO {table} ({', '.join(headers)}) VALUES ({vals});")
# β INSERT INTO users (name, age, city) VALUES ('Alice', '30', 'Berlin');
# β INSERT INTO users (name, age, city) VALUES ('Bob', '25', 'Tokyo');package main
import (
"encoding/csv"
"fmt"
"strings"
)
func csvToSQL(data, table string) string {
r := csv.NewReader(strings.NewReader(data))
records, _ := r.ReadAll()
if len(records) < 2 {
return ""
}
headers := records[0]
var vals []string
for _, row := range records[1:] {
escaped := make([]string, len(row))
for i, v := range row {
escaped[i] = "'" + strings.ReplaceAll(v, "'", "''") + "'"
}
vals = append(vals, " ("+strings.Join(escaped, ", ")+")")
}
return fmt.Sprintf("INSERT INTO %s (%s) VALUES\n%s;",
table, strings.Join(headers, ", "), strings.Join(vals, ",\n"))
}
func main() {
csv := "name,age,city\nAlice,30,Berlin\nBob,25,Tokyo"
fmt.Println(csvToSQL(csv, "users"))
// β INSERT INTO users (name, age, city) VALUES
// ('Alice', '30', 'Berlin'),
// ('Bob', '25', 'Tokyo');
}# SQLite: import CSV directly into a table sqlite3 mydb.db <<EOF .mode csv .import data.csv users SELECT * FROM users LIMIT 5; EOF # PostgreSQL: COPY from CSV file (server-side) psql -c "COPY users FROM '/path/to/data.csv' WITH (FORMAT csv, HEADER true);" # PostgreSQL: \copy from CSV (client-side, no superuser needed) psql -c "\copy users FROM 'data.csv' WITH (FORMAT csv, HEADER true);" # MySQL: LOAD DATA from CSV mysql -e "LOAD DATA INFILE '/path/to/data.csv' INTO TABLE users FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n' IGNORE 1 ROWS;"