JSON to CSV conversion transforms structured data from JSON format into comma-separated values, a tabular format that spreadsheets, databases, and analytics tools consume natively. JSON stores data as nested objects and arrays with no fixed schema. CSV stores data as rows and columns with a header row defining field names. The conversion is how API data reaches spreadsheets and databases.
CSV (Comma-Separated Values) is defined by RFC 4180. Each line is one record, and fields within a line are separated by a delimiter character, most commonly a comma. Fields containing the delimiter, double quotes, or line breaks must be enclosed in double quotes, with internal quotes escaped by doubling them. This escaping is the primary source of bugs when writing a JSON-to-CSV converter by hand.
The conversion is straightforward when your input is a flat array of objects with consistent keys. Each object becomes one row, and each unique key becomes a column header. Nested objects and arrays require a flattening step, and inconsistent keys across objects require a strategy for handling missing fields β typically leaving the cell empty. A reliable converter handles all of these edge cases automatically.
Why Convert JSON to CSV?
APIs return JSON, but spreadsheets, SQL databases, and BI tools expect tabular data. Converting JSON to CSV lets you move data between these systems without writing custom import scripts.
β‘
Instant Browser Conversion
Paste your JSON and download the file immediately. No server upload, no file size limits from external APIs, no waiting for processing queues.
π
Privacy-First Processing
Your data stays in your browser. The conversion runs entirely in JavaScript on your device. Database exports, user records, and financial data never leave your machine.
π
Multiple Delimiter Support
Choose between comma, semicolon, tab, or pipe delimiters. Use semicolons for European locale spreadsheets, tabs for TSV files, or pipes for legacy system imports.
π
No Account Required
Open the page and convert. No sign-up, no API key, no CLI installation. Works on any device with a modern browser.
JSON to CSV Use Cases
API Data Export for Spreadsheets
REST APIs return JSON. Product managers and analysts need that data in Excel or Google Sheets. Convert the API response to CSV and open it directly in any spreadsheet application.
Database Bulk Import
PostgreSQL COPY, MySQL LOAD DATA, and SQLite .import all accept CSV. Convert your dataset to tabular format for fast bulk loading without writing a custom import script.
ETL Pipeline Prototyping
ETL pipelines often have intermediate outputs that are hard to inspect as raw JSON. Convert a step's output to CSV and open it in a spreadsheet to verify transforms before wiring up the full pipeline.
QA Test Data Preparation
QA engineers generate test fixtures as JSON, but many test frameworks and data-driven testing tools accept CSV for parameterized test inputs. Convert fixtures to CSV without manual reformatting.
Log Analysis and Reporting
Structured JSON logs from applications and cloud services can be converted to CSV for import into BI tools like Tableau, Power BI, or Looker for visualization and reporting.
Academic Data Processing
Students and researchers working with open data APIs receive JSON responses. Converting to CSV allows analysis in R, pandas, SPSS, or Excel without writing parsing code.
CSV Delimiter Reference
The delimiter character separates fields within each row. Comma is the most common, but other delimiters are standard in specific contexts. Choosing the wrong delimiter causes fields to merge or split incorrectly when the file is opened.
Delimiter
Character
Extension
When to Use
Comma
,
.csv
Default for most spreadsheets and databases
Semicolon
;
.csv
Standard in locales where comma is a decimal separator (DE, FR, BR)
Tab
\t
.tsv
Avoids escaping when field values contain commas or semicolons
Pipe
|
.csv
Used in fixed-width legacy systems and some ETL pipelines
Handling Nested JSON in CSV
CSV is a flat format with no native way to represent nested objects or arrays. When your JSON contains nested structures, the converter must flatten them into columns. There are several strategies, and the right choice depends on how the CSV will be consumed.
Dot-Notation Flattening
Nested keys are joined with dots: {"address": {"city": "Berlin"}} becomes a column named address.city with value Berlin. This is the most common approach and works well with tools that support nested field references.
Array Index Columns
Arrays are expanded into numbered columns: {"tags": ["a", "b"]} becomes tags.0 = a, tags.1 = b. This preserves all values but creates many columns when arrays are large.
JSON String Fallback
Complex nested values are serialized as JSON strings within the CSV cell: the cell contains the raw JSON text. This preserves the full structure but requires the consumer to parse the cell value.
Ignoring Nested Fields
Some converters drop nested objects and arrays entirely, keeping only scalar (string, number, boolean, null) fields. This produces clean CSV but loses data. Useful only when you know the nested fields are not needed.
Code Examples
Converting JSON to CSV programmatically requires handling header extraction, field quoting, and delimiter escaping. Most languages have built-in or standard library support for CSV writing.
# Using jq to convert JSON array to CSV
echo '[{"name":"Alice","age":30},{"name":"Bob","age":25}]' | \
jq -r '(.[0] | keys_unsorted) as $k | $k, (.[] | [.[$k[]]] ) | @csv'
# β "name","age"
# β "Alice",30
# β "Bob",25
# Using Miller (mlr) for streaming conversion
echo '[{"name":"Alice","age":30}]' | mlr --json --ocsv cat
# β name,age
# β Alice,30
Go
package main
import (
"encoding/csv"
"encoding/json"
"fmt"
"os"
)
func main() {
jsonStr := `[{"name":"Alice","age":30},{"name":"Bob","age":25}]`
var data []map[string]interface{}
json.Unmarshal([]byte(jsonStr), &data)
w := csv.NewWriter(os.Stdout)
// Write header
headers := []string{"name", "age"}
w.Write(headers)
// Write rows
for _, row := range data {
record := make([]string, len(headers))
for i, h := range headers {
record[i] = fmt.Sprintf("%v", row[h])
}
w.Write(record)
}
w.Flush()
// β name,age
// β Alice,30
// β Bob,25
}
Frequently Asked Questions
What JSON structure does this converter expect?
The converter expects a JSON array of objects, like [{"name":"Alice"},{"name":"Bob"}]. Each object in the array becomes one row in the output, and the object keys become column headers. A single JSON object (not wrapped in an array) is treated as a one-row table.
How are nested objects and arrays handled?
Nested values are flattened using dot notation. For example, {"address":{"city":"Berlin"}'} produces a column named address.city. Arrays are expanded into indexed columns (tags.0, tags.1). This preserves data while keeping the output flat.
What happens when objects have different keys?
The converter collects all unique keys across all objects in the array and uses them as column headers. Objects missing a key get an empty cell for that column. No data is lost, and the column order follows the order keys first appear.
Can I use a semicolon or tab instead of a comma?
Yes. The tool supports comma, semicolon, tab, and pipe delimiters. Use semicolons when your data or locale uses commas as decimal separators (common in German, French, and Brazilian spreadsheets). Use tabs for TSV files consumed by Unix tools.
Is the conversion lossless?
For flat JSON arrays with consistent scalar values, yes. The output file contains the same data and can be converted back to identical JSON. For nested structures, flattening changes the shape of the data. Array values serialized into indexed columns or JSON strings can be reconstructed, but the round-trip requires knowing the original structure.
How large a JSON file can I convert?
The tool runs in your browser and processes data in memory β files up to 10β20 MB convert without issues on modern devices. For files larger than that, use a CLI tool like jq, Miller, or a Python script with the csv module, which process data as a stream.
Is it safe to paste sensitive data into this tool?
Yes. All processing happens in your browser using JavaScript. No data is sent to any server. You can confirm this by opening your browser's developer tools and checking the Network tab during conversion.