CSV to JSON JavaScript β€” Converter + Code Examples

Β·Front-end & Node.js DeveloperΒ·Reviewed bySophie LaurentΒ·Published

Use the free online CSV to JSON directly in your browser β€” no install required.

Try CSV to JSON Online β†’

Most CSV data I encounter arrives as a flat string from a file upload, a database export, or an API that still speaks the 1970s format. To convert CSV to JSON in JavaScript, you need two things the language gives you for free: string splitting to parse the rows and JSON.stringify() to serialize the result. No npm packages required for the basics β€” this guide covers the full pipeline from a reusable csvToJson() utility through PapaParse and Node.js file I/O. For quick conversions without code, the online CSV to JSON converter handles it instantly. All examples target Node.js 18+ and modern browsers.

  • βœ“Split CSV by newline, extract headers from row 0, map remaining rows to objects, then JSON.stringify(array, null, 2) for pretty output.
  • βœ“JSON.stringify() produces a string; JSON.parse() converts it back to a live JavaScript array β€” know which one you have before operating on it.
  • βœ“Map instances do not serialize to JSON automatically β€” call Object.fromEntries(map) first.
  • βœ“For CSV with quoted fields, commas inside values, or newlines in cells, use PapaParse or csv-parse instead of manual splitting.
  • βœ“csvtojson (npm) handles type coercion, streaming, and RFC 4180 edge cases in a single call.

What is CSV to JSON Conversion?

CSV to JSON conversion transforms a flat, comma-delimited text format into a structured array of objects where each row becomes a JavaScript object keyed by the column headers. The CSV format has no data types β€” everything is a string. JSON adds structure, nesting, and explicit types (numbers, booleans, null). This conversion is the first step in almost every data pipeline that starts with a spreadsheet export, a legacy system dump, or a user-uploaded file. The underlying data stays the same; the format changes from position-based columns to named properties.

Before Β· json
After Β· json
name,email,role,active
Sarah Chen,schen@nexuslabs.io,Engineering Lead,true
Raj Patel,rpatel@nexuslabs.io,Product Manager,true
[
  {
    "name": "Sarah Chen",
    "email": "schen@nexuslabs.io",
    "role": "Engineering Lead",
    "active": "true"
  },
  {
    "name": "Raj Patel",
    "email": "rpatel@nexuslabs.io",
    "role": "Product Manager",
    "active": "true"
  }
]

csvToJson() β€” Building a Reusable Conversion Function

The full CSV-to-JSON pipeline in JavaScript breaks down into three steps: split the CSV string by newline to get rows, extract headers from the first row with split(','), then map each remaining row to a plain JavaScript object where the keys come from the headers and the values come from the corresponding column positions. The final call to JSON.stringify() converts that array of objects to a JSON string. Here is a minimal working version:

JavaScript β€” minimal csvToJson
function csvToJson(csv) {
  const lines = csv.trim().split('\n')
  const headers = lines[0].split(',').map(h => h.trim())

  const rows = lines.slice(1)
    .filter(line => line.trim() !== '')
    .map(line => {
      const values = line.split(',')
      return Object.fromEntries(
        headers.map((header, i) => [header, values[i]?.trim()])
      )
    })

  return JSON.stringify(rows, null, 2)
}

const csv = `server,port,region,status
api-gateway,8080,us-east-1,healthy
auth-service,8443,eu-west-1,degraded
payments-api,9090,ap-south-1,healthy`

console.log(csvToJson(csv))
// [
//   { "server": "api-gateway", "port": "8080", "region": "us-east-1", "status": "healthy" },
//   { "server": "auth-service", "port": "8443", "region": "eu-west-1", "status": "degraded" },
//   { "server": "payments-api", "port": "9090", "region": "ap-south-1", "status": "healthy" }
// ]

That function handles the basics: trailing newlines, empty lines, whitespace around values. Every CSV field comes through as a string. Notice that port is "8080" (a string), not 8080 (a number). If you need proper types in the JSON output, you have to coerce them yourself. Here is an extended version with type detection:

JavaScript β€” csvToJson with type coercion
function coerceValue(val) {
  if (val === undefined || val === '') return null
  if (val === 'true') return true
  if (val === 'false') return false
  if (val === 'null') return null
  const num = Number(val)
  if (!isNaN(num) && val.trim() !== '') return num
  return val
}

function csvToJson(csv, { coerce = false } = {}) {
  const lines = csv.trim().split('\n')
  const headers = lines[0].split(',').map(h => h.trim())

  const rows = lines.slice(1)
    .filter(line => line.trim() !== '')
    .map(line => {
      const values = line.split(',')
      return Object.fromEntries(
        headers.map((header, i) => [
          header,
          coerce ? coerceValue(values[i]?.trim()) : values[i]?.trim()
        ])
      )
    })

  return JSON.stringify(rows, null, 2)
}

const csv = `endpoint,port,max_connections,debug
/api/v2/orders,8443,500,true
/api/v2/health,8080,100,false`

console.log(csvToJson(csv, { coerce: true }))
// [
//   { "endpoint": "/api/v2/orders", "port": 8443, "max_connections": 500, "debug": true },
//   { "endpoint": "/api/v2/health", "port": 8080, "max_connections": 100, "debug": false }
// ]

The coerce flag is opt-in because automatic type detection can backfire β€” a field like a zip code ("07302") loses its leading zero when converted to a number. Keep coercion off by default and enable it only when you control the schema. Quick note: JSON.stringify() accepts a third space argument for indentation. Pass 2 for two spaces, 4 for four, or "\t" for tabs. Omit it entirely for compact single-line output β€” useful when sending the JSON string as an API request body where whitespace just wastes bandwidth.

Note:A raw CSV string is not JSON. Calling JSON.parse() directly on CSV text throws a SyntaxError. You must first convert the CSV to JavaScript objects with your csvToJson() function, which internally calls JSON.stringify() to produce the actual JSON string.

Handling Maps, Dates, and Custom Objects from CSV Data

Not every CSV conversion ends with a flat array of plain objects. Sometimes you need to build a Map from header-value pairs, parse date strings into Date objects, or attach computed properties before serialization. JavaScript has a quirk that trips people up: Map instances do not serialize with JSON.stringify(). You get an empty object. The fix is Object.fromEntries() to convert the Map back to a plain object before stringifying.

The reason Map serializes to {} is that JSON.stringify() iterates over an object's own enumerable properties. A Map stores its entries in an internal slot, not as enumerable properties on the object itself, so the serializer sees an object with no keys. The Map prototype also lacks a toJSON() method, which is the hook JSON.stringify() calls first on any value before deciding how to serialize it. If a value has toJSON(), the method's return value is what gets serialized β€” not the object itself. This is why Date objects serialize correctly: Date.prototype.toJSON returns an ISO 8601 string, so JSON.stringify(new Date()) produces a quoted timestamp rather than an empty object. Understanding this hook lets you define the same behavior on your own classes β€” as shown in the EmployeeRecord example below β€” to control exactly which CSV-derived fields appear in the final JSON output.

Converting a Map to JSON

JavaScript β€” Map to JSON via Object.fromEntries()
// Build a Map from CSV header→value pairs
const headers = ['server', 'port', 'region']
const values = ['payments-api', '9090', 'ap-south-1']
const rowMap = new Map(headers.map((h, i) => [h, values[i]]))

// Map does NOT serialize directly
console.log(JSON.stringify(rowMap))
// "{}"  β€” empty object, data lost!

// Convert to plain object first
const rowObj = Object.fromEntries(rowMap)
console.log(JSON.stringify(rowObj, null, 2))
// {
//   "server": "payments-api",
//   "port": "9090",
//   "region": "ap-south-1"
// }

Date Strings and toJSON()

CSV date fields arrive as strings. If you parse them into Date objects during processing, those Dates serialize correctly because Date has a built-in toJSON() method that returns an ISO 8601 string. You can also define custom toJSON() on your own classes to control which CSV columns appear in the serialized output β€” for example, omitting internal tracking fields like _rowIndex.

JavaScript β€” toJSON() for custom serialization
class EmployeeRecord {
  constructor(csvRow) {
    this._rowIndex = csvRow._rowIndex  // internal, not for JSON
    this.employeeId = csvRow.employee_id
    this.name = csvRow.name
    this.hiredAt = new Date(csvRow.hired_date)
    this.salary = Number(csvRow.salary)
  }

  toJSON() {
    // Only expose fields we want in the JSON output
    return {
      employee_id: this.employeeId,
      name: this.name,
      hired_at: this.hiredAt,  // Date.toJSON() β†’ ISO string automatically
      salary: this.salary,
    }
  }
}

const csvRow = {
  _rowIndex: 42,
  employee_id: 'EMP-2847',
  name: 'Sarah Chen',
  hired_date: '2024-03-15',
  salary: '128000',
}

const record = new EmployeeRecord(csvRow)
console.log(JSON.stringify(record, null, 2))
// {
//   "employee_id": "EMP-2847",
//   "name": "Sarah Chen",
//   "hired_at": "2024-03-15T00:00:00.000Z",
//   "salary": 128000
// }
// Note: _rowIndex is excluded, salary is a number, date is ISO format

Reviver for Date Deserialization

After writing CSV-derived JSON to a file or sending it over the network, JSON.parse() gives you back plain objects β€” the Date objects become strings again. Use a reviver function to convert ISO 8601 strings back into Date objects during parsing:

JavaScript β€” reviver to reconstruct Date objects
const jsonString = '{"employee_id":"EMP-2847","name":"Sarah Chen","hired_at":"2024-03-15T00:00:00.000Z","salary":128000}'

const isoDatePattern = /^\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}/

const parsed = JSON.parse(jsonString, (key, value) => {
  if (typeof value === 'string' && isoDatePattern.test(value)) {
    return new Date(value)
  }
  return value
})

console.log(parsed.hired_at instanceof Date)  // true
console.log(parsed.hired_at.getFullYear())    // 2024
Warning:JSON.stringify() returns undefined (not a string) for values that contain functions, Symbols, or undefined properties. If your CSV-derived objects accidentally pick up an undefined value from a missing column, that property silently disappears from the JSON output. Always default missing values to null instead.

JSON.stringify() Parameters Reference

The function signature is JSON.stringify(value, replacer?, space?). The replacer and space arguments are both optional β€” pass null for the replacer when you only need indentation.

Parameter
Type
Default
Description
value
any
(required)
The JavaScript value to serialize β€” object, array, string, number, boolean, or null
replacer
Function | Array | null
undefined
Filter or transform values during serialization. Array whitelists property names; function receives (key, value) pairs
space
number | string
undefined
Indentation: number 0–10 for spaces, string (e.g. "\t") for custom indent. Omit for compact single-line output

JSON.parse() parameters:

Parameter
Type
Default
Description
text
string
(required)
The JSON string to parse β€” must be valid JSON or a SyntaxError is thrown
reviver
Function | undefined
undefined
Called for each key-value pair. Return value replaces the original; return undefined to delete the property

JSON.parse() β€” Consuming the JSON Output

Once you have a JSON string from csvToJson(), the next step is usually to parse it back into a live JavaScript array for filtering, mapping, or feeding into an API. The difference between a JSON string (typeof === "string") and a JavaScript object matters. You cannot call .filter() or access [0].name on a string β€” you need JSON.parse() first. This round-trip (stringify then parse) also works as a validation technique: if your CSV conversion produced something that is not valid JSON, parse will throw. The optional reviver argument lets you transform each key-value pair during parsing β€” useful for restoring Date objects from ISO strings or renaming keys without a separate pass.

JavaScript β€” parse JSON output and query rows
const csv = `endpoint,method,avg_latency_ms,error_rate
/api/v2/orders,POST,342,0.02
/api/v2/health,GET,12,0.00
/api/v2/payments,POST,890,0.15
/api/v2/users,GET,45,0.01`

// Step 1: convert CSV to JSON string
const jsonString = csvToJson(csv, { coerce: true })

// Step 2: parse back to JavaScript array
const endpoints = JSON.parse(jsonString)

// Verify it is an array
console.log(Array.isArray(endpoints))  // true

// Filter high-latency endpoints
const slow = endpoints.filter(ep => ep.avg_latency_ms > 200)
console.log(slow.map(ep => ep.endpoint))
// ["/api/v2/orders", "/api/v2/payments"]

// Destructure the first row
const [first, ...rest] = endpoints
console.log(first.endpoint)  // "/api/v2/orders"
console.log(rest.length)     // 3

A safe wrapper for JSON.parse() is useful when validating conversion output before downstream processing. If the CSV conversion produces malformed JSON for any reason (truncated input, encoding errors), this catches it without crashing:

JavaScript β€” safe parse wrapper
function safeParse(jsonString) {
  try {
    return { data: JSON.parse(jsonString), error: null }
  } catch (err) {
    return { data: null, error: err.message }
  }
}

// Valid output
const result = safeParse(csvToJson(csv))
if (result.error) {
  console.error('CSV conversion produced invalid JSON:', result.error)
} else {
  console.log(`Parsed ${result.data.length} rows`)
}

// Accidentally passing raw CSV to JSON.parse β€” this fails
const bad = safeParse('name,email\nSarah,schen@nexuslabs.io')
console.log(bad.error)  // "Unexpected token 'a', "name,email"... is not valid JSON"

Reviver for Key Renaming and Validation

The reviver function receives every key-value pair during parsing, from the innermost properties outward. Returning undefined for a key removes it from the result entirely; returning a different value replaces it. The reviver is useful for renaming headers (camelCase to snake_case), stripping internal fields, or checking required columns exist. It is called with the root value last (empty string key), which is where you throw if the result is not an array.

JavaScript β€” reviver for key renaming and shape validation
const jsonString = csvToJson(`employeeId,firstName,hiredDate
EMP-2847,Sarah,2024-03-15
EMP-3012,Raj,2023-11-01`, { coerce: false })

const camelToSnake = str => str.replace(/[A-Z]/g, c => '_' + c.toLowerCase())

const employees = JSON.parse(jsonString, function(key, value) {
  // Root value β€” validate shape
  if (key === '') {
    if (!Array.isArray(value)) throw new Error('Expected JSON array from CSV')
    return value
  }
  // Rename camelCase header keys to snake_case
  if (typeof value === 'object' && value !== null && !Array.isArray(value)) {
    return Object.fromEntries(
      Object.entries(value).map(([k, v]) => [camelToSnake(k), v])
    )
  }
  return value
})

console.log(employees[0])
// { employee_id: 'EMP-2847', first_name: 'Sarah', hired_date: '2024-03-15' }

Convert CSV from a File and API Response

The two places CSV data actually comes from in production: files on disk and HTTP responses. Both scenarios need error handling because the input is external and uncontrolled.

Read CSV File, Convert, Write JSON

Node.js 18+ β€” file conversion
import { readFileSync, writeFileSync } from 'node:fs'

function csvToJsonFromFile(inputPath, outputPath) {
  let csvText
  try {
    csvText = readFileSync(inputPath, 'utf8')
  } catch (err) {
    throw new Error(`Failed to read ${inputPath}: ${err.message}`)
  }

  const lines = csvText.trim().split('\n')
  if (lines.length < 2) {
    throw new Error(`${inputPath} has no data rows (only ${lines.length} line)`)
  }

  const headers = lines[0].split(',').map(h => h.trim())
  const rows = lines.slice(1)
    .filter(line => line.trim() !== '')
    .map(line => {
      const values = line.split(',')
      return Object.fromEntries(headers.map((h, i) => [h, values[i]?.trim()]))
    })

  const jsonOutput = JSON.stringify(rows, null, 2)
  writeFileSync(outputPath, jsonOutput, 'utf8')
  console.log(`Converted ${rows.length} rows β†’ ${outputPath}`)
  return rows
}

// Usage
const data = csvToJsonFromFile('inventory.csv', 'inventory.json')
console.log(data[0])
// { sku: "WDG-2847", warehouse: "us-east-1", quantity: "150", ... }

Fetch CSV from an API Endpoint

Node.js 18+ β€” API response conversion
async function fetchCsvAsJson(url) {
  const response = await fetch(url)
  if (!response.ok) {
    throw new Error(`HTTP ${response.status}: ${response.statusText}`)
  }

  const contentType = response.headers.get('content-type') || ''
  if (!contentType.includes('text/csv') && !contentType.includes('text/plain')) {
    console.warn(`Unexpected content-type: ${contentType}`)
  }

  const csvText = await response.text()
  const lines = csvText.trim().split('\n')
  const headers = lines[0].split(',').map(h => h.trim())
  const rows = lines.slice(1)
    .filter(line => line.trim() !== '')
    .map(line => {
      const values = line.split(',')
      return Object.fromEntries(headers.map((h, i) => [h, values[i]?.trim()]))
    })

  return rows
}

// Example: fetch exchange rate CSV from a data provider
try {
  const rates = await fetchCsvAsJson('https://data.ecb.internal/rates/daily.csv')
  console.log(JSON.stringify(rates.slice(0, 3), null, 2))
  // Send as JSON to downstream service
  await fetch('https://api.internal/v2/rates', {
    method: 'POST',
    headers: { 'Content-Type': 'application/json' },
    body: JSON.stringify({ data: rates }),
  })
} catch (err) {
  console.error('Rate sync failed:', err.message)
}
Note:The replacer argument in JSON.stringify() lets you whitelist specific columns from the CSV. Pass an array of header names to include only those fields: JSON.stringify(rows, ['name', 'email', 'department']). Properties not in the array are silently excluded from the output.

Command-Line CSV to JSON Conversion

Node.js can run inline scripts, and there are dedicated CLI tools that handle CSV-to-JSON conversion without writing a script.

bash β€” Node.js one-liner
# Pipe CSV to a Node.js inline script
cat servers.csv | node -e "
  const lines = require('fs').readFileSync('/dev/stdin','utf8').trim().split('\n');
  const h = lines[0].split(',');
  const rows = lines.slice(1).map(l => Object.fromEntries(h.map((k,i) => [k.trim(), l.split(',')[i]?.trim()])));
  console.log(JSON.stringify(rows, null, 2));
"
bash β€” Miller (mlr) for CSV to JSON
# Miller is a Swiss Army knife for structured data
# Install: brew install miller (macOS) or apt install miller (Debian/Ubuntu)
mlr --icsv --ojson cat inventory.csv

# Filter rows during conversion
mlr --icsv --ojson filter '$quantity > 100' inventory.csv

# Select specific columns
mlr --icsv --ojson cut -f sku,warehouse,quantity inventory.csv
bash β€” csvtojson CLI
# Install globally
npm install -g csvtojson

# Convert file
csvtojson servers.csv > servers.json

# Pipe from stdin
cat exports/q1-metrics.csv | csvtojson > q1-metrics.json

For large files, Miller is usually the better choice over csvtojson. Miller is implemented in C and processes CSV as a stream without loading the entire file into memory, which means it handles multi-gigabyte exports at constant memory usage. It also supports in-place field-level operations β€” renaming columns, type-casting values, filtering rows β€” before the data ever becomes JSON, so you avoid a two-step parse-then-transform pipeline. csvtojson, on the other hand, runs in Node.js and is more convenient when the rest of your toolchain is JavaScript: you can pipe its output directly into Node streams, import it as a library, or use its colParser API for per-column type coercion in code. Prefer Miller for raw throughput and shell pipelines; prefer csvtojson when you need tight integration with a Node.js application.

Note:jq does not parse CSV natively. If you need jq in the pipeline, convert to JSON first with csvtojson or mlr, then pipe the JSON output to jq for filtering and transformation.

High-Performance Alternative β€” PapaParse

The manual split(',') approach fails on real-world CSV files. Quoted fields containing commas, embedded newlines, escaped double quotes β€” all of these break a naive splitter. PapaParse is the library I reach for when the CSV comes from an unknown source. It handles every RFC 4180 edge case, auto-detects delimiters, and works in both Node.js and browsers.

bash β€” install PapaParse
npm install papaparse
JavaScript β€” PapaParse with type coercion
import Papa from 'papaparse'

const csv = `product,description,price,in_stock
"Widget, Large","A premium widget with ""extra"" features",29.99,true
Bolt Assembly,Standard M8 bolt kit,4.50,true
"Gasket Set","Includes gasket, seal, and O-ring",12.75,false`

const { data, errors, meta } = Papa.parse(csv, {
  header: true,
  dynamicTyping: true,     // auto-converts numbers and booleans
  skipEmptyLines: true,
  transformHeader: h => h.trim().toLowerCase().replace(/\s+/g, '_'),
})

if (errors.length > 0) {
  console.error('Parse errors:', errors)
}

console.log(JSON.stringify(data, null, 2))
// [
//   {
//     "product": "Widget, Large",
//     "description": "A premium widget with \"extra\" features",
//     "price": 29.99,
//     "in_stock": true
//   },
//   ...
// ]
console.log(`Parsed ${data.length} rows, delimiter: "${meta.delimiter}"`)

PapaParse's dynamicTyping option does the type coercion automatically β€” numbers become numbers, "true"/"false" become booleans. The transformHeader callback normalizes column names to snake_case, which saves you from dealing with inconsistent headers from different CSV exports. For quick conversions without writing any parsing code, the CSV to JSON converter handles all of this in the browser.

Terminal Output with Syntax Highlighting

Dumping a large JSON array to the terminal makes your eyes glaze over fast. Adding syntax highlighting to the output makes it readable during debugging and development. The cli-highlight package colorizes JSON output in Node.js terminals.

bash β€” install cli-highlight
npm install cli-highlight
JavaScript β€” colorized JSON output in terminal
import { highlight } from 'cli-highlight'

// After converting CSV to JSON array
const jsonOutput = JSON.stringify(rows, null, 2)

// Print with syntax highlighting
console.log(highlight(jsonOutput, { language: 'json' }))
// Keys, strings, numbers, and booleans each get distinct colors

Colorized output earns its keep when you are inspecting a large conversion result interactively. JSON keys, string values, numbers, and booleans each get distinct ANSI colors, which makes it easy to spot a field whose type is wrong β€” for example, a port number that should be 8080 but is highlighted as a string because coercion was off. This is especially useful when debugging CSV files exported from spreadsheet tools where column types are inconsistent across rows. Without color, scanning 50 rows of JSON for a single mistyped field means reading every value individually. With color, a string-colored number jumps out immediately.

Warning:Syntax highlighting adds ANSI escape codes to the output. Do not use it when writing JSON to a file, piping to another program, or sending as an API response body β€” the escape codes would corrupt the JSON. Only use highlighting for terminal display.

Working with Large CSV Files

Loading a 500 MB CSV file into a string with readFileSync() will eat memory and potentially crash your process. For large files, stream the CSV line by line and emit JSON objects as they arrive. The csv-parse package (part of the csv ecosystem on npm) provides a streaming parser that works with Node.js streams.

Streaming CSV to NDJSON with csv-parse

NDJSON (Newline-Delimited JSON) is a format where each line of the output file is a self-contained JSON object. Unlike a single large JSON array β€” which requires the entire file to be in memory before you can begin reading it β€” NDJSON files can be processed line by line. This makes NDJSON ideal for large datasets that will be consumed by log processors, stream pipelines, or databases with bulk-import APIs. The csv-parse package emits one JavaScript object per CSV row in object mode, so you can pipe it directly into a transform stream that appends \n after each JSON.stringify(row).

Node.js 18+ β€” streaming CSV to NDJSON
import { createReadStream, createWriteStream } from 'node:fs'
import { parse } from 'csv-parse'
import { Transform } from 'node:stream'
import { pipeline } from 'node:stream/promises'

// Transform each CSV row object to a JSON line
const toNdjson = new Transform({
  objectMode: true,
  transform(record, encoding, callback) {
    callback(null, JSON.stringify(record) + '\n')
  },
})

await pipeline(
  createReadStream('telemetry-2026-03.csv'),
  parse({
    columns: true,       // use first row as headers
    skip_empty_lines: true,
    trim: true,
    cast: true,          // auto-convert numbers and booleans
  }),
  toNdjson,
  createWriteStream('telemetry-2026-03.ndjson')
)

console.log('Streaming conversion complete')
// Each line in the output file is one JSON object:
// {"timestamp":"2026-03-15T08:22:00Z","service":"gateway","latency_ms":42,"status":200}
// {"timestamp":"2026-03-15T08:22:01Z","service":"auth","latency_ms":156,"status":401}
// ...

PapaParse Streaming for Browser and Node.js

PapaParse's streaming mode uses a step callback that fires once per row rather than collecting all rows in memory. You pass it a Node.js ReadStream (in Node.js) or a File object (in the browser) and PapaParse handles the chunking internally. No stream pipeline to wire up β€” just a callback. Use it when you need RFC 4180 compliance without pulling in csv-parse.

Node.js β€” PapaParse streaming large file
import Papa from 'papaparse'
import { createReadStream } from 'node:fs'

let rowCount = 0
const errors = []

const fileStream = createReadStream('warehouse-inventory.csv')

Papa.parse(fileStream, {
  header: true,
  dynamicTyping: true,
  step(result) {
    // Process one row at a time β€” constant memory
    rowCount++
    if (result.data.quantity === 0) {
      errors.push(`Row ${rowCount}: ${result.data.sku} is out of stock`)
    }
  },
  complete() {
    console.log(`Processed ${rowCount} rows`)
    if (errors.length > 0) {
      console.log(`Issues found: ${errors.length}`)
      errors.forEach(e => console.log(`  ${e}`))
    }
  },
  error(err) {
    console.error('Parse failed:', err.message)
  },
})
Note:Switch to streaming when your CSV file exceeds 50 MB or when you are processing an unbounded input (WebSocket feed, server-sent events, piped stdin). The NDJSON format (one JSON object per line) is often a better output format than a giant JSON array for large datasets β€” it can be processed line by line without loading the entire file into memory.

Common Mistakes

❌ Calling JSON.parse() directly on a CSV string

Problem: CSV is not JSON. Passing a raw CSV string to JSON.parse() throws a SyntaxError because commas and newlines are not valid JSON syntax.

Fix: Parse the CSV into JavaScript objects first using split() or a library, then use JSON.stringify() to produce JSON. Only call JSON.parse() on strings that are already valid JSON.

Before Β· JavaScript
After Β· JavaScript
const csv = 'name,email\nSarah Chen,schen@nexuslabs.io'
const data = JSON.parse(csv)
// SyntaxError: Unexpected token 'a'
const csv = 'name,email\nSarah Chen,schen@nexuslabs.io'
const lines = csv.trim().split('\n')
const headers = lines[0].split(',')
const rows = lines.slice(1).map(line =>
  Object.fromEntries(headers.map((h, i) => [h, line.split(',')[i]]))
)
const json = JSON.stringify(rows, null, 2)  // valid JSON string
❌ Using toString() instead of JSON.stringify()

Problem: Calling toString() on a JavaScript object returns the useless string '[object Object]' instead of the actual data. This silently destroys your CSV-to-JSON output.

Fix: Always use JSON.stringify() to convert JavaScript objects to JSON strings. toString() exists for primitive-to-string coercion, not for serialization.

Before Β· JavaScript
After Β· JavaScript
const row = { server: 'api-gateway', port: 8080 }
const output = row.toString()
// "[object Object]"  β€” data is gone
const row = { server: 'api-gateway', port: 8080 }
const output = JSON.stringify(row, null, 2)
// '{"server":"api-gateway","port":8080}'
❌ Splitting on comma without handling quoted fields

Problem: A naive split(",") breaks when CSV values contain commas inside quoted fields: "Widget, Large" becomes two separate values instead of one.

Fix: Use PapaParse or csv-parse for any CSV data you do not fully control. If you must parse manually, implement a state-machine parser that tracks whether the current position is inside a quoted field.

Before Β· JavaScript
After Β· JavaScript
const line = '"Widget, Large","Premium quality",29.99'
const values = line.split(',')
// ["\"Widget", " Large\"", "\"Premium quality\"", "29.99"]
// 4 values instead of 3 β€” first field split incorrectly
import Papa from 'papaparse'
const { data } = Papa.parse('"Widget, Large","Premium quality",29.99')
// data[0] = ["Widget, Large", "Premium quality", "29.99"]
// 3 values, correctly parsed
❌ Forgetting that all CSV values are strings

Problem: Without type coercion, port: "8080" stays as a string in JSON instead of a number. Downstream systems expecting numeric types reject or mishandle the data.

Fix: Apply explicit type coercion during the mapping step, or use PapaParse with dynamicTyping: true. Always be deliberate about which fields should be numeric.

Before Β· JavaScript
After Β· JavaScript
const row = { port: '8443', debug: 'true', workers: '4' }
JSON.stringify(row)
// {"port":"8443","debug":"true","workers":"4"}  β€” all strings
const row = {
  port: Number('8443'),           // 8443
  debug: 'true' === 'true',      // true
  workers: Number('4'),           // 4
}
JSON.stringify(row)
// {"port":8443,"debug":true,"workers":4}  β€” correct types

Manual Parsing vs Libraries β€” Quick Comparison

Method
Pretty Output
Valid JSON
Custom Types
Streaming
Requires Install
JSON.stringify()
βœ“ (with space)
βœ“
βœ“ via toJSON()/replacer
βœ—
No (built-in)
JSON.parse()
N/A (parsing)
βœ“ (validates)
βœ“ via reviver
βœ—
No (built-in)
csv-parse (Node.js)
βœ— (returns objects)
βœ— (not JSON)
βœ—
βœ“
npm install
PapaParse
βœ— (returns objects)
βœ— (not JSON)
βœ—
βœ“
npm install
csvtojson
βœ“
βœ“
βœ“ via colParser
βœ“
npm install
jq (CLI)
βœ“
βœ“
N/A
βœ“
System install
Miller (CLI)
βœ“
βœ“
N/A
βœ“
System install

For quick scripts where you control the CSV format and know there are no quoted fields, the built-in split() + JSON.stringify() approach works and requires zero dependencies. For production systems processing user-uploaded CSV files, use PapaParse in the browser or csv-parse in Node.js β€” both handle RFC 4180 correctly and support streaming. The csvtojson package is the only one that outputs JSON directly, handling both parsing and serialization in a single call. When you need the fastest path from paste-to-result, the CSV to JSON converter handles it all in the browser without any setup.

Frequently Asked Questions

How do I convert CSV to JSON in JavaScript without a library?

Split the CSV string by newline to get rows, extract headers from the first row with split(","), then map the remaining rows to objects keyed by those headers. Finish with JSON.stringify(array, null, 2) to produce a formatted JSON string. This approach works well for simple CSV files where you control the format and know there are no quoted fields or embedded commas. For data from spreadsheet exports or third-party systems, quoted fields and multi-line values will break a naive splitter β€” switch to PapaParse or csv-parse in those cases. For very small files processed in the browser, this zero-dependency approach is ideal.

JavaScript
const csv = `name,email,department
Sarah Chen,schen@nexuslabs.io,Engineering
Raj Patel,rpatel@nexuslabs.io,Product`

const lines = csv.trim().split('\n')
const headers = lines[0].split(',')
const rows = lines.slice(1).map(line => {
  const values = line.split(',')
  return Object.fromEntries(headers.map((h, i) => [h.trim(), values[i]?.trim()]))
})

console.log(JSON.stringify(rows, null, 2))
// [
//   { "name": "Sarah Chen", "email": "schen@nexuslabs.io", "department": "Engineering" },
//   { "name": "Raj Patel", "email": "rpatel@nexuslabs.io", "department": "Product" }
// ]

What is the difference between JSON.stringify() and toString() for objects?

toString() on a plain object returns the useless string "[object Object]" β€” it says nothing about the actual data. JSON.stringify() produces a valid JSON string with all keys and values properly serialized, including nested objects and arrays. Always use JSON.stringify() when you need to convert a JavaScript object (like a CSV-derived row) to a JSON string. To reverse the operation β€” reconstruct live JavaScript objects from a JSON string β€” use JSON.parse(), which is the exact inverse of JSON.stringify(). These two functions form a complete round-trip: stringify to persist or transmit data, parse to consume it.

JavaScript
const row = { name: 'Sarah Chen', role: 'Engineer' }

console.log(row.toString())       // "[object Object]"
console.log(JSON.stringify(row))   // '{"name":"Sarah Chen","role":"Engineer"}'

How do I handle CSV fields that contain commas or quotes?

RFC 4180 specifies that fields containing commas, newlines, or double quotes must be wrapped in double quotes, with embedded quotes escaped by doubling them ("" inside a quoted field represents a single literal quote). A manual split(",") breaks on these files β€” the field boundary is no longer a plain comma. Use PapaParse or csv-parse for production data, or write a state-machine parser that tracks whether you are inside a quoted field. Writing a correct state machine from scratch is surprisingly tricky: you must handle quotes at the start of a field, mid-field escaped quotes, newlines inside quoted fields, and different line-ending conventions (CRLF vs LF). For anything beyond toy CSV data, use a well-tested library.

JavaScript
// PapaParse handles RFC 4180 correctly
import Papa from 'papaparse'

const csv = `product,description,price
"Widget, Large","A big ""widget""",29.99
Bolt,Standard bolt,1.50`

const { data } = Papa.parse(csv, { header: true })
console.log(JSON.stringify(data, null, 2))
// description field correctly contains: A big "widget"

Can I convert CSV to JSON directly in the browser?

Yes. Both JSON.stringify() and JSON.parse() are built into every browser engine. For the CSV parsing step, you can split by newline and comma for simple files, or include PapaParse via a CDN (it has no Node.js dependencies). The entire conversion happens client-side with no server round-trip, which is useful for file privacy β€” the raw CSV data never leaves the user's machine. When a user uploads a CSV file via an <input type="file"> element, the File API's file.text() method returns a Promise that resolves to the file's content as a string, which you can then pass to your conversion function. This makes fully in-browser CSV-to-JSON conversion practical for dashboards, developer tools, and any app that needs to process uploaded data without a backend.

JavaScript
// Browser: user uploads a CSV file
const fileInput = document.querySelector('input[type="file"]')
fileInput.addEventListener('change', async (event) => {
  const file = event.target.files[0]
  const text = await file.text()
  const lines = text.trim().split('\n')
  const headers = lines[0].split(',')
  const rows = lines.slice(1).map(line =>
    Object.fromEntries(headers.map((h, i) => [h.trim(), line.split(',')[i]?.trim()]))
  )
  console.log(JSON.stringify(rows, null, 2))
})

How do I parse numeric and boolean values from CSV instead of keeping everything as strings?

CSV has no type system β€” every field is a string. You must coerce values during the mapping step. Check for numeric patterns with Number() or parseFloat(), convert "true"/"false" to booleans, and handle empty strings as null. Be careful with fields that look like numbers but must stay as strings: zip codes like "07302" lose their leading zero when coerced with Number(), and phone numbers or product codes with numeric characters are similarly fragile. The csvtojson library does automatic type coercion via its colParser option, which lets you specify per-column conversion functions and override the auto-detection for problem columns. PapaParse's dynamicTyping option applies the same coercion globally across all columns.

JavaScript
function coerceValue(val) {
  if (val === '') return null
  if (val === 'true') return true
  if (val === 'false') return false
  const num = Number(val)
  if (!isNaN(num) && val.trim() !== '') return num
  return val
}

// Apply during CSV-to-object mapping
const row = { port: coerceValue('8443'), debug: coerceValue('true'), host: coerceValue('api.internal') }
// { port: 8443, debug: true, host: "api.internal" }

How do I write CSV-to-JSON output to a file in Node.js?

Use fs.writeFileSync() with the JSON.stringify() output. The third argument to JSON.stringify controls indentation β€” pass 2 for two-space indent or "\t" for tabs. To read the file back, use JSON.parse(fs.readFileSync(path, "utf8")), which reconstructs the live JavaScript array of objects. If you are writing the file in an async context (inside an async function or at the top level of an ES module), prefer fs.promises.writeFile() to avoid blocking the event loop while the file write completes. For large JSON outputs over a few megabytes, consider piping a JSON stream to a WriteStream instead of building the entire string in memory before writing.

JavaScript
import { writeFileSync, readFileSync } from 'node:fs'

// Write
const jsonOutput = JSON.stringify(rows, null, 2)
writeFileSync('employees.json', jsonOutput, 'utf8')

// Read back
const parsed = JSON.parse(readFileSync('employees.json', 'utf8'))
console.log(Array.isArray(parsed))  // true
console.log(parsed[0].name)         // "Sarah Chen"
Also available in:Python
AC
Alex ChenFront-end & Node.js Developer

Alex is a front-end and Node.js developer with extensive experience building web applications and developer tooling. He is passionate about web standards, browser APIs, and the JavaScript ecosystem. In his spare time he contributes to open-source projects and writes about modern JavaScript patterns, performance optimisation, and everything related to the web platform.

SL
Sophie LaurentTechnical Reviewer

Sophie is a full-stack developer focused on TypeScript across the entire stack β€” from React frontends to Express and Fastify backends. She has a particular interest in type-safe API design, runtime validation, and the patterns that make large JavaScript codebases stay manageable. She writes about TypeScript idioms, Node.js internals, and the ever-evolving JavaScript module ecosystem.