JSON Formatter JavaScript β JSON.stringify()
Use the free online JSON Formatter & Beautifier directly in your browser β no install required.
Try JSON Formatter & Beautifier Online βWhen I debug API responses in Node.js, a wall of minified JSON is the first thing that slows me down β one call to JSON.stringify(data, null, 2) and the structure becomes instantly readable. To format JSON in JavaScript, you need nothing beyond the runtime: JSON.stringify is built into every browser and Node.js environment, no install required. If you just need a quick one-off result without writing code, the online JSON Formatter handles it instantly. This guide covers everything practical for use in scripts and applications: the space parameter, replacer arrays and functions, handling Date, BigInt, and circular references, reading and writing JSON files in Node.js (18+), CLI pretty-printing, and the fast-json-stringify library for production serialization.
- βJSON.stringify(data, null, 2) is built into every browser and Node.js β no install required.
- βThe replacer parameter accepts an array (key whitelist) or a function (transform values) β use it to mask sensitive fields.
- βDate objects serialize automatically via toJSON() β ISO 8601 string; BigInt throws TypeError and requires a custom replacer.
- βCircular references throw TypeError β fix with a seen Set in a replacer function, or use the flatted library.
- βFor CLI pretty-printing use node -p "JSON.stringify(require('./file.json'),null,2)" β no additional tools needed.
What is JSON Formatting?
JSON formatting (also called pretty-printing) transforms a compact, minified JSON string into a human-readable layout with consistent indentation and line breaks. The underlying data is identical β only the whitespace changes. Compact JSON is optimal for network transfer where every byte matters; formatted JSON is optimal for debugging, code review, and log inspection. JavaScript's JSON.stringify() handles both in a single call by toggling the space parameter.
{"orderId":"ord_8f2a91bc","status":"shipped","items":[{"sku":"HDMI-4K-2M","qty":2,"unitPrice":12.99}],"total":25.98}{
"orderId": "ord_8f2a91bc",
"status": "shipped",
"items": [
{
"sku": "HDMI-4K-2M",
"qty": 2,
"unitPrice": 12.99
}
],
"total": 25.98
}JSON.stringify() β The Built-in Formatter
JSON.stringify() is a global function in every JavaScript environment β browsers, Node.js, Deno, Bun β with no import needed. Its third argument, space, controls indentation: pass a number for that many spaces per level, or the string '\t' for tabs. Omit it (or pass null) and you get compact single-line output.
const serverConfig = {
host: "api.payments.internal",
port: 8443,
workers: 4,
tls: { enabled: true, cert: "/etc/ssl/certs/api.pem" },
rateLimit: { requestsPerMinute: 1000, burst: 50 }
}
console.log(JSON.stringify(serverConfig, null, 2))
// {
// "host": "api.payments.internal",
// "port": 8443,
// "workers": 4,
// "tls": {
// "enabled": true,
// "cert": "/etc/ssl/certs/api.pem"
// },
// "rateLimit": {
// "requestsPerMinute": 1000,
// "burst": 50
// }
// }The space parameter accepts either a number (1β10 spaces) or a string. Passing a tab character produces output that many editors and diff tools prefer. You can also combine all three parameters β here is a real-world pattern I use when writing formatted JSON to config files:
const telemetryEvent = {
eventId: "evt_3c7f9a2b",
service: "checkout-api",
severity: "warn",
latencyMs: 342,
region: "eu-west-1",
tags: ["payment", "timeout", "retry"]
}
// 2-space indent (most common in JS projects)
JSON.stringify(telemetryEvent, null, 2)
// Tab indent (preferred by some linters and config tools)
JSON.stringify(telemetryEvent, null, '\t')
// Compact β no whitespace (for network transfer)
JSON.stringify(telemetryEvent)
// {"eventId":"evt_3c7f9a2b","service":"checkout-api",...}undefined, functions, and Symbol values are silently omitted from the output. If a property value is undefined, that key will not appear in the serialized string at all β this is a common source of bugs when logging objects that have optional fields.Replacer Functions β Filter and Transform Output
The second argument to JSON.stringify() is the replacer. It comes in two forms: an array that whitelists specific keys, or a function that is called for every key/value pair and can filter, transform, or redact values. I reach for the array form when I need a quick subset, and the function form when I need to mask sensitive data before logging.
Array Replacer β Whitelist Specific Keys
const order = {
orderId: "ord_8f2a91bc",
customer: {
id: "usr_4421",
email: "j.morgan@example.com",
passwordHash: "bcrypt:$2b$12$XKzV..."
},
items: [{ sku: "HDMI-4K-2M", qty: 2, unitPrice: 12.99 }],
createdAt: "2026-03-10T14:22:00Z"
}
// Only include safe fields in the log output
const safeLog = JSON.stringify(order, ["orderId", "items", "createdAt"], 2)
// {
// "orderId": "ord_8f2a91bc",
// "items": [{ "sku": "HDMI-4K-2M", "qty": 2, "unitPrice": 12.99 }],
// "createdAt": "2026-03-10T14:22:00Z"
// }
// passwordHash and customer.email are excludedFunction Replacer β Transform Values
const auditRecord = {
requestId: "req_7d2e91",
user: { id: "usr_4421", email: "j.morgan@example.com", apiKey: "sk-live-eKx9..." },
action: "update_billing",
timestamp: new Date("2026-03-10T14:22:00Z"),
durationMs: 87
}
function safeReplacer(key, value) {
// Redact fields that look like secrets or PII
if (key === "apiKey") return "[REDACTED]"
if (key === "email") return value.replace(/(?<=.{2}).+(?=@)/, "***")
return value
}
console.log(JSON.stringify(auditRecord, safeReplacer, 2))
// {
// "requestId": "req_7d2e91",
// "user": { "id": "usr_4421", "email": "j.***@example.com", "apiKey": "[REDACTED]" },
// "action": "update_billing",
// "timestamp": "2026-03-10T14:22:00.000Z",
// "durationMs": 87
// }this set to the object containing the current key. The very first call passes an empty string as the key and the entire value being serialized as the value β return it unchanged to proceed with normal serialization.Handling Non-Serializable Types
Not all JavaScript values map cleanly to JSON. Knowing how each type behaves prevents silent data loss and unexpected runtime errors in production code.
Date β Automatic via toJSON()
Date objects implement a toJSON() method that returns an ISO 8601 string. JSON.stringify() calls toJSON() automatically before serializing, so no custom handling is required.
const webhook = {
eventType: "payment.succeeded",
occurredAt: new Date("2026-03-10T14:22:00Z"),
processedAt: new Date()
}
JSON.stringify(webhook, null, 2)
// {
// "eventType": "payment.succeeded",
// "occurredAt": "2026-03-10T14:22:00.000Z",
// "processedAt": "2026-03-10T14:22:01.347Z"
// }
// Any object with a toJSON() method gets the same treatment:
const custom = { toJSON: () => "custom-value", hidden: 42 }
JSON.stringify(custom) // '"custom-value"'Custom Classes β Implement toJSON()
Any class can implement a toJSON() method and JSON.stringify() will call it automatically during serialization. This is cleaner than a global replacer for domain types that appear throughout your codebase.
class Money {
constructor(amount, currency) {
this.amount = amount
this.currency = currency
}
toJSON() {
// Called automatically by JSON.stringify
return { amount: this.amount, currency: this.currency, formatted: `${this.currency} ${this.amount.toFixed(2)}` }
}
}
class OrderId {
constructor(id) { this.id = id }
toJSON() { return this.id } // Serialize as plain string
}
const invoice = {
invoiceId: new OrderId('inv_8f2a91bc'),
subtotal: new Money(199.00, 'USD'),
tax: new Money(15.92, 'USD'),
issuedAt: new Date('2026-03-10T14:22:00Z')
}
JSON.stringify(invoice, null, 2)
// {
// "invoiceId": "inv_8f2a91bc",
// "subtotal": { "amount": 199, "currency": "USD", "formatted": "USD 199.00" },
// "tax": { "amount": 15.92, "currency": "USD", "formatted": "USD 15.92" },
// "issuedAt": "2026-03-10T14:22:00.000Z"
// }toJSON() takes priority over the replacer function. If both are present, toJSON() runs first β the replacer receives the already-converted value, not the original class instance.BigInt β TypeError Without a Replacer
// This throws: TypeError: Do not know how to serialize a BigInt
// JSON.stringify({ sessionId: 9007199254741234n })
// Fix: convert BigInt to string in the replacer
function bigIntReplacer(_key, value) {
return typeof value === 'bigint' ? value.toString() : value
}
const metrics = {
requestCount: 9007199254741234n, // exceeds Number.MAX_SAFE_INTEGER
service: "ingestion-worker",
region: "us-east-1"
}
JSON.stringify(metrics, bigIntReplacer, 2)
// {
// "requestCount": "9007199254741234",
// "service": "ingestion-worker",
// "region": "us-east-1"
// }Circular References β TypeError Without a Seen Set
// This throws: TypeError: Converting circular structure to JSON
// const node = { id: "n1" }; node.self = node; JSON.stringify(node)
// Fix: track seen objects with a WeakSet
function circularReplacer() {
const seen = new WeakSet()
return function (_key, value) {
if (typeof value === 'object' && value !== null) {
if (seen.has(value)) return '[Circular]'
seen.add(value)
}
return value
}
}
const parent = { id: "node_parent", label: "root" }
const child = { id: "node_child", parent }
parent.child = child // circular
JSON.stringify(parent, circularReplacer(), 2)
// {
// "id": "node_parent",
// "label": "root",
// "child": { "id": "node_child", "parent": "[Circular]" }
// }
// Alternatively: npm install flatted
// import { stringify } from 'flatted'
// stringify(parent) // handles circular refs nativelyJSON.stringify() Parameters Reference
All three parameters are well-supported in every modern JavaScript runtime. The defaults produce compact, single-line JSON β pass parameters explicitly for readable output.
Format JSON from a File and API Response
In Node.js (18+) you often need to reformat a JSON config file or pretty-print an API response for debugging. Both patterns use the same two-step approach: parse the raw text with JSON.parse(), then re-serialize with JSON.stringify().
Reading and Rewriting a JSON Config File
import { readFileSync, writeFileSync } from 'fs'
try {
const raw = readFileSync('./config/database.json', 'utf8')
const config = JSON.parse(raw)
writeFileSync('./config/database.json', JSON.stringify(config, null, 2))
console.log('Config reformatted successfully')
} catch (err) {
console.error('Failed to reformat config:', err.message)
// JSON.parse throws SyntaxError if the file contains invalid JSON
// readFileSync throws ENOENT if the file does not exist
}Async File Reformat (fs/promises)
import { readFile, writeFile } from 'fs/promises'
async function reformatJson(filePath) {
const raw = await readFile(filePath, 'utf8')
const parsed = JSON.parse(raw)
const formatted = JSON.stringify(parsed, null, 2)
await writeFile(filePath, formatted, 'utf8')
return { keys: Object.keys(parsed).length }
}
// Usage
const { keys } = await reformatJson('./config/feature-flags.json')
console.log(`Reformatted ${keys} top-level keys`)Pretty Print JSON from a fetch() Response
When building or debugging an API client in Node.js 18+ or the browser, pretty-printing the response body is the fastest way to understand what the server returned. The standard pattern is response.json() (parsed object) into JSON.stringify(). If you need the raw string first β for example to compute a hash or log it verbatim β use response.text() then JSON.parse().
// Pattern 1: response.json() β pretty print
async function debugEndpoint(url) {
const res = await fetch(url, {
headers: { Authorization: `Bearer ${process.env.API_TOKEN}` }
})
if (!res.ok) throw new Error(`HTTP ${res.status}: ${res.statusText}`)
const data = await res.json()
console.log(JSON.stringify(data, null, 2))
}
await debugEndpoint('https://api.stripe.com/v1/charges?limit=3')// Pattern 2: response.text() β parse β format
// Useful when you want to log the raw response AND pretty-print it
async function inspectRawResponse(url) {
const res = await fetch(url)
const raw = await res.text()
console.log('Raw response length:', raw.length)
try {
const parsed = JSON.parse(raw)
console.log('Formatted:')
console.log(JSON.stringify(parsed, null, 2))
} catch {
console.error('Response is not valid JSON:', raw.slice(0, 200))
}
}Command-Line Pretty Printing
Node.js ships with enough capability to pretty-print JSON from the terminal without any additional tools. These one-liners are useful in CI scripts, deployment pipelines, and shell aliases. For even faster interactive use, install jq β the de-facto standard for JSON manipulation on the command line.
# Pretty-print package.json using node -p (print)
node -p "JSON.stringify(require('./package.json'), null, 2)"
# Pretty-print from stdin (pipe-friendly, works in CI)
echo '{"port":3000,"env":"production"}' | node -e "
const d = require('fs').readFileSync(0, 'utf8')
console.log(JSON.stringify(JSON.parse(d), null, 2))
"
# Pretty-print an API response saved to a file
cat api-response.json | node -e "
process.stdin.setEncoding('utf8')
let s = ''
process.stdin.on('data', c => s += c)
process.stdin.on('end', () => console.log(JSON.stringify(JSON.parse(s), null, 2)))
"# Python fallback (pre-installed on macOS and most Linux distros)
cat api-response.json | python3 -m json.tool
# jq β fastest and most feature-rich (brew install jq / apt install jq)
cat api-response.json | jq .
# jq β pretty-print and filter in one step
cat api-response.json | jq '.data.users[] | {id, email}'"type": "module" in package.json), require() is not available in one-liners. Use --input-type=module and fs.readFileSync instead, or switch to node -e with a CommonJS snippet as shown above.If you're not in a terminal at all β pasting a response from Postman or a log file β the ToolDeck JSON Formatter lets you paste, format, and copy in one step, with syntax highlighting and validation built in.
High-Performance Alternative β fast-json-stringify
fast-json-stringify generates a dedicated serializer function from a JSON Schema. Because it knows the shape of the data ahead of time, it can skip type-checking and use string concatenation instead of recursive descent β benchmarks typically show 2β5Γ throughput improvement over JSON.stringify() on large, repetitive payloads. I use it in high-frequency API routes where serialization cost shows up in profiler traces.
npm install fast-json-stringify
import fastJson from 'fast-json-stringify'
const serializeTelemetryEvent = fastJson({
title: 'TelemetryEvent',
type: 'object',
properties: {
eventId: { type: 'string' },
service: { type: 'string' },
severity: { type: 'string', enum: ['info', 'warn', 'error'] },
latencyMs: { type: 'integer' },
timestamp: { type: 'string' },
region: { type: 'string' }
},
required: ['eventId', 'service', 'severity', 'latencyMs', 'timestamp']
})
const event = {
eventId: 'evt_3c7f9a2b',
service: 'checkout-api',
severity: 'warn',
latencyMs: 342,
timestamp: new Date().toISOString(),
region: 'eu-west-1'
}
const json = serializeTelemetryEvent(event)
// '{"eventId":"evt_3c7f9a2b","service":"checkout-api","severity":"warn",...}'fast-json-stringify is designed for production serialization of structured data β it always produces compact output (no pretty-printing). For human-readable output during development, use JSON.stringify(data, null, 2) as normal.Terminal Output with Syntax Highlighting
Node.js's built-in util.inspect() produces colored, human-readable output optimized for terminal display. It handles circular references and BigInt natively, and recursively renders nested objects to any depth. The output is not valid JSON β it uses JavaScript syntax (e.g. true instead of true, but renders functions and Symbols rather than omitting them), which makes it ideal for Node.js debugging scripts but unsuitable for API responses or file output.
import { inspect } from 'util'
const payload = {
requestId: "req_7d2e91",
user: { id: "usr_4421", roles: ["admin", "billing"] },
metadata: { ipAddress: "203.0.113.42", userAgent: "Mozilla/5.0" },
createdAt: new Date()
}
// depth: null β expand all nested levels; colors: true β ANSI terminal colors
console.log(inspect(payload, { colors: true, depth: null }))util.inspect() for JSON written to files, sent over the network, or stored in a database. Its output is not valid JSON and will cause parse errors in any downstream system that calls JSON.parse() on it. Reserve it for interactive terminal debugging only.Working with Large JSON Files
JSON.parse() loads the entire file into memory before parsing β fine for small payloads, but impractical for files above 50β100 MB such as database exports, application log dumps, or analytics batches. For these cases Node.js Streams and the stream-json library let you process records one at a time without blowing the heap.
Streaming Parsing with stream-json
npm install stream-json
import { pipeline } from 'stream/promises'
import { createReadStream } from 'fs'
import { parser } from 'stream-json'
import { streamArray } from 'stream-json/streamers/StreamArray.js'
// events.json = array of millions of objects β never fully loaded into memory
await pipeline(
createReadStream('./events.json'),
parser(),
streamArray(),
async function* (source) {
for await (const { value: event } of source) {
if (event.severity === 'error') {
console.log(JSON.stringify(event, null, 2))
}
}
}
)NDJSON / JSON Lines β No Extra Dependencies
NDJSON (Newline Delimited JSON) stores one JSON object per line and is common in Kafka exports, BigQuery outputs, and structured log pipelines. Node.js's built-in readline handles it without any third-party packages.
import { createReadStream } from 'fs'
import { createInterface } from 'readline'
// Format: one JSON object per line (logs, Kafka exports, BigQuery)
const rl = createInterface({
input: createReadStream('./logs.ndjson'),
crlfDelay: Infinity
})
for await (const line of rl) {
if (!line.trim()) continue
const entry = JSON.parse(line)
if (entry.level === 'error') {
console.log(JSON.stringify(entry, null, 2))
}
}JSON.parse() to streaming when your JSON file exceeds 50β100 MB or when processing an unbounded stream (Kafka, log pipeline). For NDJSON / JSON Lines, use readline β it requires no extra dependencies.Common Mistakes
These four mistakes show up repeatedly in code reviews and production bug reports. Each one involves a subtle behavior of JSON.stringify() that is easy to miss and hard to debug after the fact.
Problem: Object properties with undefined values are omitted entirely from the JSON output β there is no warning or error. This causes invisible data loss when optional fields exist on the object.
Fix: Use null for intentionally absent values that must appear in the serialized output. Reserve undefined only for fields that should be excluded from JSON.
const userProfile = {
userId: "usr_4421",
displayName: "Jordan Morgan",
avatarUrl: undefined, // will disappear silently
bio: undefined // will disappear silently
}
JSON.stringify(userProfile, null, 2)
// { "userId": "usr_4421", "displayName": "Jordan Morgan" }
// avatarUrl and bio are gone β no warningconst userProfile = {
userId: "usr_4421",
displayName: "Jordan Morgan",
avatarUrl: null, // explicitly absent β appears in output
bio: null // explicitly absent β appears in output
}
JSON.stringify(userProfile, null, 2)
// {
// "userId": "usr_4421",
// "displayName": "Jordan Morgan",
// "avatarUrl": null,
// "bio": null
// }Problem: Passing a BigInt value to JSON.stringify() throws a TypeError at runtime. This is a hard crash, not silent omission β it will surface in production if any numeric field exceeds Number.MAX_SAFE_INTEGER.
Fix: Use a replacer function that converts BigInt values to strings before serialization. Alternatively, convert BigInt to string or Number at the data layer before passing to JSON.stringify.
const session = {
sessionId: 9007199254741234n, // BigInt literal
userId: "usr_4421",
startedAt: "2026-03-10T14:00:00Z"
}
JSON.stringify(session, null, 2)
// Uncaught TypeError: Do not know how to serialize a BigIntfunction bigIntReplacer(_key, value) {
return typeof value === 'bigint' ? value.toString() : value
}
const session = {
sessionId: 9007199254741234n,
userId: "usr_4421",
startedAt: "2026-03-10T14:00:00Z"
}
JSON.stringify(session, bigIntReplacer, 2)
// {
// "sessionId": "9007199254741234",
// "userId": "usr_4421",
// "startedAt": "2026-03-10T14:00:00Z"
// }Problem: Objects that reference themselves β common in DOM trees, linked lists, and some ORM result sets β throw a TypeError when passed to JSON.stringify(). The error only occurs at serialization time, often far from where the circular reference was created.
Fix: Use a replacer function with a WeakSet to detect and replace circular references, or install the flatted library for a drop-in replacement that handles circular structures natively.
const dept = { id: "dept_eng", name: "Engineering" }
const team = { id: "team_frontend", dept }
dept.teams = [team] // circular: dept β team β dept
JSON.stringify(dept, null, 2)
// Uncaught TypeError: Converting circular structure to JSONimport { stringify } from 'flatted' // npm install flatted
const dept = { id: "dept_eng", name: "Engineering" }
const team = { id: "team_frontend", dept }
dept.teams = [team]
// flatted handles circular refs β note: output is flatted format, not standard JSON
stringify(dept)
// Or use a WeakSet-based replacer if you need standard JSON output:
function circularReplacer() {
const seen = new WeakSet()
return (_key, value) => {
if (typeof value === 'object' && value !== null) {
if (seen.has(value)) return '[Circular]'
seen.add(value)
}
return value
}
}
JSON.stringify(dept, circularReplacer(), 2)Problem: Passing a space value greater than 10 to JSON.stringify() does not throw an error β the value is silently capped at 10. Developers who expect 20 spaces per indent for deep nesting will get only 10, leading to unexpected formatting in generated files.
Fix: The maximum indentation is 10 spaces. For deep structures, prefer 2-space indentation (the most common convention in JavaScript projects) and rely on collapsible editors for navigation.
const deepConfig = { server: { tls: { certs: { primary: "/etc/ssl/api.pem" } } } }
// Expecting 20-space indent β but space is capped at 10
JSON.stringify(deepConfig, null, 20)
// Same output as JSON.stringify(deepConfig, null, 10)
// No error, no warning β silently truncatedconst deepConfig = { server: { tls: { certs: { primary: "/etc/ssl/api.pem" } } } }
// Use 2 (most projects) or 4 spaces β never exceed 10
JSON.stringify(deepConfig, null, 2)
// {
// "server": {
// "tls": {
// "certs": {
// "primary": "/etc/ssl/api.pem"
// }
// }
// }
// }JSON.stringify vs Alternatives β Quick Comparison
Different situations call for different tools. JSON.stringify with a replacer covers most production use cases without any dependencies. util.inspect is the right choice for quick terminal debugging when you need color output and do not need valid JSON. fast-json-stringify pays off in high-throughput routes where profiling shows serialization cost; for everything else the overhead of schema maintenance is not worth it.
Frequently Asked Questions
How do I pretty print JSON in JavaScript?
Call JSON.stringify(data, null, 2) β the third argument controls indentation. Pass 2 or 4 for spaces, or "\t" for tabs. No import or install required: JSON is a global object in every JavaScript environment, including browsers and Node.js.
const config = { host: "api.payments.internal", port: 8443, tls: true }
console.log(JSON.stringify(config, null, 2))
// {
// "host": "api.payments.internal",
// "port": 8443,
// "tls": true
// }What does the `space` parameter do in JSON.stringify()?
The space parameter controls indentation in the output. Pass a number (1β10) for that many spaces per level, or a string such as "\t" to use a tab character. Values above 10 are silently capped at 10. Passing null, 0, or omitting the parameter produces compact single-line JSON.
const data = { service: "payments", version: 3, active: true }
JSON.stringify(data, null, 2) // 2-space indent
JSON.stringify(data, null, 4) // 4-space indent
JSON.stringify(data, null, '\t') // tab indent
JSON.stringify(data) // compact: {"service":"payments","version":3,"active":true}Why does JSON.stringify() return undefined for some values?
JSON.stringify silently omits object properties whose values are undefined, functions, or Symbols β these types have no JSON representation. If the top-level value itself is undefined, the function returns undefined (not the string "undefined"). Use null instead of undefined for optional fields that must appear in the output.
const event = {
traceId: "tr_9a2f",
handler: () => {}, // function β omitted
requestId: undefined, // undefined β omitted
sessionId: Symbol("s"), // Symbol β omitted
status: "ok"
}
JSON.stringify(event, null, 2)
// { "traceId": "tr_9a2f", "status": "ok" }How do I handle Date objects when formatting JSON?
Date objects have a built-in toJSON() method that returns an ISO 8601 string, so JSON.stringify handles them automatically. You do not need a custom replacer for dates β the serialized value will be a string like "2026-03-10T14:22:00.000Z".
const order = {
orderId: "ord_8f2a91bc",
placedAt: new Date("2026-03-10T14:22:00Z"),
total: 42.98
}
JSON.stringify(order, null, 2)
// {
// "orderId": "ord_8f2a91bc",
// "placedAt": "2026-03-10T14:22:00.000Z",
// "total": 42.98
// }How do I format a JSON string (not an object) in JavaScript?
Parse the string with JSON.parse() first, then re-serialize it with JSON.stringify(). The two calls can be chained in one line for quick debugging.
const raw = '{"endpoint":"/api/v2/users","timeout":30,"retry":true}'
const formatted = JSON.stringify(JSON.parse(raw), null, 2)
console.log(formatted)
// {
// "endpoint": "/api/v2/users",
// "timeout": 30,
// "retry": true
// }Can I use JSON.stringify() in the browser?
Yes. JSON is a built-in global in every modern browser since IE8 β no script tags or imports needed. Open DevTools console and call JSON.stringify() directly. It works identically to the Node.js version, with the same parameter signature and the same limitations around BigInt and circular references.
// Works in Chrome, Firefox, Safari, Edge β no imports needed
const payload = { userId: "usr_7b3c", action: "checkout", cart: ["SKU-001", "SKU-002"] }
copy(JSON.stringify(payload, null, 2)) // copy() is a DevTools helperWorking with JSON in JavaScript gives you full control β replacer functions, custom toJSON(), streaming large files in Node.js. When you just need to inspect or share a formatted snippet, ToolDeck's JSON Formatter is the faster path: paste your JSON and get an indented, highlighted result with no environment setup.
Related Tools
Alex is a front-end and Node.js developer with extensive experience building web applications and developer tooling. He is passionate about web standards, browser APIs, and the JavaScript ecosystem. In his spare time he contributes to open-source projects and writes about modern JavaScript patterns, performance optimisation, and everything related to the web platform.
Marcus specialises in JavaScript performance, build tooling, and the inner workings of the V8 engine. He has spent years profiling and optimising React applications, working on bundler configurations, and squeezing every millisecond out of critical rendering paths. He writes about Core Web Vitals, JavaScript memory management, and the tools developers reach for when performance really matters.