Format JSON in Bash β jq Guide & Examples
Use the free online JSON Formatter & Beautifier directly in your browser β no install required.
Try JSON Formatter & Beautifier Online βWhen a deploy script starts processing API responses or validating config files in CI, knowing how to format JSON in bash quickly becomes essential. The two tools that cover 99% of real-world cases are jq and python3 -m json.tool β both can format json bash pipelines reliably, validate with exit codes, and integrate cleanly into CI/CD workflows. For one-off inspection without a terminal, the browser-based JSON Formatter handles it instantly. This guide covers jq installation, pipe and file formatting, validation functions, CI/CD integration in GitHub Actions, pre-commit hooks, heredoc patterns, and when to reach for the Python stdlib fallback.
- β’
jq .formats AND validates simultaneously β exits with code 1 on invalid JSON - β’ Use
jq -ein CI pipelines: non-zero exit on empty/false/null output - β’
jq . file.json > /dev/null && echo "valid"β validate without changing output - β’
python3 -m json.toolworks on any system without extra installation - β’ Never do
jq . f.json > f.jsonβ shell truncates the source file before jq reads it
What Is JSON Formatting in Bash?
JSON formatting in bash means transforming compact, minified JSON into indented, human-readable output. The underlying data is unchanged β only the whitespace and line breaks differ. In scripting contexts this matters for two reasons: readability when debugging, and validation when the formatter double-checks syntax as a side effect. Tools like jq parse the JSON fully before reformatting it, which means a successful format run is also an implicit validity check. That dual behaviour β format and validate in one step β is what makes jq so useful in automated pipelines.
{"service":"payments-api","version":"2.4.1","database":{"host":"db-prod-01.internal","port":5432,"pool_size":20},"cache":{"enabled":true,"ttl":300}}{
"service": "payments-api",
"version": "2.4.1",
"database": {
"host": "db-prod-01.internal",
"port": 5432,
"pool_size": 20
},
"cache": {
"enabled": true,
"ttl": 300
}
}jq β Format JSON in Bash
jq is the de-facto standard for JSON processing in shell scripts (jq 1.6+, bash 4+). It is a purpose-built command-line JSON processor that can format, filter, transform, and validate JSON. The identity filter . passes the input through unchanged, but formatted. When jq cannot parse the input it exits with code 1 β this is what makes it ideal for scripting: formatting and validation are a single operation.
Install jq
# macOS brew install jq # Debian / Ubuntu apt-get install -y jq # Fedora / RHEL / CentOS dnf install jq # Alpine (Docker images) apk add --no-cache jq # Verify jq --version # jq-1.7.1
Format from stdin and from a file
# Pipe inline JSON through jq
echo '{"host":"db-prod-01.internal","port":5432}' | jq .
# Format a file directly (prints to stdout)
jq . config/feature-flags.json
# Format with 4-space indentation
jq --indent 4 . config/feature-flags.json
# Format using tabs instead of spaces
jq --tab . config/feature-flags.jsonWrite formatted output to a file
# Save formatted output (do NOT redirect back to the same file) jq . compact.json > formatted.json # Compact (minify) β reverse of formatting jq -c . formatted.json
jq exits with code 1 on invalid JSON, code 0 on success, and code 5 on usage errors. Use this in if statements and || exit 1 guards throughout your scripts.Sort keys and strip colour
# Sort all keys alphabetically (useful for deterministic diffs) jq --sort-keys . config/app-config.json # Disable colour output when writing to a log file jq --monochrome-output . response.json >> deploy.log
jq Options Reference
The most commonly used jq flags for formatting and validation workflows:
Validate JSON in a Bash Script
Validation and formatting are the same operation in jq β it parses before it prints. Redirect stdout to /dev/null when you only want the exit code without the formatted output. The pattern below is reusable across deploy scripts, pre-commit hooks, and CI pipelines. In incident response, the first thing I do with an unfamiliar API payload is pipe it through jq β it turns a wall of minified JSON into something I can actually read and debug.
Reusable validation function
validate_json() {
local file="$1"
if jq . "$file" > /dev/null 2>&1; then
echo "β Valid JSON: $file"
return 0
else
echo "β Invalid JSON: $file" >&2
return 1
fi
}Abort a deploy on invalid config
CONFIG="infra/k8s/app-config.json"
validate_json "$CONFIG" || { echo "Aborting deploy: invalid config" >&2; exit 1; }Validate all JSON files in a directory
find ./config -name "*.json" | while read -r f; do jq . "$f" > /dev/null 2>&1 || echo "INVALID: $f" done
-e / --exit-status flag goes further: it also exits with code 1 when the output is false or null. Use it to assert that a specific field is truthy: jq -e '.feature_flags.new_checkout' config.json.Format JSON from Files and API Responses
Two common sources of JSON in shell scripts are files on disk and HTTP API responses via curl. Each has a slightly different handling pattern. For files, the main concern is safe in-place editing. For API responses, the key detail is suppressing curl's progress bar so it doesn't corrupt jq's input.
Safe in-place formatting of a file
# Format and overwrite safely using a temp file tmp=$(mktemp) jq --indent 2 . config/feature-flags.json > "$tmp" && mv "$tmp" config/feature-flags.json echo "Formatted config/feature-flags.json"
Format a curl API response
# Format deployment status from API DEPLOY_ID="dep_8f3a2b9c" curl -s \ -H "Authorization: Bearer $DEPLOY_API_TOKEN" \ "https://api.deployments.internal/v1/deploys/$DEPLOY_ID" \ | jq --indent 2 .
Format and filter simultaneously
# Get formatted + filter to errors only from a monitoring endpoint
curl -s "https://monitoring.internal/api/events?level=error&limit=10" \
| jq '[.events[] | {id, message, timestamp, service}]' \
|| { echo "Failed to fetch or parse events" >&2; exit 1; }The || { ... } pattern is critical here. Without it, a failed curl or a malformed API response silently passes through and the next step in your script operates on empty or partial data. If you need to inspect complex nested responses without writing a filter expression first, the browser-based JSON Formatter lets you paste the raw response and navigate the tree interactively.
Format JSON in CI/CD Pipelines
CI is where JSON validation gates matter most β a malformed config that reaches production is far more painful to roll back than a pipeline failure. Most competitors document jq for one-off terminal use; the patterns below are the ones I use in production SRE workflows to catch config errors before they ever reach a deployment slot.
GitHub Actions β validate all JSON configs
- name: Validate JSON configs
run: |
echo "Validating JSON configuration files..."
find . -name "*.json" -not -path "*/node_modules/*" | while read -r f; do
if ! jq . "$f" > /dev/null 2>&1; then
echo "::error file=$f::Invalid JSON syntax"
exit 1
fi
done
echo "All JSON files are valid"Pre-commit hook β validate staged JSON files
#!/usr/bin/env bash
set -euo pipefail
STAGED=$(git diff --cached --name-only --diff-filter=ACM | grep '\.json$' || true)
[ -z "$STAGED" ] && exit 0
for f in $STAGED; do
jq . "$f" > /dev/null 2>&1 || { echo "Invalid JSON: $f"; exit 1; }
done
echo "JSON validation passed"scripts/validate-json.sh, make it executable with chmod +x scripts/validate-json.sh, then symlink it: ln -s ../../scripts/validate-json.sh .git/hooks/pre-commit.Format JSON Variables and Heredocs in Bash
Shell scripts often build JSON payloads dynamically β from environment variables, git metadata, or computed values. The safest pattern is jq -n --arg / --argjson rather than string interpolation, which breaks the moment a value contains a quote or a newline. Always double-quote variables when piping to jq to prevent word splitting on whitespace in the JSON.
Format a stored API response variable
# Always quote "$API_RESPONSE" β whitespace in JSON would break an unquoted expansion echo "$API_RESPONSE" | jq --indent 2 .
Build and format a payload with jq -n
payload=$(jq -n \
--arg env "production" \
--arg version "$(git describe --tags)" \
--argjson replicas 3 \
'{environment: $env, version: $version, replicas: $replicas}')
# Inspect the built payload
echo "$payload" | jq .
# Post it to an API
curl -s -X POST \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $DEPLOY_API_TOKEN" \
-d "$payload" \
"https://api.deployments.internal/v1/deploys"--arg always binds a string value. --argjson parses the value as JSON first, so you can pass numbers, booleans, arrays, and objects without quoting them inside the filter expression.Format JSON in Bash Without Installing jq
When jq is unavailable β minimal Docker images, locked-down CI runners, or systems where you cannot install packages β Python's built-in json.tool module provides the same core capability. It is part of the Python standard library; if Python 3 is installed, it works with no additional dependencies.
# Format from a file python3 -m json.tool config.json # Control indent width python3 -m json.tool --indent 2 config.json # Sort keys alphabetically python3 -m json.tool --sort-keys config.json # Format from stdin (e.g., piped from curl) curl -s https://api.deployments.internal/v1/status | python3 -m json.tool
python3 -m json.tool is stricter than jq: it rejects trailing commas, comments, and JSON5 extensions. This strictness is desirable for production config validation but can be a friction point when working with lax JSON from third-party tools. It also produces no colourized output, making terminal inspection less ergonomic than jq for interactive use.# Validation with exit code (same semantics as jq)
python3 -m json.tool config.json > /dev/null && echo "valid" || echo "invalid"
# Inline string validation
echo '{"service":"payments-api","healthy":true}' | python3 -m json.tool > /dev/null
echo "Exit code: $?" # 0 = validTerminal Output with Syntax Highlighting
jq colorizes its output by default β keys in blue, strings in green, numbers in white. When you need full JSON syntax highlighting in a scrollable pager, or when debugging large nested responses in a terminal session, bat provides the most ergonomic experience. Both are useful for debugging and interactive inspection; neither should be used when writing output to files or API responses.
Install bat
# macOS brew install bat # Debian / Ubuntu (binary may be named batcat β alias if so) apt-get install -y bat # alias bat=batcat # add to ~/.bashrc if needed # Verify bat --version # bat 0.24.0
View JSON files with syntax highlighting
# Syntax-highlighted JSON in the pager (press q to exit) bat config/app-config.json # Disable paging β print directly to terminal bat --paging=never config/app-config.json # Pipe jq output through bat for coloured inspection jq '.database' infra/app-config.json | bat --language=json --paging=never
Colorized jq output in a scrollable pager
# -C forces color even when stdout is not a tty (e.g. when piping to less) jq -C . logs/deploy-response.json | less -R
bat / jq -C) only for terminal inspection and debugging. Strip ANSI color codes before writing to log files or piping to other tools β use jq -M . (--monochrome-output) or bat --plain.Working with Large JSON Files in Bash
When a JSON file exceeds 50β100 MB, loading it into memory with jq's default mode can be slow or trigger OOM on memory-constrained hosts (Docker containers with a 512 MB limit, for instance). jq --stream emits path/value pairs incrementally as it reads, without buffering the entire document. For NDJSON (one JSON object per line), jq has a more efficient native approach.
Stream a large JSON file with jq --stream
# --stream emits [path, scalar] pairs as jq reads the input # Extract all "status" fields from a large log archive without loading it fully jq -c --stream 'if length == 2 and (.[0][-1] == "status") then .[1] else empty end' logs/archive-2026-03.json
NDJSON / JSON Lines β process one object per line
# NDJSON: one JSON object per line β common in Kafka exports, Fluentd, and Logstash
# -R reads raw lines; fromjson? skips lines that are not valid JSON
jq -c -R 'fromjson? | {id: .request_id, status: .http_status, latency: .duration_ms}' logs/access-2026-03-13.ndjson > logs/summary.ndjson# Shell loop alternative β useful when you need per-line error handling
while IFS= read -r line; do
echo "$line" | jq -c '{id: .request_id, status: .http_status}' 2>/dev/null || echo "SKIP: malformed line" >&2
done < logs/access-2026-03-13.ndjsonjq . file.json to --stream when the file is larger than 50β100 MB or when the process runs inside a container with a memory limit. For NDJSON pipelines, prefer jq -R 'fromjson?' over a shell while read loop β it is significantly faster because it avoids spawning a subshell per line.Common Mistakes
Problem: The shell opens and truncates the output file before jq reads the input. If source and destination are the same path, jq reads an empty file.
Fix: Write to a mktemp temporary file first, then atomically replace the original with mv.
jq --indent 2 . settings.json > settings.json
tmp=$(mktemp) && jq --indent 2 . settings.json > "$tmp" && mv "$tmp" settings.json
Problem: Without an error handler, the script silently continues with an empty or missing formatted file when JSON is invalid β downstream steps then fail with confusing errors.
Fix: Add || { echo '...' >&2; exit 1; } after every jq call that produces output used by a later step.
jq . response.json > formatted.json
jq . response.json > formatted.json || { echo "Invalid JSON in response.json" >&2; exit 1; }Problem: curl prints a progress bar to stderr by default. When stderr is merged with stdout (e.g. in subshells or log capture), the progress bar text appears in jq's input and causes a parse error.
Fix: Always pass -s (silent) to curl when piping to jq. Use -v or --fail-with-body separately if you need diagnostic output.
curl https://api.payments.internal/config | jq .
curl -s https://api.payments.internal/config | jq .
Problem: The -r / --raw-output flag removes JSON string quotes from top-level string values β it does not format objects or arrays. Passing -r . to an object input produces the same compact object, not indented output.
Fix: Use jq . (no -r flag) for formatting. Reserve -r for extracting plain string values like jq -r '.version' config.json.
jq -r . config.json
jq . config.json
jq vs python3 vs json_pp β Quick Comparison
Choosing between tools depends on what is available in your environment and what you need beyond basic formatting:
For most bash scripting and CI/CD work, jq is the right default β it validates, formats, filters, and provides reliable exit codes in a single binary with no runtime dependency. Fall back to python3 -m json.tool when you cannot install additional packages and Python is already present.
Frequently Asked Questions
How do I format a JSON file in place using bash?
Never redirect jq's output back to the same file β the shell truncates the file before jq reads it. Instead, write to a temp file first, then replace the original atomically with mv.
tmp=$(mktemp) jq --indent 2 . config/app-config.json > "$tmp" && mv "$tmp" config/app-config.json echo "Formatted in place successfully"
How do I validate JSON in a bash script and exit on error?
Pipe or pass the file to jq and redirect stdout to /dev/null. Use || to catch the non-zero exit and abort the script. jq exits with code 1 on any parse error, making it reliable for CI gates.
validate_json() {
local file="$1"
if jq . "$file" > /dev/null 2>&1; then
echo "β Valid JSON: $file"
return 0
else
echo "β Invalid JSON: $file" >&2
return 1
fi
}
validate_json infra/k8s/app-config.json || exit 1How do I format JSON in bash without installing jq?
Use python3's built-in json.tool module β it ships with every standard Python installation and produces properly indented output with the same exit-code semantics as jq.
# Format from a file python3 -m json.tool config.json # Format from stdin (e.g., a curl response) curl -s https://api.internal/status | python3 -m json.tool --indent 2
How do I format JSON from a curl response in bash?
Always pass -s (silent) to curl so progress bars don't corrupt jq's input. Pipe curl's stdout directly into jq.
DEPLOY_ID="dep_8f3a2b9c" curl -s -H "Authorization: Bearer $DEPLOY_API_TOKEN" "https://api.deployments.internal/v1/deploys/$DEPLOY_ID" | jq --indent 2 .
How do I format only part of a JSON file using jq?
Use a jq path expression instead of the identity filter (.) to extract and format a nested object or array. The result is itself formatted JSON.
# Format just the database config block
jq --indent 2 '.database' infra/app-config.json
# Format + filter events array to error level only
jq '[.events[] | select(.level == "error") | {id, message, service}]' events.jsonWhat exit code does jq return for invalid JSON?
jq exits with code 1 for any parse error and also when the -e / --exit-status flag is set and the output is false or null. Exit code 0 means valid JSON was parsed and produced truthy output. Exit code 5 means the system encountered a usage error.
# Test exit code directly
echo '{"ok":true}' | jq . > /dev/null 2>&1; echo "exit: $?" # exit: 0
echo '{bad json}' | jq . > /dev/null 2>&1; echo "exit: $?" # exit: 1
# -e flag: exit 1 if output is false/null
echo 'null' | jq -e . > /dev/null 2>&1; echo "exit: $?" # exit: 1Related Tools
Browser-based alternatives and complements to bash JSON formatting β useful when you need a visual interface, a shareable link, or are working outside a terminal:
Nadia is a site reliability engineer who lives in the terminal. She writes Bash scripts that process logs, transform data, and orchestrate infrastructure across fleets of servers. She is a heavy user of jq, awk, and sed and writes about shell one-liners, text processing pipelines, data serialisation from the command line, and the practical Bash patterns that SREs reach for when speed matters more than elegance.
Erik is a DevOps engineer who has spent years writing and maintaining the shell scripts that hold CI/CD pipelines together. He writes about Bash best practices, portable POSIX shell, encoding and decoding in shell scripts, secret management from the command line, and the patterns that separate reliable automation scripts from brittle ones. He is a strong believer in making shell scripts readable and testable with tools like bats-core.