JSON to Python

Generate Python dataclasses from JSON

Try an example
Root class name:

JSON Input

Python Output

Runs locally Β· Safe to paste secrets
Python dataclasses will appear here…

What is JSON to Python Dataclass Conversion?

JSON to Python dataclass conversion takes a raw JSON object and produces a set of Python dataclass definitions with accurate type annotations. Python's dataclasses module, introduced in PEP 557 (Python 3.7), generates __init__, __repr__, and __eq__ methods from annotated class fields. When you work with JSON APIs, configuration files, or message queues, dataclasses give your data a typed structure that editors and type checkers like mypy can verify at development time.

Python's json.loads() returns plain dicts and lists. These work, but they have no type information: a misspelled key returns None instead of raising an error, and your editor cannot autocomplete field names. Dataclasses solve this by mapping each JSON key to a named, typed field. Nested JSON objects become separate dataclass definitions, arrays become List[T] annotations, and null values become Optional[T] with a default of None.

Writing these definitions by hand is mechanical work. You read the JSON, figure out each field's type from its value, convert keys from camelCase or snake_case to Python conventions, and handle edge cases like nullable fields and mixed-type arrays. A converter does all of this in milliseconds. You paste JSON, get correct dataclass code, and move on.

Why Use a JSON to Python Converter?

Translating JSON structures into Python class definitions by hand means guessing types from sample data, reordering fields so required ones come before optional ones, and updating everything when the API changes. A converter removes that friction.

⚑
Instant dataclass generation
Paste your JSON and get typed Python dataclass definitions in under a second. Nested objects, lists, and optional fields are handled automatically.
πŸ”’
Privacy-first processing
Conversion runs entirely in your browser using JavaScript. Your JSON never leaves your machine. API keys, tokens, and user records stay private.
πŸ“
Correct type annotations
Every generated field includes a Python type annotation inferred from the JSON value: str, int, float, bool, List[T], or Optional[T] for nulls.
πŸ“¦
No install or signup
Open the page and paste your JSON. No Python environment required, no pip packages to install, no account to create.

JSON to Python Use Cases

REST API Client Development
Generate dataclasses from API response samples. Paste the JSON returned by a third-party REST endpoint and get type-safe Python classes ready for requests or httpx.
FastAPI Request/Response Models
Start from a JSON payload shape and generate dataclass definitions. Convert them to Pydantic models to get automatic validation in FastAPI route handlers.
Data Pipeline Schemas
Define typed record structures for ETL pipelines. Paste a sample JSON message from Kafka, RabbitMQ, or SQS and generate dataclasses that document the expected shape.
Configuration File Parsing
Turn JSON config files into typed Python classes. Load your config with json.load(), then construct a dataclass instance for editor autocomplete and type checking.
Test Fixture Generation
Create typed fixtures from sample JSON data. QA engineers can paste API response snapshots and produce dataclass definitions for use in pytest test suites.
Learning Python Type Annotations
Students can paste any JSON structure and see how Python represents it with type hints. The generated code shows List, Optional, nested classes, and default values in context.

JSON to Python Type Mapping

Every JSON value maps to a specific Python type annotation. The table below shows how the converter translates each JSON type, with both the typing module syntax (Python 3.7+) and the built-in syntax available from Python 3.10 onward.

JSON TypeExamplePython (typing)Python 3.10+
string"hello"strstr
number (integer)42intint
number (float)3.14floatfloat
booleantrueboolbool
nullnullOptional[str]str | None
object{"k": "v"}@dataclass classnested model
array of strings["a", "b"]List[str]list[str]
array of objects[{"id": 1}]List[Item]list[Item]
mixed array[1, "a"]List[Any]list[Any]

Dataclass Decorator Reference

The @dataclass decorator accepts several parameters that change how the generated class behaves. This reference covers the options most relevant when working with JSON-derived data.

Decorator / FieldBehaviorUse When
@dataclassGenerates __init__, __repr__, __eq__ from field annotationsStandard dataclasses
@dataclass(frozen=True)Makes instances immutable (hashable, no attribute reassignment)Config objects, dict keys
@dataclass(slots=True)Uses __slots__ for lower memory and faster attribute accessPython 3.10+, large datasets
@dataclass(kw_only=True)All fields require keyword arguments in __init__Python 3.10+, many fields
field(default_factory=list)Sets a mutable default without sharing state between instancesList/dict/set defaults

dataclass vs Pydantic vs TypedDict

Python has three common ways to define typed structures from JSON. Each fits a different use case. Dataclasses are the standard library option with zero dependencies. Pydantic adds runtime validation. TypedDict annotates plain dicts without creating a new class.

@dataclass
Standard library (Python 3.7+). Generates __init__, __repr__, and __eq__. No runtime validation. Works with mypy and dataclasses-json for serialization. Best for internal data structures where you control the input.
BaseModel (Pydantic)
Third-party library. Validates types and constraints at runtime. Parses JSON directly via model_validate_json(). Standard choice for FastAPI, settings management, and any code that receives untrusted input.
TypedDict
Standard library (Python 3.8+). Adds type hints to regular dicts. No __init__ or methods generated. Values stay as plain dict access. Use when you need type checking but want to keep the dict interface, such as in legacy codebases.

Code Examples

These examples show how to use generated dataclasses in Python, how to produce them programmatically from JavaScript, and how to use alternative approaches like Pydantic and CLI tools.

Python (dataclasses)
from dataclasses import dataclass
from typing import List, Optional
import json

@dataclass
class Address:
    street: str
    city: str
    zip: str

@dataclass
class User:
    id: int
    name: str
    email: str
    active: bool
    score: float
    address: Address
    tags: List[str]
    metadata: Optional[str] = None

raw = '{"id":1,"name":"Alice","email":"alice@example.com","active":true,"score":98.5,"address":{"street":"123 Main St","city":"Springfield","zip":"12345"},"tags":["admin","user"],"metadata":null}'
data = json.loads(raw)

# Reconstruct nested objects manually
addr = Address(**data["address"])
user = User(**{**data, "address": addr})
print(user.name)     # -> Alice
print(user.address)  # -> Address(street='123 Main St', city='Springfield', zip='12345')
JavaScript (generate Python from JSON)
// Minimal JSON-to-Python-dataclass generator in JS
function jsonToPython(obj, name = "Root") {
  const classes = [];
  function infer(val, fieldName) {
    if (val === null) return "Optional[str]";
    if (typeof val === "string") return "str";
    if (typeof val === "number") return Number.isInteger(val) ? "int" : "float";
    if (typeof val === "boolean") return "bool";
    if (Array.isArray(val)) {
      const first = val.find(v => v !== null);
      return first ? `List[${infer(first, fieldName + "Item")}]` : "List[Any]";
    }
    if (typeof val === "object") {
      const clsName = fieldName.charAt(0).toUpperCase() + fieldName.slice(1);
      build(val, clsName);
      return clsName;
    }
    return "Any";
  }
  function build(obj, cls) {
    const fields = Object.entries(obj).map(([k, v]) => `    ${k}: ${infer(v, k)}`);
    classes.push(`@dataclass\nclass ${cls}:\n${fields.join("\n")}`);
  }
  build(obj, name);
  return classes.join("\n\n");
}

const data = { id: 1, name: "Alice", scores: [98, 85] };
console.log(jsonToPython(data, "User"));
// @dataclass
// class User:
//     id: int
//     name: str
//     scores: List[int]
Python (Pydantic BaseModel alternative)
from pydantic import BaseModel
from typing import List, Optional

class Address(BaseModel):
    street: str
    city: str
    zip: str

class User(BaseModel):
    id: int
    name: str
    email: str
    active: bool
    score: float
    address: Address
    tags: List[str]
    metadata: Optional[str] = None

# Pydantic parses and validates JSON in one step
raw = '{"id":1,"name":"Alice","email":"alice@example.com","active":true,"score":98.5,"address":{"street":"123 Main St","city":"Springfield","zip":"12345"},"tags":["admin","user"],"metadata":null}'
user = User.model_validate_json(raw)
print(user.name)              # -> Alice
print(user.model_dump_json()) # -> re-serializes to JSON
CLI (datamodel-code-generator)
# Install the generator
pip install datamodel-code-generator

# Generate dataclasses from a JSON file
datamodel-codegen --input data.json --output models.py --output-model-type dataclasses.dataclass

# Generate Pydantic models instead
datamodel-codegen --input data.json --output models.py

# From a JSON string via stdin
echo '{"id": 1, "name": "Alice", "tags": ["admin"]}' | \
  datamodel-codegen --output-model-type dataclasses.dataclass
# Output:
# @dataclass
# class Model:
#     id: int
#     name: str
#     tags: List[str]

Frequently Asked Questions

What is the difference between a Python dataclass and a regular class?
A dataclass uses the @dataclass decorator to auto-generate __init__, __repr__, and __eq__ methods from field annotations. A regular class requires you to write these methods yourself. Dataclasses reduce boilerplate when the class primarily holds data, which is the typical case for JSON-derived structures.
Can I use dataclasses with JSON serialization directly?
The standard library's json module cannot serialize dataclass instances by default. Use dataclasses.asdict() to convert a dataclass to a dict, then pass that to json.dumps(). For more control, the dataclasses-json library adds .to_json() and .from_json() methods, and Pydantic models handle serialization natively.
How does the converter handle nested JSON objects?
Each nested object becomes a separate @dataclass definition. If a JSON field named "address" contains an object with "street" and "city", the converter creates an Address dataclass and annotates the parent field as address: Address. Deeply nested structures produce multiple dataclass definitions in dependency order.
What happens when a JSON field is null?
Null fields are annotated as Optional[str] (or the appropriate type if it can be inferred from context) with a default value of None. Fields with defaults must come after required fields in a dataclass, so the converter places optional fields at the end of the class definition.
Is there a difference between dataclasses and Pydantic models for JSON?
Dataclasses are part of the standard library and do not validate data at runtime. Pydantic models validate types, enforce constraints, and can parse raw JSON strings directly. If you receive JSON from external sources and need to reject malformed data, Pydantic is the better fit. For internal data passing where you trust the input, dataclasses are lighter and have no external dependencies.
How do I handle camelCase JSON keys in Python dataclasses?
Python convention uses snake_case for variable names. The converter translates camelCase keys like "firstName" to snake_case fields like first_name. If you need to deserialize back from JSON, use the dataclasses-json library with a config that maps between the two naming conventions, or write a custom __post_init__ method.
Does this converter support Python 3.10+ syntax like list[str] instead of List[str]?
The converter generates typing module imports (List, Optional) for maximum compatibility with Python 3.7 through 3.12. If your project targets Python 3.10 or later, you can safely replace List[str] with list[str] and Optional[str] with str | None. The type mapping table above shows both forms.