JSON Lines Viewer
Paste or upload a JSONL / NDJSON file and instantly inspect each record — pretty-printed, searchable, with per-line error detection. Runs entirely in your browser.
- 1
{ "id": 1, "name": "Ada Lovelace", "role": "engineer", "active": true, "tags": [ "math", "computing" ] } - 2
{ "id": 2, "name": "Alan Turing", "role": "researcher", "active": true, "tags": [ "computing", "cryptography" ] } - 3
{ "id": 3, "name": "Grace Hopper", "role": "admiral", "active": false, "tags": [ "compiler", "navy" ] } - 4
{ "id": 4, "name": "Linus Torvalds", "role": "engineer", "active": true, "tags": [ "kernel", "git" ] } - 5
{ "id": 5, "name": "Margaret Hamilton", "role": "engineer", "active": true, "tags": [ "apollo", "software" ] }
Why a JSONL Viewer?
- Inspect log streams and dataset dumps quickly
- See exactly which line breaks parsing
- Pretty-print without writing throwaway scripts
- Filter records by substring across all fields
- Switch between record cards and a tabular view
What It Handles
- JSON Lines (.jsonl) and NDJSON (.ndjson)
- Mixed Unix (LF) and Windows (CRLF) endings
- Blank lines (skipped silently)
- Per-line errors without aborting the file
- Drag-and-drop or paste, up to ~25 MB
Privacy & Speed
- 100% client-side — nothing is uploaded
- Safe for production logs and PII data
- No sign-up, no rate limits, no API keys
- Instant — runs entirely in your browser
- Free and open to use
About the JSON Lines Viewer
The JSON Lines Viewer reads JSONL and NDJSON content — text where each line is an independent JSON value — and turns it into a browsable, searchable, pretty-printed list of records. It identifies which lines parse successfully and which contain errors, so you can quickly find the bad record in a 50,000-line log dump without writing a one-off script.
Everything happens in your browser. No data is uploaded to a server, which makes the tool safe for production logs, customer exports, machine-learning datasets, and anything else that mixes useful records with potentially sensitive values.
How to Use the JSON Lines Viewer
- Paste your JSONL / NDJSON content into the JSONL Input box on the left, drop a
.jsonlor.ndjsonfile onto it, or click Upload to pick a file. - Watch the Records / Valid / Invalid / Size stats update live as you type or load data.
- Browse parsed records in the right panel. Each card shows the line number, pretty-printed JSON, and a one-click Copy.
- Switch to the Table view to see records as rows with shared columns — useful for tabular datasets.
- Use the Search box to filter records by any substring across keys, values, or error messages.
- Click Copy all to copy every valid record (pretty-printed) to your clipboard, or Download .json to save them as a single JSON array.
- Click Sample to load a small demo dataset, or Clear to start fresh.
Common Use Cases
- Application logs — most structured-logging libraries (winston, pino, zap, structlog) emit one JSON object per line. Paste a chunk to inspect it without spinning up an ELK stack.
- Data pipelines — quickly inspect intermediate JSONL files produced by ETL jobs, BigQuery exports, or Spark pipelines.
- ML / LLM datasets — view OpenAI fine-tuning files, HuggingFace datasets, and instruction-tuning corpora that ship as JSONL.
- Streaming APIs — inspect saved responses from APIs that stream NDJSON (Elasticsearch bulk, OpenAI streaming, Twitter v2 filtered stream).
- Debugging bad lines — when a JSONL importer fails on "line 14237", paste the file here and jump straight to the broken record.
- Convert to a JSON array — use Download .json to turn a JSONL stream into a regular JSON array for tools that don't support line-delimited input.
Frequently Asked Questions
What's the difference between JSONL and NDJSON?
In practice, none. Both formats are "one JSON value per line, separated by newlines." JSONL (jsonlines.org) and NDJSON (ndjson.org) are slightly different specs but the on-disk format is identical, and this viewer handles both transparently.
Is my data sent to a server?
No. Parsing and rendering happen entirely in your browser using JavaScript. Your JSONL content, including any field values, is never uploaded — making the tool safe to use with production logs, customer data, or anything else you can't share externally.
Can I view very large files?
Files up to about 25 MB work well. Above that, browsers start to struggle with rendering tens of thousands of DOM nodes. For multi-gigabyte log files, use the Search box to filter down to the records you actually need, or split the file with split on the command line first.
What happens to invalid lines?
Invalid lines are kept in place with the original parse error message and the raw line content shown in red, so you can see exactly which record broke and why — without losing visibility into the surrounding lines.
When does the table view show columns?
Table view collects every key seen in any object record and shows them as columns. If a record is missing a key, the cell is empty. Records that are arrays or primitives don't have a sensible row representation and are skipped from the table view.
Can I export the records as a regular JSON array?
Yes. The Download .json button bundles every valid record into a single pretty-printed JSON array and downloads it as a .json file. This is the inverse of "convert JSON array to JSON Lines."
Does the search look inside nested objects?
Yes. Each record is serialized to JSON and matched as a substring, so you can search for nested key names, deeply nested values, array contents, or even partial words. Search is case-insensitive.
Why is line 1 sometimes skipped?
Blank lines (including the typical trailing newline at the end of a file) are silently skipped — they don't count as records and don't generate parse errors. Only non-empty lines produce records in the output.