Launched this week

Streaming JSON in Rust
Terabytes of JSON, kilobytes of memory
5 followers
Terabytes of JSON, kilobytes of memory
5 followers
Streaming JSON is a set of Rust libraries for parsing and transforming JSON using a sliding window approach, without loading the full document into memory. It includes: - "RJiter": a streaming JSON parser - "scan_json": trigger actions during parsing by matching keys and nested structures - "id_transform": copy input to output, a base for JSON rewriting - "ddb_convert": convert JSON to/from DynamoDB format, 12× faster than Python and 2× faster than a baseline Rust implementation

Processing JSON at scale is hard.
There are many streaming JSON parsers, but once you add even simple business logic on top of them, the code quickly becomes complex and messy.
"Streaming JSON" is a Rust project designed to balance high performance with clean, usable APIs. It lets you:
- Process huge JSON files (even terabytes) using only a few kilobytes of memory
- Build streaming transformations without tangled parser logic
- Achieve faster performance than many existing libraries (2× in our benchmarks)
Even if you don’t strictly need streaming, the project can simplify and speed up many JSON processing tasks.
I built this based on real-world experience creating several JSON tools.
I’d love feedback from the community:
👉 Have you ever hit JSON bottlenecks, parsing speed, memory usage, or throughput limits?
And if you find it useful, a GitHub star would mean a lot!
@olpa
Looks really promising! I also spent lot of time to try and experiment various ways how to process json files with streaming method, and i can say it is not easy to achieve if you want to keep memory low and processing fast. Every cpu cycle matters and every tiny optimization will grow large once you try to process large json files.
While JSON format is easy to read and understand for humans, it is a nightmare for computers to process.
Keep up the good work! Definitely following your Github repo!