There were several JSON problems mentioned in the article. One was parsing large arrays & floats. Another was DOM style parsing vs streaming. The point of using a streaming JSON parser is to avoid building an "output" per se, and instead handle the elements of the input as though they're events, concurrently with reading the input. In other words, turn parsing into an event handling system with many small events, rather than producing a single large data structure. One reason to do this is to avoid string allocation for every key and string value in the input, because that's where a huge chunk of the time goes when you use a non-streaming parser. Some people call this parsing style SAX parsing (https://en.wikipedia.org/wiki/Simple_API_for_XML) Here's an example https://rapidjson.org/md_doc_sax.html