Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The repo is calling it bad, but in production I'd actually prefer a parser that throws an approptiate error based on a setting that can be configured with some reasonable defaults over something that just silently chugging along until it runs out of memory.

I can imagine it'd be a lot easier to overwhelm and DDoS a server that attempts to parse incoming JSON requests without any depth bounds too.



Just because your data structure is deeply nested doesn’t mean it takes a lot of memory.

[[[[]]]] is still only 4 objects worth of memory on a stack.


Most JSON parsers use an amount of memory that is proportional to the size of the input, independently of whether they can parse deeply nested structures or not.

A parser that limits the maximum depth of the input can still be made to consume gigabytes of memory on an input of several gigabytes, and there is nothing wrong about that.

Size limits on a production server should be enforced before even starting to parse the JSON, but this has nothing to do with the issue highlighted here.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: