the article was ai to a large part. i looked at the domain and saw it wasnt standard, assumed br = brazil, and gave the benefit of the doubt that ai was used to translate technicality. while the prose sucks as ai prose does, the content behind the prose did not suck. so i disagree that this was slop. for the record ive been flagged many times for vitriol against ai based on my personal, moral, and professional hatred. you didnt add anything to the conversation and i think thats against the hn spirit just as much as ai abuse
It's just not what the word axiom means nor how anyone uses it. An axiom is unprovable by definition - is it a thing we accept to be true because it is useful to do so (e.g. there exists an empty set)
"Provably Correct Axiom" is nonsense. An axiom is unprovable.
Just "provably correct" would've been fine. This chess stuff is hilariously pretentious.
I just don't understand how this can be this slow. What on earth is it doing to get 193 requests per second on static, cached content?
The article doesn't dig too deep into this sadly, rather just accepts that it's slow and uses a different tool.
But seriously, this is responding to a request with the content of a file. How can it be 100,000x slower than the naive solution. What can it possibly be doing, and why is it maxxing out the CPU?
If no-one else looks into this, I might get around to it.
But this isn't even true, and NextJS is well into egregiously complexity. Remix was an alternative option in the space that is now deprecated in all-but-name for React Router v7, which (for those just tuning back in), react router is now a framework.
If you wrote your app in NextJS 2 years ago, you would already have to rewrite chunks of it to get it to compile today. These tools are NOT solidified, they are releasing breaking changes at least once a year.
The blog post you've linked doesn't justify what you've said about it at all.
In the netflix blog post they're complaining about increasing latency over time because they have a function that *reloads all express routes in-memory* that didn't properly remove all the previous routes, so the routes array got bigger and bigger. That's not a fundamental problem with express[1], that's an obscure (ab)use case implemented wrong. Hardly a damning indictment of express.
> This turned out be caused by a periodic (10/hour) function in our code. The main purpose of this was to refresh our route handlers from an external source. This was implemented by deleting old handlers and adding new ones to the array. Unfortunately, it was also inadvertently adding a static route handler with the same path each time it ran.
[1]: Admittedly an array is not the "best" data structure for routing, but that absolutely wasn't the performance issue they were having. Below a couple thousand routes it barely matters.
Where there is smoke, there is fire. And Trump has been boasting about hacking the election, if some Twitter screenshots are to be believed..
This looks a bit thin and a bit far-fetched, but, then again, it might be the beginning of some proof.
If there was fraud, it's hard to believe they managed to cover all their tracks. So let's archive and save the possible clues we can find, and maybe, one day, we will have an answer.
All these replies trying to justify the title... absolutely bonkers. Should be ashamed.
Literally has nothing to do with faking data, hacking data, changing data or trying to influence any outcome whatsoever. The title is 100% not what his code is about.
But yall keep on trying to fit a square peg into a round hole and wonder why no one cares about your outrage.
Use it on something legitimate, not something 100% made up.
reply