Hacker Newsnew | past | comments | ask | show | jobs | submit | fhars's commentslogin

You are arguing against the opposite of what the comment you answered to said.

Am i? "Can you think of any reasons beyond performance?" implies that the comment author thinks performance would be a valid reason.

Quoting my original message:

> And why do we not anymore make use of it, but instead implemented separate JSON loading functionality in JavaScript?

In other words: I'm asking for reasons why was native JSON JavaScript module created, if we already had eval.

> Can you think of any reasons beyond performance?

One of the reasons is that native JSON parser is faster than eval: give some other reason.


Watch the original, there you can select an English simultaneous translation: https://media.ccc.de/v/36c3-10652-bahnmining_-_punktlichkeit...


That method wont work, that is a too large change that happens to seldom. What you want is a leap second every hour for five months to switch between standard and daylight savings time and back, with a month of constant time around each solstice. That gives you a smooth transition without perceptible discontinuities.


Seems to me the most obvious answer is to return to sundials, no?


Only works during the day? Which, come to think of it, I'm not entirely clear how humans kept time at night long ago. I'm assuming they learned roughly where some constellations were?


I challenge the idea that 10 minutes is too large of a change?

I accept that it was too many changes back when we didn't have smart phones/clocks controlling the vast majority of time pieces. Even most cars, nowadays, set themselves off of a GPS signal.

Nowadays, though? A surprising number of people flat out don't notice that the time even changed until people tell them about it.

As the other response said, though; if you look at when people were on solar time, the length of an hour just flat out wasn't constant. Such that most animals are already used to wake times changing throughout the year. It was specifically our move to a mechanical method that was constant that is causing this.

To that end, shifting to a change every month would, in many ways, be a step back towards how sundials worked with constant changes. As you say, we could go even more continuous someday. That feels like it would have slightly more complications. But by the time everything is controlled by a central computer like thing, most of them would be completely obviated.


People have been wondering that for a while: https://news.ycombinator.com/item?id=19304281


You only have to bundle about 110 ISDN channels to transfer that (four E1 or five T1 trunk lines).


Right, but point is, assume the "backbone" never got fast enough to have a million subscribers all doing that at once.

I remember a subscriber T1 costing 4 figures per month, and I don't think it's because the copper pairs themselves were any different. (They weren't. As long as they didn't have bridge-taps, it was just plain old pairs. The repeaters every few kilofeet were not that expensive either.)

I remember the early-90s internet guidance that idle traffic like keepalive pings was discouraged, especially if you were sending traffic overseas, because it cluttered up the backbone links with packets that weren't actually valuable, and that was rude / abusive. Presumably edge CDNs would've still happened (or, ISPs providing Usenet servers basically did a lot of that already), but you simply wouldn't be doing video over the internet at large because the bandwidth charges would kill you.


You would still have video happening, but it would not be the type we have today (streaming arbitrary full-length movies from a nearly infinite catalog and YouTube). It would be used for big events and things like that. We might still have gotten podcasting, though.


Podcasting distributed by NNTP would be so much more efficient than RSS and HTTP, too!


Right, but a T3 could have handled multiple.


Like the "Cancel subscription" dialog with options "Cancel" and "Cancel"...

UX Design is hard...



There are even commercially available prototypes of that vacuum cooling technology, if you want to perform your own experiments with that concept: https://www.amazon.com/Thermos-Stainless-Ounce-Drink-Bottle/...


That's my water bottle. 10/10 would recommend for not passing temperature gradients.


To be fair, they have mirror surfaces inside. A more realistic prototype would be ultra-black for something like 10-50x better radiative heat transfer. Of course it would still be more like shitty insulation than like good conduction.


this kind of sarcasm will go over their head. People truly don't understand vacuums


I absolutely don't understand how vacuum works. So I absolutely cannot model how a Dewar flask which has 15 billion light year thickness between the inner and outer wall - a wall that is very close to absolute zero will behave.


There is another interpretation, reading "bits" as "set bits" and assuming that textual description (especially the operator "of the") has a higher precedence than multiplication, then your initial number is 9 with 2 bits set, and the largest number is 52 with 3 bits set, and 3 < 2 * 3 + 1 = 7.


Does Base https://github.com/garybernhardt/base still work with current versions?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: