Who can know what the world will look like as we "transition"? I sure don't, but I'm thankful the author here has taken a stab at it. I feel like this is one of the first stories I've seen to try to imagine this post-transition world in a way that isn't so gonzo as to be unrelatable. It was so relatable (the human-ness shining all the brighter in a machine-driven world) that I cried as I finished reading. I've felt very anxious about my own future, and to see one possible future painted so vividly, with such human and emotionally focused themes, triggered quite an emotional reaction. I think the feeling was:
> If the world must change, I hope at least we still tell such stories and share how we feel within that change. If so, come what may, that's a future I know I can live in.
Thank you for this comment, I'm so glad it made you feel a little bit better about the future, if even for a little while!
This is really the whole idea behind this project with Near Zero. I think there's a lot of anxiety out there around AI and the future, I was there for a while too. Ultimately I've ended up pretty optimistic about it all, and inspired by what the group at Protocolized is doing, found science fiction a great way to help express that.
You're right that this isn't some groundbreaking revelation. If you're using AI enough to be feeling it, you're feeling/seeing what they're talking about. The purpose of a paper/retreat like this it get it all together and written down on paper, then to disseminate it to the wider world. I think the paper does a good job of collecting info that isn't wrong, and which has enough info to help guide folks making decisions.
Don't downvote, it certainly did for me. My first computer was a MBP 13inch from 2009, as I was apple obsessed like the person in the parent article. Time passes and I realized I really didn't like either Windows or Mac, and for the past 10 years Ive been linux only. It really does happen, even if rarely.
Good on you for rising up to the ranks of Linux/BSD.
You just need to recognize that not everybody aspires to be competent with lower-levels of hardware and software that Apple makes that much more difficult. Most Apple users are content to use apps written by others and that is as far as their interest goes.
An analogy is the car market. Most people don't care how the car works, etc. They just want to get to places. If you only need to drive to the shops and do minimal errands, you don't even need a truck - a sedan will do just fine. Same with computers, lots of different market segments with distinct needs and expectations.
> You just need to recognize that not everybody aspires to be competent with lower-levels of hardware and software
You don't really need that to use Linux.
People should stop copy/pasting urban myths or stories from the late 90's. We are in 2026 and one can perfectly buy a laptop preinstalled on linux with full support and just find the apps they need from an "app store" which in this case is just the frontend for the flatpak and packages manager. Picking up an app from Gnome Software is no different than installing an app from the play/apple/microsoft store.
Yep everyone has their preference. A lot of us have done both. I’ve run multiple distros. I’ve played with low level software. I have used and continue to use open source tools in places.
And I prefer my Mac to this day as my main machine.
Consumer user or Linux hacker is a false dichotomy people sometimes like to try to slot people into (not accusing you GianFabien).
My first computer was a Compaq my parents got during that peak era of home PC mass adoption in the late 90s. I immediately played a ton of games, got on AOL, learned VBScript, C++, HTML, etc.
This was such a natural and common thing that I never even questioned if others were having a different experience with computers. This sounds crazy now, but it felt as if everyone was either going to learn to program or already had, not as a career choice but as an essential form of literacy. I mean even the calculators were programmable!
To me, Macs were just "the boring computers" we had at school and what my grandparents bought. They seemed locked down and weird like an appliance. I have no idea what my life would be like now if I had grown up in a different time and with a Mac.
This isn't to hate on Macs, but to tell the story of the dominance of Microsoft at the time and how much culture shifted towards more "dumb" consumerism. By the time the first iPod came out I realized the adults had no interest in any of this more progressive future. Then the iPhone and Windows Vista confirmed it.
I installed Ubuntu on the ThinkPad I had in high school and never really looked back. To this day, I am still baffled by the obsessions people have with AI "replacing jobs" and Apple devices as status symbols. I think those people miss the point entirely and worry about their incomplete worldview being passed down to younger generations. What I see is the masses refusing to participate and technofeudalists taking advantage of them.
I think the timelines are too short for trends to be completely apparent yet. You can typically hire people faster than you can scale your income sources, even in the face of tremendous demand. Right this moment there's factors pushing folks to fire, but I also do see some companies delivering more (not a lot, but noticably more) and seeing increasing sales as a result. Those are in conflict, and we'll see which way the trends push through time.
The author of that post effectively re-defines "memory"/"RAM" as "data", and uses that to say "accessing data in the limit scales to N x sqrt(N) as N increases". Which, like, yeah? Duh, I can't fit 200PB of data into the physical RAM of my computer and the more data I have to access the slower it'll be to access any part of it without working harder at other abstraction layers to bring the time taken down. That's true. It's also unrelated to what people are talking about when they say "memory access is O(1)". When people say "memory access is O(1)" they are talking about cases where their data fits in memory (RAM).
Their experimental results would in fact be a flat line IF they could disable all the CPU caches, even though performance would be slow.
Someone built an archive of Github statuses to show aggregate uptime, last month and this month Github's uptime is below 90%, not even one "nine" of availability: https://mrshu.github.io/github-statuses/
87% uptime for Github in February 2026. They've got to get it together.
> If the world must change, I hope at least we still tell such stories and share how we feel within that change. If so, come what may, that's a future I know I can live in.
reply