People always wring their hands that operating at a new, higher level of abstraction with destroy people's ability to think and reason.
But people still think and reason just fine, but now at a higher level that gives them greater power and leverage.
Do you feel like you're missing something when you "cook for yourself" but you didn't you didn't plant and harvest the vegetables, raise and butcher the protein, forge the oven, or generate the gas or electricity that heats it?
You also didn’t write the CPU microcode or the compiler that turns your code into machine language.
When you cook or code, you're already operating on top of a very tall stack of abstractions.
Nah. This is a different beast entirely. This is removing the programmer from the arena, so they'll stop having intuition about how anything works or what it means. Not more abstract; completely divorced from software and what it's capable of.
Sure, manager-types will generally be pleased when they ask AI for some vanilla app. But when it doesn't work, who will show up to make it right? When they need something more complex, will they even know how to ask for it?
It's the savages praying to Vol, the stone idol that decides everything for them, and they've forgotten their ancestors built it and it's just a machine.
I agree with your sentiment. The thing is, in the past, the abstractions supporting us were designed for our (human) use, and we had to understand their surface interface in order to be able to use them effectively.
Now, we're driving such things with AI; it follows that we will see better results if we do some of the work climbing down into the supporting abstractions to make their interface more suitable for AI use. To extend your cooking metaphor, it's time to figure out the manufactured food megafactory now; yes, we're still "cooking" in there, but you might not recognize the spatulas.
Things like language servers (LSPs) are a step in this direction: making it possible to interact with the language's parser/linter/etc before compile/runtime. I think we'll eventually see that some programming languages end up being more apropos to efficiently get working, logically organized code out of an AI; whether that is languages with "only one way to do things" and extremely robust and strict typing, or something more like a Lisp with infinite flexibility where you can make your own DSLs etc remains to be seen.
Frameworks will also evolve to be AI-friendly with more tooling akin to an LSP that allows an MCP-style interaction from the agent with the codebase to reason about it. And, ultimately, whatever is used the most and has the most examples for training will probably win...
But people still think and reason just fine, but now at a higher level that gives them greater power and leverage.
Do you feel like you're missing something when you "cook for yourself" but you didn't you didn't plant and harvest the vegetables, raise and butcher the protein, forge the oven, or generate the gas or electricity that heats it?
You also didn’t write the CPU microcode or the compiler that turns your code into machine language.
When you cook or code, you're already operating on top of a very tall stack of abstractions.