Programming languages evolve slowly because they're more like mathematical notation than technology, so said someone smart whose identity I can't recall. Lots of smart people have been thinking for more than a half century about the following problem: How to elegantly tell a computer what to do? With all that brainpower invested in this problem, the low-hanging fruit is gone. It's understandable that progress is slow now. I doubt there are great programming paradigms waiting in the wings, waiting on technological advances like faster CPUs, more memory, or better graphics.
After staggering expenditures of effort by some of the world's sharpest minds over more than half a century, I doubt there are any more major breakthroughs awaiting us in the field of programming languages. (Unless AI starts to play a major role, in which case I couldn't predict how programming would change.)
After staggering expenditures of effort by some of the world's sharpest minds over more than half a century, I doubt there are any more major breakthroughs awaiting us in the field of programming languages. (Unless AI starts to play a major role, in which case I couldn't predict how programming would change.)