Interesting post and he makes a good point:
I find that all the significant concepts in software systems were invented/discovered in the 15 years between 1955 and 1970. What have we been doing since then? Mostly making things faster, cheaper, more memory-consuming, smaller, cheaper, dramatically less efficient, more secure8, and worryingly glitchy. And we’ve been rehashing the same ideas over and over again. Interactivity is now “event-driven programming”. Transactions are now “concurrency”. Internetworking is now “mesh networking”. Also, we have tabbed browsing now, because overlapping windows were a bad skeuomorphism from the start, and desktop notifications, because whatever is all the way in the corner of your screen is probably not very important. “Flexible view control” is relegated to the few and the proud who run something like
herbstluftwmon their custom-compiled GNU/Linux.
in particular, maybe it’s time to bring more of what academia is doing out into the industry. For example, all the benefits of Haskell and type inference should be in mainstream languages (Rust is trying to do that, Go kinda failed). Another example is modeling, people are still modeling by either drawing boxes and sticks or by coding up small prototypes. Why not try out a modeling language like Alloy? Why not have more contract-based programming?