It has been some time since I last wrote about technology and software development. I’ve been professionally developing software for over fifteen years, working for multiple startups, as a consultant, and currently at Stripe. I can map over my career how my skills and methodologies of developing software has improved, but there is one question that continues to bug me: What’s next?
Now, this isn’t a “What’s next for me?” but “What’s next in computing?”. Compared to many other fields, computing and computer science is still quite a new field, but it is incredible how far it has come in less than 100 years. From room-sized equipment that was programmed by plugging in cords to the phones we carry around in our pockets, from punch cards to languages like Go, Rust, and Elixir, this field has grown in leaps and bounds that no-one could have predicted.
That said, I can’t help but also feel that our field has stagnated. Creating new languages has never been easier (including my own experiment), but that next paradigm shift has so far eluded us. Diving into the history of programming languages shows us that major paradigm in software development, from procedural and imperative languages, through object-oriented, purely functional, and logical was all fleshed out and well understood by 1980, over 40 years ago. Every language since then has combined and/or improved upon these paradigms in different ways but I’ve yet to see any language today that makes me think “huh, there’s something new here.”
And if even the legendary Alan Kay, the inventor of Object-oriented programming and the SmallTalk language, who spent 16 years researching new paradigms through the Viewpoints Research Insitute didn’t find it (though the papers and the research they did do is amazing and worth familiarizing yourself with), you start to wonder if there are any more major paradigms to find.
Or maybe we’re looking in the wrong direction. Maybe the next step isn’t going to come from programming language research at all.
Delving deeper into our field’s history, you’ll eventually come across a little thing called the Dynabook, by Alan Kay. The Dynabook was intended to make computing so easy and intuitive to use that it would be a fundamental educational tool, making computing accessible to all. While Steve Jobs and the iPhone and iPad were heavily influenced by this idea, it’s easily argued that Jobs missed the entire point of the Dynabook, as the iPad and iPhone were then designed and built for consumption instead of education.
Millions of people today carry around computational devices in their pockets that put room-sized supercomputers of the past to shame, and yet the true power of these devices is locked away. We rely on others to build the apps we use, whether they be for productivity, education, or consumption (e.g. games, videos, and music). For the vast majority of people, there’s no other way to use these or other devices. Even for those of us who understand software and computing, it’s still significantly harder than it should be to get a computer do what you want.