Software is terrible

“Web design, the first 100 years” has been making rounds in my feeds for the past week and is well worth the 30 minutes it’ll take to read it the first time. Despite its title, the talk is less about design as a visual substrate than it is about design as an intent.

Maciej Cegłowski (the man behind Pinboard) takes a stab at explaining why it is that technology is failing the people and fooling itself, all the while entertaining the fantasy of its own influence. Why are today’s applications, platforms and hardware generally mediocre despite the great promises that were made in the early days of the electronic era? Moore’s law hit a wall, the average smartphone barely holds a charge for more than 16 hours and networks are managed by the likes of Comcast and AT&T. The experience of computing feels increasingly constrained rather than liberating.

Are we done with it or just about the touch the void?

We started with music and publishing. Then retailing. Now we’re apparently doing taxis. We’re going to move a succession of industries into the cloud, and figure out how to do them better. Whether we have the right to do this, or whether it’s a good idea, are academic questions that will be rendered moot by the unstoppable forces of Progress. It’s a kind of software Manifest Destiny.

To achieve this vision, we must have software intermediaries in every human interaction, and in our physical environment.

But what if after software eats the world, it turns the world to shit?

Consider how fundamentally undemocratic this vision of the Web is. Because the Web started as a technical achievement, technical people are the ones who get to call the shots. We decide how to change the world, and the rest of you have to adapt.

Cegłowski classifies the actors of the Web in three mutually exclusive buckets: you can chose to consider the web as a network, a mean of control or a threat. The first bucket is happy — this is the web as we know it, the one where you can binge on Wikipedia, write a rant to your church mailing list or create Bittorrent. The second bucket is full of folks eager to have more of it all — it’s the folks who keep We put a chip in it running. The third bucket is trying to leave Earth because the singularity will otherwise ruin everything.

A long time ago, I think I was in the second bucket. Move fast, code away, break things. But what I’ve learned over the past 10 years of writing software professionally is that writing software should probably be considered as a last resort. The cost and pain of writing and using software is likely to trump the cost of the problem it actually seeks to address. The failure to deliver ever-more powerful hardware on a clock-like schedule revealed an ugly truth.

Software is terrible

Software doesn’t fix broken processes, it actually exposes them and sheds light on their bizarre limit conditions and impossibility to be generalized. Trying to simultaneously design and automate a new process is a great way to get nowhere. Not only do engineers have finite amount of time and patience, but they get away with so much. Why would they care about usability or stability when users were conditioned to reboot their PC, to routinely work around common bugs, to save and backup their work at every occasion? And then at the smallest sign of defiance, when the user is finally enjoying a working system, it’s time for an upgrade. Because, fuck you user, we know better. Thank you for your purchase, please come back.

A full year after I purchased it and after dozens of updates, my GPS watch is still incapable of properly syncing wirelessly with my phone – that’s despite the manufacturer shipping an app for it. And I have to go through a desktop to update the firmware. Who cares if half the internet population doesn’t even own a PC? My 2013 phone is just finally getting upgraded to Lollipop, six full months after the manufacturer promised the OTA would land. These are minor examples but they are infuriating – what other industry would get away with that kind of incompetence and deceitfulness?

Well, the answer is probably the military, or construction, which is a sad state of affairs. Software came with the promise to be better, to be a bicycle for the mind. The fact that it had so many past examples of failures to learn from makes its suckitude inexcusable. A few weeks ago, we reached the point where we can realistically exploit bad software design to remotely take control of a running vehicle. Will DMVs train drivers to reboot their cars when they start behaving at freeway speeds?

Software didn’t prevent the Germanwings’ plane from being smashed on the side of that mountain. It might be hubris to consider that it could have, or maybe not. It certainly warned the pilots of AF447 what the situation was and they were confused about what to do about it. The examples are numerous and the end result is the same and it is tragic – even when software is as good as it gets, we choose to let operators remain in control. Weirdly enough, we’re on a fast track to allow self-driving cars on our roads.

Software is a never-ending illustration of the knowledge gap that each human has about its own species. The biggest mistake we routinely make is assuming that everyone thinks like us – hence having the same problems and needing the same solution. The second biggest mistake is thinking that no one thinks like us. Software is terrible because it is human. That also makes it great, and surprising, and playful and allows for quick hacks of genius to be implemented.

Havens of near-sanity

Not all software is that bad. Google Now is a recent example of product that hardly ever ceases to amaze. A continuous, preemptive, predictive, personal search engine that sits in my pocket all the time. It makes attempts to understand me, though it does come at the price of trusting Megacorp with pretty much all of my data. Which is fine as long they don’t take your access away.

My DLSR camera routinely impresses me, too. It boots in under half a second, performs the most common task I use it for swiftly and neatly. It’s incredibly customizable, It connects seamlessly to a wide range of third-party accessories. It processes dozens of megabytes of data in a fraction of a second, and is optimized to function for days on a battery the size of two AA batteries. It is mediocre at formatting a 32 GB memory card – that’s a great tradeoff. I don’t feel any need to upgrade its firmware, though I probably could. The software is correct and gets out of the way.

And then there’s the box of magical little stuff that is weird, beautiful or hacky. It’s not a huge box but its very existence is comforting.

Computing as a resource is an idea that has gotten an incredible amount of traction in just a few years. Startups no longer wonder how to provision hardware anymore — it’s been such a massive and sudden shift it’s easy to forget how things were done before. The flip of the coin is that AWS nodes are already an abstraction on top of which containerized applications get shipped. Such containers are deployed on an orchestrator which itself is managed by a meta-OS.

Zero f#*!$ given as a service

When is enough good enough? When are we going to stop jumping the software shark? There’s a huge appeal to make server applications easier to manage but every second spent dockerizing an app is a second spent not building a feature. Each layer of abstraction is driving us further away from the metal. At what point do you get to work instead of figuring how stuff works? And it’s not just an enterprise problematic — as Apple has shifted resources over to iOS, Mac OS has been dangerously stagnating for a few years. I just wish it stayed as is but private software entities just don’t have the economic incentive to keep up.

From a programming standpoint, I have a huge bias in favor of tools like Polymer, Quartz Composer or even Yahoo Pipes because they encapsulate the complexity of code and allow the user to literally connect black boxes that are meant to just work. They reason in terms of data flows and signals rather than algorithms. It’s truly software as a service but the runtime is on the client side. Turning on an 8-year-old iPhone is all it takes to realize remote software as a service is still a buzzword.

At the turn of the 19th century, electricians were living gods: their skill-set was considered highly technical and rewarded accordingly. Electricity was disrupting industries, reshaping cities, accelerating the world. Because we collectively recognized the usefulness of free-flowing energy available at all times, we commodified it – essentially making the job of an electrician dull and repetitive and moving on to problems of a higher level.

Knowing to code is irrelevant. It merely equates to ordering electrons in a circuit to go either right or left. Knowing when or why the electron should go right or left is what matters. At some point, someone ten times dumber than present software engineers likely will be able to do the same. This is a great thing and I’m looking forward to the day where software can get out of the way and become the smart substrate it promised to be.

Rather than teach everyone how to code, let’s teach them to think. The coding can come later; it’s easier.

Rob Pike