I don’t normally use this space for persona grievances but I have to make an exception because I need someone to vent to and my wife is tired of hearing me go on about this subject. So bear with me as I rail incoherently against the idea of continuous user interface redesign.
Back when IBM controlled the digital world, they set strict rules governing the lifecycle of their technology. These rules meant that even if they had an improvement to their CPU or memory access, it would not make into a new model computer until three years had passed for the old technology. This gave customers a sense of dependability, that they weren’t spending millions of dollars on something that would turn to be obsolete in six months. It also provided a solid narrative that carried users through the upgrade path.
They tried this in the PC world and we all know how that turned out: Micro-(I know we have 14,000 known defects just ship it anyway)-soft kicked their asses out of the market. And once the scourge of teenage coders was loosed upon the world we quickly went from “build it because it’s better” to “build it because you can.”
Here is a rule programmers should live by but don’t: Never make a change to a well-known interface element unless it will significantly improve the experience. Take the classic word processor Word Star which I started using back in the CPM days. When the programmers created what had to be one of the first context based menus, they hadn’t heard of mnemonics quite yet so instead of copy and paste being ^C ^P it was something random like ^M ^Q. I don’t remember the exact commands but, trust me, they were random.
Random but well-known. They had been around for a long time and a lot of people, me included, knew those menus like the backs of our hands. However, when the concept of mnemonics came around, it would have been perfectly understandable to offer the option of changing the commands to something that made more sense. Why? Because even though it was a disruptive change, it would have made the user experience better.
Now let’s talk about the ribbons in Microsoft Office. They’ve been around for quite some time now and I have yet to figure out how they improved the experience. They certainly confused it, but I can’t see how they improved it. But in this case I can at least see a reason for the change. They may have been horribly wrong but someone at least thought ribbons would be superior to toolbars and menus.
The same cannot be said for Windows 8. The logic behind slapping this horrible Frankenstein of an operating system on PCs must have gone something like this:
VP: Hey, we make a ton of revenue off of forcing people to upgrade their operating systems and Windows 7 has been out there for a while. What have we got coming down the pike?
Techie: Nothing. All we have is the new OS for tablets and phones.
VP: Can’t we use that on PCs too?
Techie: It wouldn’t make any sense. The apps all run full screen. The start screen is meant to be used with a touch device…
VP: But dollars.
VP: Call it Windows 8 and ship it.
I had my first experience with Windows 8 yesterday when I went to set up a new machine for my mother. I had seen pictures of it, of course, but I couldn’t believe it was as bad as it looked. It is. It’s worse. It is as far from intuitive as you can get and not include an input device that requires six fingers.
So what was it about these features that made it imperative for Microsoft to force the world’s PC vendors to install it on all their new machines? You got me. It sort of makes sense when you see it on a phone but for a PC, it’s ridiculous. And it throws out everything we have learned to expect from a Windows operating system over the last two decades.
That’s bad but it’s Microsoft so it’s understandable. For them unnecessary UI “innovation” is about making money. So at least there’s a motive there that any pure blood American capitalist can understand.
The worst offender of continuous user interface design, though, has to be the iTunes team. I don’t know if they fear they’ll lose their jobs if they don’t overhaul the UI ever three weeks or what but they need to stop. Actually, they need to stop and roll back one revision.
For as long as I’ve had iTunes installed, which has been since 1.0, it has been the app that just works. I never had to read the help. I never had to Google how to do something. It made sense. It was intuitive. And, yes, they changed the interface on a regular basis but never substantially – until this most recent release which changed everything. Notice I didn’t say it improved everything. None of the changes are improvements, they’re just changes. Changes that make everything more difficult. For instance, you can no longer drag and drop. This from the company that popularized drag and drop.
As a matter of fact, I would say that this version of iTunes is entirely reminiscent of a Windows 2.0 app.
There’s enough programming that needs to be done out there, children, go do that good work and stop fiddling for the sake of fiddling.
There. I vented.