Four years ago, Bill Gates called forcing users to press the Control-Alt-Delete key combination to log into a PC a mistake, saying it was an IBM engineer who insisted on using the keys at the time. Yesterday, he admitted for a second time that the Ctrl-Alt-Delete function on Windows (and DOS before that) was a mistake. Speaking to Bloomberg, he said he wished he had pushed for a single login key, which is how it works on any Mac — you just press a key and log in.
On Windows today, you barely need to know about the key combination. In Windows 10, for example, you can click anywhere on the screen. That’s not the case in a company setting, though, where many versions of Windows still force you to press those three keys. “If I could make one small edit, I’d make that a single key,” Gates admitted during a panel discussion.
But the curious thing about Bill Gates’ regret is that the key command actually set DOS and Windows apart and gave the company some much-needed quirkiness. It’s a bad key combination…but a good differentiator.
Here’s one reason for that. At the time (the ’70s), a computer was a beige box full of components. Usability, who cares? The idea of mass producing a computer, or even an operating system, was still new, and companies like Apple were basically home-brewed, along with the products they produced. Gates is looking back with an eye to editing history a bit, but in my view, Ctrl-Alt-Delete is all part of the PC lexicon — a hallmark of what made Microsoft, and the industry at large, such an interesting, if flawed, endeavor.
Today, no one would ever think of using complex key commands. You could argue that the trend to use more icons and more swipes has gone too far — apps like Snapchat are the exact opposite of the Ctrl-Alt-Delete mentality, and just as confusing (especially to anyone over a certain age). As someone who was actually alive when computers came into fruition, I remember that sense of being “in the know” about Ctrl-Alt-Delete as a way to log in or to reboot a computer. I took pride in being able to help others learn the obscure commands. We haven’t even talked about using the Command prompt in Windows back when I was in college, or the macro recorder, knowing how to type a URL directly into a browser window, or doing any coding at all — even the basic stuff in a simplified programming environment.
It was the entire concept of having some tacit knowledge about computing that birthed the industry, and the information we handed down (sometimes on a floppy disk) or talked about after work that gave Information Technology a name and an identity, and gave me (in the ’90s) a paycheck.
Maybe it is the fact that things were a little archaic and complex that gave computing its mystique in the early days, that same sense you get today when you know how to fly a drone without crashing it or can find some of the hidden features on an iPhone. This is not an argument about making computing more complex on purpose; it’s an argument that says a dash of complexity in the beginning is all part of the canon that helps create the industry.
Do you agree? Have an opposing view? Feel free to post your rebuttal or comment on our Twitter feed or on Facebook.