marfig
No ROM battery
... to happen to computer science: Graphical User Interfaces.
Introduction
You folks have heard me enough times complain about the developments in the user interface of Windows. I've been vocal about it during the Windows 8 launch. However I'm vocal about it on any operating system. Gnome suffered a radical change to its UI in version 3. And I was as critic of it as I am of Windows 8. In fact perhaps even more. Since, contrary to Gnome, I don't really care about Metro.
But if you really want to know, I'm generally critic about radical UI changes on any type of software, not just operating systems. I'm not a bitter person, but there's one thing that I must put on the table right here and now, so it gets out of the way: I'm against change!
Now that we understand each other and you are sure I'm not trying to hide it, lets get the ball rolling. Why this general mistrust?
Tall Abstractions
In functional terms an operating system is a software that stands between the computer and what we want to do with it. Programs, on the other hand, are the tools that allow us to do what we want.
There's nothing wrong with this relationship in modern computer science. Operating systems offer an essential abstraction of the computer hardware, that allows programmers to develop programs and users to use those programs.
However this means that an operating system should be as transparent to the user as possible. The operating system is not for the most part a concern to the user. What we have been witnessing with Microsoft Windows for years however is the exact opposite.
Windows, as an operating system, has been consistently taking its space all over the computer landscape, to the point it has an impact on user productivity, something that should have been the exclusive domain of computer programs. We have been hearing and experiencing this impact for years now. When we hear that an operating system is "easier and friendlier to use", we know we no longer have just an abstraction of the computer hardware. We have an autocratic opaque layer that orders even how computer programs should look and behave.
I want to be an Astronaut
Why this state of affairs on Windows? Why should it dictate so much about the user experience?
A good years back, Microsoft operating system was nothing more than a text-based abstraction called MS-DOS. Graphical User Interfaces were computer programs that ran on top of the operating system. And they were present since very early in MS-DOS life. For instance, back in the late 80s, an Amstrad computer with MS-DOS would invariably ship with GEM, a graphical user interface. Microsoft itself had been marketing their own GUI (Microsoft Windows, of course) since the mid 80s.
Close to the end the DOS era and the birth of Windows 95, Microsoft started the process of hooking an operating system to a graphical user interface and make them one. This was first achieved with Windows NT 3.1 in 1993. Despite the version number, that was in fact the first NT-based operating system available to the public (the NT-based series of operating systems exist to this date, with Windows 8 being NT 6.2).
With the birth of Windows NT, the graphical user interface became an integral part of the operating system. This gave rise to a whole new culture around the concept of Microsoft Operating Systems. Since then, we have been presented with an ever growing influence of the operating system on the user experience. More and more the OS has become determinant on not just user productivity, but also how applications should act and behave.
In short, coupling the GUI to an OS is how you move from a mostly transparent operating system to an opaque one, who no longer serves computer programs and offer basic computer management features to users, but now also influences everything you do with it.
The Best Worst Thing
Is this bad? For the reasons enunciated, yes. It's bad because you take away the ability for a user to determine how exactly they want to operate. You force the user to a series of process workflows that may or may not work in their benefit according to who they are and what they do with a computer. The problem with GUIs is that they can never, ever, answer the needs of every
user. Likewise, the general "easier and friendlier" culture around Microsoft operating systems is doing no one a favor in terms of increasing the general population technological literacy.
It's bad, because users will have to introduce the operating system User Interface into the list of things that affect their productivity. Every change to the operating system user interface is a change they will have to tackle with.
But it's also good for all the same reasons (except the last perhaps). GUIs are responsible for the massification of computer usage. Like no other piece of software, GUIs made it possible the "One PC on every home". They did introduce news more productive ways of doing the things we want done on the computer, but also greatly contributed to the homogenization of computer programs look and feel. And important aspect that greatly facilities the learning process. GUIs made it possible that, no matter 8 or 80, everyone can use a computer.
The price we pay however is that it also cages us in. We become invariably dependent on the GUI look and feel. No change to it is decided by us.
Smoke and Mirrors
Innovation and, particularly technological innovation, has become a misnomer. It's terrible to see, for the purposes of product advertisement, the gratuitous use we make of those words. We can consider an innovation such things as removing a start bar or introducing a ribbon bar. If you take it at face value from someone that has been using computers since the 80s, you'll trust me when I say that something changed between how productive I was then and how much I am now. But nothing changed between how productive I was with a menu system and how much I am with a ribbon bar, once I learned how to use both.
The matter of fact is that it's change that brings confusion. And the thing with change is that it doesn't just happen because of good intentions (hell is full of those). It often happens because there's a need to force users into whatever new paradigm is deemed worthy at that time in history, regardless of how well the benefits of said paradigm have been studied. That's what Metro 8 is, for instance. Forcing onto everyone the consumer paradigm of computer usage first initiated by Apple, and marketed by Microsoft as a true revolution in the way people will use their personal computers, despite there being no scientific support for such outrageous claim.
If "technological innovation" used to mean something, the software industry (and not just that of operating systems, let's be clear) has made a case of removing it altogether.
Aside from security features, there's really nothing tremendously evolutionary in the way we use our computer since Windows 2000. The levels of productivity haven't changed dramatically. We still do today the things we were doing 12 years ago. And software still presents itself to us in basically the same shape and form as it was then. Despite all the hype with every version of windows since then about being easier and friendlier, computers are still used today the same way they were then.
Metro is however different. It tells you the same thing as every other operating system before it. That it found the solution to all your problems and that you shouldn't listen to the other times it was telling you that the previous version was the solution to all your problems. What makes it different is that... it's very different from previous operating systems. It completely broke from the user interface mold introduced in 1993.
The Best Worst Thing, The End
And this is absolutely terrible. Without cause, without anyone in the user community has much whispering how upset they were because they couldn't use their computer, Microsoft forces you to completely change the way you use your computer. Not even so much the option was given to keep the "old" interface as an alternative. No, the company knows best how you should use the computer. Not you.
Change introduces stress. Change wastes time. Change forces you to deal with it even if you don't want to. Change is almost always a bad thing. Pointless, careless change is even worse. It's like driving to work, reaching an intersection and there's a policeman forcing you to take a detour despite the fact the usual road you take not being closed to traffic.
The problem with UI changes is that rarely, very rarely they are actually needed. In fact we can get accustomed even to bad UIs and resent a change after that. User interfaces are that powerful as an abstraction. When these changes are made, they should only be when there's a concrete case for it and, ideally, introduced gradually into the user workflow. Everyone who studied user interface best practices knows this.
What do we have with Microsoft Windows 8 and Metro is anything but that. It's a despot approach to user interface. And while no one ever asked for this or felt they needed it on a personal computer, Microsoft marketing approach to it was the oldest trick in the book of product marketing: If there's no need, create one.
Introduction
You folks have heard me enough times complain about the developments in the user interface of Windows. I've been vocal about it during the Windows 8 launch. However I'm vocal about it on any operating system. Gnome suffered a radical change to its UI in version 3. And I was as critic of it as I am of Windows 8. In fact perhaps even more. Since, contrary to Gnome, I don't really care about Metro.
But if you really want to know, I'm generally critic about radical UI changes on any type of software, not just operating systems. I'm not a bitter person, but there's one thing that I must put on the table right here and now, so it gets out of the way: I'm against change!
Now that we understand each other and you are sure I'm not trying to hide it, lets get the ball rolling. Why this general mistrust?
Tall Abstractions
In functional terms an operating system is a software that stands between the computer and what we want to do with it. Programs, on the other hand, are the tools that allow us to do what we want.
There's nothing wrong with this relationship in modern computer science. Operating systems offer an essential abstraction of the computer hardware, that allows programmers to develop programs and users to use those programs.
However this means that an operating system should be as transparent to the user as possible. The operating system is not for the most part a concern to the user. What we have been witnessing with Microsoft Windows for years however is the exact opposite.
Windows, as an operating system, has been consistently taking its space all over the computer landscape, to the point it has an impact on user productivity, something that should have been the exclusive domain of computer programs. We have been hearing and experiencing this impact for years now. When we hear that an operating system is "easier and friendlier to use", we know we no longer have just an abstraction of the computer hardware. We have an autocratic opaque layer that orders even how computer programs should look and behave.
I want to be an Astronaut
Why this state of affairs on Windows? Why should it dictate so much about the user experience?
A good years back, Microsoft operating system was nothing more than a text-based abstraction called MS-DOS. Graphical User Interfaces were computer programs that ran on top of the operating system. And they were present since very early in MS-DOS life. For instance, back in the late 80s, an Amstrad computer with MS-DOS would invariably ship with GEM, a graphical user interface. Microsoft itself had been marketing their own GUI (Microsoft Windows, of course) since the mid 80s.
Close to the end the DOS era and the birth of Windows 95, Microsoft started the process of hooking an operating system to a graphical user interface and make them one. This was first achieved with Windows NT 3.1 in 1993. Despite the version number, that was in fact the first NT-based operating system available to the public (the NT-based series of operating systems exist to this date, with Windows 8 being NT 6.2).
With the birth of Windows NT, the graphical user interface became an integral part of the operating system. This gave rise to a whole new culture around the concept of Microsoft Operating Systems. Since then, we have been presented with an ever growing influence of the operating system on the user experience. More and more the OS has become determinant on not just user productivity, but also how applications should act and behave.
In short, coupling the GUI to an OS is how you move from a mostly transparent operating system to an opaque one, who no longer serves computer programs and offer basic computer management features to users, but now also influences everything you do with it.
The Best Worst Thing
Is this bad? For the reasons enunciated, yes. It's bad because you take away the ability for a user to determine how exactly they want to operate. You force the user to a series of process workflows that may or may not work in their benefit according to who they are and what they do with a computer. The problem with GUIs is that they can never, ever, answer the needs of every
user. Likewise, the general "easier and friendlier" culture around Microsoft operating systems is doing no one a favor in terms of increasing the general population technological literacy.
It's bad, because users will have to introduce the operating system User Interface into the list of things that affect their productivity. Every change to the operating system user interface is a change they will have to tackle with.
But it's also good for all the same reasons (except the last perhaps). GUIs are responsible for the massification of computer usage. Like no other piece of software, GUIs made it possible the "One PC on every home". They did introduce news more productive ways of doing the things we want done on the computer, but also greatly contributed to the homogenization of computer programs look and feel. And important aspect that greatly facilities the learning process. GUIs made it possible that, no matter 8 or 80, everyone can use a computer.
The price we pay however is that it also cages us in. We become invariably dependent on the GUI look and feel. No change to it is decided by us.
Smoke and Mirrors
Innovation and, particularly technological innovation, has become a misnomer. It's terrible to see, for the purposes of product advertisement, the gratuitous use we make of those words. We can consider an innovation such things as removing a start bar or introducing a ribbon bar. If you take it at face value from someone that has been using computers since the 80s, you'll trust me when I say that something changed between how productive I was then and how much I am now. But nothing changed between how productive I was with a menu system and how much I am with a ribbon bar, once I learned how to use both.
The matter of fact is that it's change that brings confusion. And the thing with change is that it doesn't just happen because of good intentions (hell is full of those). It often happens because there's a need to force users into whatever new paradigm is deemed worthy at that time in history, regardless of how well the benefits of said paradigm have been studied. That's what Metro 8 is, for instance. Forcing onto everyone the consumer paradigm of computer usage first initiated by Apple, and marketed by Microsoft as a true revolution in the way people will use their personal computers, despite there being no scientific support for such outrageous claim.
If "technological innovation" used to mean something, the software industry (and not just that of operating systems, let's be clear) has made a case of removing it altogether.
Aside from security features, there's really nothing tremendously evolutionary in the way we use our computer since Windows 2000. The levels of productivity haven't changed dramatically. We still do today the things we were doing 12 years ago. And software still presents itself to us in basically the same shape and form as it was then. Despite all the hype with every version of windows since then about being easier and friendlier, computers are still used today the same way they were then.
Metro is however different. It tells you the same thing as every other operating system before it. That it found the solution to all your problems and that you shouldn't listen to the other times it was telling you that the previous version was the solution to all your problems. What makes it different is that... it's very different from previous operating systems. It completely broke from the user interface mold introduced in 1993.
The Best Worst Thing, The End
And this is absolutely terrible. Without cause, without anyone in the user community has much whispering how upset they were because they couldn't use their computer, Microsoft forces you to completely change the way you use your computer. Not even so much the option was given to keep the "old" interface as an alternative. No, the company knows best how you should use the computer. Not you.
Change introduces stress. Change wastes time. Change forces you to deal with it even if you don't want to. Change is almost always a bad thing. Pointless, careless change is even worse. It's like driving to work, reaching an intersection and there's a policeman forcing you to take a detour despite the fact the usual road you take not being closed to traffic.
The problem with UI changes is that rarely, very rarely they are actually needed. In fact we can get accustomed even to bad UIs and resent a change after that. User interfaces are that powerful as an abstraction. When these changes are made, they should only be when there's a concrete case for it and, ideally, introduced gradually into the user workflow. Everyone who studied user interface best practices knows this.
What do we have with Microsoft Windows 8 and Metro is anything but that. It's a despot approach to user interface. And while no one ever asked for this or felt they needed it on a personal computer, Microsoft marketing approach to it was the oldest trick in the book of product marketing: If there's no need, create one.
Last edited: