The Best Worst Thing

marfig

No ROM battery
... to happen to computer science: Graphical User Interfaces.

Introduction
You folks have heard me enough times complain about the developments in the user interface of Windows. I've been vocal about it during the Windows 8 launch. However I'm vocal about it on any operating system. Gnome suffered a radical change to its UI in version 3. And I was as critic of it as I am of Windows 8. In fact perhaps even more. Since, contrary to Gnome, I don't really care about Metro.

But if you really want to know, I'm generally critic about radical UI changes on any type of software, not just operating systems. I'm not a bitter person, but there's one thing that I must put on the table right here and now, so it gets out of the way: I'm against change!

Now that we understand each other and you are sure I'm not trying to hide it, lets get the ball rolling. Why this general mistrust?

Tall Abstractions
In functional terms an operating system is a software that stands between the computer and what we want to do with it. Programs, on the other hand, are the tools that allow us to do what we want.

There's nothing wrong with this relationship in modern computer science. Operating systems offer an essential abstraction of the computer hardware, that allows programmers to develop programs and users to use those programs.

However this means that an operating system should be as transparent to the user as possible. The operating system is not for the most part a concern to the user. What we have been witnessing with Microsoft Windows for years however is the exact opposite.

Windows, as an operating system, has been consistently taking its space all over the computer landscape, to the point it has an impact on user productivity, something that should have been the exclusive domain of computer programs. We have been hearing and experiencing this impact for years now. When we hear that an operating system is "easier and friendlier to use", we know we no longer have just an abstraction of the computer hardware. We have an autocratic opaque layer that orders even how computer programs should look and behave.

I want to be an Astronaut

Why this state of affairs on Windows? Why should it dictate so much about the user experience?

A good years back, Microsoft operating system was nothing more than a text-based abstraction called MS-DOS. Graphical User Interfaces were computer programs that ran on top of the operating system. And they were present since very early in MS-DOS life. For instance, back in the late 80s, an Amstrad computer with MS-DOS would invariably ship with GEM, a graphical user interface. Microsoft itself had been marketing their own GUI (Microsoft Windows, of course) since the mid 80s.

Close to the end the DOS era and the birth of Windows 95, Microsoft started the process of hooking an operating system to a graphical user interface and make them one. This was first achieved with Windows NT 3.1 in 1993. Despite the version number, that was in fact the first NT-based operating system available to the public (the NT-based series of operating systems exist to this date, with Windows 8 being NT 6.2).

With the birth of Windows NT, the graphical user interface became an integral part of the operating system. This gave rise to a whole new culture around the concept of Microsoft Operating Systems. Since then, we have been presented with an ever growing influence of the operating system on the user experience. More and more the OS has become determinant on not just user productivity, but also how applications should act and behave.

In short, coupling the GUI to an OS is how you move from a mostly transparent operating system to an opaque one, who no longer serves computer programs and offer basic computer management features to users, but now also influences everything you do with it.

The Best Worst Thing

Is this bad? For the reasons enunciated, yes. It's bad because you take away the ability for a user to determine how exactly they want to operate. You force the user to a series of process workflows that may or may not work in their benefit according to who they are and what they do with a computer. The problem with GUIs is that they can never, ever, answer the needs of every
user. Likewise, the general "easier and friendlier" culture around Microsoft operating systems is doing no one a favor in terms of increasing the general population technological literacy.

It's bad, because users will have to introduce the operating system User Interface into the list of things that affect their productivity. Every change to the operating system user interface is a change they will have to tackle with.

But it's also good for all the same reasons (except the last perhaps). GUIs are responsible for the massification of computer usage. Like no other piece of software, GUIs made it possible the "One PC on every home". They did introduce news more productive ways of doing the things we want done on the computer, but also greatly contributed to the homogenization of computer programs look and feel. And important aspect that greatly facilities the learning process. GUIs made it possible that, no matter 8 or 80, everyone can use a computer.

The price we pay however is that it also cages us in. We become invariably dependent on the GUI look and feel. No change to it is decided by us.

Smoke and Mirrors

Innovation and, particularly technological innovation, has become a misnomer. It's terrible to see, for the purposes of product advertisement, the gratuitous use we make of those words. We can consider an innovation such things as removing a start bar or introducing a ribbon bar. If you take it at face value from someone that has been using computers since the 80s, you'll trust me when I say that something changed between how productive I was then and how much I am now. But nothing changed between how productive I was with a menu system and how much I am with a ribbon bar, once I learned how to use both.

The matter of fact is that it's change that brings confusion. And the thing with change is that it doesn't just happen because of good intentions (hell is full of those). It often happens because there's a need to force users into whatever new paradigm is deemed worthy at that time in history, regardless of how well the benefits of said paradigm have been studied. That's what Metro 8 is, for instance. Forcing onto everyone the consumer paradigm of computer usage first initiated by Apple, and marketed by Microsoft as a true revolution in the way people will use their personal computers, despite there being no scientific support for such outrageous claim.

If "technological innovation" used to mean something, the software industry (and not just that of operating systems, let's be clear) has made a case of removing it altogether.

Aside from security features, there's really nothing tremendously evolutionary in the way we use our computer since Windows 2000. The levels of productivity haven't changed dramatically. We still do today the things we were doing 12 years ago. And software still presents itself to us in basically the same shape and form as it was then. Despite all the hype with every version of windows since then about being easier and friendlier, computers are still used today the same way they were then.

Metro is however different. It tells you the same thing as every other operating system before it. That it found the solution to all your problems and that you shouldn't listen to the other times it was telling you that the previous version was the solution to all your problems. What makes it different is that... it's very different from previous operating systems. It completely broke from the user interface mold introduced in 1993.

The Best Worst Thing, The End

And this is absolutely terrible. Without cause, without anyone in the user community has much whispering how upset they were because they couldn't use their computer, Microsoft forces you to completely change the way you use your computer. Not even so much the option was given to keep the "old" interface as an alternative. No, the company knows best how you should use the computer. Not you.

Change introduces stress. Change wastes time. Change forces you to deal with it even if you don't want to. Change is almost always a bad thing. Pointless, careless change is even worse. It's like driving to work, reaching an intersection and there's a policeman forcing you to take a detour despite the fact the usual road you take not being closed to traffic.

The problem with UI changes is that rarely, very rarely they are actually needed. In fact we can get accustomed even to bad UIs and resent a change after that. User interfaces are that powerful as an abstraction. When these changes are made, they should only be when there's a concrete case for it and, ideally, introduced gradually into the user workflow. Everyone who studied user interface best practices knows this.

What do we have with Microsoft Windows 8 and Metro is anything but that. It's a despot approach to user interface. And while no one ever asked for this or felt they needed it on a personal computer, Microsoft marketing approach to it was the oldest trick in the book of product marketing: If there's no need, create one.
 
Last edited:

Tharic-Nar

Senior Editor
Staff member
Moderator
Modern UI for Desktops - change - equal bad. Modern UI for tablets - new - equal good. The interface lends itself well to a new type of user interaction, touch. As a replacement to the current system that is on the desktop, it's bad, very bad, and not very compatible with the current interface methods of keyboard and mouse. However, the bridging of the two was the result of Microsoft trying to unify the interface, so that people are not confused when switching between devices - ultimately improving the experience in the long run (or that's the idea).

Companies have toyed with Tablets before, but they didn't take off because the user interfaces didn't translate over from desktops (and poor battery life). For something like a tablet, you need a GUI at the OS level because there is no other way to interact with the device. Mobile companies had toyed with larger interfaces too with various smartphones, and often things were hit and miss. Apple comes along and shake up the whole world and suddenly we have a new standard. Modern UI is Microsoft creating a new interface for a new device format.

Ultimately, it's just a glorified start menu, one that takes up the entire screen, once you go past that, the interface is almost identical to the current paradigm.

To say that all OS level GUIs are bad is a bit of a stretch. It unifies the interface, it provides a standard with which applications can wrap around, instead of creating a whole new interface themselves, it also provides a visual way to manage active application (flicking between text based terminals with ctrl-F1-F4 is not exactly the best method... or is that deemed a GUI too?). The option to create a custom UI is still there, or not run one at all. To say that change to a UI is bad is also a stretch - sure, it can be annoying, but if you continue to use the same interface standard for however long and shun everything new, how do we know when something better comes along? Change for the sake of change is bad, and I do agree there. Forcing a UI to an incompatible format, again, bad. But all OS based GUI bad? No, not in the slightest.
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
For instance, back in the late 80s, an Amstrad computer with MS-DOS would invariably ship with GEM, a graphical user interface. Microsoft itself had been marketing their own GUI (Microsoft Windows, of course) since the mid 80s.

Ahh, I haven't heard about GEM for quite some time. I remember reading into it heavily a while back while I was trying to locate the front-end that my 286 PC had (it wasn't anything like GEM, though... it was a simple front-end).

The problem with UI changes is that rarely, very rarely they are actually needed. In fact we can get accustomed even to bad UIs and resent a change after that.

I agree. The one thing that I hear from a lot of people when it comes to Windows 8 bashing is that "you're not open to change", when that's hardly the truth for a lot of people. I don't dislike the Modern UI because I dislike change... I dislike it because it's way less convenient for what I do. Sure, it looks pretty, I guess, but it makes certain things more challenging to do. Then there's the whole "how to power off" deal.

Yes, I've grown accustomed to some of this, and while I plan to stick to Windows 8, I still find some of the design decisions from Microsoft to be downright bizarre. I found the same thing with GNOME 3, but somehow, I ever prefer that over Windows 8's additions.

Change for the sake of change is bad, and I do agree there. Forcing a UI to an incompatible format, again, bad. But all OS based GUI bad? No, not in the slightest.

I do agree there. But like Mario said, it's unlikely Microsoft actually did testing with people to see if Modern UI is actually something people are bound to enjoy. Apple has the same sort of mentality... who gives a damn what people want, we want it, and it's what we want the market to want. Enough said.

*slams hammer*
 

marfig

No ROM battery
However, the bridging of the two was the result of Microsoft trying to unify the interface, so that people are not confused when switching between devices - ultimately improving the experience in the long run (or that's the idea).

And this is what essentially killed Windows 8 to me. There really needs to be a line that borders this insane notion that unless things are made easier to use, people won't use it.

I'm constantly thrown back to this movie: Idiocracy.

People will use complex things too. Long before "easy of use" took over the mentalities of Redmond, there was already a "Windows on every home". The market for PC was already a mass market, long before Vista, and even XP. We are stretching things so far it will come a point we will all be dumb, out of the lack of anything to stimulate our intelligence. There's a culture running in Microsoft that can only be described as Anti-intellectualism.

There would be nothing wrong in offering Metro as an optional part of the Windows 8 experience. In fact we had been promised that in the early stages of the development. We were told it would be optional. Not enforced.

That would have resulted in the company having to support and maintain two GUI versions. Which Microsoft clearly didn't want to do. The company was caught off-guard with all the tablet craze and this is the answer it found. Masked behind pure marketing concepts (note that these aren't anywhere even near to being technical concepts) of being "easier" and "friendlier", we are given an operating system interface no one asked for. We didn't need it! Grandma has been using Microsoft OS since Windows 95 just fine.

What's worse, it fundamentally changes the way we work with computers. Why on earth do that? What on earth possesses someone to think that they somehow have the intellectual capacity to come up with the solution to the UI dilema that will fit everyone?

Apple comes along and shake up the whole world and suddenly we have a new standard. Modern UI is Microsoft creating a new interface for a new device format.

There's no new standard. Where's that standard? There's a different device for a different purpose that indeed does share some of the earlier Personal Computer tasks. This device has been invented and put on the market. People accepted it and bought it. It started taking its rightful place in the market and, because it intersected with the personal computer on certain usage areas, it rightfully took its share of the PC market. Done deal. End of story.

This device needed a different UI paradigm. One was created. It became a standard for that device. And that device only. Tablets didn't ask the PC to create their standard. Likewise It should have not become a standard for the personal computer. Microsoft is the only company that did this. Apple itself didn't (in fact they've been thinking of killing OS X for some time, but that's another matter).

Did the hybrid engine defined a new standard for combustion engines? Did the cellphone standard define a new standard for fixed phones? Did the portable cooler define a new standard for refrigerators? Do not dimiss these comparisons so readily. They all share a common background to the tablet UI vs PC UI. These are all devices that took market share from their earlier cousins while introducing a new interface and a new way of doing things.

The markets however panicked among the half-assed opinions of so-called technology experts predicting the death of the PC. The most ridiculous idea I've heard in the last 30 years. A new device takes its rightful share of the PC market and suddenly the PC is going to die. I mean, for pete's sake! How insanely stupid one can be?

To say that all OS level GUIs are bad is a bit of a stretch.

I didn't say that. But playing ball, let me tell you that in fact all operating systems are bad, not just their UIs.

I'll elaborate on this thought on a later reply to this thread.

To say that change to a UI is bad is also a stretch - sure, it can be annoying, but if you continue to use the same interface standard for however long and shun everything new

The idea that I'd be shunning anything new is contrary to what is being defended. I said I hate change. But that should be fully understood in the context of rapidly involving technology. It has nothing to do with introducing new concepts and new technology. But it has all to do with forcing me to use them.

I agree. The one thing that I hear from a lot of people when it comes to Windows 8 bashing is that "you're not open to change", when that's hardly the truth for a lot of people.

For some reason, somewhere around the late 90s, we -- the users -- have started to take an almost religious attitude towards the software industry. This has been leading the industry and its market into a truly dystopian vision of the world. We exercise our personal preferences as if they had to be defended and fought for. The software industry has become one gigantic vanity fair.

What those people are doing is to immediately enter into defensive mode when someone says they don't like what they like. It can't be that you simply don't like windows 8, no. There must be something wrong with you. You don't like change. You are a stupid and ignorant person.

It's quite extraordinary how our mentalities are growing (one should consider shrinking) over the advancements in technological offerings.


The one interesting point about that article is made at the end: Since Windows market share is over 95%, compared to the paltry OS X share, how relevant indeed is it that all Microsoft could do in 4 days with Windows 8 is to double the number of downloads of the latest OS X update?

I don't necessarily wish that Windows 8 fails for the sake of failing. I want it to fail in a clear context and that failure can be traced to a defective (PC-wise) UI that should have never existed. Unfortunately, one thing we must be sure: that won't happen.

The days of the start button are over. I'm positive Microsoft is moving away from it, for whatever reason I cannot phantom. The failure of Windows 8 will be answered with core changes to Metro and Microsoft won't go back. That type of "failure" would be catastrophic for its shareholders who panic even at small dead mice.
 
Last edited:

marfig

No ROM battery
I didn't say that. But playing ball, let me tell you that in fact all operating systems are bad, not just their UIs.

I'll elaborate on this thought on a later reply to this thread.

No to Operating systems

Not of the Family
Software operating systems are in fact a necessary evil. We would be much better without them. We may feel tempted to thank them for having allowed us to actual make use of a computer architecture that would have been otherwise impossible to operate; Writing a letter, playing games, or watching a movie by having to manually send streams of bits to the processor isn't fun.

But that feeling is quickly obfuscated, at least to me, when we think how much more complicated they made our lives by not being homogenous. Every Operating System has its own answer to how it interfaces with the machine. With that comes a whole baggage of learning and adapting that, despite any rewards at the end, cannot constitute anything more than a huge time waster.

Changing operating systems isn't an easy decision. To some may even be impossible. Moving from Windows to Linux is something I've been trying to do for well over 10 years. Not because I'm particularly dumb. But because my work invariably forces me to spend months on end concentrated on Windows development and I end up having to forego my Linux learning. When I come back I have to start over. This is so because both operating systems are different to the core.

Likewise, the lack of homogeneity has been (while a source of income for some) a large money sink for businesses around the world. Having to cope directly or indirectly with different operating systems, increases costs and entropy on any business. In the software industry in particular, the slow progress of cross-development solutions and tools (marred even more by copyright barriers) clearly mirrors the difficulties faced.

It's somewhat ironic that we do everything we can to make operating systems look, operate and behave differently, only to then spend hundreds of thousands of collective hours trying to port programs between them. I'm sure mankind in the future will read its history books and learn all about the beginnings of the communication society and giggle at how inefficient, how wasteful, we were.

Careful What You Wish For
So then someone comes and says it's all for the best. We have more than one way of doing things and different people can do things as they like best or as they feel more comfortable doing them.

I was born in a time when there were no personal computers around. People still did things. Then personal computers came and people did things on them. They didn't adapt to different ways of doing it. This was the beginning. There weren't many different ways of doing things. People were happy.

Naturally some wanted to do things differently. And so different operating systems were born and different programs that did the same thing were born. All in the name of diversity. Diversity is good. Is it?

It is. But comes with a cost. The cost isn't only measured in training hours, or the financial costs of a business running two different operating systems on their servers. Diversity makes uniformity more difficult. It forces a system to became more diverse than it could have been. As an example, while seemingly illogical there's actually core reasons for why Windows, Unix and RISC OS based systems all have different new line escape sequences that are deeply related to design choices of the operating system and not because someone just wanted to have a different new line escape sequence.

In the Paradox of Choice, Psychologist Barry Schwartz digs a knife deep our collective minds blind trust in the maxim "Freedom of Choice is good". You can get a general feel for the book in the author's talk at TED's. Controversial as it may be to some, we should always question our need for diversity as much as we question on a daily basis the lack of uniformity. Operating systems should be no exception.

A Foggy Crystal Ball
If I was asked to predict the future I would demand to give two answers. I cannot in all honesty speak of what I believe will be the future of computer science on what matters to operating systems uniformity. But I suspect of two different realities.

We keep the current path of providing individual solutions to the same problem, hoping this satisfies our taste for Freedom of Choice, as well as hoping this answers some irrational desire to do things in a different way.

Or, at some point in the near future we slowly start the process of uniformization that will one day invariably end the reign of software operating systems and we will have hardware-based operating systems that will be defined on a computer architecture basis.

The first future doesn't look good to me. As software development becomes increasingly more complex, so will cross-development. Less and less developers will start offering ports of their software. Users wanting to do things in a different way will be forced to move to another operating system. This doesn't sound good to me. Since when an operating system should determine the software programs available to it?

The second future seems better to me. A future where software programs, not operating systems, dictate how we function as a technology driven society and how productive we are. Software development will then have a chance to go beyond the limitations imposed by heterogeneous operating systems and reach a larger market at a smaller cost.

It also seems the most likely future to me. We are just starting. The current personal computer architecture is also the first to have been commercially successful. It's also still advancing at a rapid pace, on account of being so newer and so many things waiting in line to be discovered or invented. As computer-based technologies mature and the dust starts to settle, a need for uniformity will invariable set in. Demands will start being made by a more mature and accustomed generation of users, who will not feel so enthralled with the idea of being partisans for a company or a concept.
 

Tharic-Nar

Senior Editor
Staff member
Moderator
Wasn't this the whole idea behind Java? Platform agnostic - not OS or Hardware, but platform. You create one set of code that will run on all 'compatible' devices. It's just a shame that functions are deprecated so fast and the only people creating the runtime environments charge an arm and a leg for it.
 

marfig

No ROM battery
For sure. Java most fundamental principle is its platform agnosticism. So is C and C++ and quite a few other general-purpose programming languages like these ones.

There's a lesson to give on this regard: That it is possible to develop agnostic programming languages that go beyond different operating system kernels and allow even development across different computer architectures. The software operating system reveals thus, again, its weaknesses. To develop compilers across different systems, one will have to contend not only with the actual system specifics, but with the various operating systems it supports.
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
marfig said:
That would have resulted in the company having to support and maintain two GUI versions. Which Microsoft clearly didn't want to do.

This is a horrible excuse on Microsoft's behalf, though. The desktop GUI of the OS has had similar functionality dating back to Vista, so Microsoft could have retained the desktop experience from 7 1:1 and no one would have minded. Plus, the company -did- infuse the desktop portion with a bunch of aesthetic and functionality updates. The only thing missing from this equation is the Start menu.

marfig said:
What's worse, it fundamentally changes the way we work with computers. Why on earth do that?

You and madstork91 in a debate would be fun to see. He's always looking for the next big thing in computing, though it normally focuses around hardware, not software. His thoughts is that we don't KNOW that better solutions are possible, and the reason we don't accept those that come about is becuase we're too stuck in our ways, our thinking. Essentially, forcing ourselves to move over might help us

marfig said:
For some reason, somewhere around the late 90s, we -- the users -- have started to take an almost religious attitude towards the software industry. This has been leading the industry and its market into a truly dystopian vision of the world. We exercise our personal preferences as if they had to be defended and fought for. The software industry has become one gigantic vanity fair.

I think that mostly boils down to the fact that we all use these operating systems a LOT, literally hours and hours each day. So to become tied to it on a greater level is no surprise. After a while it's going to sink into your brain and stay there, and even minor change might be a little annoying.

It's the same thing with companies though. Look how many people vehemently defend Apple, or even NVIDIA, AMD and so forth. People sure do become passionate about the products they use. It's almost silly when you think about it.

marfig said:
I don't necessarily wish that Windows 8 fails for the sake of failing. I want it to fail in a clear context and that failure can be traced to a defective (PC-wise) UI that should have never existed. Unfortunately, one thing we must be sure: that won't happen.

It won't happen, because people will take it up the ass. I'm guilty of this also. I don't like Start screen, but I'm willing to accept it because at the end of the day, it really doesn't bother me. I've gotten used to it enough where I can launch apps from it super-fast, so it's quickly becoming something I'm not even realizing anymore. At this point I think I'd be fine with either solution because both allow me to open apps fast.

So, I guess you can consider me to be one of the people that hated the change, but just took it anyway. There are other things that do stand out as being more of a problem than the Start screen itself, and that mostly has to do with shutting down, and accessing things like the Control Panel. Both of those things I could do in Windows 7 five times faster than I can do them in Windows 8. And both of those are the direct result of the tablet focus.

What's REALLY mind-blowing to me is that the server solutions feature the exact same implementations. I'm just in awe over that. I could see it maybe where someone is walking around with a tablet, and using that sort of interface, but not the server itself. Sheesh. It's a SERVER.

IT'S A SERVER!!!

I agree with most of what you say in your "No to Operating Systems" section, although I still can't picture not using an OS at all ;-) I agree also on the cross-platform aspect, but soon enough we'll all be rocking OSes that are more similar to mobile platforms so perhaps that'll make things easier. People will start developing using languages that can be easily ported. There already exist tools that can export to multiple different platforms (look at Game Studio for example).

Sigh.
 

marfig

No ROM battery
I agree with most of what you say in your "No to Operating Systems" section, although I still can't picture not using an OS at all ;-)

You would still be using an operating system. Just not a software one. The operating system would (I defend in the future it will) reside in the hardware as integral part of the computer architecture and clearly defined by standards making sure no two chip developers provide fundamentally different OSes.

Of course, I don't believe this will happen on the current computer architecture iteration. Our current model concerns itself exclusively with the holy trinity (device-bus-processor). Later computer architecture models can and will eventually incorporate the OS layer. Human nature and technology in general is friendly towards standardization.

I agree also on the cross-platform aspect, but soon enough we'll all be rocking OSes that are more similar to mobile platforms so perhaps that'll make things easier. People will start developing using languages that can be easily ported. There already exist tools that can export to multiple different platforms (look at Game Studio for example).

You won't see many cross-development solutions that trie to completely abstract away the operating system and/or the computer architecture that don't make serious concessions to either memory consumption or execution speed, or both. In fact you won't see any.

Fortunately our computers also become faster and we do not experience these limitations so clearly. But they are there. And for "real" development where minute concessions accumulate in great numbers, solutions like these are always limited in scope. The reason why you won't see many complex games like BF3 or Skyrimm being developed in XNA Game Studio. Ever.
 
Last edited:
Top