Microsoft Announces Windows 8 Editions

Rob Williams

Editor-in-Chief
Staff member
Moderator
Microsoft this week has announced four editions of Windows 8 that we can expect to see at launch. These include the standard "Windows 8", "Windows 8 Pro", "Windows 8 Enterprise" and "Windows 8 RT" - the latter of which is designed to be pre-installed, and only on ARM devices. There's no mention of Windows 8 Starter, but if this launch will match Vista's and 7's, it'll become available soon after launch (but not for off-the-shelf consumption).

windows_8_beta_14_thumb.jpg

Read the rest of our post and then discuss it here!
 

OriginalJoeCool

Tech Monkey
When's Windows 8 being released?

* checks *

October 2012.

I wonder if this release will be a success?
 
Last edited:

Kougar

Techgage Staff
Staff member
Glad to hear someone finally had the impetus to go back to Home & Pro versions. Now if they had simply done away with 32bit like they should've...
 

OriginalJoeCool

Tech Monkey
We'll still need 32-bit for quite a while ... not sure quite how long. I saw a rumour somewhere Microsoft was working on 128-bit. :D
 

marfig

No ROM battery
Naturally we are closer today then we ever were. But it might take some 10 years more for the complete displacement of 32 bit software. I wouldn't know. No one knows in fact. We will only do when we suddenly realize it already happened.

It took "just" 10 years to displace 16 bit platforms. Between 1985 and 1995, if we consider the first commercially successful 32 bit processors (the Motorolla's 68020 that dominated UNIX systems and the first ARM, that dominated embedded systems) and the first commercially viable 32 bit system, the Windows 95.

And "just" has a mutually exclusive double meaning here. One one hand we can consider that a long time. The fact is that 16 bit systems were still young when 32 bit systems first came up and the computer market was much smaller than it is today. One could expect the transition to have taken less time than it actually did. On the other hand we can consider that a short time, looking at how long 64 bit displacement of 32 bit platforms and software is apparently going to take.

The fact however is that first consumer grade 64 bit processor appeared only in 2003 and the industry only reacted the next year with a proper 64 bit instruction model. So 2004 it is. That was just 8 years ago. And we have far more obstacles to adoption than we had before when 32 bit architecture took its stand.

Let's see:

One of the biggest mistakes people do in my opinion (and that leads to the general misconception that displacing 32 bit should be a simple affair) is the fact they look at 64 bit as being backwards compatible with 32 bit and as such a no-brainer. Why would this ever be a problem? Well, because that's not the whole picture. Far from it.

There were too many things that conspired to a relatively fast adoption of 32 bit architecture and a displacement of 16 bit systems (which are still largely being used today, but that's another matter).

  • For one the computer market was smaller back in 1985. Today it is vast!
  • The development model imposed by the new architecture actually simplified tremendously software development. This lead to many software developers being thankful of finally getting rid of the whole mess that was memory management under 16 bit systems.
  • 32 bit offered a tremendously needed new address space at the time.Conversely 64 bit still only offers a n advantage on limited scenarios and none having anything to do with mainstream computer usage.
  • 16 bit systems were still relatively young. Whereas 64 bit systems get introduced on an already fully established 32 bit market with millions of systems and applications already developed and a massive industry with millions of developers and 10s of thousands of development companies.

The fact that, like with the 32 bit architecture the 64 bit one is backwards compatible, doesn't really constitute much of an advantage in terms of adoption. It simply guarantees that old software can run on new machines, not that new software will be developed for the new platform.

And that really is the problem with the 32 bit displacement. Software developers aren't largely developing for the 64 bit platform. Most software being developed is still 32 bit and only occasionally we get tagging along 64 bit versions of that same software. Very few, extremely few, develop titles exclusively for the 64 bit platform.

There's profound and important reasons for this to happen that I will be glad to discuss if anyone shows an interest. But at large, both hardware manufacturers and operating system developers have done their part in adopting 64 bit systems. The real problem here is software development companies and authors that are holding back full displacement of 32 bit systems and applications and that keep forcing both hardware and operating system suppliers to take their hand off the 32 bit nuke button.

I foresee a few more years of 32 bit software dominance. I'd risk even something like 10 more years until the slow pace of 64 bit software development gains enough importance to force a shift on the industry. As for users, with due exceptions due to specialized software or use of the computer, we really couldn't care less whether 64 bit systems would still take 10 years to force themselves on us. The matter of the fact is that most of us don't need the added address space and most mainstream commercial software is still a long way from ever taking advantage of it.
 
Last edited:

Kougar

Techgage Staff
Staff member
You probably said that just to get a reaction. You can't be serious.

Actually, I fully am serious!

The average Joe isn't capable of realizing they need to check what version their OS is when they buy a computer. Many still try to upgrade their RAM past 4GB when the OS can't even use the 4GB they have. I don't see the need for the confusion to be perpetuated for another five years. On top of that there are plenty of security reasons as to why consumers should be using 64bit OS's now.

Games are notably bad... in the last six years I've played a dozen various games that would exceed the 2GB RAM allocation barrier and crash because of it. Such is life when playing at XHD or especially QHD resolutions with graphics details on max. If the game wasn't written in 32bit then it'd never have even been an issue.

Games like Supreme Commander, Civilization V, and others have been hitting the RAM limit (and exceeding it) for many years. Civ V was actually UNPLAYABLE for me when it first launched unless I played at lower resolutions and detail settings for this reason. Because at 1920x1200 on max details, by mid game there would be enough units and AI usage to exceed the RAM allocation and crash the game every single time. I could watch it happen in Task Manger too. There were lots of end user mods going on to the SupCom exe, such as enabling PAE, but the game itself was hard-coded for that 2GB limit regardless. As for Civ V, iIt took many patches and a great deal of back and forth with their tech people before the game was patched well enough to not exceed the artificial RAM limitations imposed on it. That I had to endure all of these despite that my own system had 6-12GB of RAM available is just ludicrous.

Even Minecraft, as basic a game as that is, has this problem. Unless users install a 64-bit JAVA installation they cannot use the "Far" distance view without exceeding allocatable RAM limits and crashing the game. And we're talking about Minecraft, of all things!


From another angle... what would happen if a person bought a 4GB GTX 680 card for a 32bit OS? Did you know manufacturers sell $80 GT 430 cards with 4GB of RAM on them too? Newegg always has a few available, so do many B&M stores. I'm not even sure what would happen really, I'd guess the system would only recognize between 512-1GB of the GPU memory?

Most GPUs come with 1.5, 2, or 3GB of onboard memory, and 32bit Windows is limited to 3.15GB as it is. Combine that with the fact that almost all desktops and laptops ship with a base 4GB of RAM by default, and 32bit OS's 3.15GB limitation is just beyond the absurd.

I'm not a software engineer nor can I claim to know anything about it. But I can't see the problem here? As you've said 32bit software does work on 64bit machines. Furthermore, for backward compatibility anyone can fire up a 32bit virtual machine inside a 64bit OS. Between both options I think that pretty much covers things?

Hardware support is finally here. Except for a few Atom SKUs all processors are finally 64bit capable. The 16-bit BIOS is finally all but phased out overnight by the 64bit UEFI. From a hardware standpoint, things are ready. The sooner Microsoft dumps 32bit Windows the sooner software developers will begin writing their code so it actually can make use of system resources that've been going unused for half a decade, often to the detriment of users that don't truly understand why their program or game is constantly breaking or performing so poorly.
 
Last edited:

marfig

No ROM battery
Hey Kougar. Was hoping you chimed in. My question was as much a provocation as a real desire I have for a debate on this matter. I think we are on a very important turning point in computer architecture, development practices and usage patterns, and very little is talked about it. However, I'm abroad and roaming through a mobile network. Costs are high, and even though it's my employer the one paying I'll try and keep this short.

Actually, I fully am serious!

In a day and age when we ask OS developers to broaden the architecture reach, you can't be serious. Not when all you can come up with is your desire to play games in full settings. You and all other 23 people in the world using 1920x1200 monitors.

We want Windows on ARM, Linux on PPC, Mac OSX on Intel. And you want to do the exact opposite? Really? Might as well complain with Intel and AMD and tell them to stop doing 32bit processors. You'd get me there and I'd have an harder time arguing with you. But at the OS level? No way.

Kougar, is not up to the OS developer or hardware manufacturer to dictate when trends stop. They create them, they don't (or shouldn't) take the rug from under our feet until we are ready. The implications of Microsoft or Linux to just stop supplying 32 bit versions would be devastating in the current market. Only two "people" define when the market is ready. Consumers and software developers. And currently we have hope since the push towards 64 bit is starting to become noticeable on both fields. But there's a whole lot to do still.

There's a complicated triangle of interests that you need to manage in order for a successful transition to take place. You have developers that don't want to just start doing 64bit versions of their applications because they will not be selling to the 32bit user base. You have OS developers that don't want to stop supporting 32 bit processors because 32bit hardware is still being made. You have hardware manufacturers that don't want to stop making 32bit processors because there's still a concrete demand for it, as continuous software development proves to them on a daily basis. The road to transition is done by pushing hardware manufacturers and OS developers into incrementally reduce 32bit support.

But how can you do that when software developers still largely ignore 64 bit programming? Let me say again, 64 bit backwards compatibility only ensures it can run 32 bit programs, it does not guarantee people will develop 64 bit applications. And this is what needs to change. Not some OS developer suddenly deciding to pull the plug on what's still more than half of their user base! Be patient.

Even Minecraft, as basic a game as that is, has this problem. Unless users install a 64-bit JAVA installation they cannot use the "Far" distance view without exceeding allocatable RAM limits and crashing the game. And we're talking about Minecraft, of all things!

Minecraft has that problem because of its development strategies. Not because 3 GB can't handle it. In other words Minecraft is bad code. Give me a break.

That said, I fully appreciate your point. And I agree. We are reaching the limits of 32bit computing on the computer gaming industry. Not because of poorly executed java games that are trying to go well beyond the programming language inherent limitations, but because other games like Civ V, Skyrim, or many other top AAA titles have been demonstrating.

But you think the gaming market alone should dictate what computers we will be using from now on? There's this tendency we have to build bubbles around our interests and situate ourselves inside them, pretending those are the borders of our world. But the matter of fact is that, for good or bad, the gaming market is still just a fraction of the interests concerning the computing world. In other words, gaming alone won't get you there. And you want more evidence than the fact it hasn't yet? The computing market is still largely dominated by businesses. These are the largest costumers. A typical household in a western country has 1 or 2 computer. But mom and dad each use 1 computer at work, Jimmy can access 20 computers in the school library and little Sandy was sick last week and at the hospital where she was checked there were 200 computers some of which interfacing with 32bit and 16bit medical devices and machinery.

You want ubiquitous 64 bit computing? You'll have to deal with this market too. It won't happen because you were happy spending 300 USD and onwards on a large monitor for your gaming.

And speaking of which...

You'll also want to convince hardware assemblers to stop considering 4GB as the default RAM on their middle range computers. That's not how you are going to sell 64 bit machines and expect anything useful to come out of it. We are currently having the exact same problem we had a few years ago when 2GB was considered the default RAM on a 32bit computer. It wasn't enough. Your game will almost always run better as a 32 bit executable it is built with Large Address Aware enabled than on a 64 bit machine with 4 GB ram. And this for a simple reason, 64 bit executables make use of memory address pointers twice the size of 32 bit executables. Because this is volatile (dynamic) memory, it doesn't mean the executable uses twice the memory, but it means it indeed is heavier on ram than its 32 bit sibling. A 32 bit executable built with LAA will make use of all 4GB of ram and consume less memory.

From another angle... what would happen if a person bought a 4GB GTX 680 card for a 32bit OS? Did you know manufacturers sell $80 GT 430 cards with 4GB of RAM on them too? Newegg always has a few available, so do many B&M stores. I'm not even sure what would happen really, I'd guess the system would only recognize between 512-1GB of the GPU memory?

Most GPUs come with 1.5, 2, or 3GB of onboard memory, and 32bit Windows is limited to 3.15GB as it is. Combine that with the fact that almost all desktops and laptops ship with a base 4GB of RAM by default, and 32bit OS's 3.15GB limitation is just beyond the absurd.
Video memory is on a different address space. A 32 bit machine can make use of roughly 3.5 GB of GPU RAM, while still maintaining its ~3.5 GB of on-board ram.

But yes, no 32 bit machine can make use of >=4GB video cards.

But not all is roses. And I hope you do realize the implications of large resolution gaming and high memory capacity video cards. I'm not saying these aren't desirable. I'll always vote yes for moar in graphics. But don't expect it coming without any pain. Your immediate biggest problem is the current compression algorithms and the fact it would become very tempting (and actually desirable) for much higher resolution textures. You could expect double (or more) the size of current textures. And because game assets make up the bulk of a game size on disk, you can safely expect games to start shipping in a 16, 20, 30 GB download file. All for you to download on Steam in the comfort of your home, whether or not you have a bandwidth limit plan with your ISP, whether or not you have a fast internet connection.

Hardware support is finally here. Except for a few Atom SKUs all processors are finally 64bit capable. The 16-bit BIOS is finally all but phased out overnight by the 64bit UEFI. From a hardware standpoint, things are ready. The sooner Microsoft dumps 32bit Windows the sooner software developers will begin writing their code so it actually can make use of system resources that've been going unused for half a decade, often to the detriment of users that don't truly understand why their program or game is constantly breaking or performing so poorly.

You are giving Microsoft more credit than it deserves. As if it was on its shoulders whether the world would switch to 64 bit computing overnight. That happened a long time ago, in 1995. The market was younger, much smaller and 32 bit computing really introduce serious advantages at a time it was needed the most (I hope you read my previous post). It's not up to Microsoft anymore. Its hands are as much tied as yours. And I could probably wager on the fact their frustration doubles yours. It's them who have to code, maintain and support two versions of an operating system for the exact same processor. You think they like it!?
 
Last edited:

Kougar

Techgage Staff
Staff member
marfig said:
In a day and age when we ask OS developers to broaden the architecture reach, you can't be serious. Not when all you can come up with is your desire to play games in full settings. You and all other 23 people in the world using 1920x1200 monitors.

In 2006 that would've been true, but today? 1920x1080 is one of the most common resolutions on 20-24" displays. I am no longer a small subset of gamers. Even cheap 21" sets comes with it. As for such niche gamers ya reference, they have already ugpraded to cheap 27" QHD sets that use 2560x1440. And lets not ignore every single 30" monitor owner out there either.

marfig said:
We want Windows on ARM, Linux on PPC, Mac OSX on Intel. And you want to do the exact opposite? Really? Might as well complain with Intel and AMD and tell them to stop doing 32bit processors. You'd get me there and I'd have an harder time arguing with you. But at the OS level? No way.

ARM is a good point to bring up. ARM is NOT 32bit or 64bit because it is not even x86 capable. So building a 64bit OS has no affect on this whatsoever, because ARM chips can't use it regardless. Secondly, by your own arguement if this is such a high workload to begin with, then why not lighten it by moving coders away from 32bit OS development entirely, so they can focus on ARM or 64bit deployment? As for the CPU venders, ignoring older Atom models both already stopped making 32bit-only processors. Even VIA's Nano CPU is a 64bit processor these days. So there's no need to ask, they already did it.


marfig said:
There's a complicated triangle of interests that you need to manage in order for a successful transition to take place. You have developers that don't want to just start doing 64bit versions of their applications because they will not be selling to the 32bit user base. You have OS developers that don't want to stop supporting 32 bit processors because 32bit hardware is still being made. You have hardware manufacturers that don't want to stop making 32bit processors because there's still a concrete demand for it, as continuous software development proves to them on a daily basis. The road to transition is done by pushing hardware manufacturers and OS developers into incrementally reduce 32bit support.

I fully agree there's a complicated overlap of interests. But there is nothing stopping venders from keeping their code 32bit based, even on 64bit systems. That's what most people have actually done already! If Microsoft released a 64bit OS it doesn't force them to write 64bit code, they could delay several more years until most of their legacy customers have upgraded.

As for the hardware you speak of, I've already covered this point. AMD doesn't make 32bit anything anymore, even VIA's Nano processor family is 64bit. And old Atom SKUs aside, Intel doesn't make them either. So the hardware is no longer there and has moved on. Find some 32bit chips currently being made and I'll reconsider, but it's been ages since Pentium M's and Semprons were made.

marfig said:
But how can you do that when software developers still largely ignore 64 bit programming? Let me say again, 64 bit backwards compatibility only ensures it can run 32 bit programs, it does not guarantee people will develop 64 bit applications. And this is what needs to change.

That's where we disagree. Again, they can code for 32bit as long as they want on a 64bit OS, that's each vendors own prerogative, as each one must serve its own customers. What I am trying to get at, is that other vendors that COULD make the switch will not until an outside influence brings about the change. In my opinion, until Microsoft stops releasing a 32bit OS, that influence for change will not happen.

Minecraft has that problem because of its development strategies. Not because 3 GB can't handle it. In other words Minecraft is bad code. Give me a break.

Nope. 32bit JAVA will run out of memory allocation. When it happens, the screen will go grey. Running a 64bit JAVA install will fix this. Furthermore, the game tries to be aggressive about paging stuff out of memory to keep from overrunning this limit, which greatly impacts game performance due to all the memory loading/unloading constantly going on. Most games suffer this problem because they must tightly cap their memory usage. In a world of HD textures, this is a huge problem.

marfig said:
But you think the gaming market alone should dictate what computers we will be using from now on? There's this tendency we have to build bubbles around our interests and situate ourselves inside them, pretending those are the borders of our world. But the matter of fact is that, for good or bad, the gaming market is still just a fraction of the interests concerning the computing world. In other words, gaming alone won't get you there.

Ah, but I never made any claims about the gaming market being the sole dictator of this. it's just the easiest example to pick from, and one that I can personally relate to as well as any other XHD and QHD gamers out there.

So, to pick an example outside the gaming market... just look at the iPad's Retina display. Intel forecasts such high DPI to progress to the computing sector starting next year... a la this slide. No offense, but when desktop programs begin working at this resolution memory requirements will explode. In some cases the memory footprint required for displaying the program will outright double or even triple depending on the resolutions involved. If a 21" desktop is supposed to have a "3840x2160" resolution, then 32bit software is going to be a huge problem. Most games won't scale that high without breeching the limit. And 32bit Photoshop will likely break when users begin exhausting it's memory limits when working at native resolution files!

You want ubiquitous 64 bit computing? You'll have to deal with this market too. It won't happen because you were happy spending 300 USD and onwards on a large monitor for your gaming.

Last I checked, 1920x1080 monitors can be had for $90 in a store, less if on sale. Any 21-24" display these days uses it.

marfig said:
You'll also want to convince hardware assemblers to stop considering 4GB as the default RAM on their middle range computers.

It's already happening. 8GB is standard on many systems already, or offered as a "free" upgrade. Checking right now, the cheapest Dell laptop ships with a single 4GB SODIMM, allowing users to buy a second module for a quick and easy upgrade. Higher end models already start at 8GB. Given that a person can buy 16GB of 1600MHz, low voltage 1.35v, DDR3 RAM for $82 without a rebate or sale discount, I think we're already past the inflection point you're discussing.

Your immediate biggest problem is the current compression algorithms and the fact it would become very tempting (and actually desirable) for much higher resolution textures. You could expect double (or more) the size of current textures. And because game assets make up the bulk of a game size on disk, you can safely expect games to start shipping in a 16, 20, 30 GB download file. All for you to download on Steam in the comfort of your home, whether or not you have a bandwidth limit plan with your ISP, whether or not you have a fast internet connection.

This may be why Max Payne 3 will ship with a 35GB installation footprint. There are too many games to even bother listing them that use 15-20GB of space and have been doing so since 2009. Many are indeed hosted on Steam. Which is an excellent point to bring up with any ISP that contemplates 5GB monthly download caps.

marfig said:
It's them who have to code, maintain and support two versions of an operating system for the exact same processor. You think they like it!?

That's a very interesting point. I'm sure they don't, and they've indicated it requires diverting resources away from other development to do so, which in itself is obvious enough.

If nothing else, if they couldn't ditch 32bit with Windows 8 as some industry experts indicated they could very well do, then they at least could only make the 32bit version available as a special SKU, while all store units and mainstream systems shipped 64bit only. Businesses that wanted to stick to 32bit could order it as an option, but by default consumers would be getting 64bit. It would also serve a second purpose as signaling they plan to finally move away from 32bit with Windows 9, and make it easier for them to do so.
 
Last edited:

marfig

No ROM battery
In 2006 that would've been true, but today? 1920x1080 is one of the most common resolutions on 20-24" displays. I am no longer a small subset of gamers. Even cheap 21" sets comes with it. As for such niche gamers ya reference, they have already ugpraded to cheap 27" QHD sets that use 2560x1440. And lets not ignore every single 30" monitor owner out there either.

Of course 1080p is a small number. It represents half of the monitors on Steam. But again you are looking at the game sub market. Not at the whole universe that comprises Windows customers.

ARM is a good point to bring up. ARM is NOT 32bit or 64bit because it is not even x86 capable. So building a 64bit OS has no affect on this whatsoever, because ARM chips can't use it regardless.

The instruction set architecture of a processor is transversal to its design. Every processor carries an instruction set architecture that defines its supported memory address space and integer size. ARM is a 32 bit processor.

Some processors are hybrid processors. The still ubiquitous Z80 is an 8 bit processor capable of addressing 16 bit memory addresses and performing 16 bit arithmetic thanks to register coupling.

ARM has only last year announced a 64 bit version of the instruction set. It hasn't been implemented yet. Windows is going to support ARM processors starting with Windows 8. That's a 32 bit Windows OS, whether you like it or not. If and when 64 bit ARM is implemented, 64 bit Windows versions for those new 64 bit ARM processors will necessarily be developed.

Also, my point was that reducing support for processor architectures is really not what we should want from an OS vendor. Ideally, we would want them to support every single processor out there. That's what a good operating system should do and that's why there's been historically a consumer demand for vendors to broaden the scope of their support.

As for the CPU venders, ignoring older Atom models both already stopped making 32bit-only processors.

No, they haven't. The newly released Ivy Bridge still supports 32 bit hardware operation mode. Same with the Bulldozer architecture. This is called the legacy mode, as opposed to the 64 bit operation mode, dubbed long mode.

They have to. This is what guarantees that vendors can offer 32 bit backward compatibility on the 64 bit versions of their operating systems without having to do complex data marshaling which would impact greatly on 32 bit software performance.

I fully agree there's a complicated overlap of interests. But there is nothing stopping venders from keeping their code 32bit based, even on 64bit systems. That's what most people have actually done already! If Microsoft released a 64bit OS it doesn't force them to write 64bit code, they could delay several more years until most of their legacy customers have upgraded.

This is unfortunately only one side of the story. As described above processor manufacturers will need to keep 32 bit support for quite a while still. All the way until 32 bit development is fully displaced on desktop computers. By telling developers they can keep their 32 bit software counterparts, you are in fact extending the period of this support. Exactly what you don't want to happen.

So what needs to happen is convince them to migrate their software to 64 bit versions. And offer 64 bit versions only. And here is where we are at this point in history.

Why are developers being slow in their acceptance of 64 bit computing? Apart from the need they feel to supply to the still prominent 32 bit consumer, there's also very real programming considerations when moving to 64 bit development. I'll be making a post about it on this thread tomorrow. Suffice to say for now that moving to 64 bit development can be very difficult for some software developers at this point and incur in great costs.

Find some 32bit chips currently being made and I'll reconsider, but it's been ages since Pentium M's and Semprons were made.

Other than what I said above, also remember that the 64 bit instruction set removed 16 bit support. 16 bit support is given only through the legacy mode I talk above. This is yet another reason why 64 bit processors for the desktop computer will need to keep supporting 32 bit operation mode for quite a while. Because the 32 bit instruction set is the one that carried 16 bit support.

Why is this important? Because a vast number of peripherals in the industry still use 16 bit chips and understand only 16 bit operations. From medical devices, to home security and nuclear plants.

Nope. 32bit JAVA will run out of memory allocation. When it happens, the screen will go grey. Running a 64bit JAVA install will fix this.

And yet, why do you think much more complex games than Minecraft can allow players to see further away or have more elements on screen without failing miserably? Let me put it this way, if Minecraft was developed in C++ and not on Java through a memory hog graphics library (the LWJGL), its memory problems would not happen so soon.

Lets be clear, Java (and the LWJGL if you want) are great tools. This isn't an attempt of mine at a jab. Quite on the contrary, I respect the programming language. But Minecraft pushes Java to its limits and reveals the limitations of this programming language. Other programming languages are more suitable for what Minecraft pretends to be. It's only because the programmer chose Java that you have this problem. Not because 32 bit isn't enough to run a game like Minecraft. It's also bad code from the developer, for the reason I explain below.

I'll give you a better example: Skyrim or Fallout 3 (including New Haven, of course). They both reveal the problem you are talking about. In order to not go over the memory allocation limits for a 32 bit application (2 GB), these games can't render detailed textures further away, and even remove elements from the background. For instance, Fallout removes trees, houses, and other elements that are far away and only render them as the player approaches.

Those I believe are better examples because they employ a programming language that can make the most of the available memory allocation. Also the programmers don't do the stupid thing of showing you a grey or black scene because they ran out of memory to put things on the screen.
 
Last edited:

marfig

No ROM battery
So as promised here's a more detailed explanation of the complications of 64 bit software development, along with a conclusion at the end wrapping up my thoughts on this matter. I hope I don't end up making a too long post that becomes boring to read.

Key concepts
Here I'm introducing some key thoughts to keep in mind, while you read what's below:

  • 64 bit operating systems can run 32 bit applications. But 64 bit applications cannot run under any circumstances, except one (1), on 32 bit operating systems. To be clear, if I develop a 64 bit only application, it cannot install on users who are still running 32 bit operating systems (be it Windows, Linux, or Mac OS X to name the most prevalent).
  • WOW64, the 32 bit emulator for 64 bit Windows cannot execute 32 bit device drivers or run certain 32 bit applications that try to access pure 64 bit processes. This means for instance that 32 bit add-ons for Internet Explorer, Office, Windows Explorer, an any other software that expose a 64 bit plugin or extension architecture, won't work. What this means is that 32 bit support on 64 bit Windows is actually limited.
  • All major operating system vendors still offer 32 bit versions. Microsoft, Apple and the Linux kernel community. One of them removing support for their 32 bit OS branch may, or may not, have an impact on users decision to move to another operating systems. Also, each one of them in their own different ways still face the occasional 64 bit support issue that doesn't exist under the equivalent 32 bit version. This may or may not work as a scare to some users or decision makers working under mission-critical environments.

Introduction
This post concentrates only on the complications of 64 bit programming and on the effect it has been having on the software development "community", from the hobbyist at home to the major software development company.

Most of this is written from my own experience and my observations of the industry. As such, most of this is academic ramblings at best, flat out wrong at worst.

Why 64 bit development?
Let us turn the relevant question up side down. Instead of asking ourselves why are we still doing 32 bit applications, I'll ask what's advantageous about 64 bit development.

We know the advantage of 64 bit platforms. We can have more memory. That's a pretty big deal right there in these days when the complexity of what can be done on a computer means larger programs demanding access to larger portions of our memory. 64 bit computers are for instance today pretty much the requirement to run virtual machines, to host servers or to satisfy power users who require an hefty number of heavy-duty applications to run at the same time on their computers.

Conversely, the question on whether 64 bit applications run faster than their 32 bit counterparts is purely circumstantial and has no bearing on a discussion that we want factual. Some applications will work faster (applications making heavy use of the processor registers), others will have no meaningful effect (applications which don't). Some will work slower if the developer didn't do their job correctly. Which, unfortunately many don't. So for all purposes, the address space of 64 bit computers is the real and palpable advantage here. The rest is too debatable and incidental for being of any use.

Does this necessarily translate to a desire in developing for the 64 bit architecture? It depends.

As far as software developers are concerned, the 64 bit platform offers them two advantages over the 32 bit one. They can work on an extended allocated memory space for their applications. And they can work on larger in-memory data sets.

This is the rundown of of application memory allocation limits on both architectures, in Windows:

32 bit
  • Static memory: 2GB
  • Dynamic memory: 2GB
  • Stack memory: 1 GB

64 bit
  • Static memory: 2GB
  • Dynamic memory: 8TB
  • Stack memory: 1 GB

It's interesting to note that only dynamic memory benefits. This is by design of Windows itself. Maybe one day, Microsoft will implement a new PE (Portable Executable) format that implements 64 bit fields. But until then, no dice. And because that format would be incompatible with 32 bit executables, 32 bit backwards compatible on 64 bit windows as it exists today would be more difficult.

In any case this is a tremendous advantage to programs that either require large amounts of memory or that wish to work with data sets larger than 2GB.

This is the real advantage of 64 bit development under Windows. One that is crucial for the evolution of certain industries like Gaming (as discussed on this thread), but also some scientific software, server software (including obviously, databases), virtual machines that have been experiencing a boom in later years, video and imaging software and and even on certain key businesses administrative tasks. For instance, it's not unknown the need of certain businesses to work with very large Excel sheets, above the 2 GB boundary.

So, why not 64 bit development?
Well, if you don't require more than 2GB allocable memory in your application, you don't really need to develop for the 64 bit architecture. What's worse, doing so can be a problem under certain circumstances, adding a new set of concerns to an already long list of programming challenges.

The reality of it all is that 64 bit programming translates simply to access to more memory. It isn't a fundamental architectural event like the 32 bit revolution was. It finds its proponents on those areas that can take real advantage of it and that depend on it for their evolution. Every other programmer can go pretty much "meh!" on it without fear (for now, at least).

This has the disadvantage that, contrary to most of the history of 32 bit adoption, 64 bit has fragmented the developer community between those that need to develop for it and those that don't. Those that don't, usually delay their adoption and keep doing 32 bit development, unknowingly extending the period of 32 bit support hardware and operating system vendors need to give to the 32 bit platform.

The question is still unanswered though. Why not just do the bloody program as a 64 bit executable?

  • 1st problem: Porting software can be a lot of work.

    For years developers know that "if it works, don't change it" is a good maxim. This is a culture that ensures good, working and dependable software. But it's also the reason why so many legacy systems exist to this day. It's a two-sided stick. With one side you thrust bugs and all the problems that could arise from messing with code that just works, with the other side you are poking yourself in the gut and forcing the industry to somehow maintain, or have to deal with legacy systems. From Internet Explorer 6 to old 16 bit systems still operating on companies, we see this culture operating on both the consumer and developer markets.

    The reason is that rarely a program can be ported to a newer architecture without any work and without failing or introducing new bugs. The work involved in dealing with the new problems is often not acceptable when the work involved with maintaining the old system is too much already. On the other hand, the system in question (here meaning the combination of hardware and software) can be a mission-critical system and one cannot afford any risks, particularly on cases where it's difficult or takes a whole lot of time to do proper testing.

    64 bit development introduced changes to the programming data models and function convention calls that can easily introduce new bugs when porting 32 bit programs to 64 bit platforms. Bugs that can be hard to spot. Despite all the documented best practices, many developers take for granted the size of a variable in memory and perform operations (bit shifting and even simple integer arithmetic) that will fail when the program is compiled in 64 bit. Depending on which compiler they use and how it is setup these errors can creep in in an invisible fashion and result in a program crashing (which is good) or producing wrong results (which is disastrous and the absolute worst that can happen).

    Likewise, the changes to function convention calls in x86-64 means that many DLLs or executables that communicate with different programming language DLLs, or have been designed to be easily ported to other systems, may require extensive changes to the source code.

  • 2nd Problem: Available input data formats and boundaries

    The x86-64 architecture poses a problem to systems that rely on existing data that has been tailored specifically for the 20 year old 32 bit architecture. If data exists on a system out of the developer control (another DLL, a database, a file) that has been formatted according to the specifications of a 32 bit typical integer or that makes use of data sets using typical 32 bit boundaries that are not longer correct under 64 bit development, more or less complex data marshaling routines have to be developed to correctly read and write to this, now old, data format. Many of which can have an impact on application performance.

    Many examples exist of this problem. A common one among companies is the Jet Database Engine that powers Microsoft Access and is was (and is) used extensively on Visual Basic projects. Until last year, Microsoft didn't make available a 64 bit version of this engine and WOW64 didn't support it either. 64 bit applications couldn't access mdb files. The only way was to use third-party marshaling software that would be installed under WOW64. Software that meant a decrease in database access performance.

  • 3rd Problem: Library (or API) availability

    This is particularly problematic to in-house software development or developers working with no longer maintained API or libraries, or older versions for compatibility or stability reasons.

    A source Library is a collection of code that gives developers manageable access to certain programming language features or facilitate the development by providing pre-written classes, routines and specifications. For the purposes of this debate consider Libraries as just a form of an API, if you don't want to care about the actual differences.

    The matter of fact is that Libraries and APIs still exist in great numbers with no, or limited, 64 bit compatibility. Because these represent key components of the development stack, it's simply not possible to develop a native 64 bit application accessing these libraries without once again incurring in performance-wise expensive data marshaling.

    Too many examples exist once again. To pick one -- and to get away from Windows for a bit -- consider the Carbon API for the Macintosh. Developers who have been relying on it, will need to migrate to Cocoa if they wish to develop for 64 bit. Only the GUI portion of the old API is supported in 64 bit. What's worse, Apple doesn't provide C/C++ compatibility between the 64 bit versions of Mac OS X and Cocoa. So former Carbon application developers need not only to learn about the new Cocoa API, but also learn to program in Objective-C. Want to know why Final Cut Pro X had less features than the former version? Well, that's the more than likely reason since the application was entirely re-written for the Cocoa API. We are talking about having to rewrite in an entirely new programming language extensive areas of the application, if not everything.

    This is an extreme example of what can go wrong with the lack of library or API support, but one that illustrates well why some developers may be so resistant to the idea of moving to 64 bit development.

  • 4th Problem: Porting software on diverging data models

    64 bit development introduced a new problem to those wishing to develop software for multiple platforms (usually Windows, Linux and Mac). Until now the most used data type had the same size on all systems. Not anymore. Windows 64 bit uses the LLP64 data model that attributes a 32bit size to the integer data type. Unix-based systems use the LP64 data model, on which integers are 64 bit in size.

    The complexity of porting 64 bit software resides on the fact that Windows 64 bit now requires the long long data type to store pointers and other data entities that represent 64 bit values, whereas the good ol' integer remains the data type for these in Unix-like systems.

    Because historically developers have consistently make the mistake of relying on data types sizes (despite all the well documented best practices) to perform many arithmetic or bit shifting operations this rarely means just changing the variable data type and recompile. More often than not, this means carefully going through the code (code that was likely written by someone else or many years ago) trying to find hard to spot bugs that will be introduced with the change.

    Of note also the fact this is a problem even if one is not porting software to other systems, but simply porting 32 bit code to 64 bit on the same operating system. Here, Linux and Mac developers have it the worst because their integer data type has been silently changed to a new size, while Windows developers may relax, having to concern themselves only with the fact their pointers need to change to the long long data type. They must however check all their pointer arithmetic just in case. There's a whole lot of bad practices there too.

Conclusion
This is an addendum to my previous post on this thread. Here I try to explain the complexities of 64 bit development. Particularly on what concerns porting 32 bit code to run on the new x86-64 architecture in long mode (as a native 64 bit executable).

For the impact this has on the requirements on vendors to keep supporting 32 bit versions of their operating systems, see that post.

The matter of fact is that 32 bit development is still a in-your-face reality. It's going to take still some time. The transition cannot be imposed by an operating system vendor like it was before with the 32 bit adoption.

Thankfully we are walking that road. And every new day is one day closer to that goal. It's only when consumers and software developers give the signal their adoption of the 64 bit platform is complete and only a fringe still hold on to old practices, that vendores will feel it's the time to pull the plug.

If there is one corollary to all this, that is: It's not because Windows stops developing the 32 bit version of their operating system that developers will stop doing 32 bit software. Exactly because Windows 64 bit is mostly 32 bit compatible. It won't change anything. The issues with 64 bit development are real and impose themselves on many developers and businesses. It's developers that need to deal with the transition and eventually stop developing 32 bit applications. At the very least start supporting 64 bit versions of their software.

It's a bit like expecting the death penalty will stop crime. You stop crime by dealing with the criminal directly.

Want to argue with anyone? Argue with your favorite game developer, if they don't give you a 64 bit version of their game.

(1) A 32 bit operating system running under a 64 bit processor capable of legacy mode (Intel and AMD, for instance) and that supports hardware virtualization, can install a 64 bit Guest operating system in a virtual machine and run 64 bit applications natively (in the actual hardware).
 
Last edited:
Top