It has nothing to do with that, it has to do with the fact that it's tied into the Windows operating system, and cannot run on alternate OSes, like Linux and Mac OS X.
Why should DirectX be developed for other OS's? From my understanding this is the equivalent of taking the transmission out of one model truck and expecting it to work in a competing model. DirectX is a coding API designed to integrate the OS with the hardware and to set standards to make code (IE game) development easier as they can expect the feature set and OS support to be there. It's an integral component of the OS, Windows wouldn't work without it and DirectX won't work without the Windows OS. At least without some serious modifications due to interdependencies, one of them being Microsoft's COM.
With the advent of PC gaming the majority of 3D games were written for OpenGL, because it was the only actual standardization for hardware calls between the hardware and the OS that came about at the time. Microsoft saw that OpenGL was not being developed for 3D gaming, instead the standard was oriented to develop for processional 3D rendering programs. Microsoft filled the gap by writing DirectX specifically for gaming, and eventually through better optimizations and more gaming oriented featuresets they finally began getting it right and it was adopted by more game devs. Even John Carmack, the guy behind id Software and is the last major game engine holdout for OpenGL praised Directx9 and said it was as good as for better than OpenGL by that stage.
I'm talking about the choice game developers have... they can code their games in OpenGL today for Windows 7 should they wish, but only Id Software and small Indie game developers seem to bother doing so anymore.
I'd be willing to bet that Apple would never even ponder implementing DirectX support into the OS, but I could see it being a possibility on Linux, or at least an option so the FOSS die-hards can remain happy.
Apple is Apple, they don't tend to follow cutting edge hardware/software that closely. For example, here's
Ars Technica's article on OpenGL 4 released five days ago. Apple doesn't even support OpenGL 3 yet, so it's no wonder they don't have any apparent rush to adopt high-end GPUs in their products. It would be pointless without the OpenGL support to take advantage of new featuresets in the hardware.
You're right about OpenGL sucking for gaming, but that's the problem. There's one real option, and it's tied into Windows. That sounds like a monopoly to me.
It's an optional monopoly by default, is my point here. Nothing is stopping anyone from coding their games in OpenGL 3 for Windows 7 64bit. The only problem is they lose features and can't do the same tricks in hardware that they can with DirectX, because OpenGL doesn't push new featuresets. For the past . As one writer put it, the problem with open standards is that it takes significantly longer for them to be developed, changed, and added upon... and that is exactly why OpenGL has been playing catch up by belatedly copying various calls and standards developed in each release of DirectX since DX9.