NVIDIA 9800GTX and GX2 Get Greenlight

Merlin

The Tech Wizard
I here all sorts of nasty stories about what the current GS does when my son brings his friends out for a little home lan action. His best friend used to work for them back in the day before they (GS) got bought out, but he still knows a bunch that still work there.

One thing is for sure, if you ever do take a computer into them to be worked on, you had better remove all personal date from the HD or they will steal it, it happens all the time, they no controls to prevent it from happening.
Also, there was an add that GS would remove porn from the computers that were brought in for repair. A preacher was caught with boys on there, what they thought was too young, the outcome was never revealed. But if you have porn on there, burn it to CD and then remove before the GS does repairs....I guess if you can.
Hell Yeah I look at women, I just dont save it to file..:eek:

Also
http://www.technibble.com/geeksquad-caught-stealing-porn-from-customers-computer/
 
Last edited:

b1lk1

Tech Monkey
You are not doing yourself any favors at all judging this card by the first set of benchmarks released. I am not impressed by this card as well, but I also know better than to judge it this fast.
 

sbrehm72255

Tech Monkey
I'll wait to judge the card till after a few more site report their results.................;)

I didn't like the original X2 card from Nvidia (7950GX2) and I really don't like this one (design really) either, I think that AMD/ATi took the smarter route by placing both cores on the same PCB. I think maybe Nvidia was trying to save a bit of R&D cash by using their old design.
 

Kougar

Techgage Staff
Staff member
I'd agree one review isn't enough to base an opinion on... however there are three or four thorough reviews out already that I've looked at. A few more won't change anything now... ;)

Some driver updating may improve the results, but there've been enough reviews to indicate pretty well where this card falls.
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
I won't ever look at TweakTown for more than one reason, but the main one is that I find it ridiculous that regardless of the GPU being looked at, the review is 19 pages. Screw that.

I'll wait for a review from a more credible source (Techgage doesn't have one at this point in time).
 

sbrehm72255

Tech Monkey
I also tend not to read extremely long reviews that hash over the same old stuff time and time again. Short sweet and to the point is what I like reading, but then again I only have a 2nd grade education...........;) J/K'ing.
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
Hah. I have a far different interpretation of what makes a good review than others, so how we do things will differ a lot from other sites. In my opinion, if it takes a visitor less than ten seconds to finish a page off, there is something wrong.

I just live by that from experience. I have read numerous sites before beginning Techgage and reviews like that pissed me off, but it's certainly common (obviously to bump up adviews).

To each their own, I guess.
 

Kougar

Techgage Staff
Staff member
When most of the pages are nothing but graphs, doesn't seem that lengthy, at least from my point of view.

They tested everything in XP and also Vista, and then a few high AA modes. Only thing missing was a pair of G92's in SLI to compare against the GX2.
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
This review could be an exception, but if it's a preview, why perform EVERY test? That seems like a complete waste of time, when newer drivers could improve on things all around.

But, that's the thing to me... most pages are graphs and nothing but. But like I said, many people will disagree with me, and that's fine. I just like to know that our visitors can visit a page and not immediately hit the next page link. Sure more adviews would be great, but by doubling our reviews pages, it's a cheap way to get it done.

Boy, have I swerved this topic right off the tracks.
 

sbrehm72255

Tech Monkey
This is what the [H] had to say about the GX2 against the ATi X2..............:eek:

Seems that they liked the card and it's performance...........:D Only issue they have with it is the extremely high price, but we already knew that...........;)

9800 GX2 vs. ATI Radeon HD 3870 X2

The ATI Radeon HD 3870 X2 is around $200 cheaper than the GeForce 9800 GX2 and its performance is drastically slower. In Crysis we had to play with all the in-game settings at “Medium” which reduced the gameplay experience. The Radeon HD 3870 X2 was drastically slower in Jericho supplying a tremendous difference in the gameplay experience in this shader intensive game.

http://enthusiast.hardocp.com/article.html?art=MTQ3NSw5LCxoZW50aHVzaWFzdA==
 

Kougar

Techgage Staff
Staff member
Looks like y'all were right after all... apparently the drivers used were a major issue. All in all the card looks more respectable in the launch reviews than in that TT pre-launch review.

I still say anyone that spends $650 on a GX2 is going to want to maim something when the true next-gen stuff comes out Q3-ish for that same price. :D
 

sbrehm72255

Tech Monkey
As stated before, these current cards are just a stop gap thing for Nvidia, sort of a waste of cash unless you need something right now.

Just OC the poop out of a GTS (G92) and you'll have a 9800GTX.
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
For -most- people, I think the GX2 is a waste of time. However, for those using 1920x1200 or 2560x1600, the card would prove invaluable. Right now, no single-card configuration boasts such power like this. You could say, "Just go SLI", but with the GX2, you are not forced into a specific motherboard purchase.

Despite the GX2 being a stop-gap, it still excites me. I'm unsure of the performance of dual GX2's (although TweakTown has another "preview" I refuse to look at), I think if done well, two of those cards would be fantastic for those using 30" displays.

Just imagine Crysis on a 30" with dual GX2s! With power like that, the game should be able to be run with Very High detail (considering two 8800GTS in SLI can use High for the most part).
 

sbrehm72255

Tech Monkey
I guess Crysis at 1920x1200 would might take advantage of the GX2 but UT3 at that rez plays great at max setting on a little ole 8800GT. But seeing the way the gaming industry is going with demanding super high-end hardware to play their new games, maybe it would be a good buy to future proof for a little while.

But sooner or later these super high hardware requirements is going to kill the computer gaming industry.
 

Kougar

Techgage Staff
Staff member
I don't think things are that bad...

We've been using the same GPU core that was launched in November 2006, only thing Nvidia has done is launch die shrinks and overclocks of it. Soon to be going on two years and it still offers better performance than anything else... Nvidia launched the 7800 series, 7900 series, and 8800 cards inside the same 2 years from 2005-2006, not to mention ATI had some nice cards as well.

My point is just that had GPUs continued to be launched at the same pace as they were back then with each new release pushing the envelope like normal, then Crysis should of just seemed like a normal (if the most demanding) game to arrive yet. It wouldn't of been any worse than Oblivion was, I might bet.

Whenever NVIDIA decides to finally launch a completely new core there will probably be a good bit of headroom again for games to use. Even my 320mb 8800GTS will run Crysis with an average of 16FPS at 1920x1200 stock CPU, although I've admittedly not played it except for benchmarking yet.
 
Last edited:

b1lk1

Tech Monkey
I can happily say I am glad I bought the 8800GTS 512MB card. I paid just nder $300 for it and it plays everything I throw at it @ 1920X1200 with some AA/AF thrown in. While the GX2 is of course a fast card, it is nothing to lose sleep about. I could simply add another GTS to my system for roughly $275-300 (Thank you XFX price drops) and match the performance for less money.

In my opinion, even 2 9800GX2's will not allow high res Crysis at very high details/settings. In fact, I am wanting to see if they can even get Quad SLI working this time since they never released an official driver for the 7950GX2's as it didn't work.
 

Greg King

I just kinda show up...
Staff member
I can happily say I am glad I bought the 8800GTS 512MB card. I paid just nder $300 for it and it plays everything I throw at it @ 1920X1200 with some AA/AF thrown in. While the GX2 is of course a fast card, it is nothing to lose sleep about. I could simply add another GTS to my system for roughly $275-300 (Thank you XFX price drops) and match the performance for less money.

In my opinion, even 2 9800GX2's will not allow high res Crysis at very high details/settings. In fact, I am wanting to see if they can even get Quad SLI working this time since they never released an official driver for the 7950GX2's as it didn't work.


Yeah, NVIDIA really screwed the pooch the first time around with quad-SLI.

I agree with you 100%. The 8800 GT I have is perfect for the resolutions that I play at in most games and with them being as cheap as they are now (mine was under $200 after rebate for a factory overclocked card from XFX) a second card is more of a when then if at this point.
 

sbrehm72255

Tech Monkey
My 8800GT (Bios volt modded and slightly OC) does me just fine as well for what I play (UT3, Bioshock). Haven't even tried Crysis yet, I may get around to it one of these days.
 
Top