Burning my GPU up and what the heck happened to my 3dmark?

RainMotorsports

Partition Master
Alright Ive decided to crank every last bit of performance out of this pos. I have a nearly 3 year old ASUS G50VT-X5. The processor was swapped long ago for a T9600 2.8 Ghz 6M L2 Cache Core 2 Duo that I clock daily to 3.1 Ghz.

The video card is the 9800M GS. I have been running the GTS clocks for 2 years 600/1500/800 C/S/M over the stock 530/1325/800. AT GTS clocks the card is essentially a 9600 GT desktop card in 59 watt form with the same memory bus and shader count. Memory is a little behind in speed.

I flashed an upped voltage of 1.15 and clocks of 920/1550/850. Running stable at 940/1600/900 just shy of actually getting the 9600 GT 96 watt's clocks.

I ran 3dmark 06 and bested my last vista/7 score of 9713 with a 9868. My XP score is a nice round 10086 prior to my current changes in clocks. All in 1280x1024 of course.

My problem is with 3DMark Vantage. I couldn't get the last test to run and after googling tried to install the latest PhysX system software from nvidia but it told me I was running newer. I uninstalled it and installed what i downloaded from nvidia and the test worked. But i got a damn 3 and it seems comparable cards without overclock were getting 40 to 70 on that test. I had PhysX set to GPU in the control panel.

Anyone had any trouble with vantage like that do I need to roll back to an older driver? Prior to today I was running on a clean install of the latest drivers after having ran the latest beta drivers for the BF3 Alpha. My results according to the health checker in orb are above comparable hardware overall its just the last test thats retarted as all hell it seems.
 

RainMotorsports

Partition Master
Well I stepped outside of my safe zone this morning. Bumped to 1.2 volt and pushed clocks to 660/1650/900. After burning a bit found it to be unstable. 650/1625 which is my target was stable but 1.2v is too hot and not worth 10/25 mhz.

So I rest my madness at 640/1600/900@1.15 time will tell if the card will hold. Decent bump over the previous clocks and 110/275/100 over stock which is a 20% overclock except for the ram.

Off to see if I can figure out my 3dmark vantage issue.
 

2Tired2Tango

Tech Monkey
This is just a question... don't take it as anything but...

How much real world performance increase does all that get you?

Not benchmark scores... difference that you notice in day to day computing... Do videos play better, are your games more responsive, etc...
 

RainMotorsports

Partition Master
This is just a question... don't take it as anything but...

How much real world performance increase does all that get you?

Not benchmark scores... difference that you notice in day to day computing... Do videos play better, are your games more responsive, etc...

Here is the problem with "not benchmark scores". I dont have issues with watching 1080P video which has to be resampled to 1366x768. I have games that are cpu bottlenecked to the point where the randomness of things needing to be executed make it impossible to tell whats what. Now if include real world benchmarking, say playing a game one level and its random varied game play for long fixed periods as a show of performance then we can talk.

Its been awhile since I was at dead stock. In non CPU bottlenecked games it certainly made a difference and I have benches somewhere to show for it. Actually used to have a thread on NBR related to this model ranges performance in games which were tested on various cpu's and cpu/gpu overclocks.

This last overclock is probably only good for a couple fps on average bad company 2 which is CPU bottlenecked in multiplayer. The game uses 80 to 95% of the cpu at any given time and then add in that Punkbuster can eat up to 30% when it wants and Teamspeak and other things like to eat cycles it causes mad framerate issues.

On a semi related note I actually just benched my CPU overclock in relation to frame performance in Bad Company 2 today:
2.8Ghz Min 25 Max 82 AVG 42.24
3.15Ghz Min 31 Max 92 AVG 50.897

I can definitely tell when my CPU isnt overclocked while playing Isla SQDM since I play it so much the input lag and artificial network lag created by the processor being tied up is very noticeable.

I will do some game benchmarks stock versus overclocked GPU sometime but the GPU was on GTS clocks for the last couple years. I really onlu play Bad Company 2 at this point but Saints Row 2 is a GPU bottlenecked game for me thats barely playable at all. 20% isnt anything to laugh at however.
 
Last edited:

2Tired2Tango

Tech Monkey
Ok... thanks anyway...

Let me give you an example of what I meant...

I have an Atom based HTPC running XP... It will not decode and play 1080p content at high bitrates unless I over clock the CPU...

My desktop machine also has 1080p display and will play the same videos with such ease that the AMD "Cool and Quiet" feature actually back throttles the CPU...

What I was looking for was something like "My games hesitate and lag if I don't overclock"... Not some benchmark number of frame rate... Seriously, how much difference to 5 extra FPS make on a 60hz monitor? Is 75fps really better than 70?
 

RainMotorsports

Partition Master
Problem is were not talking 70 to 75 fps. Were talking about 20 to 25 and 25 to 30 many games most of the time. If it ever drops below 24 which it does were talking a bad time. A boost in performance doesnt have to mean better framerates it can mean turning a setting up to look better at similar fps. I am playing games on low settings here its not a joy ride. If the cpu bottlenecks the performance the fps no longer tells the story at all because its a much worse situation. The bench i posted was under ideal circumstances and doesnt reflect punkbuster going crazy or any other factors. My original changes years ago made crysis mostly playable versus not really at all. We are talking about trying to fix the worst situations and just dealing with the fact that it might not be needed for other games. The laptop is end of life for me so i decided to give it one last push. I havent really played my other games this year so i dont have much input on the good and the bad at the moment but i do own games that are borderline or cant be played at all. In 6 weeks it will be replaced and with the battery being 3 years old and showing signs of no longer holding 2 hour charges I cant say I am afraid of the consequences. Sorry about the lack of formatting I am on my cell.
 

Kougar

Techgage Staff
Staff member
That's some insane boosts, Rain! Take care the GPU isn't running too hot when you ramp up the clocks & volts together like that... laptop cooling is pretty substandard to begin with and it is easy to cook the hardware to the point it will fail. Especially if the GPU and CPU share the same cooling fan. Even when the GPU core temps are fine, the VRM and core both are sandwiched against the laptop mainboard, and that gets hardly any cooling at all.

What I was looking for was something like "My games hesitate and lag if I don't overclock"... Not some benchmark number of frame rate... Seriously, how much difference to 5 extra FPS make on a 60hz monitor? Is 75fps really better than 70?

His FPS numbers were below the 60Hz / 60FPS display cap, so yes it does actually matter because the difference can be noticed.

When playing an FPS (lets use Crysis, since that constantly played below 30FPS for me back in the day), the difference between a minimum framerate of say 15 and 20 felt huge when actually playing. The difference in the game fluidity is stark for me below 30FPS, and at least at that point I can notice even an incremental improvement in frame rates.

If I'm playing a any FPS that's running smooth at 60+ and suddenly the game's framerates drop below the 40-50 range just for a few seconds, I can tell. Gaming at an average of 30 just feels laggy and the game events themselves are jerky and totally not realistic. Average FPS is important, but minimum FPS is just as crucial too.
 
Last edited:

2Tired2Tango

Tech Monkey
Thanks Kougar... that's part of what I was wondering about.

You may find this hard to believe, given my involvement in Home Theatre setups but I have never once played a "computer game"... Well, except for solitaire... I tried Doom (I think it was) and as soon as I realized that everything was about shooting and killing... never gave it a second glance.

Anyway, I realize that slow CPUs and low end GPUs often have trouble with movies and I've gotten pretty good at sorting those issues out... Now I know it also extends to game play.

Thanks.

Does the contrary apply... is there noticeable improvement beyond the monitor's refresh rate?
 

RainMotorsports

Partition Master
You want to talk about substandard cooling lol talk about when asus put a quad core and gtx 260m in this same chasis and called it a G51 and didnt even change the cooler but they did on the original G60. People saw 90c idles and 115 running temps even on properly pasted units things practically melted their own solder. 2 hard drives in a 15 inch only bought me one fan so yeah the cpu heated air cools my gpu and when i swapped my low volt p series for my T9600 the temps went up. I used to get in the 80s but a good baseline for the GTS clocks is Grid which i can max out with 16xAA at 1366x768 which pushes temps to 93c. The thermal underclock trigger is at 105c the design max and i hit it one time way back when i forgot to plug the fan back in after a cleaning. I am hitting up to 96c at this point and even for laptop guys that is scary hot but the way i see it ive seen worse and once again lifespan isnt much of an issue but ive been pushing into the 90s for 2 years. The temps are a bit more critical at the higher voltage i do realize that. Its been since feb that i cleaned and pasted the machine so i think its about due.

Tango i can give you the only sign ive seen of my gpus video playback capability. My stepmother had me rotate a 1080p video she shot and after i did she was not able to play it back on her i3 laptop with the intel M4500HD integrated graphics. But my laptop had zero issues playing it downscaled to fit the screen. Pretty odd task having the computer play something 1920 pixels tall on a 768 pixel tall screen. Other then that ive never tested it it does its job and like yours does it at the lower clock speed setting.
 
Last edited:

RainMotorsports

Partition Master
When playing an FPS (lets use Crysis, since that constantly played below 30FPS for me back in the day), the difference between a minimum framerate of say 15 and 20 felt huge when actually playing. The difference in the game fluidity is stark for me below 30FPS, and at least at that point I can notice even an incremental improvement in frame rates.

Does the contrary apply... is there noticeable improvement beyond the monitor's refresh rate?

I cant say I see a difference between a 30 fps minimum and getting 60 fps nor do i put much stock into it being necessary.

But here is something people tend to ignore. As Kougar brought up, Minimum framerate. Using something like fraps to show your framerate shows averages only over a period of a second or a bit less.

If your only getting 30 FPS average there is a good chance in gaming that your minimums are falling below 24 and anyone with eyeballs will notice it getting choppy assuming its a game with camera movement like FPS, Racing, etc.

Assuming no major problems if your always getting 60+ fps average chances are your not falling below 25 to 30 minimums and everything looks fluid. But I could lock a game down that I get 100 fps in to 30 solid and I personally cant tell the difference between 30 solid and 60 solid fps. There are instances where vsync or other techniques to limit framerates can introduce input lag but I don't see that in anything I play.

Im gonna run the same bench i did on my cpu overclock on stock gpu clocks soon. For the sake of my K/D ill make a new name cause I predict this is gonna suck really bad.
 
Last edited:

RainMotorsports

Partition Master
Alright results of the last bench are in:
Stock CPU Clock - Stock GPU - Minimum 20, Maximum 85 Average 41.6
Stock CPU Clock - GPU OC - Minimum 25, Maximum 82 Average 42.24
CPU Clock 3.15Ghz - GPU OC - Minimum 31, Maximum 92 Average 50.897

Well the average was better than I thought. Dont let the maximum fool you because say looking at a wall only can cause those sort of spikes. Mind you this is very dynamic gameplay and while i sampled about 15 minutes each time nothing is ever scientific in multiplayer.

The minimums dropped below 24 and it was noticeable. The round started out very well and I had 30 Kills, 30 Deaths which is 1:1 and my average is 0.78 so thats better than average but not better than i do these days. It was playable but not quite enjoyable.

I would need alot more sample time to really back the data up and its just not that important. There are better games to test the GPU overclock, as this games problems center around cpu usage.
 

2Tired2Tango

Tech Monkey
Ok, one last question...
If you can't see the difference between 30fps and 60fps... why do you need 100?
Is that just to maintain a minimum above 30?

Also... if your CPU and/or GPU is running particularly hot, you may be getting what they call "microstuttering"... That is; the chip hits it's thermal limit, throttling cuts in, the chip cools, full speed resumes... but it all happens on a fraction of a second timebase. This is generally an indication that you are right on the very edge of a chip's capabilities and need to back it down one notch. I first became aware of this phenomenon when a customer brought me a machine that was used for very long mathematical computations... "Why does this take longer when I overclock?"... I don't mind telling you I did some serious head scratching over that one and my friend's machine now has a 2 radiator water cooling system with a 1 litre tank in it...
 
Last edited:

RainMotorsports

Partition Master
Ok, one last question...
If you can't see the difference between 30fps and 60fps... why do you need 100?
Is that just to maintain a minimum above 30?

Let me extract the part where I brought up 100 fps.

But I could lock a game down that I get 100 fps in to 30 solid and I personally cant tell the difference between 30 solid and 60 solid fps.

I grabbed the number of 100 just to point to a game where id probably have 30+ on minimum yes. But not because I need to make that. I get 100+ on older DirectX9 games like Doom 3 and F.E.A.R. with the settings at their highest because they dont require much power these days.

But the thought went to the game World in Conflict which is a Real Time Strategy game. Its not the kinda of game where a slight graphical studder is gonna get you killed. The game features the ability to lock the framerate at 60 45 30 15 fps. Mind you I would never try to play it at 15 FPS. But to mitigate heat and save energy if I am getting higher than 60 fps then why not lock it down to a lower one.

I was basically trying to say if my minimum fps is guaranteed to be 30 fps I have no real issue with running it at 30 fps solid, I would like 60.

Also... if your CPU and/or GPU is running particularly hot, you may be getting what they call "microstuttering"... That is; the chip hits it's thermal limit, throttling cuts in, the chip cools, full speed resumes... but it all happens on a fraction of a second timebase. This is generally an indication that you are right on the very edge of a chip's capabilities and need to back it down one notch. I first became aware of this phenomenon when a customer brought me a machine that was used for very long mathematical computations... "Why does this take longer when I overclock?"... I don't mind telling you I did some serious head scratching over that one and my friend's machine now has a 2 radiator water cooling system with a 1 litre tank in it...

The GPU downclocks at 105C and with temps 10 to 15C below that while close cant say I have been seeing it. I have run artifact testing while artificially heating the gpu with another program to temps higher than I get in games. No recorded drops in frequency and on the stable speeds no worsening in performance.

The CPU happens to also be a 105C unit and since ive never gone anywhere past 75C on a dirty heatsink I cant say I would even worry about it.
 

RainMotorsports

Partition Master
Ok... thanks a lot guys... this has been a big help to a non-gamer....

I sent you a couple of PM's basically to go over what its like to play a racing game while not getting 24 fps minimum or better. As i say its alot like input lag. You tend to over compensate and it gets you into trouble.

My laptops a pos this day in age. It was mid range in desktop terms the day it came out. Its getting replaced most of this was a last hoorah. I will probably still mobile game on it. But chances are there wont be too many new games on the list. By mobile I mean wheres the nearest wall outlet.

Bad Company 2 isnt in need of a video card overclock as much as its in need of a quad core. It is a playable game and everything past playable is a bit of a bonus. If we were trying to performance analyze my overclock I would probably bring about 25 games into the testing.

I have found out that with my 3dmark Vantage's last bench working the results are actually where they should be. Just had not mentioned it today so thought I would do so now.
 

Kougar

Techgage Staff
Staff member
Does the contrary apply... is there noticeable improvement beyond the monitor's refresh rate?

My first answer was going to be a straightforward no, but then I remembered something. There's a side effect I've noticed when my own FPS rates are super-high that some games (I'm thinking of TF2 in particular, as I get 200+FPS highs) is that at some FPS levels, the game looks great... but if the FPS levels climb above that point (I'm guessing 300, but I have no real idea what they are) I'll start noticing screen tearing, where one half of the frames are being replaced by a different frame within the same image on the screen at a given time.

Vsync specifically prevents this problem, but in doing so it also renders older frames first, even though they are already "old" and not representative of what's going on just "now" in the game. (this is one of the factors as to how people seemingly get shot through walls or around corners) So I've taken to using a middle alternative, triple buffering. It minimizes any tearing and renders the newest completed frames first.

It's not exactly what you meant by your question I know, but in a way the side effect of 100+ FPS rates can actaully cause image quality differences, depending on the user's GPU driver + game settings. But again, to the point of your question do I notice any improvements from extra frames when above 60Hz? Nah. I can't even tell when TF2 is rendering at 150FPS or 300FPS. I only notice the minimums when they dip below 60, as it interrupts the game's fluidity.
 
Last edited:
Top