Poll: GPU Overclocking Question

Recommended Videos

VincentX3

New member
Jun 30, 2009
1,299
0
0
I've been having something in the back of my head lately, and no matter how much I search, I can't seem to find a direct answer. So Escapist, let's hear you're answers.

That "something" is what damages graphic cards more?
Now most of you will say heat problems, which is true. But hear my story first.

Since about three years ago, I have a "Gigabyte 9600GT" (Passive Cooled)
I love this GPU personally, I know it's not the "TOP" but it can run the latest PC games at 60FPS with no problems.


That includes (All max settings with 4xAA\8-16xAF):
-Resident Evil 5 (benchmark @71.4FPS)
-Sims 3
-Devil May Cry 4 (Benchmark @95.3FPS)
-Fallout 3
-Left 4 Dead 2
-StarCraft II


Now, I'm no newb to Overclocking. I've done it many times and studied it well. But the more you study, the more questions will appear.

The question is what "technically" hurts the graphic card more?
In my case, here are my stock settings:

-GPU Clock = 720
-Shader = 1800
-Memory = 1008

Here are my Overclocked settings: (Performance Mode)

-GPU Clock = 775
-Shader = 1900
-Memory = 1024

In the past, I've overclocked more than now. Of course, I ran into artifacts and handled them with no problem. With the current settings, I've yet to run into artifacts.

What I want to know is:
- During the moment that the "Artifact" is occurring, can that alone be enough to break the card or damage it in a way rendering it useless? Even if it's not overheating.

I've yet to have a straight or detailed answer to this question and would appreciate any discussion.

PS: Sorry for the long read, I tend to go into a lot of detail.
 

VincentX3

New member
Jun 30, 2009
1,299
0
0
For those that are curious, here are my Idle and Load temperatures:
Idle: 38ºc
Load: 54-56ºc