• Index » 
  • Community » 
  • Tech » 
  • Freezer's guide to overclocking your graphics card using RivaTuner
Freezer7Pro
I don't come here a lot anymore.
+1,447|6168|Winland

Due to all the recent threads about overclocking graphics cards, I've decided to make this guide, in which I will show you step-by-step how to overclock your graphics card. This guide is aimed at more novice users, who are unused to making these kinds of tweaks. Note that this could potentially damage your card, and if you fry it from overclocking and it's painfully obvious, you propably won't get friendly responses if you RMA it, but if you do it right and keep the temps under control, there's no major risk of frying it. (Thanks for pointing out, kyle)

First of all, download and install RivaTuner. It's a great and easy-to-use overclocking utility that will work with a very vast majority of nVidia and ATi graphics cards, and will outperform the utilities provided by the manufacturers. Now when that's covered, let's get to the point.

When you start up RivaTuner, it should look something like this:

https://i205.photobucket.com/albums/bb8/Freezer7Pro/Tuner1.png

You'll see some general information about your setup; Driver info, what card you have, what kind of RAM it has, and so on. To get started with the clocking, click the arrow to the right of "Customize..." on the bar with "ForceWare detected" in the example:

https://i205.photobucket.com/albums/bb8/Freezer7Pro/Tuner2.png

Click the picture of a video card, the one furthest to the left. That will take you to a window looking somewhat like this:

https://i205.photobucket.com/albums/bb8/Freezer7Pro/Tuner3.png

This is the main overclocking window. This is where you do the real overclocking. Now, before you go ahead and do any clocking, you should make sure that your cooler can handle it. Most GeForce 6/7 series cards and ATi X800/X850/X1600/X1800/X1900 series with default/active coolers will be good enough to support some levels of overclocking. You should blow out the cooler with some compressed air or a straw before you go ahead and do anything, however. There's often more dust than you would imagine. Never use a vacuum cleaner to clean out your computer, as it produces massive quantities of static electricity, and WILL damage your components.

I don't have much personal experience with the 8800/HD series cards and their coolers. From what I've done, all I've came to know is that the 320MB G80 8800GTS shouldn't be overclocked with stock cooler . If someone's got some info on this that they think would fit in this guide, please let me know, and I'll add it and give credit.

You should also install Everest or SpeedFan to check your temperatures, or use RivaTuner for that too. Run a 3D app like 3DMark 06 (I strongly recommend it, since it's a very good benchmark to see how effective your clocking has been), and check the temps after a minute or two. If they're over 65-70C, I wouldn't recommend you overclocking your card with the current cooler, as it might run a tad too hot for comfort, which will counteract your overclocking and possibly harm the card. You should get an aftermarket cooler if that's the case, they're usually not that expensive, and will usually provide a cooler and quieter solution.

With that covered, let's go on with actually overclocking your card.

The first thing you need to do is check the "Enable driver-level hardware overclocking"-box at the top of the window. This will unlock the sliders under it, as well as show a drop-down box on the right;

https://i205.photobucket.com/albums/bb8/Freezer7Pro/Tuner4.png

The drop-down box will be set to "Standard 2D" by default, you will need to set it to "Performance 3D" to notice any effect on other things than your desktop, which really needs a lot of graphics performance . Well, Vista does.

Now, we'll get to the trial and error-part of overclocking a video card... Which is, well, overclocking the video card. Depending on the card and cooler, it might be able to clock 400MHz or 100MHz, it is very, very card-specific. In general, lower-end cards clock better than higher-end ones, since they're usually just (basically) downclocked versions of the more powerful cards based on the same core. For instance, the GeForce 6600 will usually clock to around 560MHz, while the 6600GT, which has a stock clock of 500MHz (as opposed to 300MHz on the 6600) usually also limits at about 560MHz. The 7600GS has a stock core frequency of 450MHz and will go up to 600-610MHz, while the 7600GT with 575MHz stock will go to 620-630MHz.

Anyhow, back to the clocking. This is the both hardest and easiest part. It's hard because you need to know what your card can handle, and easy because it's just moving a slider to the right. If you don't know what your card can handle, which you propably don't, we'll have to find out through overclocking in small steps.

We'll start with the core clocking, as it's the one with most impact, and the one that can usually be clocked the most without artifacts. In many cases, the RAM can barely be clocked at all. As is the case with the card in this guide.

I'd recommend that you start by a rather large step, and then continue with smaller steps and running a 3D app in between.

https://i205.photobucket.com/albums/bb8/Freezer7Pro/Tuner5.png

50MHz is usually a pretty good start , since most cards can take it, and it saves time from going in smaller steps right from the start. If you want to be completely on the safe side, you could go in 10-20MHz steps right from the beginning.

After moving the slider, click test, apply, and run a 3D app. I usually run the first test in 3DMark06, looking for artifacts that appear if you clock the card too much, or it runs too hot. (The test button isn't on the example pictures, since RivaTuner can't test my card/driver combination.)

https://i205.photobucket.com/albums/bb8/Freezer7Pro/artifacts.png
Click for full-size (1024x768)
Artifacts caused by too high memory clock. (Running 500MHz DDR1 at over 600MHz, lulz).

Memory-caused artifacts are usually much more severe than those caused by the GPU. I tried to create some GPU artifacts for reference, but the system would crash before they appeared. They usually consist of faulty models, like giant "spears" sticking out from the ground, or endless holes, unlike the memory-caused ones, which usually consist of texture errors.

After the large first step, and running 3DMarks first test to check for artifacts (And coming out clean), you can keep on going in a couple of (2-3) 20MHz steps, and then 10MHz at a time, running a 3D app after every click on the apply button. Remember to check the temps immedeatly after running the 3D app, so that your card isn't overheating. Modern cards can withstand very high temperatures (Over 100C), but when they run over 80C, I wouldn't make them run any hotter. You don't want to boil water on it.

Sometimes, it might happen that you run out of slider space, but your card has passed that speed fine, and you want to go further. This requires a little tweaking in RivaTuner.

https://i205.photobucket.com/albums/bb8/Freezer7Pro/Tuner6.png

To fix this, close the overclocking window and go to the "Power User" tab and expand RivaTuner/Overclocking/Global, as illustrated in the picture:

https://i205.photobucket.com/albums/bb8/Freezer7Pro/Tuner7.png

Double-click on the grey field to the right of "MaxClockLimit" and enter a number. It doesn't have to be high, as it doesn't represent the clock frequency limiter. 150 is usually more than enough.

https://i205.photobucket.com/albums/bb8/Freezer7Pro/Tuner8.png

This will raise the top to a higher value that is dependent on your card's stock frequencies. Don't know how it counts it, but it will be enough to push the card to it's limits.

Now, with that tweak applied, the overclocking window should look something like this:

https://i205.photobucket.com/albums/bb8/Freezer7Pro/Tuner9.png

It's just to keep on, 10MHz at a time, test, run 3D app, until you get frequencies nad temps that you are happy with. Do the same procedure with the memory, but start off with 10MHz steps, because the memory usually can't be clocked as much as the GPU.

If you have any questions, feel free to contact me, and I'll be glad to help with any problems.

© FREEZER7PRO 2008

Last edited by Freezer7Pro (2008-03-02 03:57:20)

The idea of any hi-fi system is to reproduce the source material as faithfully as possible, and to deliberately add distortion to everything you hear (due to amplifier deficiencies) because it sounds 'nice' is simply not high fidelity. If that is what you want to hear then there is no problem with that, but by adding so much additional material (by way of harmonics and intermodulation) you have a tailored sound system, not a hi-fi. - Rod Elliot, ESP
CommieChipmunk
Member
+488|6541|Portland, OR, USA
Dude.. well done. +1
Stimey
­
+786|6091|Ontario | Canada
Very nice guide, gonna do this on my soon to be new rig.
­
­
­
­
­
­
kylef
Gone
+1,352|6464|N. Ireland
Noteworthy: this can potentially damage your graphics card. Overclocking is not standard procedure, and can void warranty.

Nice guide, Freezer7Pro.

Last edited by kylef (2008-03-01 11:55:45)

Brasso
member
+1,549|6601

Nice guide, but I spotted a little error.  In the part where you're talking about too high a core overclock, you actually mention memory...or so it seemed to me.  Needed a little clarification.

Freezer7Pro wrote:

Memory-caused artifacts are usually much more severe than those caused by the GPU. I tried to create some (GPU core clock errors) for reference, but the system would crash before they appeared. They (GPU core clock errors) usually consist of faulty models, like giant "spears" sticking out from the ground, or endless holes. Unlike the memory-caused ones, which usually consist of texture errors.
Oh, and maybe you could include something about changing fan speed?

+1

Last edited by haffeysucks (2008-03-01 12:05:37)

"people in ny have a general idea of how to drive. one of the pedals goes forward the other one prevents you from dying"
Freezer7Pro
I don't come here a lot anymore.
+1,447|6168|Winland

haffeysucks wrote:

Nice guide, but I spotted a little error.  In the part where you're talking about too high a core overclock, you actually mention memory...or so it seemed to me.  Needed a little clarification.

Freezer7Pro wrote:

Memory-caused artifacts are usually much more severe than those caused by the GPU. I tried to create some (GPU core clock errors) for reference, but the system would crash before they appeared. They (GPU core clock errors) usually consist of faulty models, like giant "spears" sticking out from the ground, or endless holes. Unlike the memory-caused ones, which usually consist of texture errors.
Oh, and maybe you could include something about changing fan speed?

+1
Yeah, that was a bit unclear. Thanks for pointing it out, added a little clarification.
The idea of any hi-fi system is to reproduce the source material as faithfully as possible, and to deliberately add distortion to everything you hear (due to amplifier deficiencies) because it sounds 'nice' is simply not high fidelity. If that is what you want to hear then there is no problem with that, but by adding so much additional material (by way of harmonics and intermodulation) you have a tailored sound system, not a hi-fi. - Rod Elliot, ESP
BluRR33
Member
+27|6358|Sweden
Well done !!! great Work
nukchebi0
Пушкин, наше всё
+387|6295|New Haven, CT
Do you want me to send you pictures with a real graphics card?

(You should also mention the config file edit needed to work this with 8800GTs and 8800GTS 512 MB.
Freezer7Pro
I don't come here a lot anymore.
+1,447|6168|Winland

nukchebi0 wrote:

Do you want me to send you pictures with a real graphics card?

(You should also mention the config file edit needed to work this with 8800GTs and 8800GTS 512 MB.
These pictures are for illustration purposes only, not tech showoff. These pictures will do fine. EDIT: No shader clocks, though, but it'll do. They're usually linked anyhow.

And I don't know about any config tweak. As I said, I don't have much experience with the 8 series cards. Do tell, and I'll add it.

Last edited by Freezer7Pro (2008-03-01 17:53:24)

The idea of any hi-fi system is to reproduce the source material as faithfully as possible, and to deliberately add distortion to everything you hear (due to amplifier deficiencies) because it sounds 'nice' is simply not high fidelity. If that is what you want to hear then there is no problem with that, but by adding so much additional material (by way of harmonics and intermodulation) you have a tailored sound system, not a hi-fi. - Rod Elliot, ESP
Little BaBy JESUS
m8
+394|6119|'straya

Freezer7Pro wrote:

nukchebi0 wrote:

Do you want me to send you pictures with a real graphics card?

(You should also mention the config file edit needed to work this with 8800GTs and 8800GTS 512 MB.
These pictures are for illustration purposes only, not tech showoff. These pictures will do fine. EDIT: No shader clocks, though, but it'll do. They're usually linked anyhow.

And I don't know about any config tweak. As I said, I don't have much experience with the 8 series cards. Do tell, and I'll add it.
then again i didnt link my clocks cuz i found i could push my shader alot further than my core...
Little BaBy JESUS
m8
+394|6119|'straya
Well this is wat a 8 series card looks like in the overclocking section of RivaTuner.

It will link your shader and core clocks but i suggest doing them sperately as i have found you can overclock your shader quite a bit more than core...


Also is your 6600 actually 500Mhz effective? or is it 1000 Mhz effective like my 8660 is?
https://i192.photobucket.com/albums/z43/howdoulikemen0w/card.jpg

Last edited by Little BaBy JESUS (2008-03-01 18:28:26)

ghettoperson
Member
+1,943|6620

It might be worth mentioning that you're being rather conservative with you temperatures. Most GPU's from what I've seen will usually idle at 65 or so with a stock cooler, and can easily get up to 90-100 at load, with no ill effect. I'm not saying that you should push it that high, but if you're seeing temps well over 70 at load with stock clocks it isn't too much to worry about.
Little BaBy JESUS
m8
+394|6119|'straya

ghettoperson wrote:

It might be worth mentioning that you're being rather conservative with you temperatures. Most GPU's from what I've seen will usually idle at 65 or so with a stock cooler, and can easily get up to 90-100 at load, with no ill effect. I'm not saying that you should push it that high, but if you're seeing temps well over 70 at load with stock clocks it isn't too much to worry about.
Yer agrees.

Anything under 90 with stock clocks should be fine.
Flaming_Maniac
prince of insufficient light
+2,490|6678|67.222.138.85
sticky

Really excellent post Freezer, good work.
Freezer7Pro
I don't come here a lot anymore.
+1,447|6168|Winland

Little BaBy JESUS wrote:

Well this is wat a 8 series card looks like in the overclocking section of RivaTuner.

It will link your shader and core clocks but i suggest doing them sperately as i have found you can overclock your shader quite a bit more than core...


Also is your 6600 actually 500Mhz effective? or is it 1000 Mhz effective like my 8660 is?
http://i192.photobucket.com/albums/z43/ … w/card.jpg
Yeah, I should add that picture at the end with a little note.

And the 6600 is 500MHz effective. It's DDR.

ghettoperson wrote:

It might be worth mentioning that you're being rather conservative with you temperatures. Most GPU's from what I've seen will usually idle at 65 or so with a stock cooler, and can easily get up to 90-100 at load, with no ill effect. I'm not saying that you should push it that high, but if you're seeing temps well over 70 at load with stock clocks it isn't too much to worry about.
Yes, I know I'm in the lower figures here, and I know that the never HD/8 series cards can take reall high temps. But the 7 series and X series won't do as fine with those temps, and I don't want to be responsible for tipping people to run at sky-high temps and frying their cards.

Flaming_Maniac wrote:

sticky

Really excellent post Freezer, good work.
The idea of any hi-fi system is to reproduce the source material as faithfully as possible, and to deliberately add distortion to everything you hear (due to amplifier deficiencies) because it sounds 'nice' is simply not high fidelity. If that is what you want to hear then there is no problem with that, but by adding so much additional material (by way of harmonics and intermodulation) you have a tailored sound system, not a hi-fi. - Rod Elliot, ESP
xGj
Official lame Crysis fanboy.
+84|6342|Netherlands tbh
Good work Freezer, +1 to you
Btw I thought I might contribute to the thread, here's how to setup RivaTuner so it monitors your temperature rather than using Speedfan etc.:

1. Open up RivaTuner and select 'Hardware monitoring' as shown in the following image:

https://i27.tinypic.com/2efjjih.jpg

2. Click yes on this eventual pop-up:

https://i32.tinypic.com/25p34fl.jpg

3. You will now see the 'Hardware monitoring' screen, showing all kinds of green grids. If you see less grids than on my picture, please stretch the window:

https://i25.tinypic.com/54bxn9.jpg

4. Click the red 'record' button on the bottom left:

https://i31.tinypic.com/11sz1v8.jpg

5. Right click the 'Core temperature' grid, and select 'Setup' as shown here:

https://i29.tinypic.com/2jed5sn.jpg

6. Select under 'Tray icon settings' the options you prefer. I like to show the temperature of my graphics card in the tray icon, in the tooltip and in a barchart rather than in plain text. But choose whatever you like:

https://i25.tinypic.com/2ypbrsl.jpg

7. An alternative, but really useful step. Close the 'Setup' and 'Hardware monitoring' windows, and open up the main RivaTuner screen again. Select the 'Settings' tab, and under the user interface preferences tick the box 'Send to tray on close'. This means if you close RivaTuner, it will minimize and sit in your tray, while it keeps showing your temps. Very useful because you close the tiny Rivatuner window a lot (well, I do.).

https://i31.tinypic.com/21b3sif.jpg

Freezer, feel free to add this to your post, thought I'd just contribute this so it's easier for people to manage their OC settings

Last edited by xGj (2008-03-02 03:44:52)

ghettoperson
Member
+1,943|6620

Freezer7Pro wrote:

ghettoperson wrote:

It might be worth mentioning that you're being rather conservative with you temperatures. Most GPU's from what I've seen will usually idle at 65 or so with a stock cooler, and can easily get up to 90-100 at load, with no ill effect. I'm not saying that you should push it that high, but if you're seeing temps well over 70 at load with stock clocks it isn't too much to worry about.
Yes, I know I'm in the lower figures here, and I know that the never HD/8 series cards can take reall high temps. But the 7 series and X series won't do as fine with those temps, and I don't want to be responsible for tipping people to run at sky-high temps and frying their cards.
Nah, I think temps can just be quite varied, I know my 6600 used to get up to the high 80's/low 90's at load.
Brasso
member
+1,549|6601

Little BaBy JESUS wrote:

It will link your shader and core clocks but i suggest doing them sperately as i have found you can overclock your shader quite a bit more than core...
seriously?  i always linked mine, i should try unlinking them...
"people in ny have a general idea of how to drive. one of the pedals goes forward the other one prevents you from dying"
Sydney
2λчиэλ
+783|6814|Reykjavík, Iceland.
OC'ing my 8800GTX has been nothing but trouble with stock cooling, and I still haven't found any aftermarket GPU coolers for that card here in Iceland...
Brasso
member
+1,549|6601

PBAsydney wrote:

OC'ing my 8800GTX has been nothing but trouble with stock cooling, and I still haven't found any aftermarket GPU coolers for that card here in Iceland...
I'm sure any Zalman will work, no?
"people in ny have a general idea of how to drive. one of the pedals goes forward the other one prevents you from dying"
Sydney
2λчиэλ
+783|6814|Reykjavík, Iceland.

haffeysucks wrote:

PBAsydney wrote:

OC'ing my 8800GTX has been nothing but trouble with stock cooling, and I still haven't found any aftermarket GPU coolers for that card here in Iceland...
I'm sure any Zalman will work, no?
Did a bit of research, no, they won't, however the Thermaltake CL-W0153 is made for the 8800GTX, but it's not available here, and if I have to purchase a cooler from the internet, it's not worth OC'ing my card at all IMO.
Little BaBy JESUS
m8
+394|6119|'straya
Just thought i might as well mention the fact that u can make links in the launcher section to make it easier/quicker to get to stuff...

i dunno if other people do it but thats wat i did... just chose the ones i use most.... easy to access.

https://i192.photobucket.com/albums/z43/howdoulikemen0w/ask.jpg

Last edited by Little BaBy JESUS (2008-03-03 02:10:43)

RDMC
Enemy Wheelbarrow Spotted..!!
+736|6536|Area 51
Just a question on the fans, its set to 65% for performance 3D apps, is it safe to change this to example 80%? Or even 100%?

EDIT: Meanwhile I changed my clock speeds from 560/700 to 602/740 and I scored 3141 3DMarks. Used to be 2950 or something, so that like what, a 7% increase?

I've now increased it to 641/740. And my GPU while doing 3DMark reached a temp of 62 Degrees Celsius, that is a normal temperature right? Oooh and I now got 3243 3D Marks

Last edited by RDMC (2008-03-03 03:52:07)

Little BaBy JESUS
m8
+394|6119|'straya

RDMC wrote:

Just a question on the fans, its set to 65% for performance 3D apps, is it safe to change this to example 80%? Or even 100%?

EDIT: Meanwhile I changed my clock speeds from 560/700 to 602/740 and I scored 3141 3DMarks. Used to be 2950 or something, so that like what, a 7% increase?

I've now increased it to 641/740. And my GPU while doing 3DMark reached a temp of 62 Degrees Celsius, that is a normal temperature right? Oooh and I now got 3243 3D Marks
First of all wat is ur card? that temperature is fine for any new cards so no problems there...

ur fan will be fine at 100% the only thing u will notice is the sound of ur little fan working its heart out

imo depending on ur card id say u could easily take it another 20-50 Mhz but its up to you.

Last edited by Little BaBy JESUS (2008-03-03 03:58:46)

RDMC
Enemy Wheelbarrow Spotted..!!
+736|6536|Area 51

Little BaBy JESUS wrote:

RDMC wrote:

Just a question on the fans, its set to 65% for performance 3D apps, is it safe to change this to example 80%? Or even 100%?

EDIT: Meanwhile I changed my clock speeds from 560/700 to 602/740 and I scored 3141 3DMarks. Used to be 2950 or something, so that like what, a 7% increase?

I've now increased it to 641/740. And my GPU while doing 3DMark reached a temp of 62 Degrees Celsius, that is a normal temperature right? Oooh and I now got 3243 3D Marks
First of all wat is ur card? that temperature is fine for any new cards so no problems there...

ur fan will be fine at 100% the only thing u will notice is the sound of ur little fan working its heart out

imo depending on ur card id say u could easily take it another 20-50 Mhz but its up to you.
Its a 7600GT 256MB. I now have it at 662/740 and still no artifacts and still running at 62 Degrees under load

Well, its now at 672/740. And I am done for now, it crashed at 680 and the temperature is still at 62 Degrees
Final score: https://img215.imageshack.us/img215/3104/3dmarkscore03032008an8.png

Last edited by RDMC (2008-03-03 04:40:20)

  • Index » 
  • Community » 
  • Tech » 
  • Freezer's guide to overclocking your graphics card using RivaTuner

Board footer

Privacy Policy - © 2024 Jeff Minard