Recently after switching from 800x600@120Hz CRT to 1920x1080@120Hz LCD I've noticed that in Quake Live the game doesn't feel as smooth on LCD. To be more precise while the reported fps was constantly on 125 it felt like something between 90 -120fps with quite annoying "microstuttering" even though I am not running SLI/CF setup (just single GPU Sapphire HD 6850) I could simulate similar effect also in other games like Warsow.

At first I thought the LCD is to blame but when I tested QL in 800x600 and old Q3 in 1920x1080 here the games run without issues so I started looking for other causes... after some investigation I've found the problem is indeed related to my VGA (after carefully studying GPU load and core / memory clock in Afterburner). I am pretty confident that this "microstuttering" occurs when GPU core is frequently underclocked / overclocked. While playing QL in full HD there were frequent changes of core frequency from 600 to 750MHz and back etc. also with memory underclocking as the game is not exactly among the most demanding.

After I increased load with setting 8XFSAA this made the 1920x1080 to run smoothly as the core was running on 750Mhz most of the time (no significant drop in clock). On the other hand it has caused 800x600 resolution to become "microstutterish" as now the GPU was not capable of handling the game completely at lower clocks but was switching between low <-> high clocks.

Then I found that another good way to fix it is to use another VGA (driver) bug - while Youtube video is opened in browser the core will stay fixed at 300MHz - no matter if in 2D or 3D mode. So with paused youtube video and constant 300Mhz (still conastnatly at 125 fps) the game feels smooth - much smother as with core freq oscillating between 600 and 750 MHz. On the other hand I think this behavior and use of such hacks to defeat those bugs is quite absurd...

I just want to ask:

1.)
Are there some other people who have experienced similar problem (or are at least able to duplicate it) - with HD 6850 GPU but possibly some other cards as well.

I think that if you can see the difference between com_maxfps 90 and com_maxfps 125 (which is pretty obvious) and your setup is affected by the mentioned bug - you should see it without problems. Though if the mentioned bugs really take place may vary based on your visual settings (e.g. I was using vertex lightning and some post-processing)

Note that I am using Catalyst 10.11 with 10.10e hotfix


2.)
IS THERE SOME WAY how to force core freq in GPU to maximum when in 3D mode ??
WITHOUT AUTOMATIC DOWNCLOCKING BASED ON CURRENT GPU LOAD ??

I couldn't find anything. Can this be done with some possible driver update in the future ?