Aces High Bulletin Board
General Forums => Hardware and Software => Topic started by: Ghastly on November 22, 2010, 01:05:44 PM
-
Be interesting to see how this plays out....
http://www.zdnet.com/blog/hardware/nvidia-accuses-amd-of-catalyst-driver-questionable-optimizations-to-improve-benchmark-results/10494?tag=nl.e539
<S>
-
at least, its not as blatant as this. (http://www.theinquirer.net/inquirer/news/1048824/nvidia-cheats-3dmark-177-drivers)
-
I guess Nvidia is still smarting over the time AMD caught Nvidia pulling the same crap a couple of years ago. I am though kind of surprised that Nvidia is calling out AMD as I mentioned, Nvidia got caught doing the same thing and it is common knowledge (and one can verify by looking at the drivers themselves) that both companies as a matter of routine have optimized their drivers to give back highly inflated artificial benchmark scores. Which is why using benchmarks like 3D Mark a joke as you'll never get the true results with any of the AMD or Nvidia drivers.
ack-ack
-
Actually, this is worse than the linked NVidia / PhysX one, because while that one just fudged numbers, this one impacts the user experience with lower quality images.
However, that doesn't detract from both of them doing it all the time. My "favorite" was when they both let the GPUs get several frames ahead, buffering data so the CPUs could plow forward, helping benchmarks but adding major latency to the player experience.
-
Well of course if you have a long memory you'll also remember...
http://www.geek.com/articles/games/nvidia-still-cheating-even-with-latest-3dmark-build-2003069
And even:
http://www.hardocp.com/article/2001/10/23/optimizing_or_cheating_radeon_8500_drivers
If it results in greater framerates for you, its optimizing. If it results in greater frame rates at the cost of image quality for them, its cheating.
-
as if Radeon users are incapable of shifting the performance tab to some level above "default" if they so find the image quality "unsatisfying." (AMD only lowered the default IQ setting, but higher IQ's are still there, available to the user via the CCC.) and i've not heard of even ONE HD6870/HD6850 user who's complained of "worsened image quality."
-
see what i mean 'bout nV grasping at straws?
It is true that AMD has "degraded" the default setting in the control panel, seemingly to match nVidia default setting. If the settings are now matched, nVidia has no ground to do the claims it does.
So, are the 10.10 Catalyst IQ default settings equal (in terms of image outcome) to the Forceware ones?
According to guru3d they are equal.
Ergo, Ati lowered the default settings in order to match nVidia.
/source (http://semiaccurate.com/forums/showpost.php?p=84107&postcount=5)