Further NVIDIA optimizations for 3DMark03 ?
Where can the user get these drivers???
This topic was started by Degger*,
If you're old like me, you may remember the day when ATI was caught optimizing its Radeon 8500 drivers for Quake III Arena timedemos. The trick was simple: ATI's drivers looked for an executable named "quake3.exe" and turned down texture quality when "quake3.exe" started. Kyle Bennett at the cold, HardOCP renamed "quake3.exe" to "quack3.exe" and ran some benchmarks. ATI was busted.
In a funny twist of fate, I got a tip earlier this week about NVIDIA's Detonator FX drivers. The allegation: if you rename 3DMark03.exe to something else and run the benchmark with anisotropic filtering enabled in the drivers, test scores drop. In other words, NVIDIA appears to be using the same lame technique ATI did way back when: keying on the program's filename in order to trigger benchmark "optimizations." In this case, those optimizations appear to be a lower quality form of texture filtering than the anisotropic filtering method selected in the driver control panel. Many review sites like us benchmark cards with anisotropic filtering and edge antialiasing turned on, so these things do matter.
Read the whole story over @ http://tech-report.com/etc/2003q2/3dmurk03/index.x?pg=1
In a funny twist of fate, I got a tip earlier this week about NVIDIA's Detonator FX drivers. The allegation: if you rename 3DMark03.exe to something else and run the benchmark with anisotropic filtering enabled in the drivers, test scores drop. In other words, NVIDIA appears to be using the same lame technique ATI did way back when: keying on the program's filename in order to trigger benchmark "optimizations." In this case, those optimizations appear to be a lower quality form of texture filtering than the anisotropic filtering method selected in the driver control panel. Many review sites like us benchmark cards with anisotropic filtering and edge antialiasing turned on, so these things do matter.
Read the whole story over @ http://tech-report.com/etc/2003q2/3dmurk03/index.x?pg=1
Participate on our website and join the conversation
This topic is archived. New comments cannot be posted and votes cannot be cast.
Responses to this topic
Embarassing. Not just for NV but for ATI and all other cheats as well. Damn you can't trust anything anymore. Gotta wipe 3DMark off the hdd. :oops:
No. Change the filenames. Hell you could name the file to "bhuauigkojas.exe" and THEN see your scores plummet...
Optimization by reducing quality gives a false benchmark - it may be 2% but you are still cheating people. Plus 2% might not matter today but it will once the cards start to age.
i still think basing my opinion on a score of
several real gametests is far superior to
placing all faith into one single syntetic testapp
that is supposed to represent future games in
the opinion of the people at futuremark..
even if they belive in it, they are humans and
the dont have to be right..
The fact remains that the Doom3 like game
test work nothing like the doom3 engine, so
whos right? the upcoming game, or the synthetic
testing applications estimate?
from what i gather they stress different parts
of the gfx card.. and the 3dmark gametest is said
to be rendered in a very inefficient way..
from what i understand of the gamingworld, gamedevs
and GPUmakers work together and optimize the gamecode
and drivers so it will work together in the best way possible..
so why isnt this the way in 3dmark?
maybe there should have been seperate codepaths for
different cards from the begining,
since that seems to be the way the industry work..
I feel a bit like 3dmark, 03 in perticular, is living in
a world of its own, cut from the real world..
The GFFX 5200 for instance, does well in 3dmark03
but in real games its beaten by GF4MX at times..
sure it supports Dx9 but it cant power it, so the higher
3dmark score sais its a good card, but realworld games
will run better with a GF4 Ti4200 or sometimes even a MX..
Im not defending anyone here, optimize a game/app so it
has a negative impact on the visuals is bad, unless its a
selectable option so you can choose to sacrifice some goodies
to get higher rate.. but not in a benchmark..
several real gametests is far superior to
placing all faith into one single syntetic testapp
that is supposed to represent future games in
the opinion of the people at futuremark..
even if they belive in it, they are humans and
the dont have to be right..
The fact remains that the Doom3 like game
test work nothing like the doom3 engine, so
whos right? the upcoming game, or the synthetic
testing applications estimate?
from what i gather they stress different parts
of the gfx card.. and the 3dmark gametest is said
to be rendered in a very inefficient way..
from what i understand of the gamingworld, gamedevs
and GPUmakers work together and optimize the gamecode
and drivers so it will work together in the best way possible..
so why isnt this the way in 3dmark?
maybe there should have been seperate codepaths for
different cards from the begining,
since that seems to be the way the industry work..
I feel a bit like 3dmark, 03 in perticular, is living in
a world of its own, cut from the real world..
The GFFX 5200 for instance, does well in 3dmark03
but in real games its beaten by GF4MX at times..
sure it supports Dx9 but it cant power it, so the higher
3dmark score sais its a good card, but realworld games
will run better with a GF4 Ti4200 or sometimes even a MX..
Im not defending anyone here, optimize a game/app so it
has a negative impact on the visuals is bad, unless its a
selectable option so you can choose to sacrifice some goodies
to get higher rate.. but not in a benchmark..
That is because 3dmark tests your GRAPHICS card only (or 90% of the test is based on it).
The Geforce 4 mx beats the fx5200 only because there are only DX8 games available today - when DX9 games come out the Geforce 4 mx will be crushed.
I still believe 3Dmark should only give scores on default settings. To give all cards the same tests to really see what they do.
The Geforce 4 mx beats the fx5200 only because there are only DX8 games available today - when DX9 games come out the Geforce 4 mx will be crushed.
I still believe 3Dmark should only give scores on default settings. To give all cards the same tests to really see what they do.
That is because 3dmark tests your GRAPHICS card only (or 90% of the test is based on it).
The Geforce 4 mx beats the fx5200 only because there are only DX8 games available today - when DX9 games come out the Geforce 4 mx will be crushed.
I still believe 3Dmark should only give scores on default settings. To give all cards the same tests to really see what they do.
GF4MX wont run DX9 games, if the DX9 functions arent optional
that is, XF5200 can hardly handle todays games, and tho they
support the dx9 features you would prolly not get decent framerates
at 640*480 with it...
Besides GF4MX doesnt even support DX8 since it lacks shaders...
its basicly a GF2..
As I said... "CRUSHED". :P