Further NVIDIA optimizations for 3DMark03 ?

Where can the user get these drivers???

This topic was started by ,


data/avatar/default/avatar01.webp

621 Posts
Location -
Joined 2001-08-26
If you're old like me, you may remember the day when ATI was caught optimizing its Radeon 8500 drivers for Quake III Arena timedemos. The trick was simple: ATI's drivers looked for an executable named "quake3.exe" and turned down texture quality when "quake3.exe" started. Kyle Bennett at the cold, HardOCP renamed "quake3.exe" to "quack3.exe" and ran some benchmarks. ATI was busted.

In a funny twist of fate, I got a tip earlier this week about NVIDIA's Detonator FX drivers. The allegation: if you rename 3DMark03.exe to something else and run the benchmark with anisotropic filtering enabled in the drivers, test scores drop. In other words, NVIDIA appears to be using the same lame technique ATI did way back when: keying on the program's filename in order to trigger benchmark "optimizations." In this case, those optimizations appear to be a lower quality form of texture filtering than the anisotropic filtering method selected in the driver control panel. Many review sites like us benchmark cards with anisotropic filtering and edge antialiasing turned on, so these things do matter.

Read the whole story over @ http://tech-report.com/etc/2003q2/3dmurk03/index.x?pg=1

Participate on our website and join the conversation

You have already an account on our website? Use the link below to login.
Login
Create a new user account. Registration is free and takes only a few seconds.
Register
This topic is archived. New comments cannot be posted and votes cannot be cast.

Responses to this topic


data/avatar/default/avatar01.webp

14 Posts
Location -
Joined 2003-06-03
Embarassing. Not just for NV but for ATI and all other cheats as well. Damn you can't trust anything anymore. Gotta wipe 3DMark off the hdd. :oops:

data/avatar/default/avatar01.webp

614 Posts
Location -
Joined 2003-03-21
No. Change the filenames. Hell you could name the file to "bhuauigkojas.exe" and THEN see your scores plummet...

data/avatar/default/avatar02.webp

20 Posts
Location -
Joined 2003-06-01
dudes i dont know what to tell you
if the proggy get more scores so it's better performance
but no1 said anything about quality ?
if the quality remains the same or drop in 2% i really dont care
and those cheaing called " Optimization" dont forget :]

data/avatar/default/avatar01.webp

614 Posts
Location -
Joined 2003-03-21
Optimization by reducing quality gives a false benchmark - it may be 2% but you are still cheating people. Plus 2% might not matter today but it will once the cards start to age.

data/avatar/default/avatar02.webp

13 Posts
Location -
Joined 2003-06-06
i still think basing my opinion on a score of
several real gametests is far superior to
placing all faith into one single syntetic testapp
that is supposed to represent future games in
the opinion of the people at futuremark..
even if they belive in it, they are humans and
the dont have to be right..

The fact remains that the Doom3 like game
test work nothing like the doom3 engine, so
whos right? the upcoming game, or the synthetic
testing applications estimate?
from what i gather they stress different parts
of the gfx card.. and the 3dmark gametest is said
to be rendered in a very inefficient way..

from what i understand of the gamingworld, gamedevs
and GPUmakers work together and optimize the gamecode
and drivers so it will work together in the best way possible..
so why isnt this the way in 3dmark?
maybe there should have been seperate codepaths for
different cards from the begining,
since that seems to be the way the industry work..

I feel a bit like 3dmark, 03 in perticular, is living in
a world of its own, cut from the real world..
The GFFX 5200 for instance, does well in 3dmark03
but in real games its beaten by GF4MX at times..
sure it supports Dx9 but it cant power it, so the higher
3dmark score sais its a good card, but realworld games
will run better with a GF4 Ti4200 or sometimes even a MX..

Im not defending anyone here, optimize a game/app so it
has a negative impact on the visuals is bad, unless its a
selectable option so you can choose to sacrifice some goodies
to get higher rate.. but not in a benchmark..

data/avatar/default/avatar02.webp

20 Posts
Location -
Joined 2003-06-01
yea
it's like i put my GF3 on my K6-2 Machine and run 3dmark
i got 1400 Points but when i tryed to run a game like Half Life the game worked very slow b/c my CPU was sucks
before that i have Rivatnt2 on that comp and the Score was 400-550

data/avatar/default/avatar01.webp

614 Posts
Location -
Joined 2003-03-21
That is because 3dmark tests your GRAPHICS card only (or 90% of the test is based on it).

The Geforce 4 mx beats the fx5200 only because there are only DX8 games available today - when DX9 games come out the Geforce 4 mx will be crushed.

I still believe 3Dmark should only give scores on default settings. To give all cards the same tests to really see what they do.

data/avatar/default/avatar02.webp

13 Posts
Location -
Joined 2003-06-06
That is because 3dmark tests your GRAPHICS card only (or 90% of the test is based on it).

The Geforce 4 mx beats the fx5200 only because there are only DX8 games available today - when DX9 games come out the Geforce 4 mx will be crushed.

I still believe 3Dmark should only give scores on default settings. To give all cards the same tests to really see what they do.


GF4MX wont run DX9 games, if the DX9 functions arent optional
that is, XF5200 can hardly handle todays games, and tho they
support the dx9 features you would prolly not get decent framerates
at 640*480 with it...
Besides GF4MX doesnt even support DX8 since it lacks shaders...
its basicly a GF2..

data/avatar/default/avatar03.webp

363 Posts
Location -
Joined 2002-04-11
Is there any game using the CG language except MorroWind ?
The technology seems to be too fast for developers...isn't it ?!

data/avatar/default/avatar02.webp

13 Posts
Location -
Joined 2003-06-06
CG in morrowind?
you sure? tought CG came much later..