New Evidence of Application Detection in nVidia Drivers

Published by

data/avatar/default/avatar02.webp
sounds like a retarded report to me, i don't see why an app or dll designed to measure shader performance, would NOT want to identify what app/sub app was running? posted as much on EB... see what happens..
data/avatar/default/avatar02.webp
Sounds to me like the clowns at Beyond3D and EB need to get a girlfriend. They obviously has way too much time on their hands. Its sad really. I feel sorry for them.
data/avatar/default/avatar04.webp
Indeed - How dare a tech hardware site post news about tech hardware!!! >(
data/avatar/default/avatar03.webp
A breath of sanity in an otherwise truly mindless first level of thread, all the post that appear under the story just got me sad that people could still be so bloody ignorant. :( (I haven't read all the sub-comments yet, I have a feeling they'll be a bit of biatch slapping being done to these fanboys. ;) )
data/avatar/default/avatar03.webp
She seems to think the idea of me having a girlfriend would be bad on our children, but since YOU think it's a good idea.... :hmm:
data/avatar/default/avatar03.webp
What in the hell are all these nVidiots blathering on about? This is hard evidence that nVidia is still cheating using application detection, and in the case of 3dm2k3 nVidia has STATED that they do not so this is proof that they are lying. Granted I don't think it's that big a deal either since we've all known for quite sometime how nVidia does their business, but it IS proof and it IS news. If you fanboys can't digest and accept it I suggest y'all go back to Guru3d, Anand, and Tom's and go rah-rah yourselves into a fury. :roll:
data/avatar/default/avatar04.webp
'If most of you put your effort into something that actually MEANT something to the world, then maybe it wouldn't be in the crap ass situation it is today.' Let us know when you collect your Nobel Peace Prize, so we can start listening to this kind of post rather than just ignore it as the empty rhetoric to deflect from an issue that it is...
data/avatar/default/avatar03.webp
...I'm hoping my children will be graduated from college and pursuing the lives they want to, it's kind of what I spend the majority of my time doing....I'm a full-time stay-at-home Dad to a 4 & 6 year old. I kind of consider that "worth while", my wife does....your mileage may vary. :lol:
data/avatar/default/avatar01.webp
Here comes the 'Your a naive nVidiot fanboy, NVidia are cheats and liers, and will poison you in your sleep' script, another damn rerun. Question... can the NVidia driver optimisations determine whether UT2004 or other games (with built in benchmark) are running as a benchmark or in a normal game??, if so I 'd love to see Elitebastards give us the details of how, probably just as much as ATIs driver writers would love to be able to do game based optimisations themselves. (Without affecting IQ of course!) but you'l be comparing screenshots in both modes of course.. Won't you? That emphasizes why simulated benchmarks like 3DMark03 have to be dropped by Video card reviewers because if the drivers can't tell there can be no confusion. Oh and Digitalwonderer and Dakisha ... Get a room chums. And what is 'Biatch' anyhow? Cool new kiddie slang or just bad spelling? Non Fanboy disclaimer : Radeons have Better IQ and PS2.0 support!!! and I wish I had one, mmmm.
data/avatar/default/avatar02.webp
okay, let me sum it up then. Are they "Optimizing" their drivers? I hope so. I could care less abount benchmarks anymore, they only ones that have a hardcore care about benchmarks anylonger are the boys who love to post their specs in hardware forums. If thats how they get their kicks, so be it. I prefer to play games myself. I have a 5900FX, and it runs and looks great in games like UT2004, Far Cry, Raven Shield, Call of Duty, Colin McRae. The games look great with my 5900. Am I getting 120FPS? Who knows, I don't, all I know is that it runs and looks great. Maybe I'm only getting 70-90FPS. If optimizing helps, great! I liked it when ATI optimized their drivers as well when I was using a 8500. And, nobody cares about any 3DMark scores any longer, you don't play 3DMark. I'm all for ATI and nVidia optomizing their drivers.
data/avatar/default/avatar03.webp
Another piece of b.s. made by skels and liceheads. FX composer can let you simulate to run your shaders on NV3x and whatnot, and give you performance report. It can optimize for all FX series chipset or the one you choose from. You can write code optimize for NV hardware. Nothing wrong WHATSOEVER. ATI's rendermonkey has routines that optimize the R9700 chipset and others exclusively, too. I don't know who they are, maybe just homeless bums
data/avatar/default/avatar03.webp
Look at NVidia's 60.72 driver; you ATI people can even dream of such driver support. Get lost!
data/avatar/default/avatar03.webp
"SHOW us proof" the graphics are corrupted or clipped or messed up about you say anything about "cheat".