After the recent 3DMark, ShaderMark etc etc benchmark scandal's and [H]ardOCP's Benchmarking Right article, it's time for the next step: Now that we know that current synthetic benchmark results are 'rippled' by vendor specific optimizations in video drivers, where are we heading with interpretations of the results we see? Can we be objective with these results? Only to a very small degree as the applications just stress test specific functions of the hardware. Kyle Bennet has his second editorial up and running. Worth the read: The current synthetic benchmarks are useless in terms of delivering a measurement metric. To put together a utility that would truly be forward looking we would need to involve key game developers that are the ones writing the shaders in the games we will be playing one day. We have an idea that the gaming and hardware communities need to come together and form a not-for-profit organization to work on this issue as a team. Benchmarking for money is simply out of the question as FutureMark has proven that business model to be full of holes to say the least. Our communities need to self-regulate and decide how to best facilitate the needs of the consumer that ultimately pay all our bills.