Ok, found this over at Hardwarecanucks that NVIDIA is claiming that AMD is increasing performance but is lowering quality to acheive this!!!!
"Sit back and grab the popcorn boys, this is likely to get interesting. NVIDIA wrote a blog post this morning, reporting the findings of multiple independent third party review websites that claim changes in AMD’s Catalyst 10.10 default driver settings cause an increase in performance at the expense of decreased image quality.
While individual games and benchmarking suites have their own image quality settings, graphics card, via drivers, set standard optimizations and filters that allow consumers to get the best possible visual experience with a modest cost to frame rates. For the most part, it has been widely accepted that both AMD and NVIDIA have similar default IQ (image quality) settings, thus resulting in a relatively identical picture.
NVIDIA alleges that with AMD’s changes in the Catalyst 10.10 drivers, benchmarks between the rival’s cards are no longer an apples-to-apples comparison. AMD has supposedly lowered the standard driver IQ settings in order to increase the frame rates – in some cases providing up to a 10% performance advantage – but it is delivering an overall lower quality gaming experience.
Now, it isn’t locked to this setting so users can obtain a higher and more similar image quality to NVIDIA and previous Catalyst drivers if they set the Catalyst settings to “High”. This however obviously comes at a cost to frame rates, which NVIDIA says is more indicative of the cards’ actual performance. Given this, NVIDIA asks that consumers take the image quality into account when comparing video card frame rates from the two companies, and encourages testers to use the “High” IQ Catalyst settings when comparing video cards.
It does seem like NVIDIA has a case here. Ultimately, for the consumer games are more than just a number: they all about the quality that can be achieved while still maintaining a satisfactory level of frames. As NVIDIA so aptly put in their post , “if [image quality] were not important, we’d all be playing at 10×7 with no AA!”
The NVIDIA blog post goes into plenty more details so you can evaluate the evidence for yourself; and if you get a chance, even test it out and let us know your findings
http://blogs.nvidia.com/ntersect/2010/1 ... ality.html"
What ya think. Will this make you change to NVIDIA?