Last visit was: It is currently Sat Apr 05, 2025 5:21 pm


All times are UTC - 4 hours [ DST ]




Post new topic Reply to topic  [ 4 posts ] 

Will this get you to choose NVIDIA over AMD???
yes 30%  30%  [ 3 ]
no 40%  40%  [ 4 ]
maybe 30%  30%  [ 3 ]
Total votes : 10
Author Message
 Post subject: NVIDIA Challenging AMD’s Image Quality !!!!
PostPosted: Sat Nov 20, 2010 9:07 pm 
User avatar

Joined: Wed Mar 03, 2010 5:36 pm
Posts: 182
Location: Whites Lake, Nova Scotia
Ok, found this over at Hardwarecanucks that NVIDIA is claiming that AMD is increasing performance but is lowering quality to acheive this!!!!

"Sit back and grab the popcorn boys, this is likely to get interesting. NVIDIA wrote a blog post this morning, reporting the findings of multiple independent third party review websites that claim changes in AMD’s Catalyst 10.10 default driver settings cause an increase in performance at the expense of decreased image quality.

While individual games and benchmarking suites have their own image quality settings, graphics card, via drivers, set standard optimizations and filters that allow consumers to get the best possible visual experience with a modest cost to frame rates. For the most part, it has been widely accepted that both AMD and NVIDIA have similar default IQ (image quality) settings, thus resulting in a relatively identical picture.

NVIDIA alleges that with AMD’s changes in the Catalyst 10.10 drivers, benchmarks between the rival’s cards are no longer an apples-to-apples comparison. AMD has supposedly lowered the standard driver IQ settings in order to increase the frame rates – in some cases providing up to a 10% performance advantage – but it is delivering an overall lower quality gaming experience.

Now, it isn’t locked to this setting so users can obtain a higher and more similar image quality to NVIDIA and previous Catalyst drivers if they set the Catalyst settings to “High”. This however obviously comes at a cost to frame rates, which NVIDIA says is more indicative of the cards’ actual performance. Given this, NVIDIA asks that consumers take the image quality into account when comparing video card frame rates from the two companies, and encourages testers to use the “High” IQ Catalyst settings when comparing video cards.

It does seem like NVIDIA has a case here. Ultimately, for the consumer games are more than just a number: they all about the quality that can be achieved while still maintaining a satisfactory level of frames. As NVIDIA so aptly put in their post , “if [image quality] were not important, we’d all be playing at 10×7 with no AA!”

The NVIDIA blog post goes into plenty more details so you can evaluate the evidence for yourself; and if you get a chance, even test it out and let us know your findings

http://blogs.nvidia.com/ntersect/2010/1 ... ality.html"

What ya think. Will this make you change to NVIDIA?

_________________
ASUS Maximus IV Gene-Z , Intel i5 2500K O/C 4.2 ghz. Sapphire AMD Radeon HD 7950 Vapor-X. 8 gigs ddr3-1600 ram, 500 gig HD. ANTEC Gamer 900 tower. Windows 7 Pro. 610 silencer PSU.


Top
Offline Profile Send private message E-mail  
 
 Post subject: Re: NVIDIA Challenging AMD’s Image Quality !!!!
PostPosted: Sun Nov 21, 2010 10:49 am 
User avatar

Joined: Mon Dec 24, 2007 7:52 am
Posts: 2711
Location: HMCS Athabaskan
if NVidia makes a better product that uses less power and produces less heat and is able to run 3 monitors off one card, that may get me to switch off AMD.

Typical NV blah blah trying to steal any press from upcoming Cypress launch.

Apparently this is a respone from an AMD Australia guy:
Quote:
ATI/AMD FP16 Demotion Response
AMD Australia Technical Manager Garrath Johnson

Thank you for contacting us about statements made by our competitor regarding disabling Catalyst AI for proper performance testing. It has been brought to our attention that NVIDIA has accused us of cheating, claiming that we have optimized some aspects of Catalyst to improve our benchmark scores. Given that these same claims at one point referenced a specific member of the media as the source of this information, and that same member of the media has denied ever suggesting that we admitted to optimizing drivers for the sake of benchmark performance, the NVIDIA claims would appear to lack any substance. Nonetheless, we’d like to confront this issue head-on and provide you with the facts for you to decide.

The alleged “optimization” is the selective use of the HDR format R11G11B10 at times when the memory cost of the FP16 HDR format would otherwise impact game play. Given that in their own documents, NVIDIA indicates that the R11G11B10 format “offers the same dynamic range as FP16 but at half the storage” it would appear to us that our competitor shares the conviction that R11G11B10 is an acceptable alternative. Additionally, we rigorously test image quality before implementing this format change and only apply it if the difference is imperceptible.

The R11G11B10 format is similar to the HDR format used in the XBOX 360, the game platform for which many titles mentioned in a list provided by NVIDIA were initially designed. In many cases, the use of the HDR format with the smaller storage requirement provided the performance headroom to implement other IQ features such as anti-aliasing, in fact increasing the overall appearance of the game.

With respect to the list of titles highlighted by our competitor as proof of our apparent “cheat”, the titles themselves and the time frame these titles span serve as a good indication that this was not done for “benchmarking purposes”. Also, as mentioned above, it is selectively used on a game by game basis. For example, this optimization was enabled for Elder Scrolls IV: Oblivion yet it was not something that was enabled in Fallout 3 which featured the same engine.

To conclude, I would encourage you to think twice before disabling Catalyst AI. Doing so will disable numerous other features which would make such a comparison meaningless. In this situation you would have to also disable all NVIDIA optimizations in their driver, something that is made even more difficult by the fact that they do not provide an option to do so in their control panel.

Below are a few questions you might be asking in light of what has been presented to you.

Atomic) Is FP16 format replacement with R11G11B10 format activated in every gaming title?
AMD) No. It is enabled in few specific DirectX 9 titles as indicated in NVIDIA’s list.

Atomic) What are the requirements for enabling the R11G11B10 format?
AMD) The requirements are that we verify with QE testing that no visible image quality changes occur when it is enabled. In general we do not apply this optimization to DirectX 10 or DirectX 11 titles since the R11G11B10 format is available to developers in that API and titles can therefore already take advantage of it where appropriate. This not exposed to developers in DirectX 9 and this is not an option for DirectX 9 developers, so we do the legwork for them. This is similar to the legwork we sometimes have to do to enable MSAA in game titles that don’t otherwise support it. In this case we also do the legwork for our end users to enable additional optimizations such as this one that may not be available to developers through the API.

In the end, if an end user does see any visual quality differences, they always have the option to disable the optimization using Catalyst AI. This is what that option in the control panel is there for. However, we don’t see why anyone would ever want or need to disable this optimization in the select number of titles in which it has been applied given the benefits they receive from its presence.

Atomic) Does this format replacement reduce graphical workloads in applications at the expense of image quality?
AMD) No. In the select cases where it has been enabled there is not apparent reduction in image quality for the end user. If there is sufficient precision in the R11G11B10 surface for the level of HDR that the game is using, then there is no benefit to anyone by using the higher precision surface. This is merely an unwarranted cost in terms of memory usage and bandwidth. Conversely, some games use much wider HDR gamuts, exceeding that available precision in the R11G11B10 format. In these cases the optimization is not applicable and is therefore not applied.

Atomic) Does AMD support the use of NVIDIA’s “AMDDemotionHack” in benchmark testing?
AMD) If this is something that will be made available to their end users then, yes. Otherwise this will be a pure benchmarking effort and no end user will benefit from information gathered from these tests. Also, confirming that there is no visible effect on image quality would be vital to confirming the validity of these results.

_________________
Gaming -Obsidian 800D-
ASUS Sabertooth 990FX R2.0 | FX 9590 | H70 | 32GB G.Skill DDR3 | 256GB Vertex 4 | 7770 | 2560x1080 UWS Asus MX299

Server -Jesusbox Tx Mozart-
ASUS M5A97 R2.0 | FX 8350 | H50 | 16GB | 1TB Velo +19TB | 5770


Top
Offline Profile Send private message  
 
 Post subject: Re: NVIDIA Challenging AMD’s Image Quality !!!!
PostPosted: Mon Nov 22, 2010 10:40 am 
User avatar

Joined: Thu Apr 26, 2007 10:34 am
Posts: 1117
Location: Eastern Passage
I think I would be more disappointed by ATI/AMD if they didn't include an option to set the IQ where it should be but even still I see this as a pretty sneak move that I don't agree with. I wouldn't say it would stop me from buying an AMD video card but it would make me double check reviews to make sure they are comparing these cards against nVidia's with similar settings to get a better indicator of performance.


Top
Offline Profile Send private message  
 
 Post subject: Re: NVIDIA Challenging AMD’s Image Quality !!!!
PostPosted: Mon Nov 22, 2010 10:49 am 
User avatar

Joined: Wed Jun 17, 2009 11:06 am
Posts: 1860
Location: Halifax ( From Newfoundland)
Nvidia is helping ATI by slamming them - ain't no such thing as bad publicity.

Also they are only pissed cause they didn't think of it. Cause if they could have they WOULD have and not said anything at all - just to get that higher bench.

Also, I been saying for years that most benchmarks have unfairly favored Nvidia due to phys-x. so now that ATI can step it up they bitch? tff.

I have bought and owned both brands. I go with price vs performance, and performance taking into acct frame rate, power draw etc. As I have stated before in this day and age, the amount of draw nvidia cards use is nothing short of retarded. In some cases a better frame rate but its not worth it to me when its so inefficient. That's a failure no matter how ya slice it. I hope they do better, along with AMD/ATI.


Top
Offline Profile Send private message E-mail  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 4 posts ] 

All times are UTC - 4 hours [ DST ]


Who is online

Users browsing this forum: No registered users and 12 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group