I just did some thinking..........
32 bit colour.... 4 billion colours........... an untold amount of which we never see. DX9 more colour precision therefore more of those 4 billion are used . Radeon cards have 96 bit colour prescision , FX cards have 128 . More presise may also mean wierder colours which we never see in reality are used.... therefore that may be the reason why radeon cards are faster... less colours are used because the colours used are NORMAL . FX cards : more colours used but those colours are WIERD so it gets slowed down AND it looks yucky .
I looked at a leaf.... it reminded me of an NVIDIA demonstration of an FX card VS a geforce 4 TI . The FX leaf looked wierd , the TI card had a more greeny green . amazing how people thing ............
Lol! Thats got to be one of the more *creative* explanation i've heard in a
while!
[1] 32bit color != 4 billion discrete colors
32bit color = 24bit + 8bit alpha channel or other such combo with alpha. eg.
Parhelia512 its 10-10-10-2 (10bit for red, 10bits for green, 10bits for blue
and 2bits for alpha). In this case its color depth I am referring too. It can
also being used in terms of color precision. Mathematically using 32bits to
represent colors would mean a theoretical 4 billion possible combinations of
color. But thats not how it works.
[2]
The question of faster is highly dependent on the use of card among other
things. Thats where you start off being biased. Saying its because of color
precision used is also WRONG, due to the many other variables involved, and
lack of empirical/statistical/logical evidence to prove such. It may be
affecting or not. Its effect if at all, may be significant, or it may not. I
doubt its possible for anyone to say. Besides, with all the other factors
influencing, it makes little diference either way.
Image quality is dependent on many many variables...among them the quality of
the videocard's ramdac, drivers and of course the hardware capability of the
chipset itself.
"Radeon cards have 96 bit colour prescision, FX cards have 128"
Before the image is outputted, the card must first render image internally.
Thats where the term precision comes into play. Now to the image, various color
manipulations (mathematical processing) have to be done before before final
output. To maintain 24bit color accuracy, the internal calculation has to be
done at a higher accuracy (errors due to rounding, etc). Thats why modern day
cards are rendered with such high precision internally even though the final
output is only "24bit".
The final output is dependent on many other variables. Nonetheless, its best
work with more than less, as long as no significant performance degradation is
had.
Getting back to why ATI image quality is typically superior...Its a combination
of drivers (ATI stuck to image quality over performance) and quality of
hardware itself.
Its generally accepted ATI colors are more "vibrant".
On a plus for Nvidia, the image quality has come a good way from the early gf2
days...but thats mostly to higher quality hardware. The drivers are still
tweaked for performance over image quality.
Also take note that one man's perception of quality is not always the same as
another. Who knows, in Nvidia camp they might actually think their image
quality is higher than ATI
That said, Matrox still has best 2D. ATI in close second though.
---------------------------------------------------------------------
An interesting related quote:
"It is not possible to show 30-bit images on hardware that is only capable of
displaying 24-bit color."