Quote:
if ATI has drivers better than NVIDIA by the next time i do a complete upgrade, ill vote with my money for ATI.
I would do that anyway. I am not really impressed that nVidia have "better drivers" - it seems they are faster in 3D, and have an overclocking tool, but as a whole are they really better?
The last benchmarks I saw showed negligible differences in average framerate in a bunch of applications. While these guys on geek benchmark sites think that 3 fps or 11fps is some kind of "proof that the card is better", I think those kinds of numbers lie in that there is simply very little gain after a certain point.
Remember if you have a 75Hz monitor and you are running a game at 100fps, you will only see 75 of those frames on the screen. Only for a game which does it's major calculations in the gaps between frame rendering without any kind of time adjustment will you see any improvement; and in that case, it's a simply badly coded game.
Saying "I get 3% more performance than ATI using nVidia" or "once I have attained a perfect framerate for my display, nVidia lets me run it 50% faster, whereas ATI is ONLY at the perfect framerate".. it's ridiculous.
If ATI drivers average at 76fps and nVidia drivers average at 150fps.. please tell me how those extra, invisible 73 frames somehow improve your gaming or productivity. You can't, really.
The hot question now on ATI and nVidia drivers is support for SLI and CrossFire. Personally I think this is another benchmarking fallacy; I think teaming graphics adapters that cost $500 each, and increasing your system power consumption twofold just to double framerate (from 150fps to 290fps perhaps!!) is just as ridiculous. ATI and nVidia know that there are plenty of neophiles out there who would sell a grandparent to be able to brag about those numbers, but I don't think it is possible to use this as a real business case for which card is better, or which drivers are better quality.