Nvidia cynically fiddled video benchmarksAnd suckered the press to bootBy Charlie Demerjian in San Francisco: Tuesday 18 September 2007, 17:10THERE HAS BEEN quite a bit of talk lately about Nvidia fiddling with certain video tests, but that is only the tip of the iceberg. The problem is merely hinted at an understanding of the way it manipulated benchmarks, the real damage is revealed in how it did so - through sleazy underhanded playing of the media.The problem is simple, Nvidia can't do video decode well, and the current media darling benchmark, HQV - especially in the HD version of the benchmark - is eating its hardware alive. On quality, Nvidia regularly loses about 25 per cent of its score in a noise reduction test - enough, to more or less take it out of that game.So, what do you do when you are blown out of the water? Cheat with a plausible workaround, and then spin your ass off hoping not to get caught. Well, Nvidia seems to have done both.First, a little background. Noise is basically data that is out of place - bits that are errors or other artifacts of the process. On sound recordings, you hear it as exactly that, noise, popping and clicking. In video, you see pixels that are anomalous and short lived, they pop up, flash, and degrade image quality. You see the noise.One of the ways you deal with it, and this it does quite effectively, is a time-based filter called Temporal Noise Reduction (TNR). In filters that smooth edges and the like, you filter across the image, looking at the pixels around you to decide if you are out of place. Time-based filters do that but also look across frames in video.If you have a background that is black for five frames, with a couple of white pixels floating about on one frame then five more completely black frames, chances are you have noise. It detects things that pop up briefly without any accompanying pixels. Then again, this is grossly oversimplified, and we just made filter engineers die a little inside with that explanation.In the end, it can be very effective, you just average across time, and high-frequency pops go away. The problem is that the more you turn it up, the more average things become. You end up with washed out images, blurred edges, and things that should be there or fast-moving things, getting really ugly. High frequency bits are also a lot of what makes an image 'sharp' instead of blurry.The other problem you see is a kind of ghosting. If you turn the TNR filter up too high, on transitions between light and dark scenes, especially cut scenes, you see ghosting for a few frames. If you have a white image for 10 pixels and then go black, the first few black pixels are classified as anomalous and get whitened or softened. Again, vast oversimplification of the process, but you see washed out and lightened transitions. It looks like this.Let me be the first to say that of the four pics I took, this was the best, and it still is pretty awful. That said, you can still see the problem, but in person, it looks far worse. Let me give you a bit more explanation.These pics were done on identical Dell 30-inch monitors, the ATI 2600XT based box is on the left, the NV 8600GTS is on the right. Throughout the tests, the NV machine had lighter images, a benefit in some places, a downer in others. This is not a good or bad thing, it is just a difference between the cards. This picture was taken with the test paused to show details, what you are seeing is a static image, not motion effects.If you look at the NV screen, it is more washed out, but there is a lot of banding and blotching where it should be solid black. Some of this is the lighter picture, most appears to be from the TNR filter. If it was only the lighter picture, you would not get the blotching effect, and if it was a problem with the monitors, both would be blotchy. I apologise for the poor image quality, but all I have is an old camera and a new phone to take pics with, and in this case, they don't quite cut it.Getting back to the problem, using TNRs is not a bad thing, in fact it can be a very good thing, it does improve image quality. But if you overuse it, you degrade image quality in certain ways, and that is where the cynical gaming of the press took place.When NV put out its 163.11 drivers, it default turned up the TNR, way up, just in time to give out to reviewers. Now, if you have TNR turned down, NV basically fails the noise reduction test in HQV. With it cranked up, it passes with flying colours. Fair enough, this is what it is designed to do, and it looks like it does what it is there to do, right?Yup, perfectly. The first problem though is that HQV's noise reduction test has little movement in it, it is a flower swaying too and fro gently in the breeze. There is no motion, so you won't see how TNR munges the edges and blurs things out all that much. On the other tests, TNR does not affect the outcome because it measures specific performance parameters. It is a synthetic benchmark that measures one performance dimension at a time.The end result is that TNR makes a single test much better and does not affect the others. You would think this qualifies as a job well done, and if the world was limited to watching HQV loop, you would be right. I don't know about you, but I watch other videos on my computer too, and therein lies the rub.What Nvidia did with TNR washes out video and notably degrades the quality. Blacks are no longer black during transitions, things blur and lose sharpness. In general, it is a mess. It does reduce noise in those videos though, and if you have a noisy high-def video setup, this is the driver release for you. This all comes at a price, the drivers are broken elsewhere, and they are broken to game a benchmark. Reviewers were not told this, and dutifully reported that the video problems were fixed.When asked about it, NV will probably respond that it is under user control, and you can set it however you want. That is 100 per cent true, but I don't know about you, I want my video devices to work, not need adjustment every scene change. I want my video driver to reduce noise and not make things look like Micheal Jackson's privates, a manual slider change during every scene change is unacceptable. It is under your control, but nevertheless it is still broken and ineffective.The cynical gaming was that it was done in a beta release meant for the members of the press, and handed out to them. The next version, 163.44, vastly lowered the default value, and presumably it will vanish under the waves with successive releases until a new card comes out, rinse and repeat. If I ever get an NV card, I will keep an eye on this.If you are confused as to how this benefits NV, think about it this way. They released 'benchmark special' drivers, and the reviews were done on these parts. Headlines screamed, or maybe footnotes briefly mentioned, that NV scored XYZ in HQV, which is much better than the XYZ - 25% that they would have gotten without the TNR.By the time someone sees the problem, weeks have gone by, as is the case now, and the problem is written off as a setting in a beta that was never meant for public use. That would be fair, but guess which number goes on the marketing presentations, the slides, and the boxes? Guess which ones people read in the magazines (remember them?), at the newsstand? A retraction four months later on page 173 is not the same as a cover article with the newest GPU on it.So, from my point of view, NV duped the press, engineered fake numbers, and got away with it. It is a real pity because if you buy a card, you are not getting the same perfromance as you read about in the skewed reviews, you are getting a scammed set of numbers. This is surreptitious at best and plain old cheating at worst. The firm deserves to be called out, otherwise it do it the next time the opportunity arises.So it falls to us to do the calling. µ
Nvidia kills 8800GTS 320MB, relaunches 640MBBoxed inBy Theo Valich: Friday, 05 October 2007, 1:00 PM We knew that Nvidia was planning to kill the 8800GTS 320MB in order to make room for the 65nm die-shrink that the world has come to know as G92.But it turns out that you cannot order 320MB versions any more either, since it is being pronounced as an EOL (End of Life) product.Next in line to go through a change is the 8800GTS 640MB, which is being tweaked up in order to live through the 512MB and 256MB versions of G92.Nvidia decided to raise the specs by another 16 scalar shader units, so the 8800GTS will now feature 112 scalar shaders, 16 less than 8800GTX/Ultra. Clockspeeds remain the same, as do thermal and other specs.But there are a lot of miffed Nvidia partners, crying foul over the situation. Imagine the surprise of AIBs that have thousands of printed retail boxes with old specs.If Nvidia ever says that they're thinking about greening the Planet and all other environmental chit-chat manufacturers like to use these days, ask them about how many trees they killed with just this sudden announcement.. µ
Nvidia admits defeat in G92 vs. RV670 battle3DMark warsBy Theo Valich: Friday, 05 October 2007, 9:39 AM IN THE FAR EAST, 3DMark is everything. You can say whatever you want about canned benchmarks, but nobody can dodge the influence of the 3DMark06 benchmark.It's been the same story with previous iterations of graphics cards, and the same will happen with the next, DX10-only workout. When that is coming out, only Futuremark knows.But something very significant happened in this round of the war, at least according to our highly-ranked sources.This time around, Nvidia did not tout its G92_200 series as the fastest thing since Niki Lauda, but rather admitted defeat in this all-popular synthetic benchmark at the hands of a yet-unnamed Radeon HD part.A reference board from Nvidia is capable of scoring 10,800 3DMarks, while a reference board from ATI will score around 11,400 3DMarks, or a clear 550-600 points advantage.This is a massive leap over previous-gen parts. The current generation's high-end performer from Nvidia, the 8800Ultra scores 12,500 points. Seeing a mainstream, $250 part scoring barely a thousand less than a current $599 card only makes us wonder how those owners that coughed up so much will feel.When it comes to ATI's part, you know what to expect in this synthetic benchmark - outscoring Radeon HD 2900XT is a default mode of operation for RV670XT. At least in lower resolutions.Partners are less than happy with Nvidia board politics as well, but this is a subject of another story. µ
The GoodThis is a fixed G80. Nuff' said. It beats 2900XT, 8800GTS 320 and 640MB, and even 8800GTX in some cases. This card has enough horsepower for games, but do not expect performance FullHD in with single card in every game. Brilliant piece for Crysis, Hellgate and UT3.The BadDrivers are green at this time, install those that are shipped on CDs, since rushed Crysis-demo beta driver does not warrant stability. 256-bit bus could become a bottleneck in next batch of games, VP processor could have been newer VP3 and not VP2And the UglySingle-slot cooling may be cool, but our board heated up significantly. Temperature was always in high-60 (Celsius) range, overclocking capabilities were limited due to single-slot cooling.