Nvidia G80 meets DX 10 spec with 'dis-unified' shader Horses for coursesBy Fuad Abazovic: Monday 03 July 2006, 09:50WE HEAR Nvidia has been beavering away fo meet the DirectX 10 specification.And the firm decided it doesn't need a unified Shader for its upcoming G80 chipset. Instead, it decided that you will be fine with twice as many pixel Shader numbers as geometry and vertex Shaders. AS we understand it, if a Nvidia DX10 chip ends up with 32 pixel-Shaders, the same chip will have 16 Shaders that will be able to process geometry instancing or the vertex information. ATI's R600 and its unified Shaders work a bit differently. Let's assume that ATI hardware has 64 unified Shaders. This means that ATI can process 64 pixel lines only per clock. That may be in the proportions: 50 pixel, 14 vertex and geometry lines per clock, or 40 vertex, 10 pixel and 14 geometry information per clock. Any ratio that adds up to 64 will do. I hope you get this maths. The Nvidian chippery is limited to 32 pixel and 16 vertex and geometry lines per clock, which might be a wining ratio but it is still too early to say. We don’t know who will win the next generation hardware game and whose approach is better: ATI's unified or Nvidia's two-to-one ratio. DirectX 10 actually doesn’t care how you do your Shaders as you speak with an abstraction layer and hardware can do its pixel, vertex and geometry data the way it wants. It will serve up the information to DirectX 10 that will process it inthe end. Nvidia's G80 is fully DirectX 10 capable as well as Shader Model 4.0 capable but it won't be unified, according to our sources. In the end, people care about the frames per second that that will ultimately decide who will win the next-generation graphics hardware race. µ
Nvidia's G80 has 32 pixel pipes 16 vertex & geometry ShadersBy Fuad Abazovic: Thursday 06 July 2006, 10:15IT TURNS that the fancy Nvidia G80 chip taped out, and in working silicon stage it will have 32 pixel Shaders and, as predicted, have 16 vertex and geometry Shaders. Nvidia wants to stick with a two to one ratio and assumes that the games of tomorrow will need twice as many pixels than they will need vertices and geometry information. ATI believes in a different religion. ATI believes that every Shader should become one, united and bellowed. No more segregation to Pixel and Vertex Shaders. If ATI makes a chip with 64 Shader units all of them can do Shader either vertex or pixel or geometry stuff all the time. You can check the stories at the bottom for a better explanation. We don’t know the clock speed of the upcoming performer but we don’t believe Nvidia can get more than 700MHz out of it - we could be wrong about that. µ
Nvidia's G80 has problems SLI and HDCP broken for nowBy Charlie Demerjian: Monday 23 October 2006, 09:03 NVIDIA IS GIVING out G80 cards to editors' day attendees but no real drivers. Why on earth would Nvidia need to do that? What is broken? Well, it seems G80 has a lot broken, some that will be fixed before launch, and some may be harder to fix.Drivers on the 25th October will fix SLI, as we understand it is pretty badly borked in the current release. How badly? Enough to crash almost every time, said one source, and so basically unusable. We have a lot of confidence this will be fixed before launch, possibly in the driver set we told you about earlier. I would only rate this as a corporate embarrassment that will hopefully never leak due to "strong and respected NDAs". The "editors" are wearing chastity belts designed to preserve the new Dawn's virginity.The more troubling news is that we hear HDCP is badly borked in the first rev. Since we are not sure if this problem is hardware or software, if you care about jumping on the DRM infection train, you might want to look long and hard before you buy a G80.Nvidia has something of a history of shipping broken functionality and not so much as correcting it on the box. Wait for independent reviews from people not wearing chastity belts that test this before you buy.Last up, we hear that in the forthcoming 95 series drivers the classic display panels are gone for good. We plan to be holding a candlelight vigil for them at One Inquirer Tower on All Hallows Eve. Punch and pie will be served, along with mournful singers and a goth/emo kid moping in the corner. µ
Asus, the web-site claims, will release two versions of the GeForce 8800-series graphics cards: EN8800GTX/HTDP/768M and EN8800GTS/HTDP/640M. The naming scheme implies that there will be two versions of the G80: the GeForce 8800 GTX and the GeForce 8800 GTS which will be different not only in terms of performance, but also in terms of installed amount of memory and interface: the GTX boards will carry 768MB with 384-bit interface, whereas the GTX cards will have 640MB with 320-bit interface. The higher-end model 8800 GTX will have its chip clocked at 575MHz and GDDR3 memory operating at 1.80GHz.
G80 is big, hot and sports new SLIG80, the remaining factsBy Fuad Abazovic: Wednesday 25 October 2006, 10:36G80, GEFORCE 8800 GTX and GTS are old news now. We learned all about the cards including the fact that it has around 700 million transistors, that, as we said many moons before, it is based on 90 nanometre process and has dual power plugs.The G80 chip needs a lot of power and is the biggest desktop graphics chip so far. I think the image from toms hardware Italy speaks for itself. here.Another thing that we learned is that Nvidia uses a new SLI. We saw it here originally spotted at nvnews.net. The new SLI is dual rail stuff. It is ironic that ATI was the first to introduce such a connector that will let you have read and writes performed at the same time.As for the rest, no one yet knows how many real pipelines this card has but you can rely on the 1.5GHz scalable clock rates. This doesn’t mean much to us as in theory this card is super fast but we will be anxious to see it in action.G80, Geforce 8800 GTX and GTS are two biggest graphic cards ever and here is the link for the retail Asus card. Samples should be distributed this week and some lucky chaps got the cards back at Nvidia's editor's dates. µ