Author Topic: Geforce 8xxx series  (Read 36416 times)

Offline Crixx_Creww

  • Akatsuki
  • *****
  • Posts: 9057
  • Country: 00
  • Chakra -12
  • ANBU OF THE HIDDEN VILLAGE FOAK
    • Atari 2600.
  • Referrals: 11
    • View Profile
    • www.crixxcrew.com
  • CPU: Intel Q6600 @3.2 Ghz
  • GPU: Nvidia Xfx geforce 9800GTX+
  • RAM: 8 Gigs Mixed kingston and corsair ddr2
Re: Nvidia GeForce 8800 GTX Details Unveiled
« Reply #60 on: November 07, 2006, 03:26:10 PM »
why did they use amd processors for that test???

Carigamers

Re: Nvidia GeForce 8800 GTX Details Unveiled
« Reply #60 on: November 07, 2006, 03:26:10 PM »

Offline W1nTry

  • Administrator
  • Akatsuki
  • *****
  • Posts: 11329
  • Country: tt
  • Chakra 109
  • Referrals: 3
    • View Profile
  • CPU: Intel Core i7 3770
  • GPU: Gigabyte GTX 1070
  • RAM: 2x8GB HyperX DDR3 2166MHz
  • Broadband: FLOW
  • Steam: W1nTry
  • XBL: W1nTry
Re: Nvidia GeForce 8800 GTX Details Unveiled
« Reply #61 on: November 07, 2006, 04:00:34 PM »
It's called what available at the time crixx. That aside, there are alot of ppl out there that want to know the performance of these beasts on the AMD platform.

Offline Arcmanov

  • Administrator
  • Akatsuki
  • *****
  • Posts: 10642
  • Country: tt
  • Chakra 126
  • Gamer/Enthusiast
    • :n64: :gcn: :dreamcast: :xbox360: :computer:
  • Referrals: 1
    • View Profile
    • Arcmanov's rig
  • CPU: Intel Core i7 5820K @ 4.3 GHz
  • GPU: 2 x [Gigabyte] GeForce GTX 1070
  • RAM: 4 x 8GB G.Skill.TridentZ RGB DDR4-2400
  • Broadband: :flow:
  • MBL: LG V20
  • Origin ID: Arcmanov
  • PSN: Arcmanov
  • Steam: Arcmanov
  • XBL: Arcmanov
Re: Nvidia GeForce 8800 GTX Details Unveiled
« Reply #62 on: November 07, 2006, 05:13:47 PM »
Ah boy.  Things lookin up for ye  brethren of AMD money.  ^_^
Not that I will be able to afford that monster (in a hurry) in any case,
but I like what I just read there.

It seems the 8800 will pwn all, whether by Intel or AMD.

Niceness.
Systems United Navy - Accipiens ad Astra


Offline .:Jedi:.

  • Chunin
  • **
  • Posts: 251
  • Chakra -3
    • Xbox, PS3
  • Referrals: 0
    • View Profile
  • CPU: AMD Athlon X2 4800+ @3.1GHz
  • GPU: PNY GeForce 8800GT (740/1850/2260)
  • RAM: 2x1GB GSkill + 2x512MB Kingston Hyper-X
Re: Nvidia GeForce 8800 GTX Details Unveiled
« Reply #63 on: November 07, 2006, 10:42:15 PM »
LAUNCH TOMORROW FELLAS!!! (is still tomorrow right?) i cant wait for OFFICIAL benchmarks.....
 

Offline W1nTry

  • Administrator
  • Akatsuki
  • *****
  • Posts: 11329
  • Country: tt
  • Chakra 109
  • Referrals: 3
    • View Profile
  • CPU: Intel Core i7 3770
  • GPU: Gigabyte GTX 1070
  • RAM: 2x8GB HyperX DDR3 2166MHz
  • Broadband: FLOW
  • Steam: W1nTry
  • XBL: W1nTry
Re: Nvidia GeForce 8800 GTX Details Unveiled
« Reply #64 on: November 08, 2006, 09:06:34 AM »
Well it is officially tomorrow... I mean today... I mean NOW... either way here is a SPEW of benchies.. so hold on tight this is gonna get CRAZY!!!!
Quote
Leadtek, EVGA Geforce 8800 GTX tested, even in SLI

First INQpression The fastest graphic thing on earth

By Fuad Abazovic: Wednesday 08 November 2006, 11:09

WE TALKED a lot about G80. It has been in the news for a while and today it is finally ready to launch. It had its last minute tweak-up, as Nvidia says that one of its resistors was too weedy. But the firm and its partners fixed that and now the cards that you are about to buy should be trouble-free.

The G80 is a totally new chip, 90 nanometre, with 681 million transistors and manufactured at TSMC. It is the largest graphics chip built to date, shielded in a big metal heat spreader. The fastest version is clocked at 575MHz while the GTS version works at 500MHz.

Nvidia did a good job and it is the first DirectX 10 chip on market. It is also the first to support the 'unified' marchitecture. Nvidia claims that its chip is completely unified. The same pipeline calculates vertex, pixel and geometry Shader information. Nvidia claims 128 processing streams for its faster version of G80, the Geforce 8800 GTX, while the second in line, the Geforce 8800 GTS, has 96.

Geforce 8800 GTX has a core working at 575MHz. GDDR 3 memory works in a rather odd 384-bit mode and it has twelve chips, totally of 768MB of memory. While we all expected 1024 would be the next step, Nvidia decided to go for 768MB due its different memory controller. Nvidia clocked the memory at 1800MHz with a totally bandwidth of respectable 86.4 GB/s and a fill rate of 36.8 billion a second. The Geforce 8800 GTX is 28 centimetres long and is the longest card we've had in our hands. Only the Geforce 7900 GX2 was longer but we never managed to get one of those.

Nvidia's unified parallel Shader design has 128 individual stream processors running at 1.35GHz. Each processor is capable of being dynamically allocated to vertex, pixel, geometry or physics operation. We don’t have any idea where this 1.35GHz number comes from but the card renders fast that's all you should care about now. And of course if supports DirectX 10, an exclusivity for Vista only.



Nvidia finally made a chip that works with FSAA and HDR. It supports the new 128-bit HDR. This one handles 32 bits per component and produces better quality. The Geforce 8800 GTX handles 24 ROPs (Raster Operation Units) or, should we say, can render 24 pixels per clock, while the Geforce 8800 GTS can render 20 only. You'll need a 450W or higher PSU to power the card with 30A 12V current so a quality 450W will do. Our 8800 GTX SLI worked with 700W OCZ GameXstream PSU so we can recommend this one.

Two power six-pin connectors means that the card gets 2x75W from the cables plus an additional 75W from the PCIe bus. This brings total power consumption to an earth-heating 225W.

Geforce 8800 GTX has a dual slot cooler - massive and heavy but it does the job. The card always worked around 55Celsius in 2D mode and a bit higher in 3D.

We will tell you more about Geforce 8800 GTS in the separate part that will follow. We had two cards on test.



You guys do realize the ramifications of this right? DIFFERENT BRAND SLI???

The first one to arrive in our hands was an EVGA Geforce 8800 GTX with a brand-new ACS3 Cooler. EVGA dared to change the Nvidia holy cooler and made some modifications to it. It added a huge metal heat spreader on the back of the card and it covered Nvidia heatpipe cooler with a massive EVGA branded piece of squared metal. It is as long as the card itself but you won’t have any troubles to connect the power cords - the cards have two of them or the two SLI connectors.

Nvidia decided to put two SLI connectors on the top of the card and G80 works with the existing cables that you get with your motherboard. The only trouble is that Nvidia doesn’t bundle an additional SLI cable so, if you have a single motherboard, you will end up with a single cable. As we had two in the lab we just borrowed one SLI cable from the other board. The card takes two slots and you need a bigger SLI connecting cable to put them together. We tried it and it works in SLI too.

The ACS3 cooler is actually more efficient that Nvidia's reference cooler as the card works at 55C in 2D mode while the Nvidia reference cooler cools the card to 60 Celsius only. This is what makes the difference between EVGA card and the rest of the cards based on the reference cooler design.

The retail box is super small. We could not believe that EVGA managed to pack the card in such a small package and still you have all the CDs and cables you need.

Second to arrive was a Leadtek Geforce 8800 GTX card and the moment we got it we just had to test it in SLI, of course. As we don’t have any support from Nvidia we had to figure out how it works, but we did it.

Leadtek Geforce 8800 GTX is packed in a nice widee retail box. It includes two games, Spellforce 2 and Trackmania Nation TMN, along with a driver CD and a bonus software pack, including Adobe reader and Power DVD 6 ORB. It has a single DVI to VGA dongle, S-video to HDTV cable including S-video out. It has a two Molex to six-pin power connectors and that is all. This is all you need. The card supports HDCP and the driver CD also includes Vista drivers, we didn’t try it yet and Leadtek puts an extra touch and at least brands Nvidia driver as Leadtek Winfast ones. They work flawlessly and they told us that the temperature in 2D mode is around 55 Celsius for EVGA or 60 Celsius for Leadtek. The Leadtek card heated up to 75 Celsius in the open, out of the case environment. EVGA ACS3 works at around 70 Celsius after a heavy 3D load.

The second smaller chip on a PCB is TMDS display logic as Nvidia either could not put it in G80, had problems with the integrated TMDS. We believe that the chip would be too big to have it all and Nvidia decided for a cheaper two chip approach.

The driver in both cases has a few new features, including support for 16X Anisotropic filtering and 16X FSAA in a few modes. The big news is that Nvidia finally supports FSAA 8X and 16X. Nvidia's Luminex is a marketing name for incredible image quality that includes support for 16X FSAA, 128-bit HDR and support for 2560x1600 resolution with a high frame rate.

The new FSAA modes include 8x, 8xQ, 16x, and 16xQ. The 8xQ and 16xQ are brand new but we don’t know what Q here stands for, we will try to play with it at a later date. We didn’t get the cards early enough to test everything we wanted.



Benchmarketing
We used :
Foxconn C51XE M2aa Nforce 590 SLI motherboard
Sapphire AM2RD580 with SB600 board for Crossfire
Athlon FX 62 2800 MHz 90 nanometre Windsor core
2x1024 MB DDR2 Corsair CM2X1024-6400C3 memory
Seagate Barracuda 7200.9 500GB SATA NCQ hard drive
Thermaltake Mini Typhoon Athlon 64/X2/FX cooler and Intel CPU's
OCZ 700W GameXstream power supply

For ATI cards we used 6.10 drivers the most current available, G80 based cards 8800 GTX used a driver supplied on a CD version 9.6.8.9 and a 91.47 drivers for Gainward 7950 GX2 card.

The power supply was enough for SLI as well. We plugged both cards in Foxconn C51XE motherboard and we have to mention that the retention mechanism on this board sucks. It works flawless but it is very hard to unplug the card as long as the 8800 GTX.

We patched the cards with two SLI connectors from two different boards, installed the drivers for both cards, restarted and Voala, it works. SLI is enabled and we did a few benchmarks. The scores can be even higher when you use Core 2 Duo or Quad CPU but at press time we decided to use our reference platform based on FX62 CPU.

Composite
Figures                        3DMark 03          3Dmark 05             3Dmark 06
3Dmark 03

ATI X1950XTX                  19741                12348                    6283
650/2000 MHz    

ATI X1950XT                    32951                15725                    9909
650/2000 MHz
Cross Fire    
 
Gainward BLISS              29758                 13633                    8083
7950GX2 PCX
500/1200MHz    

EVGA eGeforce               30485                 15210                    9814
8800GTX ACS3
575 / 1800 MHz    

EVGA-Leadtek                49390                 16374                    10974
8800GTX SLI 2x 
575 / 1800 MHz    
     
--> Game benchmarks removed (layout problems in the cut and paste) see here for full details

We started with 3Dmark03 and already saw the light. A single EVGA G80 card, EVGA eGeforce 8800GTX ACS3 575/1800MHz scores 30485. it is just slightly faster than the 7950 GX2 card, 2500 slower than Crossfire and more than 10000 faster than a single fastest X1950XTX card.

It gets even better as the EVGA-Leadtek 8800GTX SLI 2x 575/1800MHz combo scores 49390, almost 50K. You'll need a faster or overclocked CPU for 50K, it is simple as beans. SLI is sixty-two per cent faster than a single card. A single G80 is fifty-four per cent faster than an X1950XTX. It scores 194 frames in game 3, just a bit slower than two ATI cards patched together. It is three times faster than Crossfire set up in Pixel Shader 2.0 test. It scores 1125.5 FPS, versus 373.3 the score of Crossfire. It is three times faster or 301 percent. Amazing isn’t it? Vertex Shader test is twice faster on G80 than on ATI's faster card.

A single G80 beats ATI by almost 3000. SLI beats Crossfire by only 600 marks but it is still faster. 16300 is a great score for this test. Complex vertex Shader is almost twice as fast as ATI's card.

Now to the 3Dmark06, you score 10974 with two cards or 9814 with a single card. This test is super CPU bounded. We will do some testing with Core 2 Duo and overclocking as we believe we should be reaching 13500 or more with a better or overclocked CPU.

EVGA eGeforce 8800GTX ACS3 575/1800MHz scores is eighty-four percent faster than the ATI's X1950XTX in Shader model 2.0 test and seventy-one in Shader model 3.0 / HDR testing. It really smashes the competition.

Doom 3 scores around 135 FPS at first three resolutions and drop to 125 at the 20x15, SLI even then scores 135 so this is clearly CPU limited. EVGA eGeforce 8800GTX ACS3 575 / 1800 MHz is almost 80 percent faster at 2048x1536. Doom 3 with effects on scores 130 FPS at first two resolutions and later starts to drop but is still faster than 7950 GX2 cards all the time. SLI doesn’t drop at all at first three resolutions only slightly drops at 20x15.

FEAR scores are up to sky with the weakest score of 95 FPS at 20x15, faster than Crossfire in the last two resolutions and from GX2 and X1950XTX at all times. It is up to 66 percent faster than X1950XTX and 68 percent from the Gainward 7950 GX2 card. SLI is again 68 percent faster than Crossfire a massive difference.

EVGA eGeforce 8800GTX ACS3 575 / 1800 MHz scores 53 FPS even at highest resolutions with all the effects on and 4X FSAA and 16X Aniso and much more at lower ones. Crossfire beats a single card by three frames at 16x12 and eight at 20x15 but a single card loses by some forty percents. SLI is twice as fast as 7950GX2 and 57 percent than Crossfire.

Quake 4 runs up to forty seven frames faster on G80 and SLI gets the score better but not much. G80 is always faster than GX2 and Crossfire. Quake 4 with FSAA and Aniso runs some forty percent faster than ATI's fastest card and 30 per cent in Crossfire versus SLI G80.

Far Cry with effects on performance is matched with both G80 and X1950XTX while the SLI can outperform both in 20x15.

Serious Sam 2 is faster on ATI in first two resolutins by three frames while EVGA eGeforce 8800GTX ACS3 575 / 1800 MHz wins by eight, or fifteen frames at higher resolutions. SLI is 23 per cent faster than a single G80 and 43 percent faster than X1950xTX.

Serious Sam 2 with FSAA and Aniso on always scores faster at EVGA eGeforce 8800GTX ACS3 575 / 1800 MHz card but not that much, some nine to ten percent while the SLI is 54 per cent faster than a single card and sixty eight percent than ATI's card.

We decided to reintroduce Oblivion powered with Fraps and let me conclude it with it. We tested Nvidia cards in 8X FSAA mode in our custom developed test, a super-intensive routine with all the settings on. At 8xFSAA + 8xAniso and HDR SLI only makes a big stand over a single card in 16x12 and especially at 20x15. It is 21 per cent faster or 67 at 20x15.

ATI could not do more than 6X FSAA and even then it runs slightly slower than a single card. At 20x15 it is unplayable with a few FPS but at 4X FSAA it scores 27.26 FPS. This is still the game that can bring the G80 to its knees and SLI can make it barely playable with all effects on.

Before we finished up testing, we decided to do a quick Quake 4 16X and 16X Q FSAA test. Q mode is always about 11 percent slower than the 16X mode and maybe that Q stands for Quality. Not sure here as we didn’t get any documentation. We will figure it out later. As for the 16X scores it is two or more times faster and drops to 29 FPS at 20x15, but if you ask me that should be enough. If you want 60 FPS Quake 4 and G80 scores that much up to 12x10 the resolution of most 19 inch TFT gaming displays out there.

In Short
Basically, at this time I can say G80 rocks. I love it, it's great and I don’t have anything bad to say about it. Pick up your wallet if you want to spend €/$600 on a graphic card and enjoy it. You can whine that it's long, it can get warm but it's all nonsense, as this is the fastest card on planet Earth and you have to live with the heat and size. It is super stable, that is what matters the most and we didn’t have any issues with stability.

You can set the quality to 16X Anistropic filtering, 16 times full-scene anti-aliasing and have HDR 128 bit at the same time. It looks great and even the FSAA and HDR works on this card, it didn’t work on any Geforce 7 series.

Nvidia can run 16X. It affects your performance but it is still playable, at least in lower resolutions. We will look at the picture quality in the next few days as we are far from being done from analysing G80.

Overall, both EVGA Geforce 8800GTX ACS3 575/1800MHz and Leadtek Winfast PX8800 GTX TDH are great cards and get either of them you can. The only advantage is that EVGA temperature can get a bit lower but just a bit. And the good thing is that they both work in SLI together. So maybe you can buy one of EVGA and one Leadtek and be like us. :)

If you want the fastest card on planet then you have to buy G80. If you want the fastest gaming setup on the planet then you have to buy two Geforce 8800 GTX cards and put them together in SLI. Not much more to say. Nvidia did it and this is the fastest card for some while, at least until the beginning of next year. µ

Reviewed and tested by Sanjin Rados and Fuad Abazovic

And NOW the NEW chick for the Geforce 8xxx series... introducing Adriane... you may know her for being Playboy's Feb 2006 Cover, or better know from the VH1 Reality TV show 'My Fair Brady'. Either way like her or not, the rendering is amazing have a look...



And Lastly... with all good things, there is ALWAYS a pinch of salt, to be fair of course. Here is a slight downside to the more workstation type application that apparently Nvidia has been promising a long time now ALA Multi-display in SLI mode...
Quote
Nvidia's dual-display SLI implementation is bonkers

Buy three graphic cards!!

By Theo Valich: Wednesday 08 November 2006, 10:33

WE HAVE BEEN asking the Graphzilla about SLI and multi-monitor support ever since it came out with the SLI technology.

The continuous mantra previously growled at us was that it was going to be solved in future revisions of drivers. Then, the 7800GTX series came out and the mantra was changed to: watch out for future revisions of hardware. And now, with the 8800 series coming out, this is changing to: Buy a third graphics card.

Nvidia is launching its third-generation SLI-capable hardware, and users can still forget about multi-monitor support when SLI is enabled. We know the problem lies not in the driver, but in the way two displays are being set-up, GPU-wise.

The presentation about the Nforce 680i chipset, which is being launched today, contains an explanation as to why Graphzilla is now offering three long PCIe slots for placing three graphics cards on the motherboard.



The possibilities offered by the third PCIe slot are six monitors (which is nice), SLI Physics (marked as "Future Support") and SLI + Dual Display support - also marked "Future Support".

Now, wait a bit: Dual Display support and three graphics cards? We won't go into the power consumption debate, but this seems rather excessive and expensive.

As you can see in picture above, two of the graphic cards are today's babies, the 8800GTXs, while that one in the middle is 7900GT, probably playing a role of future DX10 mainstream graphics card. And in the future, dual display support just may be added. As SLI-Physics will.

Of course, since the third PCIe connector will be a feature of 680i only, you really need to forget about lower-end chipsets such as also introduced 650i. Bear in mind that the third graphics card will be pretty much idle, since this is not for enabling gaming on the second monitor, but rather enabling Windows desktop. You know, heavy fillrate hitting stuff like Windows Explorer, Skype, ICQ, MSN, and oh my, incredibly complex TeamSpeak screen. At least, according to powerpointery.

Nvidia should either stop talking about dual display support or pull a George Broussard and utter the words: when it's done. µ

Please do visit the link this was taken from here as there are game FPS benchies in the most popular and taxing titles presently out. You'll get an appreciation for the actual in-game performance and the RIDICULOUS res that these beasts in SLI can run at EVEN with FSAA+Ansio+128-bit HDR and all the bells and whistles the game has to offer.
« Last Edit: November 08, 2006, 09:13:34 AM by W1nTry »

Carigamers

Re: Nvidia GeForce 8800 GTX Details Unveiled
« Reply #64 on: November 08, 2006, 09:06:34 AM »

Offline Spazosaurus

  • Dr. Herp Derpington
  • Administrator
  • Akatsuki
  • *****
  • Posts: 7685
  • Country: tt
  • Chakra 52
  • Referrals: 3
    • View Profile
    • The Awesome Company
  • CPU: i5 3470
  • GPU: GTX 780
  • RAM: 8GB Corsair
  • Broadband: Blink 2Mb + Flow 20Mb
Re: Nvidia GeForce 8800 GTX Details Unveiled
« Reply #65 on: November 08, 2006, 10:48:29 AM »
I said GODDAMN!!!! LOOK AT THEM NUMBERS!!!! WTFOMGPWNAGE.

Offline TrinireturnofGamez

  • AdvancedTactics
  • Akatsuki
  • *
  • Posts: 3458
  • Chakra 4
  • Referrals: 0
    • View Profile
Re: Nvidia GeForce 8800 GTX Details Unveiled
« Reply #66 on: November 08, 2006, 07:35:51 PM »
www.techreport.com has a review , DAMN nvidia came good , ati gonna have to pull off a small miracle to beat them. Also shows Nvidia actually cares a bit about us , the die is so huge ,650 million transistors ,that they only yield 80 per wafer , but they don't charge anything stupid for it . 
      Image quality beats ATI , alot faster , good power, if ATI doesn't come better i'll see about getting one of these next year when they move to 65nm and get small enough to fit in meh case.
   Only flaw : that chick not so hot : P .
http://freetrinipoetry.blogspot.com/

Core 2 duo E6600
Asus mobo
Radeon HD 4770
2 gigs DDR2 667 + 2 gigs DDR 800 OCZ

Offline W1nTry

  • Administrator
  • Akatsuki
  • *****
  • Posts: 11329
  • Country: tt
  • Chakra 109
  • Referrals: 3
    • View Profile
  • CPU: Intel Core i7 3770
  • GPU: Gigabyte GTX 1070
  • RAM: 2x8GB HyperX DDR3 2166MHz
  • Broadband: FLOW
  • Steam: W1nTry
  • XBL: W1nTry
Re: Nvidia GeForce 8800 GTX Details Unveiled
« Reply #67 on: November 08, 2006, 08:36:31 PM »
Well Trini alot of the things you said are a no brainer. Don't you suppose the 'next gen' part would beat out the current gen? DUH.. making the point of better image quality and ATI having to come good is like saying humans need to breathe. NO $H!T... geezus. As for how good ATI has to come, many suspect that R600 will beat it based on priliminary specs, personally I will wait and see. I will hang on to my faithful rig until K8L and R600 cause in the least the prices of these ridiculous performance parts will drop by then. But I eh go wait much beyond that... as it is that FAR off.

Offline W1nTry

  • Administrator
  • Akatsuki
  • *****
  • Posts: 11329
  • Country: tt
  • Chakra 109
  • Referrals: 3
    • View Profile
  • CPU: Intel Core i7 3770
  • GPU: Gigabyte GTX 1070
  • RAM: 2x8GB HyperX DDR3 2166MHz
  • Broadband: FLOW
  • Steam: W1nTry
  • XBL: W1nTry
Re: Nvidia GeForce 8800 GTX Details Unveiled
« Reply #68 on: November 08, 2006, 08:54:07 PM »
It just keeps getting better and better:
Quote
First watercooler for Geforce 8800 arrives

And its a beauty

By Theo Valich: Wednesday 08 November 2006, 17:40

SOME CRAZY GERMANS from AwardFabrik.de have shown the world's the first waterblock for the yet-to-be-launched 8800 series.

The prototype comes from the company named SilenX, and it looks menacing. Preliminary testing has relealed that the temperature of the GPU drops sharply by 30 degrees Celsius, from 81-85C down to a 50C range. Overclocking scores also show significant improvement.



Before the waterblock was installed, the GPU was running in high 90s at a 625MHz clock. With the watercooler you can enjoy a 25degC lower temperature and a stable clock of 660MHz for the GPU, which is almost 100 MHz faster than a default clock.

Of course, getting a GPU to run at 660MHz yields in Shader clock of over 1.GHz (a 90nm part with almost 700 million transistors has parts that run at 1.5GHz) and not using AMD or Intel transistor technologies only gives a tribute to engineers at TSMC and Nvidia, who created a small semico miracle.



The reason the cooler works so well is its backside. While the front looks pretty much like many of the waterblocks out there, turning the back reveals the German precision in designing this monster. All of the elements on the PCB are cooled: the G80 GPU, Samsung GDDR3 memory, nV I/O chip and the power regulation.

This enables the G80 PCB to become a warm-to-touch instead of scorching-hot board, especially on the backside. The values we mentioned earlier are for the GPU alone, while the Windows thermal monitoring did not show a sharp decline in the temperature of power regulation, which in the end greatly reduce the stress on the PCB and will probably lead to a longer life of the board.

The cooler will be available soon, and can be ordered using our complementary L'INQ. We're welcoming a test unit, so that we can either confirm or negate the scores in independent testing. µ

BUT WAIT THERE IS MORE BEAUTY!!!!

Quote
Sparkle gives the G80 a cold reception

First INQpression Sparkle's Calibre P880+ the first Peltier Junction cooled GF 8800GTX card

By Nebojsa Novakovic: Wednesday 08 November 2006, 18:13

BIRTHDAYS ARE FUNNY affairs, sometimes full of surprises. My birthday this year had no surprises, it just happened to be the official birth date of Nvidia's G80, also known as GeForce 8800 (GTX the preferred variety).

By now, all is known about its Lumenex engine with unified shaders and DirectX 10 Shader 4 support, GigaThread array of 128 streaming processors, and (finally) Nvidia's own "Quantum Effects" physics acceleration, not to mention the somewhat odd 384-bit memory bus with 768 MBytes of GDDR3 RAM there, providing some 86 GB/s of bandwidth, 7x that of Athlon AM2 socket or Intel Core 2 overclocked to FSB1600. Overall, undisputably the world's fastest GPU right now, at least until DAAMIT R600 comes out in, hopefully, a few months.

Sparkle has also joined the Nvidia team fray in announcing a new G80 series of GF8800GTX and GTS cards, however, in this case, they are coming up at the very start with a (slightly) modified offering as well. There are two GeForce 8800GTX models coming out from them - one is a standard reference model, the other the high-end Calibre model with a different cooling design (see photo). Sparkle claims that the Calibre P880+ 8800GTX cooler has better cooling performance, yet lower noise than the reference design, and, once Nvidia allows G80 family overclocking, it might be of more help there, too.



Looks-wise, I'd say that the reference design is more polished, but again, the huge heat sink on the Calibre P880+, with its dual fans, gives an image of sheer power, doesn't it? But it's what is under those two fans that gives thess cards the extra oomph (once unlocked) and lower noise with cooler operation: an active Peltier Junction cooling engine, with its own 4-pin power supply (on top of the two 6-pin feeds for the card itself!). Co-developed by Sparkle and TEC, the system is second only to the best water cooling systems when it comes to the heat removal efficiency and reduced noise operation - after all, you don't want the card fan noise to overpower even the machine gun fire in your favourite 3-D shootout.

Inside the huge cooling system is a thermal electric cooler, quad heatpipes and (quite slim) dual 8cm fans. The Peltier effect uses a transducer to produce a cold source that lowers the temperature of the GF8800GTX GPU. A transistor sensor placed near the GPU detects its temperature, while software monitors the overall temperature on the video card. When a certain temperature is reached near the GPU, the transducer will turn on the cold source. When the GPU is in idle mode, the transducer will turn off automatically. The dual 8 cm fans fan push the hot air to the exterior of the card via the quad heatpipes, helping to eliminate the heat remaining in the video card system. The fans have an exhaust cover to minimise the noise.

The first sample that I got did not have any heat sinks on the 12 memory chips or the I/O chip on the graphics card. I volunteered and obtained 13 pieces of Zalman's good quality GDDR memory heat sinks and, to avoid removing the whole Peltier assembly, spent over an hour with a pincer, positioning and fixing these heat sinks on the memory chips - under the Peltier cooler! See the photo with the end result.



Now, I ran both cards on two different CPU configurations - one the Intel "Kentsfield" Core 2 QX6700 running at 3.20GHz with FSB1066, whose four cores proved useful in feeding the GPU for 3DMark06 for now at least (till some actual 4-core optimised games come out), and the old Intel "Presler" Pentium XE 965, running at 4.27GHz with its two cores, again on FSB1066. Both of these are the highest frequencies I could run these CPUs on and reliably complete the CPU portion of 3DMark06 repeatedly without a hitch, even when using Zalman 9700 high-end cooler. The CPUs were using the same platform - Intel D975XBX2 with 1GB of Corsair XMS-5400UL memory, enabling 3-2-2-7 low latency at up to around 670 MHz.

The performance was about the same - after all, Nvidia doesn't allow overclocking right now (feel free to share with us your experiences in overcoming this overclocking barrier here) and the GPU and RAM parts are the same. However, there was a definite difference in the noise. Sparkle claims to have reached over 20 deg C heat benefit (60 C working temperature instead of 80C) and 12 decibels less working noise compared to the reference card. I didn't measure the precise sound or heat levels yet, but the card was substantially quieter and, even after several rounds of 3DMark 06 with and without AA, anisotropics and so on, the huge Peltier block stayed coolish to the touch.

Also, I looked at the power consumption during the 3DMark test runs, and there was only a few watts difference between the reference card and the P880+. Few watts more for far less heat & noise, not to mention some 60 grams less card weight? I guess the deal is good, and will be very good once the overclocking is unlocked. Most importantly, neither card took more watts than the X1900XTX running at 700/1600 speed settings, for way higher performance. In all cases, the system consumption peaked at about 230Watts as measured.

In any case, these guys offer both card versions (the reference design under the Sparkle brand) and the Peltier version under the Calibre brand - also, while the Sparkle version gets the Call Of Duty 2 game CD, Calibre has the more 'unique' Painkiller game bundled along, as a full retail set.

Here are the initial 3D Mark06 results on the Kentsfield and Presler:



As you can see, this GPU really makes the difference between the CPUs, even in plain 3-D graphics. If you want to spend upwards of US$ 700 on a card like this, you better allocate a good sum for the right CPU (quad core QX6700 or QuadFather isn't a bad choice) and, of course, the right cooling and the right mainboard. Since these GPUs exert strong pressure on the CPU, PCI-E subsystem and memory bandwidth consumption, I'd never go with a dual GF8800GTX SLI configuration in a dual PCI-E X8 configuration, like say Intel 975X chipset. It simply has to be the new Nforce 680i (or, if you can make it work, one of those rare ATI RD600 chipset based boards).

Sparkle Calibre P880+ is worth checking out for any high-end "enthusiast" keen on G80 - I like the idea of a Peltier cooler on such a high end card, however I have yet to see how it will perform in long term operation. Also, trying it in SLI mode with heavily overclocked water-cooled QX6700 on the Nforce 680i, with much faster FSB and memory system to feed the twin G80's, would be tempting. But well, that is for next week. µ

Offline .:Jedi:.

  • Chunin
  • **
  • Posts: 251
  • Chakra -3
    • Xbox, PS3
  • Referrals: 0
    • View Profile
  • CPU: AMD Athlon X2 4800+ @3.1GHz
  • GPU: PNY GeForce 8800GT (740/1850/2260)
  • RAM: 2x1GB GSkill + 2x512MB Kingston Hyper-X
Re: Nvidia GeForce 8800 GTX Details Unveiled
« Reply #69 on: November 08, 2006, 09:09:52 PM »
 

Offline .:Jedi:.

  • Chunin
  • **
  • Posts: 251
  • Chakra -3
    • Xbox, PS3
  • Referrals: 0
    • View Profile
  • CPU: AMD Athlon X2 4800+ @3.1GHz
  • GPU: PNY GeForce 8800GT (740/1850/2260)
  • RAM: 2x1GB GSkill + 2x512MB Kingston Hyper-X
Re: Nvidia GeForce 8800 GTX Details Unveiled
« Reply #70 on: November 08, 2006, 10:18:12 PM »
http://www.nvidia.com/page/home.html

HOLY F**K!!! Let it load, dont take ur eyes off the page. look at the woman......
 

Offline Saxito Pau

  • Global Moderator
  • Akatsuki
  • *
  • Posts: 3848
  • Country: tt
  • Chakra 15
  • Worms will never die!
    • Original NES
  • Referrals: 2
    • View Profile
  • CPU: Intel Core i7-3770
  • GPU: EVGA GTX 970 SC ACX2.0
  • RAM: Crucial Tracer 16GB DDR3-1600
  • BattleNet ID: SaxitoPau#1996
  • Broadband: Flow 60Mbps
  • Steam: Saxito Pau
Re: Nvidia GeForce 8800 GTX Details Unveiled
« Reply #71 on: November 08, 2006, 11:14:14 PM »
I'm sorry but I must say the following:

1. I am DEFINITELY going to enlarge my e- penis next year
2. The GeForce 8800 if F***ing VIAGRA (and then some ) for your PC
3. I will import TWO and sell one for like TT$7000 (yeah, frickin Wizz style pricing!)
4. The GP-F**king-U will e-RAPE and e-sexually abuse any Conroe left near it!!

WTMC!!
God is dead.

Offline Arcmanov

  • Administrator
  • Akatsuki
  • *****
  • Posts: 10642
  • Country: tt
  • Chakra 126
  • Gamer/Enthusiast
    • :n64: :gcn: :dreamcast: :xbox360: :computer:
  • Referrals: 1
    • View Profile
    • Arcmanov's rig
  • CPU: Intel Core i7 5820K @ 4.3 GHz
  • GPU: 2 x [Gigabyte] GeForce GTX 1070
  • RAM: 4 x 8GB G.Skill.TridentZ RGB DDR4-2400
  • Broadband: :flow:
  • MBL: LG V20
  • Origin ID: Arcmanov
  • PSN: Arcmanov
  • Steam: Arcmanov
  • XBL: Arcmanov
Re: Nvidia GeForce 8800 GTX Details Unveiled
« Reply #72 on: November 09, 2006, 12:17:18 AM »
One from guru3d

http://www.guru3d.com/article/Videocards/391/

That review is THE most extensive I have EVER SEEN for a graphics card.
For ANY component for that matter.
You will not find a more comprehensive review than this one for a while I'm sure.

The performance numbers are simply STAGGERING.
WELL DONE NVIDIA!!!  A THOUSAND 'Hoshi-Toshis'!!!      *bows and kisses Emperor Nvidia's ring*

ALL HAIL THE NEW KING!!!!  ^_^
Systems United Navy - Accipiens ad Astra


Offline .:Jedi:.

  • Chunin
  • **
  • Posts: 251
  • Chakra -3
    • Xbox, PS3
  • Referrals: 0
    • View Profile
  • CPU: AMD Athlon X2 4800+ @3.1GHz
  • GPU: PNY GeForce 8800GT (740/1850/2260)
  • RAM: 2x1GB GSkill + 2x512MB Kingston Hyper-X
Re: Nvidia GeForce 8800 GTX Details Unveiled
« Reply #73 on: November 09, 2006, 01:28:42 AM »
lol Yep. that review was very well put together. im still in awe as to how much a performance jump this card manages! and its on newegg.......

http://www.newegg.com/Product/Product.asp?Item=N82E16814150205

http://www.newegg.com/Product/Product.asp?Item=N82E16814143075

soon......
 

Offline Crixx_Creww

  • Akatsuki
  • *****
  • Posts: 9057
  • Country: 00
  • Chakra -12
  • ANBU OF THE HIDDEN VILLAGE FOAK
    • Atari 2600.
  • Referrals: 11
    • View Profile
    • www.crixxcrew.com
  • CPU: Intel Q6600 @3.2 Ghz
  • GPU: Nvidia Xfx geforce 9800GTX+
  • RAM: 8 Gigs Mixed kingston and corsair ddr2
Re: Nvidia GeForce 8800 GTX Details Unveiled
« Reply #74 on: November 09, 2006, 06:34:23 AM »
guru 3d site

anyone else notice that the 7950 is now a sub 300 dollar card!

Offline Prowl

  • AdvancedTactics
  • Jonin
  • *
  • Posts: 503
  • Country: 00
  • Chakra -25
  • Referrals: 0
    • View Profile
  • CPU: i7 3930k
  • GPU: Asus 670
  • RAM: ddr3 2000
  • Broadband: FLOW
  • Origin ID: Acurus
Re: Nvidia GeForce 8800 GTX Details Unveiled
« Reply #75 on: November 09, 2006, 06:40:57 AM »
Overall I'm actually not impressed.

Truthfully what has me disapointed is all the damn heat and power usage lol.

Huge overly power hungry card that is over kill. when they have single slot cooler 65nm versions I'll think of switching.

Also ATI has 2.5x the 1950 as it's goal, so the ubberness of the 8800 won't last to long, ATI now has clear targets to meet and beat.
Lian LI PC70B
Corsair 850w psu
Asus gtx670 direct cuii
16 gigs ddr3 2000
Asus P9X79 WS PRO
Intel i7 3930k @ 3.9
128 gig ssd boot
1 gig programs
3 gig data
2x 1500 gig sata2 raid 1
Windows 7 x64
SUSE x64
Samsung 21" 215TW Wide screen LCD
Dell U2711 27" wide screen LCD

Offline Crixx_Creww

  • Akatsuki
  • *****
  • Posts: 9057
  • Country: 00
  • Chakra -12
  • ANBU OF THE HIDDEN VILLAGE FOAK
    • Atari 2600.
  • Referrals: 11
    • View Profile
    • www.crixxcrew.com
  • CPU: Intel Q6600 @3.2 Ghz
  • GPU: Nvidia Xfx geforce 9800GTX+
  • RAM: 8 Gigs Mixed kingston and corsair ddr2
Re: Nvidia GeForce 8800 GTX Details Unveiled
« Reply #76 on: November 09, 2006, 06:47:50 AM »
well for one, the 8800 uses less power and produces less heat than a x1900xtx or wahtever teh a$$ ati calls that thing

at first i like most people thought the huge length and width was a bad thing
but thinking bout it, that will be the future of cards, take up more realestate and do bigger better things

the cpus are getting smaller, and the cards are getting much larger its not that bad a trade off.

Offline W1nTry

  • Administrator
  • Akatsuki
  • *****
  • Posts: 11329
  • Country: tt
  • Chakra 109
  • Referrals: 3
    • View Profile
  • CPU: Intel Core i7 3770
  • GPU: Gigabyte GTX 1070
  • RAM: 2x8GB HyperX DDR3 2166MHz
  • Broadband: FLOW
  • Steam: W1nTry
  • XBL: W1nTry
Re: Nvidia GeForce 8800 GTX Details Unveiled
« Reply #77 on: November 09, 2006, 08:30:39 AM »
Before I go speechless lemme say R600 wil prolly pwn G80 bases when it arrives AND should consume less power as it's a 80nm part that is actually smaller in transistor count yet so far superior in specs. Anyways the G80 is still nothing to laugh at... now for the speechless part... pictures are worth THOUSANDS of words... and THOUSANDS these are....



OMFGWDFWTFPWNAGE..... My jaw is still on the page I got those from.. you can see them here

I wonder if there are laws against intertechnology marriage... LMAO!!!! PWNED!!!!

Offline Arcmanov

  • Administrator
  • Akatsuki
  • *****
  • Posts: 10642
  • Country: tt
  • Chakra 126
  • Gamer/Enthusiast
    • :n64: :gcn: :dreamcast: :xbox360: :computer:
  • Referrals: 1
    • View Profile
    • Arcmanov's rig
  • CPU: Intel Core i7 5820K @ 4.3 GHz
  • GPU: 2 x [Gigabyte] GeForce GTX 1070
  • RAM: 4 x 8GB G.Skill.TridentZ RGB DDR4-2400
  • Broadband: :flow:
  • MBL: LG V20
  • Origin ID: Arcmanov
  • PSN: Arcmanov
  • Steam: Arcmanov
  • XBL: Arcmanov
Re: Nvidia GeForce 8800 GTX Details Unveiled
« Reply #78 on: November 09, 2006, 08:55:31 AM »
^^^ Thousands you say.....

I can't find any.  I'm truly speechless.
That system is every PC gamer's wet-dream/fantasy.
Systems United Navy - Accipiens ad Astra


Offline W1nTry

  • Administrator
  • Akatsuki
  • *****
  • Posts: 11329
  • Country: tt
  • Chakra 109
  • Referrals: 3
    • View Profile
  • CPU: Intel Core i7 3770
  • GPU: Gigabyte GTX 1070
  • RAM: 2x8GB HyperX DDR3 2166MHz
  • Broadband: FLOW
  • Steam: W1nTry
  • XBL: W1nTry
Re: Nvidia GeForce 8800 GTX Details Unveiled
« Reply #79 on: November 09, 2006, 09:00:47 AM »
A review of the MUCH MORE pocket friendly GTS8800 can be found here and here. This looks to be many a gamer sweet spot as it's more affordable and still outperforms to a great extent the current gen cards... oh and @ crixx that 'trade off' you refer to ain't all that balanced as the power these cards consume is increasing faster than the power savings of the CPU... so its only a matter of time before these GPUs have to cut back like the CPUs in power but still maintain the increase in performance... after all WTMC yuh gwon do when that light bill hit yuh from using G80 in SLI with a 1KW PSU?

Carigamers

Re: Nvidia GeForce 8800 GTX Details Unveiled
« Reply #79 on: November 09, 2006, 09:00:47 AM »

 


* ShoutBox

Refresh History
  • Crimson609: yea everything cool how are you?
    August 10, 2022, 07:26:15 AM
  • Pain_Killer: Good day, what's going on with you guys? Is everything Ok?
    February 21, 2021, 05:30:10 PM
  • Crimson609: BOOM covid-19
    August 15, 2020, 01:07:30 PM
  • Shinsoo: bwda 2020 shoutboxing. omg we are in the future and in the past at the same time!
    March 03, 2020, 06:42:47 AM
  • TriniXjin: Watch Black Clover Everyone!
    February 01, 2020, 06:30:00 PM
  • Crimson609: lol
    February 01, 2020, 05:05:53 PM
  • Skitz: So fellas how we go include listing for all dem parts for pc on we profile but doh have any place for motherboard?
    January 24, 2020, 09:11:33 PM
  • Crimson609: :ph34r:
    January 20, 2019, 09:23:28 PM
  • Crimson609: Big up ya whole slef
    January 20, 2019, 09:23:17 PM
  • protomanex: Gyul like Link
    January 20, 2019, 09:23:14 PM
  • protomanex: Man like Kitana
    January 20, 2019, 09:22:39 PM
  • protomanex: Man like Chappy
    January 20, 2019, 09:21:53 PM
  • protomanex: Gyul Like Minato
    January 20, 2019, 09:21:48 PM
  • protomanex: Gyul like XJin
    January 20, 2019, 09:19:53 PM
  • protomanex: Shout out to man like Crimson
    January 20, 2019, 09:19:44 PM
  • Crimson609: shout out to gyal like Corbie Gonta
    January 20, 2019, 09:19:06 PM
  • cold_187: Why allur don't make a discord or something?
    December 03, 2018, 06:17:38 PM
  • Red Paradox: https://www.twitch.tv/flippay1985 everyday from 6:00pm
    May 29, 2018, 09:40:09 AM
  • Red Paradox: anyone play EA Sports UFC 3.. Looking for a challenge. PSN: Flippay1985 :)
    May 09, 2018, 11:00:52 PM
  • cold_187: @TriniXjin not really, I may have something they need (ssd/ram/mb etc.), hence why I also said "trade" ;)
    February 05, 2018, 10:22:14 AM

SimplePortal 2.3.3 © 2008-2010, SimplePortal