Author Topic: ATI is smug but Nvidia's the bug in the rug  (Read 8983 times)

Offline strongton

  • Chunin
  • **
  • Posts: 309
  • Chakra 0
  • Referrals: 0
    • View Profile
    • http://
ATI is smug but Nvidia's the bug in the rug
« on: August 24, 2004, 09:47:55 AM »
Comment The future looks like Nvidia's

By Charlie Demerjian: Monday 23 August 2004, 13:05
I HAVE A KNACK for spotting trends, and the latest one I've spotted is in the graphics industry. For the last 2-3 years, basically since the first R300 benchmarks came out, it was clear that ATI was in the driver's seat. NVidia was simply not delivering at the high end, or just about anywhere else. The next refresh was more of the samr. It looked like ATI was unassailable.

Then a funny thing happened, I was walking around the Intel Developer Forum this spring, talking to people who 'knew' things, and asked them about the new crop of graphics chips. At the time, they were weeks away from coming out, the then new and yet only named by us X800 and 6800. I was fully expecting them to say that ATI was going to extend its lead as was the word on the street. Strange thing happened, they did not say that.

To paraphrase Yoda, intrigued I was. Really intrigued. Especially when ATI was first out of the gate, and the numbers showed a little advantage for ATI. I was starting to think I was wrong, and several people were more than kind enough to point this out.

When the chips launched, and real people got their hands on real parts, the story changed a little. Nvidia's response was to tout the features that it had, and ATI was lacking. People pointed and laughed, editorials were written, and the collective tech world failed to grasp what was going on. Who cares about PS3.0 when there are no games for it? Who cares about PS3.0 when MS does not have a version of DX that supports it? Geometry instancing? What is that?

ATI was rather smug, and pointed to benchmarks like Quake3. Yay, 450FPS if you spend $400 on ATI, and you only get 425FPS if you spend $400 on NVidia. Most of the people didn't really grasp the concept that the best monitors out there only hit 120Hz for a refresh rate, and both cards would more than double that with all the eye candy turned on. The point of another 25FPS in an old game with all the eye candy turned on is.......?

I was almost ready to believe, but then cracks started to show, and they validated my beliefs, ATI missed the boat, and missed it badly. Don't believe me, how about looking at it from the ATI perspective? ATI knew it was behind, and knew there was nothing it could do for a year. What do you do in this case, spin, dodge and dance.

You'll have to trust me when I say that it tried, and it tried hard, it danced up the proverbial storm. The funniest jig was when they told me with a straight face that there were situations that PS3.0 made programs slower than PS2.0, which ATI proudly featured. Yep, Ferarris get crappy gas mileage also, but I still would not trade one in for a 1990 vintage Yugo, turbo or not.

Why was ATI so afraid? It knew what was coming, and it was all in Nvidia's favor. On the face of it, ATI has HalfLife2 and NVidia has Doom3. Benchmarks show that Nvidia has a commanding lead in Doom3, bordering on the abusive. ATI has a noticeable lead in HL2, but from what I have seen, not as commanding as the lead Nvidia has in Doom.

Why is this so worrisome? Well, for two reasons, licensing and the future. Licensing is the worst one, Doom3 is an absolutely spectacular engine, head and shoulders above anything else out there. It is scalable, beautiful, has a marquee name, and is saddled with amateurish gameplay. Three out of four ain't bad, and luckily for Nvidia, gameplay is not its problem. In fact, only the first three are relevant to video.

So, one marquee game versus another, it would appear to be a tie. The thing is that the Doom3 engine will be licensed by just about everyone under the sun. The HL2 engine may be, or may not be, but people won't be clamouring for a Newell engine as rabidly as a Carmack one. While the proverbial 'people' may be wrong, they are the ones spending the money.

This leads us to the future. HL2, for all its gameplay is a last generation engine. It looks good, very good in fact, but it is a year old, and things like Far Cry and Doom have passed it by. This isn't a criticism, it is just time and evolution, I have been playing CounterStrike:Source for a few days now, and it is very very nice. Newer engines look nicer though.

In the older engines, HL2, Q3, and others, ATI has a slim advantage. On the newer ones, Nvidia rocks. Which one do you think will be more prevalent in the next few months? Which one would you license if you had to lay the money down? Which one do you think will power the next generation in greater number? Yup, ATI has a problem, and it will only grow worse the further out you look. If you want real comedy, look at the requirements for Longhorn.

But it gets worse. ATI has no PS3.0 part, and never will. The problem isn't that big if you look at it from the perspective of the highest end on the market, the pimped out Ultra-Platinum-Stupidlyexpensive Edition, those cards make up a miniscule amount of the market. They cast a great shadow, but monetarily, I would be surprised if they made money directly, but they sell a lot of mid and low end cards.

Here too, ATI is sucking wind, and has been for quite a while. ATI has a crappy line up other than the high end. This is nothing new, it started in the last generation. The capabilities of the Nvidia low end lineup hits most of the checkboxes that the high end 5950 cards do. ATI just rebadges the previous generation, and uses its numbering scheme to make it sound like they have something that it doesn't. It is more than enough to fool some of the consumers, but then again, there are a lot of machines sold with Celerons in them.

The point is not consumers, but OEMs. They are more than smart enough to actually look at the capabilities before they buy the chips by the hundreds of thousands. Worse yet, many of them sell to Joe Six-Pack by means of spec sheets more than performance. Nvidia could hit those checkboxes with the last generation, but ATI still can't. The current generation gets worse, the high end can't offer the features at all, and the mid and low end get worse from there.

As the games start coming out that use PS3.0, and people start looking for those checkboxes, one company will be there. OEMs know this, and buy accordingly. That is where the money is, and it is where ATI isn't and won't be. The last time someone missed like this badly, it took Nvidia about 2 years to recover, and 3DFx never did. I'm not intoning anything here, simply stating it, just watch and see.

Time will only make the situation worse and things are falling into NVidia's lap. If you were at E3, and looked at all the must have games, you would see that they actually do use the features that Nvidia offers. The biggest display in the largest hall was run by EA, and the game that wowed me the most was the Lord of the Rings game. Think massive army combat, effects, and loud music. It was a showcase for geometry instancing and PS3.0 effect. The new crop of games coming out in a few weeks will tell the story loud and clear.

Add in things that do not fall under the domain of the GPU itself, and it only gets worse. Nvidia is really good at making chipsets and platforms, and the next generation of NForce chipsets have some really nice looking features. ATI is still trying to figure out what the term South Bridge means. I am not trying to kick ATIs chipset efforts here, that would be too easy, no sport there. The crushing superiority of the Nforce platform allows Nvidia to do something that ATI can't, tune the chipset for the graphics card to get that extra bit of mileage.

While this alone would probably be enough to give Nvidia the crown, it has a secret weapon. Not a little pistol hidden in their boot, more like a tactical nuclear weapon called SLI. If you go to ATI headquarters, you can still see the confused look on their faces when they first heard the whistling of the incoming warhead.

SLI alone will be enough to make Nvidia own every benchmark under the sun by simply abusive margins. The early benchmarks indicate that it really is everything Nvidia said it would be. ATI's response? Well, nothing at first, I think it realises that HL2 benchmarks won't cut it this time.

It gets worse though. There has been no response at all since tha SLI announcement, and this is a classic sign of one side getting caught with its pants down. What we know of the R500 and NV50 is that they will be PS4.0 and DX(insert marketing term here) parts. ATI will never have a PS3.0 part. It probably won't have an SLI part in that generation either. How do I know?

Quite simple actually. When someone notices that they have their pants yanked down like ATI just did, they tend to deflect things to any place but where they are and to any time but now. Standard PR operating procedure here is that ATI would deflect the question, and talk about how even though it is not a big deal, their next gen parts would have it. Piffle, tee-hee, there is no man behind the curtain.

When Nvidia announced its parts, you heard ATI crowing about how the R500 would have SLI also, right? Me neither. None of my sources heard it, and no OEMs are talking about it. Said checkbox will probably be blank until R600 time. That is a long long time to wait while you lose every single benchmark on the planet.

That may make you lose some high end sales, but as I mentioned earlier, those sales really don't amount to all that much. The OEM sales matter a lot, and everything looks to be in Nvidia's favour here as well. Of late, a lot of companies have defected from the Nvidia camp to the ATI camp. I think the OEMs will be quite receptive to the marketing pitch of 'You sell ATI, no SLI'.

If you want to make OEMs more receptive to the enemy's tender advances, there is a nice way to do this, screw your partners. While ATI trumpets the number of PCI Express parts it is selling, the OEMs tell a different story, namely that ATI is selling all that it can make, just not to them. When you are competing with your partners as ATI does, and you have a shortage, as ATI does, you don't screw your friends. ATI did. Really, ask any ATI graphics card manufacturer how good their supply of PCI Express parts are, I did. The responses are uniformly not polite.

So, what do you do if you are ATI, and you are looking at a lineup of parts that missed the boat? You deflect things. The last big FUD campaign was the native vs non-native PCI Express problem. ATI was giving out T-shirts proclaiming how Nvidia didn't have a 'native' PCI Express solution. They were using an ('eww' is intoned here, preferably with a sour look on your face) bridge chip.

So, what does this mean? Well, the benchmarks show that it means nothing at all. Common sense says that when you are not using all the bandwidth that AGP 8x offers doubling it won't add much performance. Adding a little latency obviously didn't hurt much either, all the benchmarks I have seen show no or minuscule differences between the AGP version and the PCI Express flavour of Nvidia products.

The problem for ATI is that it also shows the same thing for the X800 line. So much for the vast superiority of the 'native' way. It almost seems like Nvidia engineers were focused on doing things that mattered rather than marketing slogan engineering. Hard numbers can be a bitch when you are trying to spin.

Looking out, I see ATI with a minuscule advantage that it is clinging to, desperately hoping no one notices that the train left the station in early August with the release of Doom3. The future belongs to Nvidia right now, and the only hope ATI has is in the R500, but that won't be here for a long time. If it manages to catch up card for card, Nvidia still 'only' has a 2:1 advantage. I know where my money will be going, and an X800 seems like money badly spent if you don't plan on buying an new card every three months. µ                    

Carigamers

ATI is smug but Nvidia's the bug in the rug
« on: August 24, 2004, 09:47:55 AM »

Offline coldstorm

  • Jonin
  • ***
  • Posts: 567
  • Chakra 0
  • Referrals: 0
    • View Profile
ATI is smug but Nvidia's the bug in the rug
« Reply #1 on: August 24, 2004, 11:01:23 AM »
LOL time to show the ati people to get their heads up from the sand                    

Offline TrinireturnofGamez

  • AdvancedTactics
  • Akatsuki
  • *
  • Posts: 3458
  • Chakra 4
  • Referrals: 0
    • View Profile
ATI is smug but Nvidia's the bug in the rug
« Reply #2 on: August 24, 2004, 01:34:12 PM »
FACT : no game yet uses shader 3.0 other than farcry , in FARCRY it delivers  no greater IQ and only a tiiiinnny bit more performance and still leaves ATI the fastest in FarCry  .

FACT : few games will use shader 3.0 until ATI has it

FACT : ATI has an uber powerful 'X-880' which it refused to launch with the X-800s for some unknown reason...

OPINION : i think that they did no launch the X-880 cause of  manufacturing cost [X-800 is alot cheaper to make than any 6800] and that the X-800s were FAST ENOUGH  to compete with the 6800s

FACT : ATI has shader 3.0 in its next new chip design , the R500 'Fudo'  , which will be released early next year

FACT : most games to be released next year do not even fully use shader 2.0 yet . [call of cthulhu, Advent rising etc.]                    
http://freetrinipoetry.blogspot.com/

Core 2 duo E6600
Asus mobo
Radeon HD 4770
2 gigs DDR2 667 + 2 gigs DDR 800 OCZ

Offline W1nTry

  • Administrator
  • Akatsuki
  • *****
  • Posts: 11329
  • Country: tt
  • Chakra 109
  • Referrals: 3
    • View Profile
  • CPU: Intel Core i7 3770
  • GPU: Gigabyte GTX 1070
  • RAM: 2x8GB HyperX DDR3 2166MHz
  • Broadband: FLOW
  • Steam: W1nTry
  • XBL: W1nTry
ATI is smug but Nvidia's the bug in the rug
« Reply #3 on: August 24, 2004, 03:52:33 PM »
Those are some interesting FACKS.... trinireturnscrewthiscrap  (note spelling) the point of that article was never to say that right now Nv1dia is besting @TI but rather to point out that as time progresses the outlook for ATI deminishes. And I have to agree with them.
Firstly Your touting F@rcry as the only Shader 3.0 game, truth is as with all games and games development they use new technology not old so games currently being developed will tend towards 3.0 and what platform is available for it? not @TI.

Second the X-800 architecture doesn't support it in the get go aka. the architecture so it won't be there anytime soon.

Third, Nv1dia has time to improve their performance and architecture with shader 3.0 and their mpeg encoding and decoding, so by the time there is a R5xx their hardware will be tried and tested.

 Fourth, the new 6600 process used by Nv1dia is cheap and provides high yields, thus soon all their cards will work on a similar process so throw costs out the window.

Fifth, Nv1dia's chipset manufacturing will allow them to tweak boards for their cards, @TI is nowhere close to the level their counterpart is.

Sixth The R5xx will not be out early next year.

Seventh, as I recall until the R3xx @TI touted their features (all in w0nder)  now Nv1dia is doing that as their performance is MAGINABLY less, history has a habit of repeating itself.

Eight, Nv1dia's SLI IS, HAS and WILL keep the performance crown in their  hands for a while, think multiple cores on one card.... right... have u checked how prohibitively EXPENSES those MILITARY used pieces of hardware are? and whos to say Nv1dia can't do it either? can u spell Dual core GPU in SLI mode?

And all this coming from a man with a R350                    

Offline coldstorm

  • Jonin
  • ***
  • Posts: 567
  • Chakra 0
  • Referrals: 0
    • View Profile
ATI is smug but Nvidia's the bug in the rug
« Reply #4 on: August 24, 2004, 04:37:10 PM »
i salute u also :P                    

Carigamers

ATI is smug but Nvidia's the bug in the rug
« Reply #4 on: August 24, 2004, 04:37:10 PM »

Offline TrinireturnofGamez

  • AdvancedTactics
  • Akatsuki
  • *
  • Posts: 3458
  • Chakra 4
  • Referrals: 0
    • View Profile
ATI is smug but Nvidia's the bug in the rug
« Reply #5 on: August 24, 2004, 06:06:43 PM »
X-800 has a technology called 3DC , which in itself may be just as important or more so  than shader 3.0 [right now]  , 3dc allows 4 to 1 compression of texture maps  , so game developers can include 4x MORE detail without any performance loss,  or get a performance gain etc.  it doesn't require a skilled artist AND programmer to integrate, just compile special texture formats using the FREE  ATI 3dc compiler .  

 FACT : no one needs a card with shader 3.0 NOW since NOTHING uses it and NOTHING will use it   for a good while

FACT : the X-800 and 9800s can EMULATE advanced pixel shaders using  their 'F' buffer. so they will be able to run most Shader 3.0 games in a kinda mixed mode .

FACT : the 6800 series is better if you plan to keep the card well into 2006 [ when shader 3.0 will start to be dominant ]  , but the 6800s are more expensive to produce for Nvidia .

FACT : ATI has more than 90% of the PCI-X market for OEMs, they have already shipped 1 million PCI-X cards in the space of a few months , ATI PCI-X products are cheaper and much faster than NVIDIA cards , and will continue to be cheaper and faster until the PCX 6600s come out.. but the deals have already been made for ATI to supply those products... making more money for ati = more R&D = better ATI products down the line .  

YES i admit it ,Nvidia is better right now [ if u have an expensive PSU , and can take an extra 90watts of heat in your case ] , but in 4-6 months time that will change .                    
http://freetrinipoetry.blogspot.com/

Core 2 duo E6600
Asus mobo
Radeon HD 4770
2 gigs DDR2 667 + 2 gigs DDR 800 OCZ

Offline coldstorm

  • Jonin
  • ***
  • Posts: 567
  • Chakra 0
  • Referrals: 0
    • View Profile
ATI is smug but Nvidia's the bug in the rug
« Reply #6 on: August 24, 2004, 06:33:17 PM »
they say 1 million shipped how may sold ?oem hate to have backload of stock

also contradicting urself by saying nothing  uses ps 3.0 while u said farcry using it above

no game uses 3dc as yet when farcry  1.2 patch comes out it will be the first and then we see what it can do in real game engine

emualtion is always slower than hardware                    

Offline Crixx_Creww

  • Akatsuki
  • *****
  • Posts: 9057
  • Country: 00
  • Chakra -12
  • ANBU OF THE HIDDEN VILLAGE FOAK
    • Atari 2600.
  • Referrals: 11
    • View Profile
    • www.crixxcrew.com
  • CPU: Intel Q6600 @3.2 Ghz
  • GPU: Nvidia Xfx geforce 9800GTX+
  • RAM: 8 Gigs Mixed kingston and corsair ddr2
ATI is smug but Nvidia's the bug in the rug
« Reply #7 on: August 25, 2004, 01:13:45 AM »
the nvidia cards also have 4 to 1 compression plus they have ps3
so whats yur point?

ati is tweaking the hell out of old technology to keep cost down
they are in essence jipping consumers

nvidia is most definately looking ahead
a wonderful stratergy, because gamers want to have a card that can play games now
and be able to play games in the future
instead of upgrading ever 2 or three quaters                    

Offline vinion2000

  • Genin
  • *
  • Posts: 211
  • Country: tt
  • Chakra 1
    • Xbox 360
  • Referrals: 0
    • View Profile
  • Broadband: Flow
  • PSN: TakeBusHead
  • XBL: TakeBusHead
ATI is smug but Nvidia's the bug in the rug
« Reply #8 on: August 25, 2004, 01:25:33 AM »
personal opinion.... i never like looking benchmarks and rantings since as the article said some fo those frame rates amount to squat when using other parts. yet i guess their good references. i value real life performance. ATI vs Nvidia is as bitter as AMD vs Intel. frankly when one company is pushing the envelope and charging you for it its always going to end up a big arguement. ive used many cards on this system of mines and frankly ATI has alwasy left a bitter after-taste.
ive always gotten better overall enjoyment value (sum of speed, quality and ease of use) from Nvidia. It alwasy sucked being an ATI user. games were never really compatible unless you got a new driver update. they were quirky and some games  didnt support  your card out right eg Metal Gear Solid 2.im pretty sure ATI card holders felt bummed having to wait for a Doom3 speed up driver. those sort of things never really happen to Nvidia users. ive never really felt a need to upgrade drivers beyond knowing thier value in the long run.
as for Oem as far as i know most ppl supply Nvidia parts. the PCI express explosion IMO is a temporary necessity. ATI drivers are bitches and most of the ppl ive worked with tend not to supply them for that reason alone (customer service is a bigger bitch). ppl always like cheering the underdog but frankly when it comes to money being spent you know where i stand.                    
If I enjoy eating chicken does that make me a stereotype

or a fat bastard?


Offline rumbelly

  • Genin
  • *
  • Posts: 10
  • Chakra 0
  • Referrals: 0
    • View Profile
    • http://
ATI is smug but Nvidia's the bug in the rug
« Reply #9 on: August 25, 2004, 11:04:55 AM »
toms hardware -

NVIDIA and ATI are to the graphics card segment what Intel and AMD are to the CPU market. In the past, NVIDIA was the unchallenged number one. However, ATI has caught up with its products and also managed to put itself in a better position. No one would have thought ATI capable of it just two years ago. We watched NVIDIA slowly turning into a de facto monopoly - just like Intel. But things changed rapidly. ATI's market capitalization is now up to $3.4 billion, whereas NVIDIA has to be happy with $1.6 billion.


thas wah yuh call .. LIX IDMC !!
heh

and that talk bout nvidia having " more features "
nvidia jus using that as a marketing ploy for fellas liek alyuh who is nvidia hoes

whats the point in spendin $400 now on a card that gives inferior performance
 
where as ps3 would only be needed sometime nex year when the card will be cheaper anyway ?

you guys go spend $400 on an nvidia now for ps3, i will buy my ATI with ps3 or higher nex year for $250 WHEN IT WILL BE ACTUALLY USED

steups                    

Offline coldstorm

  • Jonin
  • ***
  • Posts: 567
  • Chakra 0
  • Referrals: 0
    • View Profile
ATI is smug but Nvidia's the bug in the rug
« Reply #10 on: August 25, 2004, 11:20:40 AM »
lol this is a mad man. 6600 which starting at $150 has ps 3.0. Market captilization means nothing porfit mean everything. i can have market cap of 90% but not making a damm cent . Inferior performance in only hl2  everything else say that 6800 or 6600 is the better buy                    

Offline TrinireturnofGamez

  • AdvancedTactics
  • Akatsuki
  • *
  • Posts: 3458
  • Chakra 4
  • Referrals: 0
    • View Profile
ATI is smug but Nvidia's the bug in the rug
« Reply #11 on: August 25, 2004, 01:34:26 PM »
Quote
the nvidia cards also have 4 to 1 compression plus they have ps3
so whats yur point?

ati is tweaking the hell out of old technology to keep cost down
they are in essence jipping consumers

nvidia is most definately looking ahead
a wonderful stratergy, because gamers want to have a card that can play games now
and be able to play games in the future
instead of upgrading ever 2 or three quaters


this is 4 : 1 LOSSLESS compression , in doom III The only diff between ultra and medium is that medium uses 4 : 1 DXT5 compression that LOOSES detail. If DIII had 3dc ATI cards would be alot faster than they are now . The next FarCry 1.3 patch is supposed to include 3dc  and we will see how much it increases performance with NO loss to IQ .
  Nvidia still has no hardware support for Npatches up to now [ truform] , if they DID more games would support it , and more games will have higher poly counts without much loss in performance .
  Nvidia over emphasises the importance of what little technolgies they aquire   eg. Ultra shadow II is what they say gives them the edge over ATI in Doom III , in other shadow intensive games like theif and splinter cell ATI takes the lead , the only reason why Nvidia is faster in Doom III is that Carmacks code likes Nvidia , if any of you have heard of the 'humus' tweak , it boosts ATI performance by 20% in DIII by just changing 2 lines of code .  This kind of thing where one card is faster not because its better , but because the code likes it is supposed to be fixed with the release of Direct X 10 and shader 4.0 , by then a 'unified shader architecture' will be introduced which means that both ATI and Nvidia will have the same rendering engine on their cards , which gives programmers and artists more freedom without having to worry about whether or not X card will play the game good .
 ATI still has the lead when it comes to features : Truform, F buffer, Fullstream , 3DC , smartshader etc.  

 
Quote
everyone is blowing PS3.0 out of porportion , the only game that will use PS 3.0  to an extent will be Unreal III in 2006 : which needs 1 gb of ram    , 512mb video card and a 4ghz processor for you to get the full expereince , no card that Nvidia or ATi has comes with 512mb of RAM yet   , buying a 128mb card for PS 3.0 is like buying an Athlon FX and only 256mb of ram



.  Whats the point of going out of your way for a card with sm3.0 , when your not going to get the full experince when it matters anyhow??   it would be better to get a 512mb ATI X-800 based card with 2.0 , ATIs cards will be able to  emulate the more popular features of SM3.0 using F buffer , so their bases are covered.

 By the time Nvidia starts putting 512mb AND sm3.0 on their cards ATI will   be ready to release their SM3.0 cards . ATI has always had the best shading architecture , so it will be better to wait for ATI sm3.0 anyhow                    
http://freetrinipoetry.blogspot.com/

Core 2 duo E6600
Asus mobo
Radeon HD 4770
2 gigs DDR2 667 + 2 gigs DDR 800 OCZ

Offline coldstorm

  • Jonin
  • ***
  • Posts: 567
  • Chakra 0
  • Referrals: 0
    • View Profile
ATI is smug but Nvidia's the bug in the rug
« Reply #12 on: August 25, 2004, 05:02:27 PM »
there is a  nvidia card with 512 gddr and ps 3.0 (quadro 4400) given that they are hugely expensive  but u still wrong (both of u). emulation will always be slower than hardware f buffer will require multiple passes to do ps 3.0 ie slowdown end of story .                    

Offline TrinireturnofGamez

  • AdvancedTactics
  • Akatsuki
  • *
  • Posts: 3458
  • Chakra 4
  • Referrals: 0
    • View Profile
ATI is smug but Nvidia's the bug in the rug
« Reply #13 on: August 25, 2004, 08:26:18 PM »
true : emulation will be slower  , but it will still work , and its not THAT much slower... you'd only loose 10% performance  , the benchmarks in 3dmark where the Geforce 3 had to emulate shader 1.4   ,  2 passes .  
    And what makes you think that the thousand dollar quadro FX 4400 will be approaching the affordability of the 6600 or X-700  anytime soon??? the 6600 will be a good card , but Nvidias has no real 'edge' with sm3.0 , its just another semi useless feature like truform , until late 2005/2006 it won't be needed , and by then 128mb of ram will not be able to hold all the textures in the games [ it can't with DIII NOW]  , but if ANY game were to enable 3dc , a 128mb card will be able to hold 512mb of textures with NO loss in IQ , or hold less than 512 and get a performance boost .  
   You don't NEED SM3.0 for unreal II, it will work on any DX9 card like a 9600 . Its basically DIII with higher res textures , SM3.0 and HDR lighting.  
     When FarCRy 1.3 comes out with HDR and 3DC ATI will take a frightening lead in performance and i say again : WILL                    
http://freetrinipoetry.blogspot.com/

Core 2 duo E6600
Asus mobo
Radeon HD 4770
2 gigs DDR2 667 + 2 gigs DDR 800 OCZ

Offline rumbelly

  • Genin
  • *
  • Posts: 10
  • Chakra 0
  • Referrals: 0
    • View Profile
    • http://
ATI is smug but Nvidia's the bug in the rug
« Reply #14 on: August 26, 2004, 12:21:03 AM »
no use arguin with fan boys yes

say wha, alyuh buy nvidia

good luck   :)                    

Offline W1nTry

  • Administrator
  • Akatsuki
  • *****
  • Posts: 11329
  • Country: tt
  • Chakra 109
  • Referrals: 3
    • View Profile
  • CPU: Intel Core i7 3770
  • GPU: Gigabyte GTX 1070
  • RAM: 2x8GB HyperX DDR3 2166MHz
  • Broadband: FLOW
  • Steam: W1nTry
  • XBL: W1nTry
ATI is smug but Nvidia's the bug in the rug
« Reply #15 on: August 26, 2004, 09:41:58 AM »
What about the fact that you will be able to place 2 6600GTs in SLI mode and the total cost may very well be below 400US? and as far as that being relegated to the expensive server market, the onset of Nvid1a's Nf0rce 4 will rectify that problem cause in the first place the same person who buys an SLI rig is the same person that would contemplate paying for a dual core @TI card at a higher price. One more thing Emulation is, has and always will be slower. Tell me something, as nice as 3dMark and all these synthetic benchmarks go, till the games features run on are tested (UT2004 botmatch etc) the actual in game performance will vary from system to system even with similar hardware. So ur 10% claim I suspect will change in time.  And before I forget, what is up with @TI and their DAMN 24MB driver and control panel files???? Nvidia has like a 13Mb file that includes the nview etc while I have to download a Driver a Control panel and if I want the nifty window effect I have to download Hydravision. I think that a company who can write all that into one 13mb file has a slight edge in drivers wouldn't you say?                    

Offline unimatrix001

  • Genin
  • *
  • Posts: 211
  • Chakra 0
  • Referrals: 0
    • View Profile
    • http://
ATI is smug but Nvidia's the bug in the rug
« Reply #16 on: August 26, 2004, 12:43:51 PM »
ATI an nVidia always at war :(                    

Offline TrinireturnofGamez

  • AdvancedTactics
  • Akatsuki
  • *
  • Posts: 3458
  • Chakra 4
  • Referrals: 0
    • View Profile
ATI is smug but Nvidia's the bug in the rug
« Reply #17 on: August 26, 2004, 02:40:34 PM »
Quote
What about the fact that you will be able to place 2 6600GTs in SLI mode and the total cost may very well be below 400US? and as far as that being relegated to the expensive server market, the onset of Nvid1a's Nf0rce 4 will rectify that problem cause in the first place the same person who buys an SLI rig is the same person that would contemplate paying for a dual core @TI card at a higher price. One more thing Emulation is, has and always will be slower. Tell me something, as nice as 3dMark and all these synthetic benchmarks go, till the games features run on are tested (UT2004 botmatch etc) the actual in game performance will vary from system to system even with similar hardware. So ur 10% claim I suspect will change in time.  And before I forget, what is up with @TI and their DAMN 24MB driver and control panel files???? Nvidia has like a 13Mb file that includes the nview etc while I have to download a Driver a Control panel and if I want the nifty window effect I have to download Hydravision. I think that a company who can write all that into one 13mb file has a slight edge in drivers wouldn't you say?
 

 you don't have to download all 24mbs of driver , that includes the control panel AND video capture  AND the new driver, if you go to the download page you are given the option to download only the video driver [ which is 8 megs :P ] the control panel stays the same for like 10 versions of driver so you don't need it if you already have it :P .
  you only have to dload 8 megs with ati as long as you have ANY control panel installed, so i guess ati wins? :P :P :P

size doesn't matter...don't come with that argument..

wait.. size DOESN matter... when the nvidia card eats a pci slot and can't fit in some cases :P :P :P :P                    
http://freetrinipoetry.blogspot.com/

Core 2 duo E6600
Asus mobo
Radeon HD 4770
2 gigs DDR2 667 + 2 gigs DDR 800 OCZ

Offline coldstorm

  • Jonin
  • ***
  • Posts: 567
  • Chakra 0
  • Referrals: 0
    • View Profile
ATI is smug but Nvidia's the bug in the rug
« Reply #18 on: August 26, 2004, 03:49:29 PM »
by next year we will be having high end consumer card using 512 memory . also 3dc whatever they may say has losses . nothing compressed is lossless. also 3dc takes up processing time :P                    

Offline TrinireturnofGamez

  • AdvancedTactics
  • Akatsuki
  • *
  • Posts: 3458
  • Chakra 4
  • Referrals: 0
    • View Profile
ATI is smug but Nvidia's the bug in the rug
« Reply #19 on: August 26, 2004, 06:19:10 PM »
http://esprit.campus.luth.se/~humus/3D/index.php < has alot of programs that make good screensavers too.. if you have the hardware to run it!


work of an ATI employee , same one that found the DIII tweak
Quote
This demo illustrates the use of the new 3Dc texture compression format, which is particularly suitable for normal map compression. It lets you compare quality between 3Dc and DXT5 normal maps, and it lets you compare the performance of using 3Dc and DXT compression over using uncompressed textures.

The performance increase of 3Dc and DXT is well worth the effort. Some benchmark numbers:

No compression: 125fps
3Dc: 146fps (+17%)
DXT: 136fps (+9%)
3Dc & DXT: 158fps (+26%)

That's with a fairly advanced shader, and overhead for the shadowmap which moves lot of workload where textures aren't used. Without shadows the difference is even larger:

No compression: 164fps
3Dc: 210fps (+28%)
DXT: 195fps (+19%)
3Dc & DXT: 239fps (+46%)

Quality-wise the DXT5 is often usable, but in some situations it just won't cut it. 3Dc on the other hand gives very good quality for all normal maps I've tried.


he calls the currant standard ,DXT, 'usable'  and 3dc 'very good'  , even if there IS loss, the quality is far superior to what we are used to now... and its faster :P                    
http://freetrinipoetry.blogspot.com/

Core 2 duo E6600
Asus mobo
Radeon HD 4770
2 gigs DDR2 667 + 2 gigs DDR 800 OCZ

Carigamers

ATI is smug but Nvidia's the bug in the rug
« Reply #19 on: August 26, 2004, 06:19:10 PM »

 


* ShoutBox

Refresh History
  • Crimson609: yea everything cool how are you?
    August 10, 2022, 07:26:15 AM
  • Pain_Killer: Good day, what's going on with you guys? Is everything Ok?
    February 21, 2021, 05:30:10 PM
  • Crimson609: BOOM covid-19
    August 15, 2020, 01:07:30 PM
  • Shinsoo: bwda 2020 shoutboxing. omg we are in the future and in the past at the same time!
    March 03, 2020, 06:42:47 AM
  • TriniXjin: Watch Black Clover Everyone!
    February 01, 2020, 06:30:00 PM
  • Crimson609: lol
    February 01, 2020, 05:05:53 PM
  • Skitz: So fellas how we go include listing for all dem parts for pc on we profile but doh have any place for motherboard?
    January 24, 2020, 09:11:33 PM
  • Crimson609: :ph34r:
    January 20, 2019, 09:23:28 PM
  • Crimson609: Big up ya whole slef
    January 20, 2019, 09:23:17 PM
  • protomanex: Gyul like Link
    January 20, 2019, 09:23:14 PM
  • protomanex: Man like Kitana
    January 20, 2019, 09:22:39 PM
  • protomanex: Man like Chappy
    January 20, 2019, 09:21:53 PM
  • protomanex: Gyul Like Minato
    January 20, 2019, 09:21:48 PM
  • protomanex: Gyul like XJin
    January 20, 2019, 09:19:53 PM
  • protomanex: Shout out to man like Crimson
    January 20, 2019, 09:19:44 PM
  • Crimson609: shout out to gyal like Corbie Gonta
    January 20, 2019, 09:19:06 PM
  • cold_187: Why allur don't make a discord or something?
    December 03, 2018, 06:17:38 PM
  • Red Paradox: https://www.twitch.tv/flippay1985 everyday from 6:00pm
    May 29, 2018, 09:40:09 AM
  • Red Paradox: anyone play EA Sports UFC 3.. Looking for a challenge. PSN: Flippay1985 :)
    May 09, 2018, 11:00:52 PM
  • cold_187: @TriniXjin not really, I may have something they need (ssd/ram/mb etc.), hence why I also said "trade" ;)
    February 05, 2018, 10:22:14 AM

SimplePortal 2.3.3 © 2008-2010, SimplePortal