PAX East 2010: NVidia Reveals Its Latest Tiny God of GPUs

bawkbawkboo1

New member
Nov 20, 2008
256
0
0
According to the wikipedia page these are made using 40nm manufacturing tech, does anyone know if that can be confirmed? I wish they could have gone to 32nm, that might have reduced the heat output and absurdly high power consumption mentioned by all the articles....
 

Signa

Noisy Lurker
Legacy
Jul 16, 2008
4,749
6
43
Country
USA
Agayek said:
Signa said:
Well, this is what I've been waiting for. I'm still running an old 8800GT, and I wanted to see what Nvidia was bringing to the table before I jumped on a DX11 card. I'll probably have to hold off for a while because my job status has been tentative, and my 8800 is running everything at full settings at 1080P with little fuss. Hard to justify a $300 pricepoint for that kind of upgrade.
You have no idea how ecstatic I would be if the 480 was only $300.
I was referring to a guesstimate of what the 470 would be at. Top of the line cards are always out of my reasonable price range, at least until the next gen comes out. Hell, I might wait just that long because of what I wrote in my first post.
 

Callate

New member
Dec 5, 2008
5,118
0
0
On one hand, I have yet to run across an app that my current GTX 280 doesn't run swimmingly.

On the other hand... Drool....
 

JediMB

New member
Oct 25, 2008
3,094
0
0
John Funk said:
GFX 480/470
GFX 480/470 GPU
The GFX 480 and GFX 470
Uh-oh!

Well, typos aside, it sounds like a pretty awesome GPU. Would probably run Arkham Asylum's Scarecrow world pretty smoothly with everything (including AA and PhysX) maxed.
 

Azhrarn-101

New member
Jul 15, 2008
476
0
0
yay, 10% extra performance from the GTX480 over ATIs HD5870 (not even the 5970) for twice the price.
Clearly, useless if nVidia didn't pay developers to use PhysX.
Much higher power consumption too (almost 120 Watt more per card compared to the 5870).

In other words, ATI still has the best bang for your buck and has done that since last year! Eyefinity support on a single card, unlike nVidias required SLI setup for nV Surround (since their cards only have 2 video outputs each).
 

Weaver

Overcaffeinated
Apr 28, 2008
8,977
0
0
In terms of pure numbers, Henry compared the new 480 to its predecessor, the GTX 285 - "hands down, it was the best GPU on the marketplace, and something we were very proud of."
I stopped trusting the speaker right there and then.
 

Myoukochou

Black Butterfly
Apr 1, 2009
46
0
0
Be careful.

It?s very fast for compute tasks, so if you?re doing that, it could be interesting. For gaming, it?s got a good minimum framerate, but it is not groundbreaking and ATi is in very close competition, with months of leadtime (and still better AF quality). Honestly I expected better for this delay, but that?s all it is; a delay, competing alongside ATi?s 5xxx series, not necessarily solidly ahead of them.

Also, it?s not finished?yield on TSMC 40G-GL is terrible, so they are paper launching and only a couple tens of thousand will be made of the first batch, I expect. Worse still, they can?t make enough complete GF100s to launch the full Fermi card; these are both cut?down; the 480 is NOT the top of the line, they actually can?t launch at the top, they can?t make enough of them well enough.

But even this cut?down Fermi is the hottest, loudest, hungriest single GPU card around. That silver finned bit is the top of a live, potentially 105°C (and probably 90°C) piped Heatsink of Doom?. Be careful if you touch this card?it will brand you. Seriously. Hook up some watercooling and you could make tea with it. (And it isn?t even fully enabled!) Run it in SLI and you will want aftermarket cooling somehow because you will be sucking well over 800 watts and really shouldn?t put them next to each other (but you may have to, depending on the motherboard).

Note no?one has even tried overclocking. Or re-enabling that disabled core (if it?s possible in software). Wouldn?t be surprised if the envelope is not only pushed, but already bulging on this one.

And why no DisplayPort? (We still have a Vsync? Where?s the dynamic update support from anyone, anyway?)

Makes me worry about the reliability. Bad yield on the process necessitating releasing a part with features disabled, hot temperatures, high power? I just don?t know. It doesn?t add up to an encouraging picture. If you want one, get a VERY good warranty on this puppy; don?t accept less than 3 years, and you definitely want the warranty to cover until you will replace it.

That said there?s no doubt it?s a good card alongside the top of ATi?s best. But it?s not a killer. And even if I had the cash, I would not want to be an early adopter on this one.

Read the reviews thoroughly first.
 

dochmbi

New member
Sep 15, 2008
753
0
0
The Radeon 5970 is faster and the 5870 is just slightly slower but much better value. I was expecting more from NVIDIA since they are about a half a year late.
 

SimuLord

Whom Gods Annoy
Aug 20, 2008
10,077
0
0
"Who says PC gaming is dead"? Nobody. Now graphics-intensive, "hey look, cool" PC gaming, that is on life support with EA and Ubi hooking up the Kevorkian machine with DRM.

But low-requirement, value-priced games that appeal to customers and not pirates? Paradox and Stardock await thee.

Bad sign for nVidia and ATI. Bad sign for EA, Ubi, 2K, and Sega. Great sign for PC gaming.
 

Ralackk

New member
Aug 12, 2008
288
0
0
bawkbawkboo1 said:
According to the wikipedia page these are made using 40nm manufacturing tech, does anyone know if that can be confirmed? I wish they could have gone to 32nm, that might have reduced the heat output and absurdly high power consumption mentioned by all the articles....
The couple of independant reviews I read claim its 40nm as well so I'm fairly sure thats what it has. I think I'll pass on this card and see what the future holds, as it stands all the high end cards run way too hot for me at the moment for air cooling.
 

Azhrarn-101

New member
Jul 15, 2008
476
0
0
Ralackk said:
bawkbawkboo1 said:
According to the wikipedia page these are made using 40nm manufacturing tech, does anyone know if that can be confirmed? I wish they could have gone to 32nm, that might have reduced the heat output and absurdly high power consumption mentioned by all the articles....
The couple of independant reviews I read claim its 40nm as well so I'm fairly sure thats what it has. I think I'll pass on this card and see what the future holds, as it stands all the high end cards run way too hot for me at the moment for air cooling.
The first releases are 40nm, because while 40nm yields at TSMC are bad, the 32nm yields were even worse. They can barely meet demand for 40nm chips with both ATI and nVidia guzzling up all available fabricator capacity, let alone work on perfecting 32nm circuitry which is what the Fermi core was designed to have.

As it stands you're looking at about 10-15% performance increase over a HD5870 with a GTX480 but at the cost of an extra ~120 watt power draw and the heat output of a small sun.
All at a significant price premium compared to the far more mature HD5870 cards.

In other words, if you must have an nVidia card, wait until they release the 32nm proper fermi's. These 40nm monsters are mainly there to pay for the extra cost involved in getting to 32nm circuitry. For all other cases, the 5870 is more than enough power up to 1920x1200, above that the 5970 is a great choice. All at a much nicer price point than the GTX400 series.
 

Zer_

Rocket Scientist
Feb 7, 2008
2,682
0
0
Azhrarn-101 said:
Ralackk said:
bawkbawkboo1 said:
According to the wikipedia page these are made using 40nm manufacturing tech, does anyone know if that can be confirmed? I wish they could have gone to 32nm, that might have reduced the heat output and absurdly high power consumption mentioned by all the articles....
The couple of independant reviews I read claim its 40nm as well so I'm fairly sure thats what it has. I think I'll pass on this card and see what the future holds, as it stands all the high end cards run way too hot for me at the moment for air cooling.
The first releases are 40nm, because while 40nm yields at TSMC are bad, the 32nm yields were even worse. They can barely meet demand for 40nm chips with both ATI and nVidia guzzling up all available fabricator capacity, let alone work on perfecting 32nm circuitry which is what the Fermi core was designed to have.

As it stands you're looking at about 10-15% performance increase over a HD5870 with a GTX480 but at the cost of an extra ~120 watt power draw and the heat output of a small sun.
All at a significant price premium compared to the far more mature HD5870 cards.

In other words, if you must have an nVidia card, wait until they release the 32nm proper fermi's. These 40nm monsters are mainly there to pay for the extra cost involved in getting to 32nm circuitry. For all other cases, the 5870 is more than enough power up to 1920x1200, above that the 5970 is a great choice. All at a much nicer price point than the GTX400 series.
Furthermore, from what I've heard, nVidia cards take a much more significant performance hit with hardware tessellation. nVidia needs to change their goddamn business model if they expect to win back the average gamer. And considering the 400 series' massive power consumption, I wouldn't be surprised if many enthusiasts went for the HD5xxx series instead. The 5000 series overclocks very nicely in most cases, I dread what you end up with if you try overclocking the GTX400s...
 

Ralackk

New member
Aug 12, 2008
288
0
0
Zer_ said:
Furthermore, from what I've heard, nVidia cards take a much more significant performance hit with hardware tessellation. nVidia needs to change their goddamn business model if they expect to win back the average gamer. And considering the 400 series' massive power consumption, I wouldn't be surprised if many enthusiasts went for the HD5xxx series instead. The 5000 series overclocks very nicely in most cases, I dread what you end up with if you try overclocking the GTX400s...
Probably a heap of slag and silicon where your computer used to be.
 

Azhrarn-101

New member
Jul 15, 2008
476
0
0
Zer_ said:
Furthermore, from what I've heard, nVidia cards take a much more significant performance hit with hardware tessellation. nVidia needs to change their goddamn business model if they expect to win back the average gamer. And considering the 400 series' massive power consumption, I wouldn't be surprised if many enthusiasts went for the HD5xxx series instead. The 5000 series overclocks very nicely in most cases, I dread what you end up with if you try overclocking the GTX400s...
I've heard pretty much the opposite, that the hardware tesselation on the GTX480 is a bit better than the HD5870. But either way, these cards are aimed at enthusiasts only, and they are probably lousy overclockers with their huge draw and heat output.
 

jamesworkshop

New member
Sep 3, 2008
2,683
0
0
Azhrarn-101 said:
Zer_ said:
Furthermore, from what I've heard, nVidia cards take a much more significant performance hit with hardware tessellation. nVidia needs to change their goddamn business model if they expect to win back the average gamer. And considering the 400 series' massive power consumption, I wouldn't be surprised if many enthusiasts went for the HD5xxx series instead. The 5000 series overclocks very nicely in most cases, I dread what you end up with if you try overclocking the GTX400s...
I've heard pretty much the opposite, that the hardware tesselation on the GTX480 is a bit better than the HD5870. But either way, these cards are aimed at enthusiasts only, and they are probably lousy overclockers with their huge draw and heat output.
You should see the SLI scaling




Single 480