New Nvidia Chip Features 3 Billion Transistors

Aedes

New member
Sep 11, 2009
566
0
0
...3 billions?!
Like, I must say after 100 millions, more or less zeros are pretty much the same on my mind.

But holy shit! 3 billions! o_O
 

IrrelevantTangent

New member
Oct 4, 2008
2,424
0
0
It's over NINE MIIIILLLLIONNNNN! Sorry. Bad joke.

But in all sincerity, how expensive is a piece of electronics like that going to be? I'm sure its processing power is through the proverbial roof.
 

Signa

Noisy Lurker
Legacy
Jul 16, 2008
4,749
6
43
Country
USA
Fayathon said:
Omikron009 said:
400% more awesome! Also, Nvidia doesn't make their graphics cards out of freakin' wood!
Is it bad that I read that in Marcus' voice?

OT: That's impressive.
I did that too before realizing I was reading a Marcus quote!
 

300lb. Samoan

New member
Mar 25, 2009
1,765
0
0
Omikron009 said:
400% more awesome! Also, Nvidia doesn't make their graphics cards out of freakin' wood!
If it took more than one pass to render all 300 trillion texels, then it wasn't an Nvidia!

When do they start packing in the 3D goggles and transmitter? I'll save all my money to buy a 120hz TV just so I can watch Avatar on Blu-Ray and play Team Fortress 2 in "never-leave-the-house"-o-vision!

PS: Hey Santa, I want one of these in my xbox!
 

ratix2

New member
Feb 6, 2008
453
0
0
no offense, but your about 3 months late with this news. nvidia announced a lot of this back shortly after the radeon hd 4800s came out.

they also announced that the first boards WOULDNT be for the consumer sector, but rather for the workstation sector, an area nvidia doesent have a lot of market share with due to its limited line of products compared to its competitors.
The_Oracle said:
It's over NINE MIIIILLLLIONNNNN! Sorry. Bad joke.

But in all sincerity, how expensive is a piece of electronics like that going to be? I'm sure its processing power is through the proverbial roof.
likely not more than around $600. once these things get to production they arent very expensive to make, its recouping the r&d costs that makes them cost so much. but that aside, graphics cards are cheaper than theyve ever been.
 

grimsprice

New member
Jun 28, 2009
3,090
0
0
Omikron009 said:
400% more awesome! Also, Nvidia doesn't make their graphics cards out of freakin' wood!
remember! If it took more than one graphics card you weren't using an Nvidia.
 

Beltaine

New member
Oct 27, 2008
146
0
0
If we weren't talking a new generation of DirectX here I'd be happy.

As it is, the GTX 295's will finally drop to a reasonable price, but still be behind the tech curve.

32x AA? Why? So I can lose 30% of my framerate and gain 0.003% more fuzziness?

Sometimes I think they just arbitrarily invent new "features" with larger numbers than previously just because they can. Remember the 16/32/64 bit console wars?
 

LoopyDood

New member
Dec 13, 2008
410
0
0
Whew, can't wait to see the performance of these things compared to ATI's 5800 series, or ATI's response. Price drops all around!
 

IrrelevantTangent

New member
Oct 4, 2008
2,424
0
0
ratix2 said:
The_Oracle said:
It's over NINE MIIIILLLLIONNNNN! Sorry. Bad joke.

But in all sincerity, how expensive is a piece of electronics like that going to be? I'm sure its processing power is through the proverbial roof.
likely not more than around $600. once these things get to production they arent very expensive to make, its recouping the r&d costs that makes them cost so much. but that aside, graphics cards are cheaper than theyve ever been.
Hmm, less than I imagined. If I ever need a new one though, I think I'll look for a cheaper one! :p
 

Doc Incognito

Currently AFK
Nov 17, 2009
166
0
0
Mimsofthedawg said:
Doc Incognito said:
Wait, don't Macs use nVidia chips? Because if they start using that new one, it might start making the graphics suck less.
unfortunately, nothing can help the MACs suck less.
Ow, that hit me hard.
Actually, I'm used to it. I have a friend that often builds computers from spare parts, and he's a Windows supremacist.

Also, it's the graphics I'm worried about, because everything else is great. I tried playing Unreal Tournament, and everything moved at half speed.
 

BloodSquirrel

New member
Jun 23, 2008
1,263
0
0
Greg Tito said:
New Nvidia Chip Features 3 Billion Transistors



NVidia has announced some of the features for its newest line of graphics chips based on Fermi chip architecture.

NVidia's next line of GPUs is codenamed GF100 and is the first graphics chip that will utilize the Fermi chip architecture. nVidia has been releasing snippets of information on the new cards on Facebook and Twitter [http://twitter.com/NVIDIAGeForce] for the last few weeks, the last of which came out yesterday. The GF100 will support a new type of 32x anti-aliasing, which basically means the edges of objects in graphics will now look more awesome. Also, 3 billion transistors! By all accounts, that is a veritable crapload of transistors.

Here are all the nuggets of information on Nvidia's new graphics cards:

GF100 is the codename for the first GeForce GPU based on the Fermi architecture!

The GF100 board is 10.5-inches long -- the same length as GeForce GTX 200 Series graphics cards!

GF100 packs in over 3B (billion!) transistors

The GF100 supports full hardware decode on the GPU for 3D Blu-Ray

GF100 graphics cards will provide hardware support for GPU overvoltaging for extreme overclocking!

GF100 supports a brand new 32x anti-aliasing mode for ultra high-quality gaming!

Note to nVidia: I want a GF100 for Christmas. Let's make it happen.

Source: Legitreviews [http://www.legitreviews.com/news/7029/]

Permalink
Now all they need to do is actually get some damn cards in stores. I've generally been loyal to Nvidia, but I want to build a new computer next month and if their cards aren't available, or enough of an improvement to make it worth waiting a month, I'm going ATI.

Also, GF100? They really need to get back to a consistent, easy-to-understand naming scheme.
 

NoNameMcgee

New member
Feb 24, 2009
2,104
0
0
32x AA? Holy shit. Is there really a need for that?

On some games I can't tell the difference between 4x and above (bearing in mind I'm playing games in my native res of 1920x1080). I can very rarely tell the difference between 4x and 8x, and I never go above 8x unless I can absolutely max the game at a constant 60fps because otherwise there is just no need, it gives no real extra visual quality unless you get your microscope out and check the edges of objects. It's just bragging rights.
 

theultimateend

New member
Nov 1, 2007
3,621
0
0
Mr.Tea said:
More Fun To Compute said:
I once made a circuit with 3 transistors and agree that 3 billion is quite a lot of transistors. I can't wait to see how good Braid and Dwarf Fortress look with 32x anti aliasing!
Braid already looks blurred because there isn't even a setting to change the resolution, I can't imagine how much worse it'll be with 32x anti-aliasing...

Abedeus said:
This is what I'm getting for my birthday.

Also, 32x AA? Who the hell needs more than x2 or x4?
I don't get it either... Maybe people like blurriness? I prefer sharpness and higher frame rates; I must be crazy.
What computer are you running that makes high AA blurry?

I've never played Braid but I can't think of a game I own where turning on high AA results in soap opera romance scene blur.

Doc Incognito said:
Mimsofthedawg said:
Doc Incognito said:
Wait, don't Macs use nVidia chips? Because if they start using that new one, it might start making the graphics suck less.
unfortunately, nothing can help the MACs suck less.
Ow, that hit me hard.
Actually, I'm used to it. I have a friend that often builds computers from spare parts, and he's a Windows supremacist.

Also, it's the graphics I'm worried about, because everything else is great. I tried playing Unreal Tournament, and everything moved at half speed.
I wouldn't be hard on macs if it weren't for the fact that you can build a system for half the cost, twice the power, with the same level of stability and still not get viruses.

I'm not even super tech savvy and my system has been running for two years without a single crash nor virus. Every AV I've used has been free, currently running the Microsoft one. Probably helps that I don't run around opening random emails.

PS. If Macs were modestly priced I'd have one in a heartbeat. They feel good against the bare skin.

AverageJoe said:
32x AA? Holy shit. Is there really a need for that?

On some games I can't tell the difference between 4x and above (bearing in mind I'm playing games in my native res of 1920x1080). I can very rarely tell the difference between 4x and 8x, and I never go above 8x unless I can absolutely max the game at a constant 60fps because otherwise there is just no need, it gives no real extra visual quality unless you get your microscope out and check the edges of objects. It's just bragging rights.
I think the idea is for people who are starting to use massive screens for gameplay and in the hopes that there will be massive resolutions to accommodate.

Since I assume at higher resolutions you need better antialiasing.