New Nvidia Chip Features 3 Billion Transistors

Greg Tito

PR for Dungeons & Dragons
Sep 29, 2005
12,070
0
0
New Nvidia Chip Features 3 Billion Transistors



NVidia has announced some of the features for its newest line of graphics chips based on Fermi chip architecture.

NVidia's next line of GPUs is codenamed GF100 and is the first graphics chip that will utilize the Fermi chip architecture. nVidia has been releasing snippets of information on the new cards on Facebook and Twitter [http://twitter.com/NVIDIAGeForce] for the last few weeks, the last of which came out yesterday. The GF100 will support a new type of 32x anti-aliasing, which basically means the edges of objects in graphics will now look more awesome. Also, 3 billion transistors! By all accounts, that is a veritable crapload of transistors.

Here are all the nuggets of information on Nvidia's new graphics cards:

GF100 is the codename for the first GeForce GPU based on the Fermi architecture!

The GF100 board is 10.5-inches long -- the same length as GeForce GTX 200 Series graphics cards!

GF100 packs in over 3B (billion!) transistors

The GF100 supports full hardware decode on the GPU for 3D Blu-Ray

GF100 graphics cards will provide hardware support for GPU overvoltaging for extreme overclocking!

GF100 supports a brand new 32x anti-aliasing mode for ultra high-quality gaming!

Note to nVidia: I want a GF100 for Christmas. Let's make it happen.

Source: Legitreviews [http://www.legitreviews.com/news/7029/]

Permalink
 

john_alexander

New member
Aug 16, 2008
57
0
0
That... That is fucking insane! I probably won't get one (my computer is more than powerful enough, thanks!), but I can't wait to see what this heralds for future development!
 

Doc Incognito

Currently AFK
Nov 17, 2009
166
0
0
Wait, don't Macs use nVidia chips? Because if they start using that new one, it might start making the graphics suck less.
 

Abedeus

New member
Sep 14, 2008
7,412
0
0
This is what I'm getting for my birthday.

Also, 32x AA? Who the hell needs more than x2 or x4?
 

dududf

New member
Aug 31, 2009
4,072
0
0
But Can it run Crysis?


I think I just came buckets. Lemme check *checks the nether regions* hmmm.... *SPLOSH SPLOSH SPLOSH!*

Yup.

I want this baby for christmas alright.
 

More Fun To Compute

New member
Nov 18, 2008
4,061
0
0
I once made a circuit with 3 transistors and agree that 3 billion is quite a lot of transistors. I can't wait to see how good Braid and Dwarf Fortress look with 32x anti aliasing!
 

Greg Tito

PR for Dungeons & Dragons
Sep 29, 2005
12,070
0
0
Doc Incognito said:
Wait, don't Macs use nVidia chips? Because if they start using that new one, it might start making the graphics suck less.
unfortunately, nothing can help the MACs suck less.
 

Greg Tito

PR for Dungeons & Dragons
Sep 29, 2005
12,070
0
0
Mr.Tea said:
Someone is bound to come in and say "But can it run Crysis?"...
dududf said:
But Can it run Crysis?


I think I just came buckets. Lemme check *checks the nether regions* hmmm.... *SPLOSH SPLOSH SPLOSH!*

Yup.

I want this baby for christmas alright.
you two have to meet, haha.
 

Fayathon

Professional Lurker
Nov 18, 2009
905
0
0
Omikron009 said:
400% more awesome! Also, Nvidia doesn't make their graphics cards out of freakin' wood!
Is it bad that I read that in Marcus' voice?

OT: That's impressive.