New Nvidia Chip Features 3 Billion Transistors

The Rockerfly

New member
Dec 31, 2008
4,649
0
0
Chaos Marine said:
The Rockerfly said:
Wow...I wish I knew more about computer chips then it might seem more impressive
What that means is that it can answer yes or no faster. All computer software basically functions by the principal of binary (forgetting HEX code and the likes) so the more transistors you have, the faster it can answer yes or no and the faster it can process information. Honestly, how that relates to graphics rendering, I don't really know but if I remember correctly, this is essentially like having a P4 processor for a graphics card. Just take a few seconds to let that sink in.
Oh okay, thank you for he summary. Have a cookie

 

Doc Incognito

Currently AFK
Nov 17, 2009
166
0
0
Mimsofthedawg said:
Doc Incognito said:
Mimsofthedawg said:
Doc Incognito said:
Wait, don't Macs use nVidia chips? Because if they start using that new one, it might start making the graphics suck less.
unfortunately, nothing can help the MACs suck less.
Ow, that hit me hard.
Actually, I'm used to it. I have a friend that often builds computers from spare parts, and he's a Windows supremacist.

Also, it's the graphics I'm worried about, because everything else is great. I tried playing Unreal Tournament, and everything moved at half speed.
haha, I know, I'm mostly kidding.

One of my friends hooked up a MAC OS to his DELL computer. It was really funny. The versatility of a MAC with the usability of a PC. Ultimate combo.
Whoa, that sounds awesome. Hmm.... *schemes and plots*
 

Azhrarn-101

New member
Jul 15, 2008
476
0
0
Chaos Marine said:
Marq said:
Fuck yes. This means the 200 series will be dropping in price.

Might get myself a gtx295.
Don't. Wait for the prices of the next series to come out, they should (seriously, I mean it would be bloody stupid if they didn't) have DirectX 11 support which is supposed to actually lower system requirements. I just hope it's actually true unlike DirectX 10.
DirectX 11 incurs a small performance loss compared to directX 9, mainly because full resolution post-processing, high definition Ambient Occlusion, hardware tessellation and 64-bit HDR require a LOT more calculations than quarter resolution post-processing and 32-bit HDR.

It is however a much more efficient API than directX 10 and 10.1 was, so while it won't actually lower requirements, it will be more efficient to implement than directX 10, although 9 is still lighter on the requirements, just nowhere near as pretty.

Add features like DirectCompute (which does basically the same as CUDA(nVidia) or OpenCL(ATI) does now, only easier to use on any directX 11 and 10.1 card) and you get a very potent combination.
 

Charli

New member
Nov 23, 2008
3,445
0
0
I-I can has? ...Please...give..it..to me. Now.

Nghh urge to do somthing drastic to obtaining it...rising.
 

Chaos Marine

New member
Feb 6, 2008
571
0
0
Mimsofthedawg said:
Doc Incognito said:
Mimsofthedawg said:
Doc Incognito said:
Mimsofthedawg said:
Doc Incognito said:
Wait, don't Macs use nVidia chips? Because if they start using that new one, it might start making the graphics suck less.
unfortunately, nothing can help the MACs suck less.
Ow, that hit me hard.
Actually, I'm used to it. I have a friend that often builds computers from spare parts, and he's a Windows supremacist.

Also, it's the graphics I'm worried about, because everything else is great. I tried playing Unreal Tournament, and everything moved at half speed.
haha, I know, I'm mostly kidding.

One of my friends hooked up a MAC OS to his DELL computer. It was really funny. The versatility of a MAC with the usability of a PC. Ultimate combo.
Whoa, that sounds awesome. Hmm.... *schemes and plots*
Just remember, it's highly illegal. According to my friend, Apple will know you did it... but they don't pursue legal action unless you begin to distribute it.

Not that I'm advocating for that kind of thing though...............................
And why is it illegal? Because it harms Apple's monopoly on their software. And people call Microsoft bad, at least you can actually run what software you want on their OS or custom build your computer and still be guaranteed it will work.
 

Doc Incognito

Currently AFK
Nov 17, 2009
166
0
0
Mimsofthedawg said:
Doc Incognito said:
Mimsofthedawg said:
Doc Incognito said:
Mimsofthedawg said:
Doc Incognito said:
Wait, don't Macs use nVidia chips? Because if they start using that new one, it might start making the graphics suck less.
unfortunately, nothing can help the MACs suck less.
Ow, that hit me hard.
Actually, I'm used to it. I have a friend that often builds computers from spare parts, and he's a Windows supremacist.

Also, it's the graphics I'm worried about, because everything else is great. I tried playing Unreal Tournament, and everything moved at half speed.
haha, I know, I'm mostly kidding.

One of my friends hooked up a MAC OS to his DELL computer. It was really funny. The versatility of a MAC with the usability of a PC. Ultimate combo.
Whoa, that sounds awesome. Hmm.... *schemes and plots*
Just remember, it's highly illegal. According to my friend, Apple will know you did it... but they don't pursue legal action unless you begin to distribute it.

Not that I'm advocating for that kind of thing though...............................
I wish that I had the ability to do something like that, but currently, my computer-related abilities max out at basic programming.
 

orangebandguy

Elite Member
Jan 9, 2009
3,117
0
41
Wow, 3 billion?

I don't even know what my graphics card is, I've never heard of it before. I miss my old ATI Radeon X300.
 

Jou-LotD

New member
Jul 26, 2009
43
0
0
Seriously, nvidia needs to put up or shut up. Their CEO has been talking so much trash lately about intel. I used to be a nvidia fanboi but I got snapped out of it over the past couple of months with his BS. A hint for you people drooling over fermi. A little over a month ago at N-con they had to hold up a dummy product because they didn't even have a prototype mold ready. They said the demos were running on Fermi but the cards looked too ugly. Nobody has had any pre-releases to test out so this card is going to be a Q2 2010 card probably. By then ATI will be even farther ahead since they aren't focusing on anything but gaming graphics. Sad to say but nvidia needs to step it up and get their head out of their ass. Also they were resisting dx11 out of sheer arrogance and their CEO nuthugs Apple.
 

Nurb

Cynical bastard
Dec 9, 2008
3,078
0
0
All this just to maintain at least 30fps with the newest games with all the eye-candy for 6-12 months until it starts to choke on the next poorly programmed graphics engine
 

Radelaide

New member
May 15, 2008
2,503
0
0
Doc Incognito said:
Wait, don't Macs use nVidia chips? Because if they start using that new one, it might start making the graphics suck less.
Fairly sure they use Raidon. Anyway, graphics cards aren't pretty, are they? Irony!