I did that too before realizing I was reading a Marcus quote!Fayathon said:Is it bad that I read that in Marcus' voice?Omikron009 said:400% more awesome! Also, Nvidia doesn't make their graphics cards out of freakin' wood!
OT: That's impressive.
If it took more than one pass to render all 300 trillion texels, then it wasn't an Nvidia!Omikron009 said:400% more awesome! Also, Nvidia doesn't make their graphics cards out of freakin' wood!
likely not more than around $600. once these things get to production they arent very expensive to make, its recouping the r&d costs that makes them cost so much. but that aside, graphics cards are cheaper than theyve ever been.The_Oracle said:It's over NINE MIIIILLLLIONNNNN! Sorry. Bad joke.
But in all sincerity, how expensive is a piece of electronics like that going to be? I'm sure its processing power is through the proverbial roof.
remember! If it took more than one graphics card you weren't using an Nvidia.Omikron009 said:400% more awesome! Also, Nvidia doesn't make their graphics cards out of freakin' wood!
Hmm, less than I imagined. If I ever need a new one though, I think I'll look for a cheaper one!ratix2 said:likely not more than around $600. once these things get to production they arent very expensive to make, its recouping the r&d costs that makes them cost so much. but that aside, graphics cards are cheaper than theyve ever been.The_Oracle said:It's over NINE MIIIILLLLIONNNNN! Sorry. Bad joke.
But in all sincerity, how expensive is a piece of electronics like that going to be? I'm sure its processing power is through the proverbial roof.
Ow, that hit me hard.Mimsofthedawg said:unfortunately, nothing can help the MACs suck less.Doc Incognito said:Wait, don't Macs use nVidia chips? Because if they start using that new one, it might start making the graphics suck less.
Now all they need to do is actually get some damn cards in stores. I've generally been loyal to Nvidia, but I want to build a new computer next month and if their cards aren't available, or enough of an improvement to make it worth waiting a month, I'm going ATI.Greg Tito said:New Nvidia Chip Features 3 Billion Transistors
NVidia has announced some of the features for its newest line of graphics chips based on Fermi chip architecture.
NVidia's next line of GPUs is codenamed GF100 and is the first graphics chip that will utilize the Fermi chip architecture. nVidia has been releasing snippets of information on the new cards on Facebook and Twitter [http://twitter.com/NVIDIAGeForce] for the last few weeks, the last of which came out yesterday. The GF100 will support a new type of 32x anti-aliasing, which basically means the edges of objects in graphics will now look more awesome. Also, 3 billion transistors! By all accounts, that is a veritable crapload of transistors.
Here are all the nuggets of information on Nvidia's new graphics cards:
GF100 is the codename for the first GeForce GPU based on the Fermi architecture!
The GF100 board is 10.5-inches long -- the same length as GeForce GTX 200 Series graphics cards!
GF100 packs in over 3B (billion!) transistors
The GF100 supports full hardware decode on the GPU for 3D Blu-Ray
GF100 graphics cards will provide hardware support for GPU overvoltaging for extreme overclocking!
GF100 supports a brand new 32x anti-aliasing mode for ultra high-quality gaming!
Note to nVidia: I want a GF100 for Christmas. Let's make it happen.
Source: Legitreviews [http://www.legitreviews.com/news/7029/]
Permalink
What computer are you running that makes high AA blurry?Mr.Tea said:Braid already looks blurred because there isn't even a setting to change the resolution, I can't imagine how much worse it'll be with 32x anti-aliasing...More Fun To Compute said:I once made a circuit with 3 transistors and agree that 3 billion is quite a lot of transistors. I can't wait to see how good Braid and Dwarf Fortress look with 32x anti aliasing!
I don't get it either... Maybe people like blurriness? I prefer sharpness and higher frame rates; I must be crazy.Abedeus said:This is what I'm getting for my birthday.
Also, 32x AA? Who the hell needs more than x2 or x4?
I wouldn't be hard on macs if it weren't for the fact that you can build a system for half the cost, twice the power, with the same level of stability and still not get viruses.Doc Incognito said:Ow, that hit me hard.Mimsofthedawg said:unfortunately, nothing can help the MACs suck less.Doc Incognito said:Wait, don't Macs use nVidia chips? Because if they start using that new one, it might start making the graphics suck less.
Actually, I'm used to it. I have a friend that often builds computers from spare parts, and he's a Windows supremacist.
Also, it's the graphics I'm worried about, because everything else is great. I tried playing Unreal Tournament, and everything moved at half speed.
I think the idea is for people who are starting to use massive screens for gameplay and in the hopes that there will be massive resolutions to accommodate.AverageJoe said:32x AA? Holy shit. Is there really a need for that?
On some games I can't tell the difference between 4x and above (bearing in mind I'm playing games in my native res of 1920x1080). I can very rarely tell the difference between 4x and 8x, and I never go above 8x unless I can absolutely max the game at a constant 60fps because otherwise there is just no need, it gives no real extra visual quality unless you get your microscope out and check the edges of objects. It's just bragging rights.