PC is "Far Superior" to Next-Gen Consoles, Says Nvidia

rofltehcat

New member
Jul 24, 2009
635
0
0
Strazdas said:
rofltehcat said:
Even that third party Steambox they showcased a few days ago had a AMD GPU... I think they are starting to panic. Even if PC gaming does great this console generation, Nvidia will be in for some very hard years.
Very unlikely. Nvidia cards were always better for gaming. AMD cards are better for raw calculation power, which is good whne your doing massive calculation and sensory input proceossing, but as far as gaming graphics Nvidia had the upper hand for years now. Also Nvidia has been working with developers, actively and for free, for a decade, thus most optimiations for PC you see will be very Nvidia centrific. Nvidia is in better position than AMD. And lets not even start about processors where AMD STILL havent shown anything that could rival the i processors.
This is all nice and great and in the end I'd like to see the "good" company do fine on the market, not just the "cheaper" company. However, things do not always work that way.
With both major consoles and even the WiiU using AMD chips, AMD has an incredibly large part of the gaming GPU market cornered and effectively out of the grasp of Nvidia for around the next 10 years. If AMD play their cards right, they can slowly push ahead.
If anything, you can already see them push ahead: the ~500$ Steambox prototype revealed a few days ago is using an AMD chip as well. AMD will continue pushing into PC gaming and unlike Nvidia they now have a huge power base to use to their advantage.

About the optimization: Yes, they have been doing a good job in the past and I sure hope they will continue to do so in the future. However, with all of the major consoles using AMD chips (of very comparable properties), optimizing games to run well on AMD hardware will be a lot more feasible.

Considering this is really primarily a GPU vs. GPU standoff, AMD vs. intel CPUs doesn't really play a role, does it? If anything, the major consoles using AMD CPUs as well gives AMD a great platform to use for pushing on that front as well.

Of course they may play it wrong, spread themselves over too many fronts or even grow too confident and lazy. But Nvidia is in a really tight spot right now and their main competitor looks more powerful than ever.
 

ike42

New member
Feb 25, 2009
226
0
0
Strazdas said:
I love Nvidia and everything else sucks. Slurp slurp. Come join the PC master race because we're so much better than you.
You are wrong and will accept none of the evidence to the contrary. Just because you're an Nvidia fanboy doesn't mean that everyone else is. Dedicated machines that have games made specifically for them will always be a better value overall than constantly having to tweak with games to get them optimized on your system since there is no way that out of the box they will work will all configurations.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
Pr0 said:
truckspond said:
PC has a larger library including EVERY previous console generation and the GTX Titan kicks the crap out of EVERY consoles GPU so it's not really that surprising. If you want high-end gaming just put in 4 titans in quad-SLI and crank everything up to 11 at 4K+ resolutions. Can you get such power in a console?
Even as a PC Gaming Elitist myself I have to ask if you can get that kind of power in under a 400 dollar package...cause last I checked that was about half the price of one Titan.

There is no question that running four titans in quad SLI will hammer the ever living dogshit out of anything that any console can do, but you've essentially also spent $4,000 to do it.
One console costs half a titan. Titan is extremely overpriced simply by the fact of being the best around and being able to get away with it. You can buy a card that has 90% of Titans power for half the price. Thats a 7th generation one. ALso, in order to beat consoles, from what the consoles are showing, you dont need a titan. You can build a PC for 500 dollars and it will be on part with current consoles. However PC gamers were never going as low as console power thus the price difference for average PC. because average PC is much faster.

MrHide-Patten said:
I guess I'll just haveto use the high end one at work and endure an xbox controller.
Not everyone can game at work.

Aeonknight said:
Consoles aren't going anywhere, at least not until public schools start teaching PC architecture (which they should.)
Erm what? They were doing that for over a decade here.....

Furism said:
Or, ideally, an external GPU that I can connect/disconnect from my laptop. When I'm at home I can get high-end graphics, but then I can still travel and have all my stuff with me without going through the hassles of synchronizing docs and all that. I've seen some prototypes Thunderbold-connected external GPUs (until recently, USB couldn't be as fast as Thunderbolt) but they were not market ready (being prototypes). I think Nvidia should investigate this more than small form factor PCs. Anybody can build small form-factor PCs. Not everybody can make an external GPU.
So far the only option for this i saw is if you made your own power supply for that GPU because the USB ports cant handle the power load needed to power the GPU, and they just dont manufacture any that would have a prebuilt power supply like the larger external HDDs. I agree that this would be a good thing, id love to have something like this to kick my laptop into gear. seriuosly, its GPU is runningo n full load while the rest is on ~20% load because its a laptop GPU....

GundamSentinel said:
For me the big advantage of a console it that you can buy it and don't have to worry about it any more. Just put in a game, lie on the couch and play it.
PC cheaper? Maybe, but not for me.
Except that this is not true at all. you cant just buy a console and forget about it or put a game and instantly play it. that stopped beign true since this generation launched.
Also do you buy games? If yes, then PC is cheaper for you.

Artlover said:
That is not next gen. That is not even current gen. The Xbox360 and PS3 already do 720 @ 60fps with no issues at all and can do 1080 @ 30fps with a bit of effort. Hell, even the Gamecube from the 6th generation can do 1080.

A decent computer from 10 years ago can easily run 1960x1280 @ 60 fps all day long.

Lets ignore GFX for a minute. Let's talk CPU. Lets look at the XB1 as an example. Do you realize that it's actual performance is barely any better than the Xbox360. Why? While, the XB1 has 8 cores. 2 are reserved for the OS, only 6 are used for gaming, and are only running at 1.75 GHz. The X360 was 3 cores at 3.2 GHz. That translates into 10.5GHz vs 9.6GHz of processing power. Not much differences, and it shows.

I'm not impressed by what I have seen by either the XB1 or the PS4. Neither gave me any reason to buy them. Every game demo I've seen does nothing but reinforce the idea that the Xbox360 and PS3 are equally as good and I might as well just keep my old systems and my money.
Not true. both 360 and PS3 had to do upscaling to even reach 720. They were never designed to run native 720 let alone 1080.
Yes, my 10 year old PC runs on 1600x1200 @ 60 fps, but no on modern games. Mostly used for internet browsing only nowadays, but it still works.

The CPU power comaprisons are never fair when you look at GHZ only. a single core of 3.0GHZ i5 can do twice as much calculations as a 10 year old Pentium 3.0 ghz core. Manufacturing differences allwed that. Thats why i series are so revolutionary, same frequency, more power. AMD still has no answer to this other than "throw 8 cores in". The point is processors evolve in more ways than just ghz.
That being said, current consoles use tablet underpowered GPUs, so i wouldnt really expect that 10.5ghz to work like 10.5ghz. ALso more cores is much harder to program so most people wont bother past 2-3 cores. In this regard old consoles were better.

rofltehcat said:
Strazdas said:
rofltehcat said:
Even that third party Steambox they showcased a few days ago had a AMD GPU... I think they are starting to panic. Even if PC gaming does great this console generation, Nvidia will be in for some very hard years.
Very unlikely. Nvidia cards were always better for gaming. AMD cards are better for raw calculation power, which is good whne your doing massive calculation and sensory input proceossing, but as far as gaming graphics Nvidia had the upper hand for years now. Also Nvidia has been working with developers, actively and for free, for a decade, thus most optimiations for PC you see will be very Nvidia centrific. Nvidia is in better position than AMD. And lets not even start about processors where AMD STILL havent shown anything that could rival the i processors.
This is all nice and great and in the end I'd like to see the "good" company do fine on the market, not just the "cheaper" company. However, things do not always work that way.
With both major consoles and even the WiiU using AMD chips, AMD has an incredibly large part of the gaming GPU market cornered and effectively out of the grasp of Nvidia for around the next 10 years. If AMD play their cards right, they can slowly push ahead.
If anything, you can already see them push ahead: the ~500$ Steambox prototype revealed a few days ago is using an AMD chip as well. AMD will continue pushing into PC gaming and unlike Nvidia they now have a huge power base to use to their advantage.

About the optimization: Yes, they have been doing a good job in the past and I sure hope they will continue to do so in the future. However, with all of the major consoles using AMD chips (of very comparable properties), optimizing games to run well on AMD hardware will be a lot more feasible.

Considering this is really primarily a GPU vs. GPU standoff, AMD vs. intel CPUs doesn't really play a role, does it? If anything, the major consoles using AMD CPUs as well gives AMD a great platform to use for pushing on that front as well.

Of course they may play it wrong, spread themselves over too many fronts or even grow too confident and lazy. But Nvidia is in a really tight spot right now and their main competitor looks more powerful than ever.
Yes, currently all consoles are using AMD GPUs because they didnt want to buy a more powerful GPU that Nvidia offered, and went for cheaper and slower AMD. if AMD is making money on this, it is very good for them, does not mean Nvidia is wrong about PCs though. Nvidia is better at gaming graphics, and to be honest we already saw problems with new consoles grpahic prowess som AMD didnt disprove that.
The cheap Steambox is going to use AMD because AMD is cheaper. power comes at a price and Nvidia always seem to be the kind of company that would try to shove in the most powerful stuff in leadign the market and ignore the low end consumers, thats where AMD always came in and satisfied the need for slow and cheap GPUs.
Nvidia has quite a fair grasp on Pc gaming. They help the developers optimize games for their cards. AMD doesnt. they are very much PC centric and were so even before they lost the consoles. they just didnt advertise it that much. there is a reason Nvidia logo comes up before every second PC game. they actaully worked on optimizing it. for thier cards of course.
I agree CPU is hard an issue now, however considering the CPU they are putting into new xbox i think it may become an issue after all. will see.
The reason i mentioned CPUs is because Nvidia and Intel is same company versus AMD and Radeon (renamed AMD, because AMD needs to be AMD and confuse people). So both GPU and CPU market shares affect both sides. And lets face it, i series are quite dominant when it comes to PC CPUs.

ike42 said:
Strazdas said:
I love Nvidia and everything else sucks. Slurp slurp. Come join the PC master race because we're so much better than you.
You are wrong and will accept none of the evidence to the contrary. Just because you're an Nvidia fanboy doesn't mean that everyone else is. Dedicated machines that have games made specifically for them will always be a better value overall than constantly having to tweak with games to get them optimized on your system since there is no way that out of the box they will work will all configurations.
Maybe you should do some more research before going on insulting people next time?
New consoles are as dedicated machines as PCs were 10 years ago. PCs are more game dedicared now than consoles thanks to consoles being the very same PC but limited. Both new consoles are programming for same APIs that you program for PC, essentially making programming extremely close. Heck, Xbox uses the very same directx that PC uses. Noone is programming directly to hardware. theres probably not more than a dozen people in the world that could. Thanks to new consoels using PC architecture, any optimization done for consoles will automatically be also done for PC. PC games have been working "out of the box" for a long whole now, maybe you should try some.
 

MrHide-Patten

New member
Jun 10, 2009
1,309
0
0
Strazdas said:
Not everyone can game at work.
One of the upsides of being an Indie Dev... granted I only do it at lunchtime once a month for exclusives.

GundamSentinel said:
PC cheaper? Maybe, but not for me.
It's the main reason why I don't/haven't, in AUS they really bend us over the table for the cost of hardwrae... and downloadable software. It's bad enough that the Goverment actually had to step in and ask the shits like Adobe 'why?'.

And analysts wonder why teh piracy rate in Austrlia is (one of) the highest in the world.
 

GundamSentinel

The leading man, who else?
Aug 23, 2009
4,448
0
0
Strazdas said:
GundamSentinel said:
For me the big advantage of a console it that you can buy it and don't have to worry about it any more. Just put in a game, lie on the couch and play it.
PC cheaper? Maybe, but not for me.
Except that this is not true at all. you cant just buy a console and forget about it or put a game and instantly play it. that stopped beign true since this generation launched.
Also do you buy games? If yes, then PC is cheaper for you.
I can't remember a single instance where I had to fuss around to get a console game to work (and I play a lot of them). Patches? Sure, but those take a couple of seconds with my internet and are very much a handsfree affair. The only times I ever got my hands dirty with my console was when I had to setup my internet, NAT and mic properly. Sure, autopatching solves most things with PC games as well, but it's still a difference. With PC games it's very much a YMMV thing depending on your hardware, but even with newer games (or maybe especially with newer games) I often have to struggle to get them working they way they're supposed to. Different people will have different experiences with it, but for me console gaming has always been a stressless breeze.

Also, anyone who says a PC is always backwards compatible is a liar. D:<

The difference between the price of new PC games and new console games is often marginal. Sales and smart shopping can cancel most of the diffence out. I agree, PC games are generally cheaper, especially via digital distributors, but not enough for me to make up for the costs of upgrading my hardware. I positively need a decent laptop for my work, so for PC gaming my options are either getting an additional good gaming PC or a getting a complete gaming laptop. Both of which are just more expensive than a console, even when considering the additional price for games. And if you shop smartly, console games can be gotten very cheaply as well.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
GundamSentinel said:
I can't remember a single instance where I had to fuss around to get a console game to work (and I play a lot of them). Patches? Sure, but those take a couple of seconds with my internet and are very much a handsfree affair. The only times I ever got my hands dirty with my console was when I had to setup my internet, NAT and mic properly. Sure, autopatching solves most things with PC games as well, but it's still a difference. With PC games it's very much a YMMV thing depending on your hardware, but even with newer games (or maybe especially with newer games) I often have to struggle to get them working they way they're supposed to. Different people will have different experiences with it, but for me console gaming has always been a stressless breeze.

Also, anyone who says a PC is always backwards compatible is a liar. D:<

The difference between the price of new PC games and new console games is often marginal. Sales and smart shopping can cancel most of the diffence out. I agree, PC games are generally cheaper, especially via digital distributors, but not enough for me to make up for the costs of upgrading my hardware. I positively need a decent laptop for my work, so for PC gaming my options are either getting an additional good gaming PC or a getting a complete gaming laptop. Both of which are just more expensive than a console, even when considering the additional price for games. And if you shop smartly, console games can be gotten very cheaply as well.
You are probably talking about the previuos consoles, which for some people worked, yes. The new ones, however, will have all the same problems PC has and more. On the other hand last time i had to do anything with a game to "make it work" was in 2009, if we exclude when i tried running 15 year old games which was a compactability issue and even then it was solved and i played the game (dungeon keeper 2 if interested). PC was always a stressless experience for me, there is no struggle.
PC hardware of equal power costs same as new consoles now. but people buy much faster PCs and then say they cost more. Of course faster PC will cost more. And you really need to play either launch day only or couple games a year to not make up the difference.
TO game on PC you do not need to buy a PC that is mroe expensive than a console. If you want to game with better graphics than a console - you do. Or you could buy an expensive PC once and play games for 5 years on better graphics. To each his own of course, and noone can force you to buy a PC, but PC is no longer the expensive option.
 

Boogie Knight

New member
Oct 17, 2011
115
0
0
I can't help shaking my head whenever Angry Joe goes bananas whenever his fancy PC gets hit with technical issues because the game itself is ass. The hard truth is that the games with the high end graphics which the glorious PC master race loves to wank off to are almost entirely a complete mess with more bugs than a sleazy motel and servers which suck more than servers in shoddy restaurants. The rambling about superior graphics is just subservience to the idiotic mentality which the gaming industry has been hyping to get around the lack of innovation by AAA developers in aspects of gaming which really matter.

Besides, I kinda hate shooters, or at least most modern shooters which are entirely focused on multiplayer at the expense of singleplayer campaigns. Last I checked, there aren't too many JRPGs which I can legally (emphasis legally) acquire on PC. Yeah, yeah, they don't show of the graphical power of the machines they run on, but are less bug and glitch prone compared to Western counterparts.
 

Griffolion

Elite Member
Aug 18, 2009
2,207
0
41
Scrumpmonkey said:
The big problem with console performance is that the system has to be on one PCB. It's kind of like the sacrifices made by 'gaming laptops'. They have a much smaller size footprint (or at least should do, the Xbone is freaking HUGE) and this means parts cannot be clocked as highly as their PC counterparts with discreet components. I'm not talking overclocking here i'm talking stock clock speeds.
That's not entirely true unless qualified with the fact that they can't engineer a cooling solution that goes above budget. But you're generally right.

Scrumpmonkey said:
I really don't buy into the consoles being that much more efficient with their resources, especially as time has moved on. Whatever efficiency gains they do get over time with developers getting more familiar with the architecture are pretty much wiped out by their hardware aging in comparison to new parts that are always being released. The Xbox 360 for instance uses what is essentially a ATI X1800, the second best GPU of 2005, and they did pretty well with that. Now in 2013 they can't even break out of the same 720p territory. That's a bit sad. It's early days but this new generation of consoles seems almost as bloated as their PC counterparts.
They actually are, mainly in the OS' memory management. Developers also tend to have lower level access to hardware, the less abstraction the better performance at the tradeoff of ease of development.

You generally find the early releases of any next-gen console to be pretty bad in terms of optimisation. At the end of any hardware cycle, the devs are taking full advantage of concurrency in their programming practices. It's a nightmare to debug and has tons of problems if not done right, but when done right can squeeze out every last iota of performance from your machine. Given the proverbial floodgates get opened to them in terms of HW performance at the beginning of a new cycle, they see no need to program concurrency into the games anymore, not until next gen requirements truly start catching up. Hence why you never really see any massive leaps in performance from one to the next, it takes years of iterative steps to truly tap next gen hardware. This, AND coupled with the fact that they have a new architecture/SDK to get familiar with, make a game for it, and get it out on a tight schedule. That stuff doesn't just happen.
 

Griffolion

Elite Member
Aug 18, 2009
2,207
0
41
Scrumpmonkey said:
We are kind of making the same point :p.
I'm not really too sure we were, but fair enough.

Scrumpmonkey said:
As time moves on the optimization of standardized parts gets better (i agree on the memory management point too).
You're going to have to explain to me what that sentence means.

Scrumpmonkey said:
Look at "The last of Us" that game looked fantastic considering the hardware restrictions of the PS3. But hardware also moves on so by the time we 'unlock the potential' of this generation there will be some new Crysis equivalent burning our GPUs at 4K resolution.
You're right, but you're going to have to explain to me your point about all that. Of course there will be a new engine that's a GPU killer as time goes on.

Scrumpmonkey said:
The thing is that games are more GPU dependent than ever. The utilization of something like windows 7 in the background on your GPU is almost non existent. Also if you are running something at above 8GB of ram the memory footprint of your OS won't be that much of an issue too. Windows Vista may have been impactful on resources back in 2006 but i don't think the GPU/CPU imprint of the operating system is much of an issue.
Yeah, that's true in a sense. The dependency ratio of games have simply shifted more over to GPU's, since advancements in parallel processing and methods in how to render using the multitudes of FMA streams found in GPU's have become more reliable. At one point it was better to send to the CPU for a lot of things, but no more. And yes, the OS memory footprint becomes a non-issue at a certain memory level, but what's your point?

(Please don't take any of this the wrong way, I think I'm just missing the point you're trying to make)
 

ToastyMozart

New member
Mar 13, 2012
224
0
0
THIS JUST IN: Experts are claiming that grass is still green, the sky is still blue, and Chris Brown is still an asshole. Stay tuned for our interview with a local man who claims that eating food saved him from dying of starvation.
 

ToastyMozart

New member
Mar 13, 2012
224
0
0
Griffolion said:
Scrumpmonkey said:
As time moves on the optimization of standardized parts gets better (i agree on the memory management point too).
You're going to have to explain to me what that sentence means.
I think he's just stating that the consoles only coming with 1 design permutation means that unlike PC, where you have to optimize for about 6 or 7 CPU architectures and 8 or 9 GPU architectures, each with a number of different versions with different degrees of power, developers working on console titles can focus exclusively on squeezing every last drop of potential out of the available hardware. This is compounded by the fact that they can have their rendering engine interact with the GPU on a very low level, while pretty much all PC games have to go through DirectX or OpenGL, which is easier to work with for supporting a multitude of GPUs, but is much less efficient.

That sort of low-level GPU interaction is also what AMD's trying to achieve with their "Mantle" API (which hopefully catches on and Nvidia starts doing too. Microsoft has REALLY gotten lazy with improving DirectX).

Basically, if Super Mario Galaxy was a PC title, there is no way you would be able to run it as well as the Wii does if you were using the same ATI Hollywood chip in your tower.
 

Griffolion

Elite Member
Aug 18, 2009
2,207
0
41
ToastyMozart said:
Oh, that's what he was saying? Wow, I really didn't get that from his sentence.

For PC you don't optimise for multiple architectures unless you have some time and money on your hands. Direct X and Open GL are abstractions designed to take the hassle of optimisation out at the tradeoff of proximity to the silicon.

Mantle's great, but is for a very small subset of GPU's, currently only used by one game. What's needed is an open low level API that both nVidia and AMD can work on together.

Yes, I'm aware of that. Please don't mistake me for an idiot because I couldn't understand what he was saying. It was more to do with his word choices than the material at hand.
 

Eldritch Warlord

New member
Jun 6, 2008
2,901
0
0
truckspond said:
PC has a larger library including EVERY previous console generation and the GTX Titan kicks the crap out of EVERY consoles GPU so it's not really that surprising. If you want high-end gaming just put in 4 titans in quad-SLI and crank everything up to 11 at 4K+ resolutions. Can you get such power in a console?
It's truly pathetic that a $300-$500 console should not get such performance as a $4000 PC.
 

Evil Smurf

Admin of Catoholics Anonymous
Nov 11, 2011
11,597
0
0
This is correct, however the spokesman seems a little arrogant, not appearing arrogant is part of good PR, this is something that would do everyone well.
 

Aesir23

New member
Jul 2, 2009
2,861
0
0
I will admit that I always get a bit confused when PC gamers say that PC gaming is cheaper. I'm well aware that they mean the games are cheaper but my mind always makes the jump to the cost of the hardware which is definitely not cheaper.

Regarding the PC vs Console issue, I think a lot of it comes down to personal taste. At the end of the day I just want a machine that plays games. I don't want to fiddle with it to get a game to work and I don't get any satisfaction out of upgrading my PC or delight in how much power it may have. On top of that, I prefer to buy physical copies of games over digital distribution. Considering that the closest Best Buy is half an hour's drive from my house and the two stores within a 20 minute walk (that sell video games) have a pretty pitiful selection of PC games, PC gaming just isn't my thing.

To be on topic, yes, PCs are far superior to consoles and that's like saying the grass is green or, to be current, that snow is white. However, like many have said, this seems to be a bit of a fit that Nvidia is throwing over the fact that the new consoles use AMD cards.