Pr0 said:
truckspond said:
PC has a larger library including EVERY previous console generation and the GTX Titan kicks the crap out of EVERY consoles GPU so it's not really that surprising. If you want high-end gaming just put in 4 titans in quad-SLI and crank everything up to 11 at 4K+ resolutions. Can you get such power in a console?
Even as a PC Gaming Elitist myself I have to ask if you can get that kind of power in under a 400 dollar package...cause last I checked that was about half the price of one Titan.
There is no question that running four titans in quad SLI will hammer the ever living dogshit out of anything that any console can do, but you've essentially also spent $4,000 to do it.
One console costs half a titan. Titan is extremely overpriced simply by the fact of being the best around and being able to get away with it. You can buy a card that has 90% of Titans power for half the price. Thats a 7th generation one. ALso, in order to beat consoles, from what the consoles are showing, you dont need a titan. You can build a PC for 500 dollars and it will be on part with current consoles. However PC gamers were never going as low as console power thus the price difference for average PC. because average PC is much faster.
MrHide-Patten said:
I guess I'll just haveto use the high end one at work and endure an xbox controller.
Not everyone can game at work.
Aeonknight said:
Consoles aren't going anywhere, at least not until public schools start teaching PC architecture (which they should.)
Erm what? They were doing that for over a decade here.....
Furism said:
Or, ideally, an external GPU that I can connect/disconnect from my laptop. When I'm at home I can get high-end graphics, but then I can still travel and have all my stuff with me without going through the hassles of synchronizing docs and all that. I've seen some prototypes Thunderbold-connected external GPUs (until recently, USB couldn't be as fast as Thunderbolt) but they were not market ready (being prototypes). I think Nvidia should investigate this more than small form factor PCs. Anybody can build small form-factor PCs. Not everybody can make an external GPU.
So far the only option for this i saw is if you made your own power supply for that GPU because the USB ports cant handle the power load needed to power the GPU, and they just dont manufacture any that would have a prebuilt power supply like the larger external HDDs. I agree that this would be a good thing, id love to have something like this to kick my laptop into gear. seriuosly, its GPU is runningo n full load while the rest is on ~20% load because its a laptop GPU....
GundamSentinel said:
For me the big advantage of a console it that you can buy it and don't have to worry about it any more. Just put in a game, lie on the couch and play it.
PC cheaper? Maybe, but not for me.
Except that this is not true at all. you cant just buy a console and forget about it or put a game and instantly play it. that stopped beign true since this generation launched.
Also do you buy games? If yes, then PC is cheaper for you.
Artlover said:
That is not next gen. That is not even current gen. The Xbox360 and PS3 already do 720 @ 60fps with no issues at all and can do 1080 @ 30fps with a bit of effort. Hell, even the Gamecube from the 6th generation can do 1080.
A decent computer from 10 years ago can easily run 1960x1280 @ 60 fps all day long.
Lets ignore GFX for a minute. Let's talk CPU. Lets look at the XB1 as an example. Do you realize that it's actual performance is barely any better than the Xbox360. Why? While, the XB1 has 8 cores. 2 are reserved for the OS, only 6 are used for gaming, and are only running at 1.75 GHz. The X360 was 3 cores at 3.2 GHz. That translates into 10.5GHz vs 9.6GHz of processing power. Not much differences, and it shows.
I'm not impressed by what I have seen by either the XB1 or the PS4. Neither gave me any reason to buy them. Every game demo I've seen does nothing but reinforce the idea that the Xbox360 and PS3 are equally as good and I might as well just keep my old systems and my money.
Not true. both 360 and PS3 had to do upscaling to even reach 720. They were never designed to run native 720 let alone 1080.
Yes, my 10 year old PC runs on 1600x1200 @ 60 fps, but no on modern games. Mostly used for internet browsing only nowadays, but it still works.
The CPU power comaprisons are never fair when you look at GHZ only. a single core of 3.0GHZ i5 can do twice as much calculations as a 10 year old Pentium 3.0 ghz core. Manufacturing differences allwed that. Thats why i series are so revolutionary, same frequency, more power. AMD still has no answer to this other than "throw 8 cores in". The point is processors evolve in more ways than just ghz.
That being said, current consoles use tablet underpowered GPUs, so i wouldnt really expect that 10.5ghz to work like 10.5ghz. ALso more cores is much harder to program so most people wont bother past 2-3 cores. In this regard old consoles were better.
rofltehcat said:
Strazdas said:
rofltehcat said:
Even that third party Steambox they showcased a few days ago had a AMD GPU... I think they are starting to panic. Even if PC gaming does great this console generation, Nvidia will be in for some very hard years.
Very unlikely. Nvidia cards were always better for gaming. AMD cards are better for raw calculation power, which is good whne your doing massive calculation and sensory input proceossing, but as far as gaming graphics Nvidia had the upper hand for years now. Also Nvidia has been working with developers, actively and for free, for a decade, thus most optimiations for PC you see will be very Nvidia centrific. Nvidia is in better position than AMD. And lets not even start about processors where AMD STILL havent shown anything that could rival the i processors.
This is all nice and great and in the end I'd like to see the "good" company do fine on the market, not just the "cheaper" company. However, things do not always work that way.
With both major consoles and even the WiiU using AMD chips, AMD has an incredibly large part of the gaming GPU market cornered and effectively out of the grasp of Nvidia for around the next 10 years. If AMD play their cards right, they can slowly push ahead.
If anything, you can already see them push ahead: the ~500$ Steambox prototype revealed a few days ago is using an AMD chip as well. AMD will continue pushing into PC gaming and unlike Nvidia they now have a huge power base to use to their advantage.
About the optimization: Yes, they have been doing a good job in the past and I sure hope they will continue to do so in the future. However, with all of the major consoles using AMD chips (of very comparable properties), optimizing games to run well on AMD hardware will be a lot more feasible.
Considering this is really primarily a GPU vs. GPU standoff, AMD vs. intel CPUs doesn't really play a role, does it? If anything, the major consoles using AMD CPUs as well gives AMD a great platform to use for pushing on that front as well.
Of course they may play it wrong, spread themselves over too many fronts or even grow too confident and lazy. But Nvidia is in a really tight spot right now and their main competitor looks more powerful than ever.
Yes, currently all consoles are using AMD GPUs because they didnt want to buy a more powerful GPU that Nvidia offered, and went for cheaper and slower AMD. if AMD is making money on this, it is very good for them, does not mean Nvidia is wrong about PCs though. Nvidia is better at gaming graphics, and to be honest we already saw problems with new consoles grpahic prowess som AMD didnt disprove that.
The cheap Steambox is going to use AMD because AMD is cheaper. power comes at a price and Nvidia always seem to be the kind of company that would try to shove in the most powerful stuff in leadign the market and ignore the low end consumers, thats where AMD always came in and satisfied the need for slow and cheap GPUs.
Nvidia has quite a fair grasp on Pc gaming. They help the developers optimize games for their cards. AMD doesnt. they are very much PC centric and were so even before they lost the consoles. they just didnt advertise it that much. there is a reason Nvidia logo comes up before every second PC game. they actaully worked on optimizing it. for thier cards of course.
I agree CPU is hard an issue now, however considering the CPU they are putting into new xbox i think it may become an issue after all. will see.
The reason i mentioned CPUs is because Nvidia and Intel is same company versus AMD and Radeon (renamed AMD, because AMD needs to be AMD and confuse people). So both GPU and CPU market shares affect both sides. And lets face it, i series are quite dominant when it comes to PC CPUs.
ike42 said:
Strazdas said:
I love Nvidia and everything else sucks. Slurp slurp. Come join the PC master race because we're so much better than you.
You are wrong and will accept none of the evidence to the contrary. Just because you're an Nvidia fanboy doesn't mean that everyone else is. Dedicated machines that have games made specifically for them will always be a better value overall than constantly having to tweak with games to get them optimized on your system since there is no way that out of the box they will work will all configurations.
Maybe you should do some more research before going on insulting people next time?
New consoles are as dedicated machines as PCs were 10 years ago. PCs are more game dedicared now than consoles thanks to consoles being the very same PC but limited. Both new consoles are programming for same APIs that you program for PC, essentially making programming extremely close. Heck, Xbox uses the very same directx that PC uses. Noone is programming directly to hardware. theres probably not more than a dozen people in the world that could. Thanks to new consoels using PC architecture, any optimization done for consoles will automatically be also done for PC. PC games have been working "out of the box" for a long whole now, maybe you should try some.