how do you have the patience to mulit quote like that?Strazdas said:its not broken. got to remmeber that in order for PC version to look like console version you need to unlock the secret setting called "ultra low graphic settings".Mr Ink 5000 said:Recommending a 8core 4GHz AMD CPU, when the consoles will be running a 8core 1.9Ghz AMD APU with 2 cores deddicated to the os
I think my calculator is broke, that doesnt seem to add up
It worked for Crysis. I remember all the graphics card reviewers using that game as their benchmark. It's like having a heavy, inefficient poorly made car and saying it needs a jet engine to make it go 60 mphThe Rogue Wolf said:"We can't be bothered to optimize our game for the PC, so we're going to inflate the system requirements so that jerkass elitist gamers think it's 'true next-gen' and can smugly say 'get a job and upgrade your PC' on forums everywhere."
Game-debate already does this sort of check, altrough very undetailed benchmarks (basically tells you your paying in high medium or low settings with your rig) depending on resolution you enter. the idea with processing power is good in theory, in practice most people dont even know their clock speed, let alone processing amount. i couldnt tell you my gpu actins per second by heart either.alj said:This gives me an idea for a website where you bench the game and give the real requirements to play on high mid low at common resolutions 4k 2k 1080p and so on.
I am a patient personMr Ink 5000 said:how do you have the patience to mulit quote like that?
Ubi seem to be awful at PC optimisation. I remember the last splintercell gave me better frame rates on ultra settings until it was patched, madness
I don't think this game will be the milestone for how the new console gen will effect PC requirements.
Crysis was efficient and optimized. it was great game and STILL IS GREAT for benchmarking. it knows how to use your system to the limit and manages to overload even the newest cards not with inefficincy but with raw graphical fidelity. both sequels later didnt manage to repeat this graphic benchmarking engine. Crysis was not a "game". it was a tech demo of Crytek new engine that was a perfect benchmark for years to come. your anology would be better used for games like GTA 4. now that game is unoptimized.Lono Shrugged said:It worked for Crysis. I remember all the graphics card reviewers using that game as their benchmark. It's like having a heavy, inefficient poorly made car and saying it needs a jet engine to make it go 60 mph
I should have been clearer. Crysis WAS an excellently optimised game and ran quite well on lower end systems. But at the time it was held up as a point of honour if you could run it at max spec. The fact it was held up as the pinnacle of requirements must not have hurt it's sales. They still put the sequels in graphics card bundles. PC specs have plateaued a little bit from when I started out. I remember being able to run a new game at max spec and 2 years later not being able to run new releases at all. Even being able to run a high spec new game was considered impressiveStrazdas said:Crysis was efficient and optimized. it was great game and STILL IS GREAT for benchmarking. it knows how to use your system to the limit and manages to overload even the newest cards not with inefficincy but with raw graphical fidelity. both sequels later didnt manage to repeat this graphic benchmarking engine. Crysis was not a "game". it was a tech demo of Crytek new engine that was a perfect benchmark for years to come. your anology would be better used for games like GTA 4. now that game is unoptimized.
Or, if that's what they mean by 8-core, then I'm fine, right? My CPU has 4 cores and 8 threads at 3.4 GHz. As a matter of fact, comparing the two processors on the website linked, there doesn't seem to be a huge difference between the 2600 and the 3770K.verindae said:http://ark.intel.com/products/65523/Intel-Core-i7-3770K-Processor-8M-Cache-up-to-3_90-GHznoobium said:I have a 3770k and I am sure pretty sure that it is a quad core processor.
Quad Core Hyper Thread. 4 Physical, 8 logical.
Honestly, if a 6950 can't get it playable at max (1920x1080), I'd be very surprised.Creator002 said:Well, that takes my chances on getting it for PC down a bit.
I have an i7 2600, 16 GB of RAM and a Radeon HD 6950, so I can run it easily, but I like turning everything up to max.
Or, if that's what they mean by 8-core, then I'm fine, right? My CPU has 4 cores and 8 threads at 3.4 GHz. As a matter of fact, comparing the two processors on the website linked, there doesn't seem to be a huge difference between the 2600 and the 3770K.verindae said:http://ark.intel.com/products/65523/Intel-Core-i7-3770K-Processor-8M-Cache-up-to-3_90-GHznoobium said:I have a 3770k and I am sure pretty sure that it is a quad core processor.
Quad Core Hyper Thread. 4 Physical, 8 logical.
$20?! I was expecting to pay $100+. Thing is 4chan doesn't exactly seem like a reputable place to get something.Strazdas said:No and no. First of all, this is very likely either a fake or the developers have completely forgot even basic optimization for CPU processes. Either way - not going to be a trend any time soon. Anything that processing-intensive is moved to GPU processing anyway, so its very likely this is fake. This is why the 8 weak cores solution for consoles will FAIL. noone is dumb enough to program for that.Alex Co said:Does this mean eight core CPUs will be the norm for AAA PC games this generation? If so, will you rather buy the same games on consoles instead?
Why would i buy it on consoles because of that? even minimum settings on that thing is still looking better than consoles. If graphic is deciding factor, a 500 dollar PC will always, always win over a console. and if game is so unoptimized that it actually needs that much processing, dont expect it to run smoothly on consoles to begin with. see, consoles aren different this generation. they are just prebuilt brand-name PCs with restricted OS and custom design that makes any IT specialist scratch his head. If it runs bad on PC it runs bad on console, always. that is, unless two different developers worked on the versions, which usually only happens to shitty ports, not multiplatform releases.
according to steam hardware survey, the average person is still running on two processor cores. the reason for this is that most games arent even capable of using more than 2 cores anyway and its all about GPU power in gaming. Which is why when i built a new PC this year i chose i5 over i7, its not like im going to need that power in CPU anyway and rather spend that money on better GPU.StHubi said:M It would also be interesting to know if these requirements can still be called "high" as the typical gaming PC is probably quite capable by now...
While the GPU they put in the recommended could be considered "that generation the gamers are about to change into new ones now" the CPU requirements are high. too high in fact. considering other requirements, unbelievably too high.
well, to be honest, at least they try and their ports actually work.ShakerSilver said:That's probably true. I mean Ubisoft isn't exactly known for constructing flawless PC ports.
well, crytek always knew how to optimize the hell out of thier games. they built the engine after all. i remember playing the original crysis on high on a frigging laptop with 8600m GS.Charcharo said:Crysis 3 said 5770 as a minimum requirement...
I was playing it on medium 50+ fps ... hell even some of the effects like Water on high. That is 900p
Same with Infinite, there I played it on almost all ultra (except AA) and with Dishonored
the thing is though, 900p. thats the reason probably. the specs aim for AT LEAST 1080p and i would be shocked if the developers werent running 1440p at least when playtesting thier game. going down to 900p is a huge boost to free up resources for other things on your system.
XP wasnt supported in modern games for years now. Whats to talk about. XP simply cant run newer than 9.0c directX, and games moved past that.RA92 said:Waah waah no XP support.
so isntead of running the game fine on perhaps slightly not maximum settings, your getting it on PS4 where you will have it run on "ultra low" settings? how does that make sense?kiri2tsubasa said:So, my I5-3570K is a physical and logical quad core based on the fact it has only 4 threads. Yep definitely getting this on PS4 then.
You mean, like how hyperthreaded 6 logical core 360 CPUs did? oh, no, they didnt.Zac Jovanovic said:How anyone didn't see this coming is beyond me, I've been putting FX8120-8350s into budget gaming rigs for almost a year now.
If both new consoles using 8 core CPUs doesn't make developers start utilizing multithreading nothing ever will.
Also the new consoles surely wont. because there may be 8 cores, but they are slow as ass. so they will likely just offload everything into GPU even more.
its not broken. got to remmeber that in order for PC version to look like console version you need to unlock the secret setting called "ultra low graphic settings".Mr Ink 5000 said:Recommending a 8core 4GHz AMD CPU, when the consoles will be running a 8core 1.9Ghz AMD APU with 2 cores deddicated to the os
I think my calculator is broke, that doesnt seem to add up
thats just false. games gotten more and more GPU intensive as time went on because GPU can hnadle more calculations easier nowadays.Vrach said:a) I'm pretty sure that most games these days are far heavier on the processor than the graphics card and by far at that.
what is the raeson fo staying with XP? if its the preference then i can understand perfectly, if its monetary you can pick a version of win7 ultimate for 20 bucks in /r/softwareswapXan Krieger said:That's actually my problem, it's why I can't play so many games I want to like Company of Heroes 2, ARMA 3, Assassin's Creed 4 Black Flag, and many others.RA92 said:Waah waah no XP support.
funny, considering even most modern machines now built still discover new ways to make crysis even more better looking. in fact Crysis run on sli 780s look better than ANYTHING else on the market. it really was built for machines that didnt exist in 2007.Nurb said:Anyone remember Doom 3's and Crysis' "Made for PCs that don't even exist yet" Lines?
Yea, usually just crap code.
So get that skype working and message me instead of getting contrived?Racecarlock said:I just want SOMETHING to talk about even if my reasoning ends up contrived and stupid.
see, you said they lied about it being a pinnacle of requirements, but they didnt, it actually was.Lono Shrugged said:I should have been clearer. Crysis WAS an excellently optimised game and ran quite well on lower end systems. But at the time it was held up as a point of honour if you could run it at max spec. The fact it was held up as the pinnacle of requirements must not have hurt it's sales. They still put the sequels in graphics card bundles. PC specs have plateaued a little bit from when I started out. I remember being able to run a new game at max spec and 2 years later not being able to run new releases at all. Even being able to run a high spec new game was considered impressive
What I am saying is that the perception of being able to run such and such a game at max spec feeds into the cliche that pc gamers will buy a game, purely for bragging rights. Using the min specs as a marketing trick to try and convince people that this game is the cutting edge of gaming is akin to having a "from the producers of" in a movie trailer. The fact that anyone who knows a little about PC gaming knows that its all about optimisation.
It ain't the size, it's what you do with it
1080p is 1920x1080=2073600 pixels.Charcharo said:900P is not that much worse then 1080p, nor THAT much less intensive...
Also it appears Crysis 3 was on med-high-ultra settings , with MSAA 1x and 50+ fps... well Crytek ARE good, real good, but is it too much to ask that other devs are at least comparable? Sure, 4A are on Crytek's level, and GSC make buggy games but at least they have the tech that justifies the difference.
I thought that since last gen consoles run games at 680p, then minimum is what takes to achieve console quality... not that it is what takes fo 1080 P ?
i was as shocked as you were when i saw it. you can get a key for 20-30 dolalrs if your smart/patient.Xan Krieger said:$20?! I was expecting to pay $100+. Thing is 4chan doesn't exactly seem like a reputable place to get something.
i thought you said you were running on 900p. now you claim to be running it on 1050p. so which one is it?Charcharo said:Actually, it is 1680x1050, 16:10, so it is more .
Besides, still, running it at 1680x1050 with settings on med-high amd hell even a couple on Ultra and SMAA? That still means med on on 1080p.
I get that most people are probably 1080p, but should minimum be for CONSOLE quality, not PC 1080p quality?
The Xbox 360 and PS3 Could NOT even get 720p, they were 670-690 usually.
Faster cores with less actual logical cores is better in every case except if they were stupid enough to force multithreading even when not loading the cores fully. and while usually i would just say thats simply impossible for any half-competent programmer, COD:Ghosts happened...... NFS:whatever_we_name_it_now happened.....DoctorM said:I'm building a new PC this month. I was torn between an Intel i3 4130 and the AMD FX6300. In THEORY the Intel should have a slight edge... but by specs (if legit), couldn't run the game at all. (It also eliminates a LOT of older Intel chips since they tend to favor faster speeds with fewer cores).
I was considering stretching my budget to an 8-core FX chip, but for the same money it looks like going from 4 to 8gb of RAM is money better spent. 4gb also being a deal breaker on this game.
Going Intel I'd have to upgrade both the RAM and the chip to an i5-3350p just to make minimum specs.
There's definitely something weird here. I wonder how standard this is going to become for the industry.
p stands for progressive (opposed to i which stands for interlaced). it means that there is that many progressive pixels generated vertically. therefore 1600x900 is 900p. 1440x900 is also 900p anything that has 900 vertical pixels and is using progressive generation is 900p i took 1600x900 as a standart 16:9 for TVs/monitors. there are other configurations of course, but since you didnt provide resolution at that point i took the most popular one.Charcharo said:I am stupid and thought that 1680x1050, which is 16:10 would be the equivalent of 900p... I goofed
Yes it is 1680x1050
ANd i completely disagree. Minimum should be what you need for lowest settings on a lower resolution (console quality pretty much).
Agree on the PS4 and Xbone though... but they are anemic consoles . Only thing they have well is RAM... all else they lack.