Watch Dogs PC Requirements Recommend 8 Core CPU

Dec 16, 2009
1,774
0
0
Strazdas said:
Mr Ink 5000 said:
Recommending a 8core 4GHz AMD CPU, when the consoles will be running a 8core 1.9Ghz AMD APU with 2 cores deddicated to the os
I think my calculator is broke, that doesnt seem to add up
its not broken. got to remmeber that in order for PC version to look like console version you need to unlock the secret setting called "ultra low graphic settings".
how do you have the patience to mulit quote like that?

Ubi seem to be awful at PC optimisation. I remember the last splintercell gave me better frame rates on ultra settings until it was patched, madness

I don't think this game will be the milestone for how the new console gen will effect PC requirements.
 

Revolutionary

Pub Club Am Broken
May 30, 2009
1,833
0
41
I have a 3770k and it's not 8 cores. 8 Logical cores is not the same as 8 Cores and I wish people make the distinction. No game on the planet would get away with needing an actual 8 core CPU.....Yet.
Even with that it still worries me that the requirements are that high, especially considering the recent drop in visual fidelity. Optomise your fucking games Ubisoft, I haven't forgotten about Far Cry 3 on launch.
 

Colt47

New member
Oct 31, 2012
1,065
0
0
I just find it funny that it has recommended an 8 logical core CPU. Is it possible that this game might actually beat the Starbound Alpha and Terraria on CPU usage?
 

Lono Shrugged

New member
May 7, 2009
1,467
0
0
The Rogue Wolf said:
"We can't be bothered to optimize our game for the PC, so we're going to inflate the system requirements so that jerkass elitist gamers think it's 'true next-gen' and can smugly say 'get a job and upgrade your PC' on forums everywhere."
It worked for Crysis. I remember all the graphics card reviewers using that game as their benchmark. It's like having a heavy, inefficient poorly made car and saying it needs a jet engine to make it go 60 mph
 

Slash2x

New member
Dec 7, 2009
503
0
0
pffft I had an 8 core 2 years ago. NEXT! Funny thing is that windows does not really use that many cores effectively. And I am thinking watchdogs will not be Linux compatible.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
alj said:
This gives me an idea for a website where you bench the game and give the real requirements to play on high mid low at common resolutions 4k 2k 1080p and so on.
Game-debate already does this sort of check, altrough very undetailed benchmarks (basically tells you your paying in high medium or low settings with your rig) depending on resolution you enter. the idea with processing power is good in theory, in practice most people dont even know their clock speed, let alone processing amount. i couldnt tell you my gpu actins per second by heart either.

Mr Ink 5000 said:
how do you have the patience to mulit quote like that?

Ubi seem to be awful at PC optimisation. I remember the last splintercell gave me better frame rates on ultra settings until it was patched, madness

I don't think this game will be the milestone for how the new console gen will effect PC requirements.
I am a patient person :)

Well, personally i cannot tell about Uni optimization as i stopped using thier products since AC2 fiasco, altrough i was planning to give them a second chance with Watch Dogs. But from what i saw of others playing it does not look that bad in comparison to some other companies.

True, in a sense existing games already shown the limitations.

Lono Shrugged said:
It worked for Crysis. I remember all the graphics card reviewers using that game as their benchmark. It's like having a heavy, inefficient poorly made car and saying it needs a jet engine to make it go 60 mph
Crysis was efficient and optimized. it was great game and STILL IS GREAT for benchmarking. it knows how to use your system to the limit and manages to overload even the newest cards not with inefficincy but with raw graphical fidelity. both sequels later didnt manage to repeat this graphic benchmarking engine. Crysis was not a "game". it was a tech demo of Crytek new engine that was a perfect benchmark for years to come. your anology would be better used for games like GTA 4. now that game is unoptimized.
 

Lono Shrugged

New member
May 7, 2009
1,467
0
0
Strazdas said:
Crysis was efficient and optimized. it was great game and STILL IS GREAT for benchmarking. it knows how to use your system to the limit and manages to overload even the newest cards not with inefficincy but with raw graphical fidelity. both sequels later didnt manage to repeat this graphic benchmarking engine. Crysis was not a "game". it was a tech demo of Crytek new engine that was a perfect benchmark for years to come. your anology would be better used for games like GTA 4. now that game is unoptimized.
I should have been clearer. Crysis WAS an excellently optimised game and ran quite well on lower end systems. But at the time it was held up as a point of honour if you could run it at max spec. The fact it was held up as the pinnacle of requirements must not have hurt it's sales. They still put the sequels in graphics card bundles. PC specs have plateaued a little bit from when I started out. I remember being able to run a new game at max spec and 2 years later not being able to run new releases at all. Even being able to run a high spec new game was considered impressive

What I am saying is that the perception of being able to run such and such a game at max spec feeds into the cliche that pc gamers will buy a game, purely for bragging rights. Using the min specs as a marketing trick to try and convince people that this game is the cutting edge of gaming is akin to having a "from the producers of" in a movie trailer. The fact that anyone who knows a little about PC gaming knows that its all about optimisation.

It ain't the size, it's what you do with it
 

Creator002

New member
Aug 30, 2010
1,590
0
0
Well, that takes my chances on getting it for PC down a bit.

I have an i7 2600, 16 GB of RAM and a Radeon HD 6950, so I can run it easily, but I like turning everything up to max.

verindae said:
noobium said:
I have a 3770k and I am sure pretty sure that it is a quad core processor.
http://ark.intel.com/products/65523/Intel-Core-i7-3770K-Processor-8M-Cache-up-to-3_90-GHz

Quad Core Hyper Thread. 4 Physical, 8 logical.
Or, if that's what they mean by 8-core, then I'm fine, right? My CPU has 4 cores and 8 threads at 3.4 GHz. As a matter of fact, comparing the two processors on the website linked, there doesn't seem to be a huge difference between the 2600 and the 3770K.
 

ToastyMozart

New member
Mar 13, 2012
224
0
0
Creator002 said:
Well, that takes my chances on getting it for PC down a bit.

I have an i7 2600, 16 GB of RAM and a Radeon HD 6950, so I can run it easily, but I like turning everything up to max.

verindae said:
noobium said:
I have a 3770k and I am sure pretty sure that it is a quad core processor.
http://ark.intel.com/products/65523/Intel-Core-i7-3770K-Processor-8M-Cache-up-to-3_90-GHz

Quad Core Hyper Thread. 4 Physical, 8 logical.
Or, if that's what they mean by 8-core, then I'm fine, right? My CPU has 4 cores and 8 threads at 3.4 GHz. As a matter of fact, comparing the two processors on the website linked, there doesn't seem to be a huge difference between the 2600 and the 3770K.
Honestly, if a 6950 can't get it playable at max (1920x1080), I'd be very surprised.
 

Xan Krieger

Completely insane
Feb 11, 2009
2,918
0
0
Strazdas said:
Alex Co said:
Does this mean eight core CPUs will be the norm for AAA PC games this generation? If so, will you rather buy the same games on consoles instead?
No and no. First of all, this is very likely either a fake or the developers have completely forgot even basic optimization for CPU processes. Either way - not going to be a trend any time soon. Anything that processing-intensive is moved to GPU processing anyway, so its very likely this is fake. This is why the 8 weak cores solution for consoles will FAIL. noone is dumb enough to program for that.

Why would i buy it on consoles because of that? even minimum settings on that thing is still looking better than consoles. If graphic is deciding factor, a 500 dollar PC will always, always win over a console. and if game is so unoptimized that it actually needs that much processing, dont expect it to run smoothly on consoles to begin with. see, consoles aren different this generation. they are just prebuilt brand-name PCs with restricted OS and custom design that makes any IT specialist scratch his head. If it runs bad on PC it runs bad on console, always. that is, unless two different developers worked on the versions, which usually only happens to shitty ports, not multiplatform releases.

StHubi said:
M It would also be interesting to know if these requirements can still be called "high" as the typical gaming PC is probably quite capable by now...
according to steam hardware survey, the average person is still running on two processor cores. the reason for this is that most games arent even capable of using more than 2 cores anyway and its all about GPU power in gaming. Which is why when i built a new PC this year i chose i5 over i7, its not like im going to need that power in CPU anyway and rather spend that money on better GPU.
While the GPU they put in the recommended could be considered "that generation the gamers are about to change into new ones now" the CPU requirements are high. too high in fact. considering other requirements, unbelievably too high.


ShakerSilver said:
That's probably true. I mean Ubisoft isn't exactly known for constructing flawless PC ports.
well, to be honest, at least they try and their ports actually work.

Charcharo said:
Crysis 3 said 5770 as a minimum requirement...

I was playing it on medium 50+ fps :p... hell even some of the effects like Water on high. That is 900p :)

Same with Infinite, there I played it on almost all ultra (except AA) and with Dishonored :)
well, crytek always knew how to optimize the hell out of thier games. they built the engine after all. i remember playing the original crysis on high on a frigging laptop with 8600m GS.
the thing is though, 900p. thats the reason probably. the specs aim for AT LEAST 1080p and i would be shocked if the developers werent running 1440p at least when playtesting thier game. going down to 900p is a huge boost to free up resources for other things on your system.

RA92 said:
Waah waah no XP support.
XP wasnt supported in modern games for years now. Whats to talk about. XP simply cant run newer than 9.0c directX, and games moved past that.

kiri2tsubasa said:
So, my I5-3570K is a physical and logical quad core based on the fact it has only 4 threads. Yep definitely getting this on PS4 then.
so isntead of running the game fine on perhaps slightly not maximum settings, your getting it on PS4 where you will have it run on "ultra low" settings? how does that make sense?

Zac Jovanovic said:
How anyone didn't see this coming is beyond me, I've been putting FX8120-8350s into budget gaming rigs for almost a year now.
If both new consoles using 8 core CPUs doesn't make developers start utilizing multithreading nothing ever will.
You mean, like how hyperthreaded 6 logical core 360 CPUs did? oh, no, they didnt.
Also the new consoles surely wont. because there may be 8 cores, but they are slow as ass. so they will likely just offload everything into GPU even more.

Mr Ink 5000 said:
Recommending a 8core 4GHz AMD CPU, when the consoles will be running a 8core 1.9Ghz AMD APU with 2 cores deddicated to the os
I think my calculator is broke, that doesnt seem to add up
its not broken. got to remmeber that in order for PC version to look like console version you need to unlock the secret setting called "ultra low graphic settings".

Vrach said:
a) I'm pretty sure that most games these days are far heavier on the processor than the graphics card and by far at that.
thats just false. games gotten more and more GPU intensive as time went on because GPU can hnadle more calculations easier nowadays.

Xan Krieger said:
RA92 said:
Waah waah no XP support.
That's actually my problem, it's why I can't play so many games I want to like Company of Heroes 2, ARMA 3, Assassin's Creed 4 Black Flag, and many others.
what is the raeson fo staying with XP? if its the preference then i can understand perfectly, if its monetary you can pick a version of win7 ultimate for 20 bucks in /r/softwareswap

Nurb said:
Anyone remember Doom 3's and Crysis' "Made for PCs that don't even exist yet" Lines?

Yea, usually just crap code.
funny, considering even most modern machines now built still discover new ways to make crysis even more better looking. in fact Crysis run on sli 780s look better than ANYTHING else on the market. it really was built for machines that didnt exist in 2007.



Racecarlock said:
I just want SOMETHING to talk about even if my reasoning ends up contrived and stupid.
So get that skype working and message me instead of getting contrived?
$20?! I was expecting to pay $100+. Thing is 4chan doesn't exactly seem like a reputable place to get something.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
Lono Shrugged said:
I should have been clearer. Crysis WAS an excellently optimised game and ran quite well on lower end systems. But at the time it was held up as a point of honour if you could run it at max spec. The fact it was held up as the pinnacle of requirements must not have hurt it's sales. They still put the sequels in graphics card bundles. PC specs have plateaued a little bit from when I started out. I remember being able to run a new game at max spec and 2 years later not being able to run new releases at all. Even being able to run a high spec new game was considered impressive

What I am saying is that the perception of being able to run such and such a game at max spec feeds into the cliche that pc gamers will buy a game, purely for bragging rights. Using the min specs as a marketing trick to try and convince people that this game is the cutting edge of gaming is akin to having a "from the producers of" in a movie trailer. The fact that anyone who knows a little about PC gaming knows that its all about optimisation.

It ain't the size, it's what you do with it
see, you said they lied about it being a pinnacle of requirements, but they didnt, it actually was.

PC specs have not plateaued. Game requirements have. You can thank consoles for that, as they keep game requirements down since you got to run it on those 8 year old machines too. Meanwhile PCs used their still vastly increasing power in otherp erspectives. resolution, multi-monitor setups, true antialiasing. You can run a new game with 2 year old hardware, and you could do the same in the 90s. you dont have a point here at all.

It feels more like their min specs are just tested on machines different from ours more than anything. if they want to trick people into cutting edge gaming they should go COD route and bloat the filesize. that always gets talked about a lot and how "much it takes to make big game", when in reality its mostly often just uncompressed audio because consoles cant play it (not powerful enough) otherwise.

Charcharo said:
900P is not that much worse then 1080p, nor THAT much less intensive...
Also it appears Crysis 3 was on med-high-ultra settings :D, with MSAA 1x and 50+ fps... well Crytek ARE good, real good, but is it too much to ask that other devs are at least comparable? Sure, 4A are on Crytek's level, and GSC make buggy games but at least they have the tech that justifies the difference.

I thought that since last gen consoles run games at 680p, then minimum is what takes to achieve console quality... not that it is what takes fo 1080 P :(?
1080p is 1920x1080=2073600 pixels.
The standart 900p is 1600x900=1440000 pixels.

That means that 1080p has 1,44 times more pixels. Therefore its almost a third less intensive to run on 900p than on 1080p from pixel count alone, and im not evne accounting the lessp rocessing needed for antialiasing due to lower resolution.

resolution matters a lot in requirements and thats why requirements alone is no loner enough as peoples resolution is no longer the standart 1600x1200 for everyone (remmeber those? yeah, we had full HD two decades ago, then abandoned it for flatter screens thanks to TV cartel that later was found out and huge fines ensued).

Oh, sure, you could take the console hardware set all graphics to minimum force it to 640x480 (because there is no such option you would have to do it via console command) and you would have it running just fine as well. does not mean thats what PC users should strive for. For modern gaming 1080p is the minimum standard. and its only logical that thats where the spec requirements are being tested.




Xan Krieger said:
$20?! I was expecting to pay $100+. Thing is 4chan doesn't exactly seem like a reputable place to get something.
i was as shocked as you were when i saw it. you can get a key for 20-30 dolalrs if your smart/patient.
and its not 4chan. I was talking about reddit. for example this post submitted 4 hours ago is offering a key for 10 dollars
http://www.reddit.com/r/softwareswap/comments/22hb65/h_windows_7_8_81_office_2013_2010_2011_mac_visio/
meanwhile this guy is asking 25 for one
http://www.reddit.com/r/softwareswap/comments/22fbih/h_windows_7_homeproultenterprise_windows_8/

lurk there for a while and youll find a lot of good stuff cheap there.
 
Apr 5, 2008
3,736
0
0
I'll look forward to seeing how this game looks on my top-spec PC with everything maxed...but I think I'll wait until a Steam sale to do so.
 

DoctorM

New member
Nov 30, 2010
172
0
0
I'm building a new PC this month. I was torn between an Intel i3 4130 and the AMD FX6300. In THEORY the Intel should have a slight edge... but by specs (if legit), couldn't run the game at all. (It also eliminates a LOT of older Intel chips since they tend to favor faster speeds with fewer cores).

I was considering stretching my budget to an 8-core FX chip, but for the same money it looks like going from 4 to 8gb of RAM is money better spent. 4gb also being a deal breaker on this game.

Going Intel I'd have to upgrade both the RAM and the chip to an i5-3350p just to make minimum specs.

There's definitely something weird here. I wonder how standard this is going to become for the industry.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
Charcharo said:
Actually, it is 1680x1050, 16:10, so it is more :D.
Besides, still, running it at 1680x1050 with settings on med-high amd hell even a couple on Ultra and SMAA? That still means med on on 1080p.
I get that most people are probably 1080p, but should minimum be for CONSOLE quality, not PC 1080p quality?
The Xbox 360 and PS3 Could NOT even get 720p, they were 670-690 usually.
i thought you said you were running on 900p. now you claim to be running it on 1050p. so which one is it?

No, minimum should not be for console quality. PC gamers would not accept console quality. thats too low. That being said, 1080p should be the minimum requirement for the new consoles to begin with.

technically, there were a few games on PS3 that ran on 1080p. things like pong and stuff.

there is a quite comprehensive list actually [http://forum.beyond3d.com/showthread.php?t=46241]

DoctorM said:
I'm building a new PC this month. I was torn between an Intel i3 4130 and the AMD FX6300. In THEORY the Intel should have a slight edge... but by specs (if legit), couldn't run the game at all. (It also eliminates a LOT of older Intel chips since they tend to favor faster speeds with fewer cores).

I was considering stretching my budget to an 8-core FX chip, but for the same money it looks like going from 4 to 8gb of RAM is money better spent. 4gb also being a deal breaker on this game.

Going Intel I'd have to upgrade both the RAM and the chip to an i5-3350p just to make minimum specs.

There's definitely something weird here. I wonder how standard this is going to become for the industry.
Faster cores with less actual logical cores is better in every case except if they were stupid enough to force multithreading even when not loading the cores fully. and while usually i would just say thats simply impossible for any half-competent programmer, COD:Ghosts happened...... NFS:whatever_we_name_it_now happened.....

besides, unless this is the ONLY game you are going to ever play, your better off with intel CPUs now. they cost more but they also deliver more. and considering how poorly optimized multithreading is in most games, they have a clear advantange.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
Charcharo said:
I am stupid and thought that 1680x1050, which is 16:10 would be the equivalent of 900p... I goofed :)
Yes it is 1680x1050

ANd i completely disagree. Minimum should be what you need for lowest settings on a lower resolution (console quality pretty much).
Agree on the PS4 and Xbone though... but they are anemic consoles :). Only thing they have well is RAM... all else they lack.
p stands for progressive (opposed to i which stands for interlaced). it means that there is that many progressive pixels generated vertically. therefore 1600x900 is 900p. 1440x900 is also 900p anything that has 900 vertical pixels and is using progressive generation is 900p i took 1600x900 as a standart 16:9 for TVs/monitors. there are other configurations of course, but since you didnt provide resolution at that point i took the most popular one.

you do have to realize that minimum settings on PC is often higher than console settings (cant chose there, no options). That is simply because consoles dont ahve enough processing power. as far as resolution goes, considering going sub-native resolution is not adviced nowadays because image quality suffers far above what just droppnig resolution means and that most ports dont really know how to deal with it (for example playing GTA4 on 800x600 text was unreadable) its not really something the requiremets should aim for. The min requirements should aim for minimum normal game experience and on PC that means higher graphics than console.
Besides, the min requirements are quite low in pretty much all games anyway and only laptops or people like me who update only every 6+ years really got to care about it.