Update: Watch Dogs Won't Run on 32-Bit Windows

Arif_Sohaib

New member
Jan 16, 2011
355
0
0
Kahani said:
Arif_Sohaib said:
Its interesting you use the word "virtually" here because that is exactly how the core i7 fits in. Its Quad Core with Hyperthreading(i5 is Quad Core without Hyperthreading, except one or two models which are Dual Core with Hyperthreading) so each of the four cores 'virtually' work as two.
Not particularly interesting, no, since I'm pretty sure everyone already knows this. Pretending to have 8 cores is not the same as actually having 8 cores. If Ubisoft actually meant that a quad-core is fine, that's what they should have said. Your claim that this is what they meant doesn't appear to be supported by any evidence, as far as I can tell it's entirely baseless speculation.

The reason there aren't many actual 8 core ones is that synchronization would be a nightmare, I am sure some PHD designers are working on making it stable and affordable.
Utter nonsense. There are plenty of 8 cores around, they're just generally only used in servers and workstations because they're completely pointless for the general consumer. There are virtually no programs that can use all 4 cores to their full potential, so what would be the point in having any more? At work, I easily max out a pair of 8-core Xeons. At home, I rarely see any core go over 50% usage. Heating, physical size, and so on are problems for fitting more cores in. Synchronising isn't going to significantly change whether you have 4, 6 or 8. All of which have been commercially available for a long time in entirely stable and affordable packages, just not always ones home users have any use for.
It seems like you are arguing for the sake of argument. From the context, you should have known that I meant home PCs don't have many 8-core CPUs.
Also check the page linked on the article, it clearly states an i7 for Ultra and Recommended.
And I assumed many people didn't know what Hyperthreading meant as too many comments were saying their i7 is obsolete, my comment was meant to calm them down.

For those who don't want to go, here is the ultra config;

GPU: Latest DirectX 11 graphics card with 2 GB Video RAM or more
CPU: Latest Eight core or more
RAM: 8GB or more

example 1
GPU: Nvidia GTX 670
CPU: Intel Core i7-3930K

example 2
GPU: AMD Radeon HD 7970
CPU: AMD FX-9370 Eight-Core

And the recommended one;

GPU: DirectX 11 graphics card with 2 GB Video RAM
CPU: Eight core
RAM: 8GB

example 1
GPU: NVidia GTX 560 ti
CPU: Intel Core i7-3770

example 2
GPU: AMD Radeon HD 7850
CPU: AMD FX-8350 Eight-Core

And here is the minimum;

GPU: DirectX 11 graphics card with 1 GB Video RAM
CPU: Quad core
RAM: 4GB

example 1
GPU: NVidia GTX 460
CPU: Intel Core2 Quad Q6600

example 2
GPU: AMD Radeon HD 5770
CPU: AMD Phenom X4 9750
 

Victim of Progress

New member
Jul 11, 2011
187
0
0
I like the fact that this game will be released on the ancient XBOX360 and PS3 and yet somehow it won't support a 32bit machine
 

The Lugz

New member
Apr 23, 2011
1,371
0
0
where to begin with this misinformation..
well, a 670 is not equal to a 7970, so whoever made that recommendation has no right commenting on graphics subsystems to begin with and both are pretty easy to beat with cheaper, older sli /xfire configs so i fail to see what's particularly 'next gen' about the system requirements, when 2/3 year old hardware will suffice the 2gb of video ram is something of a sticking point for 4xx or the low end amd systems, but if you want graphics you don't buy a $110 gpu to begin with so.. yer? :S

it's quite sad that hitherto unreleased consoles can't challenge 2 year old hardware, but open platforms just move too quickly nowadays what can they do?

also..

Kahani said:
I think you'll find that in Intel the only thing that fits is nothing. There are no 8 core Core processors
for the most part, software doesn't care how-many cores you have and if it's object oriented programming ( protip it is ) then the objects in the code simply wait for messages as required, as the point of most modern programming is portability and scalability across a multitude of software and hardware platforms

secondly, there absolutely are 8 core processors ( dunno why you're caught up on branding, though? it's just a word... you're aware 'core' and 'celeron' share silicon as-well, right?) in fact there are 32 thread systems running around on server motherboards with two of these:
http://www.overclockers.co.uk/showproduct.php?prodid=CP-420-IN&groupid=701&catid=6&subcat=1823

Xeon, 8 cores two threads each = 16 logical cores per socket

if that's too pricey nothing wrong with a red one:
http://www.overclockers.co.uk/showproduct.php?prodid=CP-337-AM&groupid=701&catid=1967&subcat=1825

also has 8 cores and they share cache but they still parallel execute if that's your thing, but who cares? it's all fluff and marketing hype anyway.
 

Retardinator

New member
Nov 2, 2009
582
0
0
Sgt. Sykes said:
It's probably this, i.e. programmers used to it. Pretty much every multiplatform engine has to support OGL for PS3 and WiiU support, some also for Unix support, while DX is only good for Xboxes and Windows. And I doubt Xboxes can't handle OGL.

It actually doesn't even take that much time to convert between the APIs. Before Rage came out, Carmack kept saying they could switch from DX to OGL at any time without much fuss. Sure, it's Carmack, but still.
On one hand, you have a company just looking to further their profits and influence through manipulative marketing and questionable business practices (as far as consumers are concerned) and on the other you have a company that had most of their older games ported to a staggering amount of platforms and expanding their market reach to ridiculous proportions. Wonder which one of them is more trustworthy.

IMO Carmack usually turns out to be one of the more sensible and knowledgeable people in the industry when he opens his mouth.

Also found an interesting read from the developers of Overgrowth which should pretty much put this off-topic discussion to rest:
http://blog.wolfire.com/2010/01/Why-you-should-use-OpenGL-and-not-DirectX
 

Disthron

New member
Aug 19, 2009
108
0
0
Actually, a well designed game should be able to run on a wide range of systems. Not just the bleeding edge tech. Specially in this day and age. We aren't programing in DOS anymore. Where the programers often had to write rendering code for specific hardware, and only the big teams could afford to do a wide range.

Ok, maybe you'll have to turn off some of the bells and whistles but it shouldn't be required. This, to me, reeks of lazy technical design and programing. People need to learn how to optimize again.
 

Rellik San

New member
Feb 3, 2011
609
0
0
Disthron said:
Ok, maybe you'll have to turn off some of the bells and whistles but it shouldn't be required. This, to me, reeks of lazy technical design and programing. People need to learn how to optimize again.
As I see it, the problem is what the developer thinks are essential bells and whistles for the experience and what aren't. I don't need super high resolution shadows, but I do like multiple dynamic shadows, so I save some power there, again, motion blur and AA are big system hogs, but if you ask the dev, they are integral to the experience, where as I say "no thanks" ambient occulsion however is something of a hog if done poorly, but I can't argue with the improvement it would give otherwise flat textures, however anistropic filtering on average, x8 is excellent and whilse there is a difference between x8 and x16, I find x16 a luxury I can happily live without, but I bet the game dev can't. Reflections are another tricky one, but I tend to find, I'm not too fussed about particle reflections so long as environment and character reflections work well. It's just a matter of what you want out of the game compared to what the developer thinks you want. Which is why I think any developer worth their salt, should release some kind of benchmarking tool for their game just prior to launch, so you can judge graphical fidelity and frame rate yourself and see if your machine is upto snuff.
 

Cid Silverwing

Paladin of The Light
Jul 27, 2008
3,134
0
0
Clive Howlitzer said:
People still use 32-bit systems? Come on, get with the program!
There's a reason to not upgrade straight away. Going 64-bit fucks up your retrogaming.

OT: Am I the only one who thinks spec requirements are speeding ahead way too fast in this day and age?
 

Clive Howlitzer

New member
Jan 27, 2011
2,783
0
0
Cid SilverWing said:
Clive Howlitzer said:
People still use 32-bit systems? Come on, get with the program!
There's a reason to not upgrade straight away. Going 64-bit fucks up your retrogaming.

OT: Am I the only one who thinks spec requirements are speeding ahead way too fast in this day and age?
I've never encountered a problem running all my old games on 64 bit Windows and I mostly play games from the 90s and even back into the 80s. I can only think of maybe 1 or 2 that I couldn't ever find a way to make function.
 

FireAza

New member
Aug 16, 2011
584
0
0
Makes sense. 64-bit capable processors have been around forever and there's been broad driver and software support for 64-bit since Windows Vista, making running a 64-bit OS no hassle at all now. Which means a lot of people are now running them, even my family is on 64-bit. Might as well start having games take advantage of this fact.
 

Dragonbums

Indulge in it's whiffy sensation
May 9, 2013
3,307
0
0
Clive Howlitzer said:
People still use 32-bit systems? Come on, get with the program!
That doesn't surprise me.

Outside of PC gaming enthusiasts nobody is willing to bother upgrading their computer when it works perfectly fine for them doing 90% of their other tasks.