Witcher 2 System Requirements and New Screens Revealed

Andy Chalk

One Flag, One Fleet, One Cat
Nov 12, 2002
45,698
1
0
Pilkingtube said:
Why is the minimum requirement an 8800? The consoles use a 7800, surely that should be the minimum?
It might be if the game was being made for consoles. This is a PC exclusive.
 

Wolfram23

New member
Mar 23, 2004
4,095
0
0
Keava said:
Wolfram01 said:
Bleh PC system reqs are so stupid these days. Very dated system for it. Doesn't even take into account resolution or let you know in the least what settings you could get at recommended or w/e.

I demand a revision of how system reqs are released!

We need a "low" setting spec, and a "max" setting spec - and at a specific resolution too! Makes it significantly easier to judge if it'll run well...
That's called benchmarks and i doubt any dev studio has computers with all sorts of hardware specifications to provide detailed data on this. Too many variables considering there are slight differences even between same models of videocard by different manufacturers.
That's BS. If they can say "you need at least these specs to run it" they can at least say run it at what settings and resolution. Presumeably lowest settings, sure, but then what resolution?? And for "recommended" how do they come up with that? They MUST do some sort of benching, and the least they can do is say if that's at medium or high settings and what resolution as well. I'd be willing to bet with a 1080p monitor and those recommended specs you be lucky to get medium, let alone high settings.
 
Apr 28, 2008
14,634
0
0
Woodsey said:
And to think, Fable 3 is coming out on the same day. Microsoft really don't have a clue, do they?

Hopefully Geralt's voice actor has taken some lessons in emoting this time around though, because dear fucking god it was awful last time (and I only played it for 5 hours before succumbing to the terrible start).

Still really looking forward to this though.
Microsoft know its going to get thrashed. They set it on this date because when it does fail they'll use that as justification for the PC not being a viable market and whatnot.

Maybe not, don't think Microsoft is that smart.

OT: Hooray! I'll be able to run it! I wants this game. Bad.

The first was a great game,even though the dialog was hilariously bad, and the combat was rather "meh". But still, the fact that you had control over how the story plays out is fantastic. And done too little in games. Bethesda/Bioware games ain't got shit on The Witcher's choices/consequence system.
 

Sacman

Don't Bend! Ascend!
May 15, 2008
22,661
0
0
Wolfram01 said:
Bleh PC system reqs are so stupid these days. Very dated system for it. Doesn't even take into account resolution or let you know in the least what settings you could get at recommended or w/e.

I demand a revision of how system reqs are released!

We need a "low" setting spec, and a "max" setting spec - and at a specific resolution too! Makes it significantly easier to judge if it'll run well...
Yeah I agree... I was running an ATI Redeon HD 4650, a 2.66 GHZ Quad core processor and 6 gb DDR2 Ram on my back up PC so I figured I could run Metro 2033 on at least high settings but no even though I met the recommended setting I still could barely run it on fairly low settings...
 

Keava

New member
Mar 1, 2010
2,010
0
0
Wolfram01 said:
That's BS. If they can say "you need at least these specs to run it" they can at least say run it at what settings and resolution. Presumeably lowest settings, sure, but then what resolution?? And for "recommended" how do they come up with that? They MUST do some sort of benching, and the least they can do is say if that's at medium or high settings and what resolution as well. I'd be willing to bet with a 1080p monitor and those recommended specs you be lucky to get medium, let alone high settings.
And what do you personally consider playable? 25 fps? 30 fps? 35 fps? 40+ fps? I know people that will moan if the game doesn't run on their hardware with at least 45 fps.
Sure they do benching, but they avoid detailed information because then people would complain their low quality manufacturer card doesn't pull those promised fps, and they can't put the manufacturers names in such information because it would be adversing. They would have to add booklets listing every resolution behaviour with different AA and AF settings...

I'll send you the address where you should deliver the monitor later tho, as i'm pretty sure good old gtx260 with 1GB VRAM sipported by quad core cpu will be able to pull modified Aurora on medium with at least 25-30 fps in 1680x1050, especially if all the bragging CDP did about optimization will be at least halfway true. The card pulled nearly 30 fps on Crysis at 1920x1200.
 

Pilkingtube

Edible
Mar 24, 2010
481
0
0
Andy Chalk said:
Pilkingtube said:
Why is the minimum requirement an 8800? The consoles use a 7800, surely that should be the minimum?
It might be if the game was being made for consoles. This is a PC exclusive.
Didn't they say that about Crysis 2? Like, in about 6 press releases... Before they made it a cross-platform game? I just don't understand why they'll say the minimum acceptable limit is an 8800 for modern games on a PC, then turn around to people on a 7800 console and say it's fine. I'm probably too simple to understand why. :(

You know, I swear i've seen a quote from a senior producer at CD Projekt saying that not only is it possible for Witcher 2 to work on current gen consoles, but that they've thought about it and that it's a multi-platform engine.

"It will be possible to do it on current gen consoles. It's not a co-incidence because we have thought about it. It is a multi-platform engine." Tomasz Gop, Senior Producer at developer CD Projekt Red told Made2Game today
Source: http://www.made2game.com/xbox-360/the-witcher-2-ps3-xbox360/

EDIT: Edited because I wasn't really being very clear with the Crysis/Crysis 2 comment >_<
 

Theotherguy

New member
Mar 15, 2011
33
0
0
Witcher 2 got canceled on the consoles due to the financial crysis a while back and since than I didn't read any news about it so I guess it just won't come out on ps3/xbox360 sadly.
 

Keava

New member
Mar 1, 2010
2,010
0
0
Pilkingtube said:
You know, I swear i've seen a quote from a senior producer at CD Projekt saying that not only is it possible for Witcher 2 to work on current gen consoles, but that they've thought about it and that it's a multi-platform engine.

"It will be possible to do it on current gen consoles. It's not a co-incidence because we have thought about it. It is a multi-platform engine." Tomasz Gop, Senior Producer at developer CD Projekt Red told Made2Game today
Engine is not the same as game. What the guy meant was that the coding structure of the engine was easily translatable for consoles, however graphic requirements are also influenced by other things, in this case mostly shaders used for rendering. GeForce 7 series are limited to 2.1 shader which means lower quality of bump mapping, specular highlights, shading, etc. When they decide to roll out with console version they will simply cut down those features to match the hardware, but why wouldn't they pimp it up a bit for people who are able to experience the full glory of their work if they can?
 

Wolfram23

New member
Mar 23, 2004
4,095
0
0
Keava said:
Wolfram01 said:
That's BS. If they can say "you need at least these specs to run it" they can at least say run it at what settings and resolution. Presumeably lowest settings, sure, but then what resolution?? And for "recommended" how do they come up with that? They MUST do some sort of benching, and the least they can do is say if that's at medium or high settings and what resolution as well. I'd be willing to bet with a 1080p monitor and those recommended specs you be lucky to get medium, let alone high settings.
And what do you personally consider playable? 25 fps? 30 fps? 35 fps? 40+ fps? I know people that will moan if the game doesn't run on their hardware with at least 45 fps.
Sure they do benching, but they avoid detailed information because then people would complain their low quality manufacturer card doesn't pull those promised fps, and they can't put the manufacturers names in such information because it would be adversing. They would have to add booklets listing every resolution behaviour with different AA and AF settings...

I'll send you the address where you should deliver the monitor later tho, as i'm pretty sure good old gtx260 with 1GB VRAM sipported by quad core cpu will be able to pull modified Aurora on medium with at least 25-30 fps in 1680x1050, especially if all the bragging CDP did about optimization will be at least halfway true. The card pulled nearly 30 fps on Crysis at 1920x1200.
What are you on about? I never actually said "playable" or mentioned framerates. I understand what you're saying, believe me, but I still think the requirements PC games give out are absolutely too vague to give any sort of reasonable understanding of how demanding the game will be. I don't care if they only want to bench at low settings and 1280x1024 but they should at least mention it.

"Recommended" doesn't even have a proper definition. You're right that maybe they mean 30fps, maybe they mean 60. Maybe they mean full AA - or none? It's just too vague. I don't think they need to say "ok, at High settings and 1080p with 4xAA you need THIS set up" because obviously there's too much going on. They can be a little more vague than that, but certainly they need to be less vague than they currently are. Maybe something like "1080p, medium settings" and of course they would "recommend" something that isn't going to make it choppy so probably 30fps or higher (as long as the min doesn't dip into the 10-20fps area)... They don't have to mention AA, or anything. But shit, the difference between low res and high res requires double the GPU power! That's a HUGE thing to ignore when giving out requirements.

I don't think any other industries can really get away with that. I mean you can't buy a motor without knowing if it's AC, DC, how much voltage it needs, and what the max power draw is. Well ok you can =P but the manufacturers have that data available.
 

Pilkingtube

Edible
Mar 24, 2010
481
0
0
Keava said:
Pilkingtube said:
You know, I swear i've seen a quote from a senior producer at CD Projekt saying that not only is it possible for Witcher 2 to work on current gen consoles, but that they've thought about it and that it's a multi-platform engine.

"It will be possible to do it on current gen consoles. It's not a co-incidence because we have thought about it. It is a multi-platform engine." Tomasz Gop, Senior Producer at developer CD Projekt Red told Made2Game today
Engine is not the same as game. What the guy meant was that the coding structure of the engine was easily translatable for consoles, however graphic requirements are also influenced by other things, in this case mostly shaders used for rendering. GeForce 7 series are limited to 2.1 shader which means lower quality of bump mapping, specular highlights, shading, etc. When they decide to roll out with console version they will simply cut down those features to match the hardware, but why wouldn't they pimp it up a bit for people who are able to experience the full glory of their work if they can?
There is a difference between lowering your barrier to entry and reducing the maximum performance of your game. Why can't they just add that facility into the game for those with less powerful hardware and extend their possible PC audience? The people playing with some crazy Quad-Crossfire system won't be affected by it up at the top end, but those people using laptops and older systems can then have access to the system too.
 

Orcus The Ultimate

New member
Nov 22, 2009
3,216
0
0
Game of the Year anyone ?

Frankly this game's really needed as a good change when you look back at all the Occidental RPG Studios !
 

Enkidu88

New member
Jan 24, 2010
534
0
0
Time to update my graphics card, whenever my card becomes the "Minimum Requirement" I know its time to upgrade :p.

The 8800GTX has served me well though.

Also, more on topic, what the heck is on Geralts back in the second picture? His sword is there, but theres some kind of weird wooden apparatus as well. Almost looks like a tripod.
 

IamGamer41

New member
Mar 19, 2010
245
0
0
Keava said:
Wolfram01 said:
Bleh PC system reqs are so stupid these days. Very dated system for it. Doesn't even take into account resolution or let you know in the least what settings you could get at recommended or w/e.

I demand a revision of how system reqs are released!

We need a "low" setting spec, and a "max" setting spec - and at a specific resolution too! Makes it significantly easier to judge if it'll run well...
That's called benchmarks and i doubt any dev studio has computers with all sorts of hardware specifications to provide detailed data on this. Too many variables considering there are slight differences even between same models of videocard by different manufacturers.

Pilkingtube said:
Why is the minimum requirement an 8800? The consoles use a 7800, surely that should be the minimum?
The game, currently, is not developed for consoles. It may be in the future but CDProjekt is a studio that does things properly. They want to first make a PC version and then eventually tone it down for console release, rather than forcing PC users to suffer due to consoles being outdated for last few years.
They have said they have had the witcher 2 running on current gen consoles and that there is no loss in detail to graphics.By current gen consoles I mean 360 and PS3.Sorry Wii no rpg goodness for you.
 

IamGamer41

New member
Mar 19, 2010
245
0
0
Ephraim J. Witchwood said:
Can't wait for this!

Shit, I still need to get through the first one. >_<
remember to keep your save file after you finished the witcher 1 because you can import it to witcher 2.Things like choices and some weapons/gold carries over.
 

fletch_talon

New member
Nov 6, 2008
1,461
0
0
Enkidu88 said:
Also, more on topic, what the heck is on Geralts back in the second picture? His sword is there, but theres some kind of weird wooden apparatus as well. Almost looks like a tripod.
I think its a scabbard for the sword he's got in his hand. Remember he has his silver sword for monsters and steel for everything else.