It might be if the game was being made for consoles. This is a PC exclusive.Pilkingtube said:Why is the minimum requirement an 8800? The consoles use a 7800, surely that should be the minimum?
It might be if the game was being made for consoles. This is a PC exclusive.Pilkingtube said:Why is the minimum requirement an 8800? The consoles use a 7800, surely that should be the minimum?
That's BS. If they can say "you need at least these specs to run it" they can at least say run it at what settings and resolution. Presumeably lowest settings, sure, but then what resolution?? And for "recommended" how do they come up with that? They MUST do some sort of benching, and the least they can do is say if that's at medium or high settings and what resolution as well. I'd be willing to bet with a 1080p monitor and those recommended specs you be lucky to get medium, let alone high settings.Keava said:That's called benchmarks and i doubt any dev studio has computers with all sorts of hardware specifications to provide detailed data on this. Too many variables considering there are slight differences even between same models of videocard by different manufacturers.Wolfram01 said:Bleh PC system reqs are so stupid these days. Very dated system for it. Doesn't even take into account resolution or let you know in the least what settings you could get at recommended or w/e.
I demand a revision of how system reqs are released!
We need a "low" setting spec, and a "max" setting spec - and at a specific resolution too! Makes it significantly easier to judge if it'll run well...
Microsoft know its going to get thrashed. They set it on this date because when it does fail they'll use that as justification for the PC not being a viable market and whatnot.Woodsey said:And to think, Fable 3 is coming out on the same day. Microsoft really don't have a clue, do they?
Hopefully Geralt's voice actor has taken some lessons in emoting this time around though, because dear fucking god it was awful last time (and I only played it for 5 hours before succumbing to the terrible start).
Still really looking forward to this though.
Yeah I agree... I was running an ATI Redeon HD 4650, a 2.66 GHZ Quad core processor and 6 gb DDR2 Ram on my back up PC so I figured I could run Metro 2033 on at least high settings but no even though I met the recommended setting I still could barely run it on fairly low settings...Wolfram01 said:Bleh PC system reqs are so stupid these days. Very dated system for it. Doesn't even take into account resolution or let you know in the least what settings you could get at recommended or w/e.
I demand a revision of how system reqs are released!
We need a "low" setting spec, and a "max" setting spec - and at a specific resolution too! Makes it significantly easier to judge if it'll run well...
And what do you personally consider playable? 25 fps? 30 fps? 35 fps? 40+ fps? I know people that will moan if the game doesn't run on their hardware with at least 45 fps.Wolfram01 said:That's BS. If they can say "you need at least these specs to run it" they can at least say run it at what settings and resolution. Presumeably lowest settings, sure, but then what resolution?? And for "recommended" how do they come up with that? They MUST do some sort of benching, and the least they can do is say if that's at medium or high settings and what resolution as well. I'd be willing to bet with a 1080p monitor and those recommended specs you be lucky to get medium, let alone high settings.
Didn't they say that about Crysis 2? Like, in about 6 press releases... Before they made it a cross-platform game? I just don't understand why they'll say the minimum acceptable limit is an 8800 for modern games on a PC, then turn around to people on a 7800 console and say it's fine. I'm probably too simple to understand why.Andy Chalk said:It might be if the game was being made for consoles. This is a PC exclusive.Pilkingtube said:Why is the minimum requirement an 8800? The consoles use a 7800, surely that should be the minimum?
Source: http://www.made2game.com/xbox-360/the-witcher-2-ps3-xbox360/"It will be possible to do it on current gen consoles. It's not a co-incidence because we have thought about it. It is a multi-platform engine." Tomasz Gop, Senior Producer at developer CD Projekt Red told Made2Game today
Engine is not the same as game. What the guy meant was that the coding structure of the engine was easily translatable for consoles, however graphic requirements are also influenced by other things, in this case mostly shaders used for rendering. GeForce 7 series are limited to 2.1 shader which means lower quality of bump mapping, specular highlights, shading, etc. When they decide to roll out with console version they will simply cut down those features to match the hardware, but why wouldn't they pimp it up a bit for people who are able to experience the full glory of their work if they can?Pilkingtube said:You know, I swear i've seen a quote from a senior producer at CD Projekt saying that not only is it possible for Witcher 2 to work on current gen consoles, but that they've thought about it and that it's a multi-platform engine.
"It will be possible to do it on current gen consoles. It's not a co-incidence because we have thought about it. It is a multi-platform engine." Tomasz Gop, Senior Producer at developer CD Projekt Red told Made2Game today
What are you on about? I never actually said "playable" or mentioned framerates. I understand what you're saying, believe me, but I still think the requirements PC games give out are absolutely too vague to give any sort of reasonable understanding of how demanding the game will be. I don't care if they only want to bench at low settings and 1280x1024 but they should at least mention it.Keava said:And what do you personally consider playable? 25 fps? 30 fps? 35 fps? 40+ fps? I know people that will moan if the game doesn't run on their hardware with at least 45 fps.Wolfram01 said:That's BS. If they can say "you need at least these specs to run it" they can at least say run it at what settings and resolution. Presumeably lowest settings, sure, but then what resolution?? And for "recommended" how do they come up with that? They MUST do some sort of benching, and the least they can do is say if that's at medium or high settings and what resolution as well. I'd be willing to bet with a 1080p monitor and those recommended specs you be lucky to get medium, let alone high settings.
Sure they do benching, but they avoid detailed information because then people would complain their low quality manufacturer card doesn't pull those promised fps, and they can't put the manufacturers names in such information because it would be adversing. They would have to add booklets listing every resolution behaviour with different AA and AF settings...
I'll send you the address where you should deliver the monitor later tho, as i'm pretty sure good old gtx260 with 1GB VRAM sipported by quad core cpu will be able to pull modified Aurora on medium with at least 25-30 fps in 1680x1050, especially if all the bragging CDP did about optimization will be at least halfway true. The card pulled nearly 30 fps on Crysis at 1920x1200.
There is a difference between lowering your barrier to entry and reducing the maximum performance of your game. Why can't they just add that facility into the game for those with less powerful hardware and extend their possible PC audience? The people playing with some crazy Quad-Crossfire system won't be affected by it up at the top end, but those people using laptops and older systems can then have access to the system too.Keava said:Engine is not the same as game. What the guy meant was that the coding structure of the engine was easily translatable for consoles, however graphic requirements are also influenced by other things, in this case mostly shaders used for rendering. GeForce 7 series are limited to 2.1 shader which means lower quality of bump mapping, specular highlights, shading, etc. When they decide to roll out with console version they will simply cut down those features to match the hardware, but why wouldn't they pimp it up a bit for people who are able to experience the full glory of their work if they can?Pilkingtube said:You know, I swear i've seen a quote from a senior producer at CD Projekt saying that not only is it possible for Witcher 2 to work on current gen consoles, but that they've thought about it and that it's a multi-platform engine.
"It will be possible to do it on current gen consoles. It's not a co-incidence because we have thought about it. It is a multi-platform engine." Tomasz Gop, Senior Producer at developer CD Projekt Red told Made2Game today
They have said they have had the witcher 2 running on current gen consoles and that there is no loss in detail to graphics.By current gen consoles I mean 360 and PS3.Sorry Wii no rpg goodness for you.Keava said:That's called benchmarks and i doubt any dev studio has computers with all sorts of hardware specifications to provide detailed data on this. Too many variables considering there are slight differences even between same models of videocard by different manufacturers.Wolfram01 said:Bleh PC system reqs are so stupid these days. Very dated system for it. Doesn't even take into account resolution or let you know in the least what settings you could get at recommended or w/e.
I demand a revision of how system reqs are released!
We need a "low" setting spec, and a "max" setting spec - and at a specific resolution too! Makes it significantly easier to judge if it'll run well...
The game, currently, is not developed for consoles. It may be in the future but CDProjekt is a studio that does things properly. They want to first make a PC version and then eventually tone it down for console release, rather than forcing PC users to suffer due to consoles being outdated for last few years.Pilkingtube said:Why is the minimum requirement an 8800? The consoles use a 7800, surely that should be the minimum?
remember to keep your save file after you finished the witcher 1 because you can import it to witcher 2.Things like choices and some weapons/gold carries over.Ephraim J. Witchwood said:Can't wait for this!
Shit, I still need to get through the first one. >_<
I think its a scabbard for the sword he's got in his hand. Remember he has his silver sword for monsters and steel for everything else.Enkidu88 said:Also, more on topic, what the heck is on Geralts back in the second picture? His sword is there, but theres some kind of weird wooden apparatus as well. Almost looks like a tripod.