The Witcher 3's PC System Requirements Revealed

mad825

New member
Mar 28, 2010
3,379
0
0
I'll say this once and I'll say it twice. A PC game that has a console counterpart, the former suffers. The specs are doctored.
 

Adam Jensen_v1legacy

I never asked for this
Sep 8, 2011
6,651
0
0
Strazdas said:
While Xeon no doubt can play games decently, its important to remmeber that it is a serverrack CPU with nonstandard socket which means that you have to tailor your entire tower after it, meanwhile I7 will just fit in your regular run off the mill mobo and other parts no fuss.
That's no longer the case. New Xeon processors work on all new motherboards. A Haswell Xeon will work on same motherboards that an i7 would work on. Check it out. E3-1231-v3 is quite literally a cheaper i7-4770 with no iGPU. It's clocked 100Mhz lower which is irrelevant.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
Charcharo said:
Jupiter Ex was the FEAR 2 engine I think.
SoM's textures, even on Ultra did not impress me that much. Just a decent looking game, nothing more.

Here is the Wolfenstein thing:
http://www.hardocp.com/article/2014/05/21/wolfenstein_new_order_performance_review#.VK5R9SuUemU

"Utilizing the "qconsole.log" file Wolfenstein: The New Order outputs we have determined that the game recognizes our video cards support OpenGL 4.4. This is good news in that NVIDIA and AMD have OpenGL 4.4 support. However, further reading the log file we find that this game does not run in OpenGL 4.x mode, instead this game runs under OpenGL version 3.2. That means that this game is on the level of a DirectX 9 (DX9) feature set type game."


We go for the lowest supported in the actual game, aint it obvious? If its 800x600, then you make the minimum around 800x600 and put that resolution as the minimum. If the minimum is 1280x1024, same thing. I am not proposing anything that hard here :p . Hell you mostly agree with me anyway.
Textures alone does not make a game pretty. often DefaultMaps do far more than textures in terns of what players notice.

Didnt knew about that in Wolfensten, odd.

There is no "lowest supported" though. You can set any costum resolution you want via command line in almost any game.

Edit: i want to see this:



Adam Jensen said:
That's no longer the case. New Xeon processors work on all new motherboards. A Haswell Xeon will work on same motherboards that an i7 would work on. Check it out. E3-1231-v3 is quite literally a cheaper i7-4770 with no iGPU. It's clocked 100Mhz lower which is irrelevant.
you attributed the quote to wrong person here.

Anyway, i wasnt aware that the new Xeons did that. Indeed then these are good options.
 

newwiseman

New member
Aug 27, 2010
1,325
0
0
A 7870 minimum?! Must be a pretty game, but I guess that's for 1080p. Guess I'll find out if my 6850 will handle it if I keep it at 720P, since I play all games on my 720p 32"TV.
 
Dec 16, 2009
1,774
0
0
Charcharo said:
Mr Ink 5000 said:
Charcharo said:
Remember Kids, minimum and recommended specs are ALMOST CERTAINLY a bit exaggerated.
This might VERY well be such a case.


Yet people here take them as fact. Damn. Come on escapists, dont disappoint me...
Well, agree that they must be exaggerated if the game is to be released on XBOne & PS4, the minimum requirement at most should be something slightly more powerful than what the consoles are packing.

The problem is, do you risk a purchase when you're just under minimum?

Shame bench marks cant be downloaded instead of demo's
Thing is, I am almost never wrong on this.
Watch Dogs wanted 6 GB RAM. Ran on MEDIUM on 4 GB + ATI 5770.
Crysis 3 has 5770 as a minimum. 5770 achieves High.
Call of Duty is Call of Duty and Ghosts wanted 6 GB of RAM, yet it used AT MOST 2. AW is ugly, yet still wants 6 GB. At least it uses them... no one knows what for.

Even Shadow Of Mordor lies in its requirements. It does not need 6 GB VRAM for Ultra. It runs on 4GB RAM + 5770 too... Fairly well even, 60 fps Low 900p.


I agree though, I like benchmarks.
I believe you, but I wouldn't risk it until a deep sale

I do feel like the minimum is artificially inflated to justify "Next Gen" as a lot of what i have seen of XBOne and PS4 looks like what i used to get out of my PC with 1GB of VRAM, only differnce being is that i had to have the AA turned off
 

DarkhoIlow

New member
Dec 31, 2009
2,531
0
0
I just bought my 980 MSI recently.

My other specs are i5 2500k 4.2Ghz & 8GB ram. I should be good for Ultra right? Or should I just OC my CPU to 4.5?
 

AnthrSolidSnake

New member
Jun 2, 2011
824
0
0
Looks like I'll be getting more life out of my 780ti than I expected. I honestly expected it to be completely out of date within only a couple years.
My AMD 8350 however? Looks like it's starting to become to standard. Which means in order to stay ahead of the game I'll have to upgrade my entire motherboard unless AMD puts out something better on the same socket type. *sigh* Might have to switch back to the more expensive Intel again.
 

Roofstone

New member
May 13, 2010
1,641
0
0
Thank god I am getting it for my playstation, I barely know what any of those words mean.

I can probably guess my way to what a graphic card does, however.
 

spartandude

New member
Nov 24, 2009
2,721
0
0
I nearly spat out my drink, while i meet the requirements (and a little bit more, but not much, maybe one fancy setting) thats insanely high. I mean holy shit is that high. And im sorry CD Projekt Red, I love you guys but im sure you could compress that 40gb a little bit more.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
Mr Ink 5000 said:
only differnce being is that i had to have the AA turned off
Where is the difference? Consoles either have no AA or have FXAA (which is not actual AA, but instead a way to blur your visuals so you wont notice the edgies. I and majority of gamers think FXAA is worse than no AA).

Charcharo said:
Lowest supported is the one in the game settings. Example would be 800x600 in Metro 2033. That is the minimum for it.

I know that games *SHOULD* support modding of resolution. Even though I am one of the biggest proponents of modding, I do not think a game has to have it for such basic things. For God's sake, due to a malfunction of my TV AND Monitor, I am using an old 1280x1024 monitor now. That is pretty low end :p I have a CRT somewhere as well...
Oh, i see what you mean. yeah, that would make sense. but i guess a lot of people would then complain about not being able to run it meeting minimum specs because they didnt think about lowering resolution, since lowering to a non-native resolution on modern screens is yucky.

most games DO support modding of resolution. using command line stuff like -w800 -h600 or -width800 -height600 will do the trick for most games i saw. others gave a way to edit the settings file to put resolution you want. similar to how people modify field of view for multimonitor gaming.