Mark Rein: Intel Held Back Innovation On the PC

geizr

New member
Oct 9, 2008
850
0
0
I think we long passed the point that higher fidelity, higher texture resolution, and higher polygon count graphics really make any difference anymore to the gaming experience. Nowadays, it seems actual human art-direction, as opposed to raw calculating prowess, is the more important. Also, as someone above pointed out, sometimes the need for higher performing hardware is not because the game engine is really doing anything that spectacular; it's because the developers were so butt-fucking lazy/incompetent about optimizing their algorithms and code. Come to think of it, relying on higher-end hardware to make great looking imagery is also lazy, because you're just trying to calculate your way to pretty pictures rather than having any real artistic skill, aesthetics, or sensibilities (any trained monkey can make a highly detailed, pretty picture, but only an artist can make something that truly moves you).
 

Kinitawowi

New member
Nov 21, 2012
575
0
0
SkarKrow said:
An A8 or A10 with some 1600+ memory will mop the floor with an i3 system and you can build it dirt cheap, you could have an A10 5800K system for under £300 easily. I really caan't recommend the i3 to anyone though over even AMD's higher options: a true quad-core such as a later Phenom picked up on the cheap will serve you better.

Honestly it's not so clean cut even in the mid-ranges to just go with intel though, the FX chips are very well priced for competition, you can have 8350's for as much as £40 less than a 3570K and it's neck and neck with it in most things, some games are intel and some are amd favourites, but the AMD tends to have the edge in multithread, so streaming or compressing high-def video is easier. Hence I'd recommend the 6350 or 8350 if you plan to do any streaming, it's cheaper and gives you £40-£80 saving on the CPU and around £70 on a motherboard with the same feature set, and you could spend that on something much better for gaming: the graphics card.

Buuuuut those i5's are better for word processing, browsing,e tc, and single threads in general. I do think the current piledriver chips will age a bit better than ivy bridge though, since console ports will soon be using those extra threads.

If you got the cash to burn though nothing touches socket 2011.
Personally I still do a lot with emulation, so the raw clock on single threads is more of a concern (last I checked, MAME still hadn't found a decent way to split to multiple cores). But yeah, AMD have always been more interested in the graphics processing performance than Intel have so maybe it'll cope better with video processing and multicore in the long run, but... eh.

I'm planning on an LGA 1150 build sometime next month - my old Core 2 Duo build isn't quite cutting it any more (curse my new monitor!)...
 

The White Hunter

Basment Abomination
Oct 19, 2011
3,888
0
0
Kinitawowi said:
SkarKrow said:
An A8 or A10 with some 1600+ memory will mop the floor with an i3 system and you can build it dirt cheap, you could have an A10 5800K system for under £300 easily. I really caan't recommend the i3 to anyone though over even AMD's higher options: a true quad-core such as a later Phenom picked up on the cheap will serve you better.

Honestly it's not so clean cut even in the mid-ranges to just go with intel though, the FX chips are very well priced for competition, you can have 8350's for as much as £40 less than a 3570K and it's neck and neck with it in most things, some games are intel and some are amd favourites, but the AMD tends to have the edge in multithread, so streaming or compressing high-def video is easier. Hence I'd recommend the 6350 or 8350 if you plan to do any streaming, it's cheaper and gives you £40-£80 saving on the CPU and around £70 on a motherboard with the same feature set, and you could spend that on something much better for gaming: the graphics card.

Buuuuut those i5's are better for word processing, browsing,e tc, and single threads in general. I do think the current piledriver chips will age a bit better than ivy bridge though, since console ports will soon be using those extra threads.

If you got the cash to burn though nothing touches socket 2011.
Personally I still do a lot with emulation, so the raw clock on single threads is more of a concern (last I checked, MAME still hadn't found a decent way to split to multiple cores). But yeah, AMD have always been more interested in the graphics processing performance than Intel have so maybe it'll cope better with video processing and multicore in the long run, but... eh.

I'm planning on an LGA 1150 build sometime next month - my old Core 2 Duo build isn't quite cutting it any more (curse my new monitor!)...
Pentium Dual-Core E5700 still here.... it works alright I guess, I can run Far Cry 3 between it an an HD 4770...

I'm planning a build based around thew FX 6350 and a 7870 tahiti LE card, maybe get me some 2400MHz Kingston Beast in there. Y'know, for overkill... I reeally need a job before I can do that though...
 

Kinitawowi

New member
Nov 21, 2012
575
0
0
SkarKrow said:
Pentium Dual-Core E5700 still here.... it works alright I guess, I can run Far Cry 3 between it an an HD 4770...

I'm planning a build based around thew FX 6350 and a 7870 tahiti LE card, maybe get me some 2400MHz Kingston Beast in there. Y'know, for overkill... I reeally need a job before I can do that though...
My E6850 on a HD 6770 has served me great... right up until the moment I upgraded from a (nine years old and finally deceased) 17" to a 24" monitor, and in turn from 1280x1024 to 1920x1200. Then it started struggling a bit. ;-) Didn't help that it had to drop from 8Gb to 4 because of dead sticks (and there's no real point buying replacement DDR2 for it).

i5-4670K and 7870 is my likely route, although I'll check some specifics nearer the time.
 

The White Hunter

Basment Abomination
Oct 19, 2011
3,888
0
0
Kinitawowi said:
SkarKrow said:
Pentium Dual-Core E5700 still here.... it works alright I guess, I can run Far Cry 3 between it an an HD 4770...

I'm planning a build based around thew FX 6350 and a 7870 tahiti LE card, maybe get me some 2400MHz Kingston Beast in there. Y'know, for overkill... I reeally need a job before I can do that though...
My E6850 on a HD 6770 has served me great... right up until the moment I upgraded from a (nine years old and finally deceased) 17" to a 24" monitor, and in turn from 1280x1024 to 1920x1200. Then it started struggling a bit. ;-) Didn't help that it had to drop from 8Gb to 4 because of dead sticks (and there's no real point buying replacement DDR2 for it).

i5-4670K and 7870 is my likely route, although I'll check some specifics nearer the time.
The 7870 tahiti cards are utter monsters for the price, default recommendation these days. I'll be considering haswell chips when they hit but I'm pretty wary of the platform cost, I can have a great AMD build for under £700 but the equiv ivy is a good 850 or so.

If haswell is just a 10-15% performance boost i'll be ignoring it and waiting to see what steamroller brings.
 

Crazie_Guy

New member
Mar 8, 2009
305
0
0
This is completely stupid. We aren't going to be seeing super realistic games by getting better integrated graphics, that's decided by the top end cards, and when a dev has a vision for a game that will not be scalable down to integrated PCs, they ignore that demographic. Here's a fact: integrated graphics will NEVER be a match for a dedicated graphics cards. If there are devs holding back their own games to accommodate integrated setups, those devs are the problem, not the state of the hardware, and there will always be devs who don't bother and are happy to push ahead with games that are for graphics cards only.
 
Sep 14, 2009
9,073
0
0
Kinitawowi said:
SkarKrow said:
Most people who care won't be playing games on intelgrated graphics though, they'll seek out a dedicated chip or use a card in their desktop.

If you must use integrated graphics systems why would you use intel and not AMD? The performance is day and night, the recent APU's give really solid performance for the budget...
All of the above. Seriously, nobody interested in gaming (or system building) is going to fork out the beans for an i7-3770K processor and then say "you know what, the integrated HD4000 graphics are fine".

AMD's APUs have the low end of the market locked up right now. The builtin graphics on even a relatively lowly A4 mean that they totally whomp Intel up to about the mid-i3. As soon as you get past that though, the raw CPU power of the Intels takes over; and Intel are fine with lesser integrated performance because they know that virtually nobody will use it.

All PC builders learn very early that a machine is only as fast as its weakest component. Historically that always meant the hard drive (Windows 7 Experience Index 5.9 GO GO GO), but SSDs have come down in price enough now for that not to be the issue. Now everything else is fair game, and that means the graphics are in the mix. Integrated is fine for home theater and other mini PCs where space for additional cards and cooling is at a premium. Want to do anything worthwhile? You need dedicated graphics. It's hardly Intel's fault that they've recognised this and aimed at the CPU Power end of the market rather than the "just enough graphics to play Angry Birds" end.

All that said, of course, if reining back the graphics for so long has enabled the indies to get on with doing their thing and reduce the number of games that emphasise graphics over, you know, game, then Intel can keep on reining.
haha completely agree with this, especially the windows index part, had to laugh at that XD
 

Monsterfurby

New member
Mar 7, 2008
871
0
0
9thRequiem said:
"PC innovation suffered for it"
Did it Mark? Did it? How, exactly? Innovation isn't graphics.
Graphics may not be innovation - but graphics certainly is emotions. [http://www.escapistmagazine.com/videos/view/jimquisition/6945-Emotions-Polygons-and-Ellen-Page]

Also: Erm... Intel builds processors, right? Their onboard GPUs are basically only placeholders in case a system doesn't have a dedicated GPU, right? Now, show of hands: if you build or buy a PC for gaming purposes, how many of you would actually buy one without a dedicated GPU?

*chirp* *chirp*

Yes. I thought so.
 

unacomn

New member
Mar 3, 2008
974
0
0
Mark has been using this excuse for years and years. It's simple, we kill the Batman.. no, wait, what I meant to say is, if someone won't dish out money for even the lowest end dedicated video adapter (they're quite cheap), odds are they don't really care enough about games that use advanced graphics to actually be interested in them in the first place.

What has killed innovation in the PC medium is game designers constantly obsessed with the size of their penis instead of with making innovative video games.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
Intel does not make graphics. (well ok they do have integrated chips, that is not a graphic card, that is ability to see a desktop on a work laptop). Even laptops use seperate graphic cards most of the time.
It is true that Intel does own Nvidia, but i dont think that this is what was meant.
 

MeChaNiZ3D

New member
Aug 30, 2011
3,104
0
0
I hope that guy realises he basically said "Graphics = innovation". Which is a remarkably stupid sentiment to have in regards to gaming.

And aside from that, PC gamers in my experience generally care enough to upgrade beyond the integrated graphics.
 

Lunar Templar

New member
Sep 20, 2009
8,225
0
0
I'm confused

when did 'graphics = innovation' happen? did I miss a memo or are they letting the stupid people talk again?