Mark Rein: Intel Held Back Innovation On the PC

Steven Bogos

The Taco Man
Jan 17, 2013
9,354
0
0
Mark Rein: Intel Held Back Innovation On the PC

[tweet t=https://twitter.com/MarkRein/status/335584521485426689]
Epic Games president Mark Rein claims that he tried for years to convince Intel to fix its graphics.

Unless you are playing CryTek's latest tech-demo-disguised-as-a-game, chances are you aren't stressing your PC to the absolute maximum. There are very few PC games these days that force gamers to upgrade their graphics cards, whereas just a few years ago it was quite common for games to really push the boundaries of PC graphics technology. I rode out an Nvidia 8800 GT for something ridiculous like 5 years before seeing the need to upgrade. Epic Games president Mark Rein says that there is more at work here than just low-system requirement indie games, blaming Intel for directly holding back innovation on the PC.

"For years we tried to convince Intel to fix their graphics," claims Rein via Twitter [https://twitter.com/MarkRein/statuses/335584521485426689], "but their data said they were good enough. PC innovation suffered for it." Rein didn't clarify the exact timeframe these attempts took place in, but later replied to a user that it was a time when "Intel still owned the lions' share of the graphics market with integrated. That's why their data said it was good enough."

This is a very interesting perspective to anyone who knows anything about PC building. Intel dominates installations, thanks to integration, and for desktop activities, browsers and video playback, Intel's hardware is more than adequate for a majority of users. Simply put, despite graphics giants Nvidia and AMD pushing each other with more and more powerful cards, it doesn't account for anything because the majority of users are sitting on Intel's integrated chip, which means PC developers have to scale games back to the "lowest common denominator."

These days, Intel has picked up the slack a bit, doubling and tripping the power of its integrated GPU's, but who knows what kind of super crazy realistic graphics we could be looking at if they got their act together back in the day?

Soruce: Twitter [https://twitter.com/MarkRein/statuses/335584521485426689]

Permalink
 

The White Hunter

Basment Abomination
Oct 19, 2011
3,888
0
0
Most people who care won't be playing games on intelgrated graphics though, they'll seek out a dedicated chip or use a card in their desktop.

If you must use integrated graphics systems why would you use intel and not AMD? The performance is day and night, the recent APU's give really solid performance for the budget...
 

9thRequiem

New member
Sep 21, 2010
447
0
0
"PC innovation suffered for it"
Did it Mark? Did it? How, exactly? Innovation isn't graphics.

Personally, I'm glad for this. There was a time when it felt like games forced a requirement to upgrade your graphics card ridiculously often, but I've had the same graphics card for much longer and it's still very high end, letting me stick everything on max settings without worrying or paying out large sums.

I think it's better for developers to work under some level of constraint, asking for some engine optimization rather than just upping minimum requirements.

I also think that the biggest thing holding graphics back is actually that most PC games with high graphics are multi-platform with consoles.
 

EvolutionKills

New member
Jul 20, 2008
197
0
0
But what is so great pushing so fast anyways? I had a 8800GT that I had when if firt came out and only replaced it about 6 months ago. And even then, I felt is was not because the games were really pushing the envelope, but because developers have gotten lazy optimizing their games. Every CoD game has ran at 60FPS on the same console hardware with practically the same engine. My rig ran CoD4 and World at War beautifully, and it was all downhill from there. The initial release of Black Ops was almost unplayable until it was patch a half dozen times to fix CPU optimization problems. The same engine on the same hardware, console is optimized to run flawlessly, and my PC just get a messy pile of unoptimized code. I stopped caring after Black Ops, haven't touched the series since...
 

SweetWarmIce

New member
Jun 1, 2009
108
0
0
Polygons are emotions, sorry that slipped out.

While it might be a shame that we could have a lot better graphics by now. It does mean people get more out of their hardware before having to upgrade.
 

Doom972

New member
Dec 25, 2008
2,312
0
0
There are enough successful games out there that only work on powerful dedicated cards. Sounds like a weak excuse.

If it's because he just wants to attract the widest audience possible, he can quit Epic and go work for Rovio and design the next Angry Birds. I'm sure he won't be missed.

I don't see why non-gamers should have to pay for an expensive GPU that they will never use.

Also: Graphics != Innovation
 

sir neillios

New member
Dec 15, 2012
120
0
0
Why would epic games care about Intel integrated anyway? You can't play anything on those chips. No joke, I got an AMD laptop when I was sixteen, nearly five years ago. It could run Mass Effect 2, at about 20 fps. I lent my copy to my cousin who had an Intel laptop he'd just got that year, that is 2010, it was unplayable. I suppose it is a shame for people who may want to play games on their regular PCs (ie teenagers on their laptops) that the prebuilt PC space is dominated by Intel.
 

Smooth Operator

New member
Oct 5, 2010
8,162
0
0
Coming from Epic games... the guys who haven't done anything of note on PC since 2004.

But anyway, anyone who wanted to play higher profile games knew damn well the cheap Intel shit would not do it so they really had no influence on this, and that is why they were the prime graphics producer because most PC's aren't used for games at all.
 
Apr 28, 2008
14,634
0
0
9thRequiem said:
"PC innovation suffered for it"
Did it Mark? Did it? How, exactly? Innovation isn't graphics.

Personally, I'm glad for this. There was a time when it felt like games forced a requirement to upgrade your graphics card ridiculously often, but I've had the same graphics card for much longer and it's still very high end, letting me stick everything on max settings without worrying or paying out large sums.

I think it's better for developers to work under some level of constraint, asking for some engine optimization rather than just upping minimum requirements.

I also think that the biggest thing holding graphics back is actually that most PC games with high graphics are multi-platform with consoles.
What he said. I also don't mind games not advancing in graphics and requiring an upgrade every damn year. Whether it's because of Intel or consoles, it's a GOOD thing. Let's more people jump into PC gaming. Again, that's GOOD.

Graphics =/= innovation and I freaking wish more people got that through their heads.
 

Lawyer105

New member
Apr 15, 2009
599
0
0
We could have better graphics?! Srs?!

We've all seen the result of games pushing the graphics envelope (with the possible exception of Crysis)... by and large they end up being drastically over-budgeted, resulting in massive minimum-sale targets, shallower story/design/mechanics etc. and often a boatload of bugs because they ran out of money before they have time to polish.

I, for one, am grateful that some studio's are starting to realise that there's more to making awesome games than photorealistic graphics!
 

TheComfyChair

New member
Sep 17, 2010
240
0
0
Pretty sure every single person who bought battlefield 3 on PC, for example, would have had non-integrated graphics. Intel didn't hold back innovation with underpowered integrated chips, it just reduced potential marketshare of the PC platform. Did that hold back innovation in gaming? Well, yes, it will have. Now PC is suddenly a big player, almost certainly the biggest overall (LoL alone dwarves CoD on all 3 platforms combined in concurrent playercount, for example), we're seeing tons of innovation from a thriving indie scene and even kickstarted AAA titles.

Interestingly, even without most PC users being capable of playing AAA games, it is still on par with the 360 in market share a lot of the time in financial reports. So, with Intel really gunning for good integrated graphics now, there's going to be a LOT of PCs out there which will be fully capable of playing games. I'll give it 3 years before the upper echelons of integrated graphics (like AMD A8 APUs and Intels highest end varient) are on par with a PS4.

That didn't happen last time around and it's only recently (2 years or so) that integrated graphics beat the consoles, so the PS4 and 360 could grow their market share nicely. So that shift will provide an interesting marketing problem for console manufacturers. How can they convince people to buy a PS4/infinity if their supermarket bought laptop - that they needed anyway for work ect. - is more powerful, with cheaper games, and can be played on the train?
 

Kinitawowi

New member
Nov 21, 2012
575
0
0
SkarKrow said:
Most people who care won't be playing games on intelgrated graphics though, they'll seek out a dedicated chip or use a card in their desktop.

If you must use integrated graphics systems why would you use intel and not AMD? The performance is day and night, the recent APU's give really solid performance for the budget...
All of the above. Seriously, nobody interested in gaming (or system building) is going to fork out the beans for an i7-3770K processor and then say "you know what, the integrated HD4000 graphics are fine".

AMD's APUs have the low end of the market locked up right now. The builtin graphics on even a relatively lowly A4 mean that they totally whomp Intel up to about the mid-i3. As soon as you get past that though, the raw CPU power of the Intels takes over; and Intel are fine with lesser integrated performance because they know that virtually nobody will use it.

All PC builders learn very early that a machine is only as fast as its weakest component. Historically that always meant the hard drive (Windows 7 Experience Index 5.9 GO GO GO), but SSDs have come down in price enough now for that not to be the issue. Now everything else is fair game, and that means the graphics are in the mix. Integrated is fine for home theater and other mini PCs where space for additional cards and cooling is at a premium. Want to do anything worthwhile? You need dedicated graphics. It's hardly Intel's fault that they've recognised this and aimed at the CPU Power end of the market rather than the "just enough graphics to play Angry Birds" end.

All that said, of course, if reining back the graphics for so long has enabled the indies to get on with doing their thing and reduce the number of games that emphasise graphics over, you know, game, then Intel can keep on reining.
 

Rellik San

New member
Feb 3, 2011
609
0
0
I believe the point is though, think of it from a manufacturing standpoint, if Intels internal graphics were kept at an accelerated pace, then Intel HD3000 would be about equivalent to an Nvidia 8800GT and considerably cheaper to manufacture to boot. Imagine being able to mid end game on a £400 Laptop instead of a £1000 one, it would have revolutionised the market.

I believe is the point, from a purely speculative stand point it's interesting to consider how things could have gone.

Not to mention, if they could get those results cheaply enough it would have also been good for the console market, good for the goose good for the gander etc.

Of course all this is purely speculation on my part.

So why would the president of Epic Games care about this? Because it would mean that the installed user base of people able to PC Game without wanting to spend a fortune would have sky rocketed. Meaning a potential increase and more filthy casuals turning to hardcore experiences.

Hypothetically of course... and as someone who does long haul travel alot being able to get in DX:HR at a decent clip and settings on my lappy would have been welcomed.
 

Albino Boo

New member
Jun 14, 2010
4,667
0
0
Rellik San said:
I believe the point is though, think of it from a manufacturing standpoint, if Intels internal graphics were kept at an accelerated pace, then Intel HD3000 would be about equivalent to an Nvidia 8800GT and considerably cheaper to manufacture to boot. Imagine being able to mid end game on a £400 Laptop instead of a £1000 one, it would have revolutionised the market.

I believe is the point, from a purely speculative standpoint it's interesting to consider how things could have gone.

Not to mention, if they could get those results cheaply enough it would have also been good for the console market, good for the goose good for the gander etc.

Of course all this is purely speculation on my part.

So why would the president of Epic Games care about this? Because it would mean that the installed user base of people able to PC Game without wanting to spend a fortune would have skyrocketed. Meaning a potential increase and more filthy casuals turning to hardcore experiences.

Hypothetically of course... and as someone who does long haul travel a lot being able to get in DX:HR at a decent clip and settings on my lappy would have been welcomed.
The bread and butter of PC sales is the business market. The cost of upping the integrated chipset would have been passed on to the consumer but the better graphics isn't a selling point to the majority of the market. If you are buying laptop or desktop for a business, a high fps on games isn't a consideration.
 

Exort

New member
Oct 11, 2010
647
0
0
What? the by far biggest improvement (in terms of performance) of Intel chip is the iGPU on the chips for the last few generations.
 

Gailim

New member
Oct 13, 2009
79
0
0
if they had pushed for better intel graphics years ago Nvidia and AMD would be exactly where they are now. they have been pushing each other hard. intel having better integrated wouldn't change that.

what better integrated means practically is that every computer is a gaming computer. if you can play a game like TF2 at medium settings on integrated graphics it can really boost PC gaming
 

The White Hunter

Basment Abomination
Oct 19, 2011
3,888
0
0
Kinitawowi said:
SkarKrow said:
Most people who care won't be playing games on intelgrated graphics though, they'll seek out a dedicated chip or use a card in their desktop.

If you must use integrated graphics systems why would you use intel and not AMD? The performance is day and night, the recent APU's give really solid performance for the budget...
All of the above. Seriously, nobody interested in gaming (or system building) is going to fork out the beans for an i7-3770K processor and then say "you know what, the integrated HD4000 graphics are fine".

AMD's APUs have the low end of the market locked up right now. The builtin graphics on even a relatively lowly A4 mean that they totally whomp Intel up to about the mid-i3. As soon as you get past that though, the raw CPU power of the Intels takes over; and Intel are fine with lesser integrated performance because they know that virtually nobody will use it.

All PC builders learn very early that a machine is only as fast as its weakest component. Historically that always meant the hard drive (Windows 7 Experience Index 5.9 GO GO GO), but SSDs have come down in price enough now for that not to be the issue. Now everything else is fair game, and that means the graphics are in the mix. Integrated is fine for home theater and other mini PCs where space for additional cards and cooling is at a premium. Want to do anything worthwhile? You need dedicated graphics. It's hardly Intel's fault that they've recognised this and aimed at the CPU Power end of the market rather than the "just enough graphics to play Angry Birds" end.

All that said, of course, if reining back the graphics for so long has enabled the indies to get on with doing their thing and reduce the number of games that emphasise graphics over, you know, game, then Intel can keep on reining.
An A8 or A10 with some 1600+ memory will mop the floor with an i3 system and you can build it dirt cheap, you could have an A10 5800K system for under £300 easily. I really caan't recommend the i3 to anyone though over even AMD's higher options: a true quad-core such as a later Phenom picked up on the cheap will serve you better.

Honestly it's not so clean cut even in the mid-ranges to just go with intel though, the FX chips are very well priced for competition, you can have 8350's for as much as £40 less than a 3570K and it's neck and neck with it in most things, some games are intel and some are amd favourites, but the AMD tends to have the edge in multithread, so streaming or compressing high-def video is easier. Hence I'd recommend the 6350 or 8350 if you plan to do any streaming, it's cheaper and gives you £40-£80 saving on the CPU and around £70 on a motherboard with the same feature set, and you could spend that on something much better for gaming: the graphics card.

Buuuuut those i5's are better for word processing, browsing,e tc, and single threads in general. I do think the current piledriver chips will age a bit better than ivy bridge though, since console ports will soon be using those extra threads.

If you got the cash to burn though nothing touches socket 2011.