Valve Boss: New Intel CPU Allows a "Console-Like Experience" on the PC

klasbo

New member
Nov 17, 2009
217
0
0
If it gives 100 FPS in Crysis 2, it's probably ok. Otherwise, meh.

AMD owns ATI, so I'd expect them to do this sort of thing first, not Intel. Considering how utterly crappy all Intel graphics chips have been so far, I'm expecting this to fail.
 

Something Amyss

Aswyng and Amyss
Dec 3, 2008
24,759
0
0
GiantRaven said:
What an odd term to use. The only way I can imagine this providing a more console like experience is if it suddenly sprouts a controller to play with.
PC Gaming: Now with RROD.

Personally, I want a more PC experience from consoles. Minus the hardware issues, of course.
 

theriddlen

New member
Apr 6, 2010
897
0
0
Oh, Gabe, it looks like your fat is starting to take control over you. Lose weight why don't you? Get yourself more bello in the jello, it's very outnumbered right now.
 

Covarr

PS Thanks
May 29, 2009
1,559
0
0
Copypasting what I posted about this elsewhere:
'The "console-like gaming experience" includes 12 year olds with foul mouths over online multiplayer voice chat, inferior controls leading to a significant disadvantage against mouse+keyboard players, less control over the audio/video settings of your game, and an overall less-fun experience (at least in first-person games).'

Of course, I'm sure this isn't what he meant, but when it comes to first-person games, I've yet to see one that's better on consoles than on PC, in any quantifiable way.

P.S. Thanks
 

CatmanStu

New member
Jul 22, 2008
338
0
0
I don't think the Graphic Card should be put out to pasture as there are a lot of people who have paid good money to stay ahead of the curve (not me though), but I do feel that games should be made to bypass the cards, but still conform to a minimum graphical fidelity (my suggestion would be 720p). If the developers/publishers then outsourced to the modding community, you could have graphical upgrade patches designed for people who have cards that make the most use of them.
 

Lacsapix

New member
Apr 16, 2010
765
0
0
Now THIS is why I prefer console gaming to PC gaming, that new CPU is going to cost more then an console itself.
 

Atmos Duality

New member
Mar 3, 2010
8,473
0
0
By cutting the busline out between the processor and video processor (and linking it directly to the L2 processing access), yeah, that actually would be a game changer, and one for the better.

It becomes more "Console like" because consoles have their video hooked more closely to the core processor (because that's exactly what they're made to do; play games).

The downside: We're creating another point of failure by eliminating the modularity of a discreet graphics card, though I suppose anyone who's been using a laptop to game on won't notice the difference.

All of those Crossfire/multiple PCIX users would have a bit of a fit though if this became the new standard.
 

EHKOS

Madness to my Methods
Feb 28, 2010
4,815
0
0
Great, when do I get my cube that does everything so I don't have to fuck with tiny ass compenents?

(My hands are too big for HDD installation :( )
 

Co3x

New member
Oct 11, 2010
32
0
0
This is interesting actually, it may be the beginning of CPUs becoming more like GPUs.

I work for a company that develops software (nothing fun and I'm not a developer) and quite a couple of the guys keep going on about how it would be awesome if CPUs were more like GPUs because of the sheer power they can produce, when compared to CPUs. Though they do say it would be a ***** to program for, so maybe this slow integration might allow some developers wet dream come true?

Who knows, but I do agree that it seems a step in the right direction to replacing graphics cards. They do have a long way to yet though, plus it takes away half the fun of hunting for a graphics card :(
 

Roboto

New member
Nov 18, 2009
332
0
0
I can only think of the heat issues. If you want something to perform as a powerful GPU and CPU at once, the heat element will be extreme. Think of the reason graphics cards are huge now, it isn't just to do with the many caps, GPU and RAM, it's the monster of a cooling device on top. And that's only at 600mhz-900mhz! 1.3ghz? Yikes. Can you say standardized liquid cooling? If not, then the thermals on the chip will have to be kept low, meaning lower output. This is the reason they can't run it at higher resolutions, I assume.
 

RubyT

New member
Sep 3, 2009
372
0
0
klasbo said:
AMD owns ATI, so I'd expect them to do this sort of thing first, not Intel.
You shouldn't. AMD owns ATI. Logic dictates that they have NO incentive to produce well-performing IGPs: they want to sell you discrete graphics cards.
They had to make integrated graphics fast enough to squeeze nVidia out of the chipset game and it seems they've succeeded. That's why the 8xx series used the same lame graphics core as the 780G series that came TWO YEARS before. No more competition.

Intel on the other hand have just scrapped their ambitions to create a discrete GPU and thus have no problem pouring whatever know-how they have of them into their integrated solutions. For them, it's about platform performance. Their CPUs have been dominating since 2006 and now they have the fastest integrated graphics too. Not that it really matters, because you're still stuck with an IGP that can't do modern video games at anything but ridiculous resolutions with no eye candy.

Don't know what Gabe means with "console experience"?
I think he actually meant to say "Apple Experience". Get everything from one vendor. Want more performance? Sorry, maybe next year. Oh yeah, you have to buy a new CPU too, 'cause we bundle them now.
Intel is pretty insolent about it. They coupled CPU and GPU, so even if you need just one of them, you'll buy both of them. And they've recently changed sockets faster than I do socks. They should start to adopt quirky RAM configurations again (like back with RAMBUS), so that you really have to buy EVERYTHING new.

What I really understand by "console experience" is that Portal 2 will make no effort to push graphics boundaries. Which is no shock at all, since it has to run on the X360 too and that thing is technology from 2005. You know, the year Facebook acquired its domain and Escapist went online...
 

Callate

New member
Dec 5, 2008
5,118
0
0
I can certainly see the benefit of knowing that every computer past a certain generation will operate to a certain minimal set of specifications. The graphics card really is the bottleneck in modern PC gaming, and the wildcard that makes it difficult for game designers to know what to program towards.

If the new technology prevented us from having to put up with new versions of DirectX every couple of years that offer minimal chrome and multiple opportunities to cast everyone who's computer is more than two years old out into the cold, that would definitely be a good thing.

But "Console-Like"? That seems like a comment aimed towards... Luring out console gamers who are hoping PC gamers will explode. ;)
 

mad825

New member
Mar 28, 2010
3,379
0
0
eh, right now leave this short of stuff for laptops,notebooks and mobile phones. this sort of technology at the moment is not for dedicated gaming.

And Gabe, if you so much as dare to advertise this shit in-game I will...
 

DojiStar

New member
Apr 24, 2009
17
0
0
These comments are way too understanding, thoughtful, and non-contentious. I expect more from forums.

I'll try to do better:

Well, I've been getting a pretty "console-like experience" from the fact that all games now are crappy console ports for the brain dead who can't type or figure out how to use a keyboard and mouse -- sometimes even simultaneously. The PC has been consolized enough, thank you very much. And what's next, making for a painless "toaster-like experience on the console" by reducing button count or reducing functionality even further? Keep PCs complicated for elitists, please. Actually is a gaming rig any more costly or complex than a console plus HD home theater system? Pro-console people always seem to omit that bit. Maybe if we stopped pirating, people would even make games for us PC gamers.
 

risenbone

New member
Sep 3, 2010
84
0
0
Well it's simple really but not from the end users point of view but more from a devs point of view.

The devs now have a new fairly solid baseline to start their graphical requirements from as in every computer that has this chipset has this level of graphical capability. Then you can add the code to make the graphics work with the various GPU's for better graphics from there. As that chipset becomes more widespread this then translates into a wider base for sales. It won't kill the seperate GPU market but will make PC software development more viable.