Oddworld Creator: Xbox One Already "Getting Comparable" to PS4

144_v1legacy

New member
Apr 25, 2008
648
0
0
Neronium said:
144 said:
Neronium said:
I still think it's a bit early to be making that assumption, as both consoles are still not even a year old and we don't start getting that stuff that truly looks the best until about a year or two in. For right now, I will say that the games pretty much look identical, but I still say give it at least a year to two years to when we can see the real differences.

Hell look at how it was last gen:




Why do those four particular games get to represent? To make this sort of point requires a large number of examples. Larger than 1. Not to mention, the Xbox games you've shown are of a radically different artistic style to those from the Playstation. The comparison lacks clarity.
I wanted to choose games that were from the same series on their console, that got sequels later on those consoles within a 2-3 year span. Both the first games on the system were launch titles, and the second image of both were the sequels later. You can notice in the second image that there is more detail than the one in the first image. I wasn't comparing the two systems together, the point of my post is that both systems have not been out for a year yet, and we'll see more detail in games later on then at the launch of them. Plus after looking at the launch list for 360 none of them were really cartoony, and if they were they didn't get a sequel or I didn't see them.

As for more examples, yes I could have posted more, but those two different series were what I was searching for at first.
Fair enough, and your explanation is valid. But the examples required the explanation - otherwise, we get four pictures with dubious implications, and online misinformation is easy to procure. A screenshot of Mario Galaxy could fool someone into thinking that the Wii has power, when in fact it was careful creation on the part of the developers. Now that I hear your reasoning, I can see it in the examples, but a range of screenshots varying in artistic styles and developer budgets would be far stronger. Perhaps it bothers me more than most, but especially in light of recent console warring, I see too many examples of too few examples.
 

Roxas1359

Burn, Burn it All!
Aug 8, 2009
33,758
1
0
144 said:
Fair enough, and your explanation is valid. But the examples required the explanation - otherwise, we get four pictures with dubious implications, and online misinformation is easy to procure. A screenshot of Mario Galaxy could fool someone into thinking that the Wii has power, when in fact it was careful creation on the part of the developers. Now that I hear your reasoning, I can see it in the examples, but a range of screenshots varying in artistic styles and developer budgets would be far stronger. Perhaps it bothers me more than most, but especially in light of recent console warring, I see too many examples of too few examples.
It was honestly my fault really, because as you said I should have explained it more honestly instead of just using the images instead. Really for the best results I should have recorded the footage myself and then spliced together a video like I usually do, but I was feeling particularly lazy...well that and my editor is currently busy with Kingdom Hearts Re: Chain of Memories. XD

But in the end, you can give first, second, and third party developers all the power they want, but if they don't know how to use it properly then there is no point. First party developers usually get it down, but there are times in which it falls flat (did you know that not a single Nintendo EAD game on the GameCube actually has 16:9 support), but generally they learn how to max out a console (Naughty Dog usually proves this as well). Second party is pretty hit or miss, but turns out good usually, and with repeated development they tend to learn how to utilize a console properly (Insomniac is actually 2nd Party, as is HAL Labs actually). Third party is the same as second party in whether they use the console fully or not, and it varies depending on the developer. SEGA, whether you like their games or not, has consistently proved to me that they know how to push a system's limits. On the Wii they did Sonic Colors, and Starlight Carnival really pushed the system's limits. For the PS2, Square Enix actually pushed the PS2 to it's limits with FF XII as the game has very little load times and all enemies are on screen, and it really taxes a PS2 when you look at it internally. Rockstar also pushes a system's limits with the GTA franchise, along with L.A Noire technically, so they know how to use power properly.
 

Grimh

New member
Feb 11, 2009
673
0
0
Yeah it's true that once you get a better understanding of the machine you can probably squeeze more out of the thing.
But isn't that also the case for PS4?

So doesn't it become a situation where you crank the XBone up to be more comparable, while you crank up the PS4 to be, well, just more?
 

VanQ

Casual Plebeian
Oct 23, 2009
2,729
0
0
FalloutJack said:
VanQ said:
Pretty sure I could do both of those, but the point being made is that this is kind of a weird spokesman to draw out for the subject in question. Oh, and a bit of comedic advice: A joke only gets old with a high frequency of it being repeated. Since I sure as hell don't use that one often and rarely see anyone else pulling it, the joke is effectively a spring chicken. The duration of a joke's overall existence has little bearing on a situation if it is relevent in the now, which in this case it is.
I see the joke made in every Oddworld thread ever here on the Escapist. I dare you to go back through any Oddworld thread that was in the News Room and find me a single one that doesn't have at least one person make that joke.
 

FalloutJack

Bah weep grah nah neep ninny bom
Nov 20, 2008
15,489
0
0
VanQ said:
FalloutJack said:
VanQ said:
Pretty sure I could do both of those, but the point being made is that this is kind of a weird spokesman to draw out for the subject in question. Oh, and a bit of comedic advice: A joke only gets old with a high frequency of it being repeated. Since I sure as hell don't use that one often and rarely see anyone else pulling it, the joke is effectively a spring chicken. The duration of a joke's overall existence has little bearing on a situation if it is relevent in the now, which in this case it is.
I see the joke made in every Oddworld thread ever here on the Escapist. I dare you to go back through any Oddworld thread that was in the News Room and find me a single one that doesn't have at least one person make that joke.
And I'm probably the culprit at least a couple times. Can't possibly be frequent enough. Oddworld doesn't get in the news that often. But I'm busy right now, so if you have a hankering, you go find 'em.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
Neronium said:
I still think it's a bit early to be making that assumption, as both consoles are still not even a year old and we don't start getting that stuff that truly looks the best until about a year or two in. For right now, I will say that the games pretty much look identical, but I still say give it at least a year to two years to when we can see the real differences.
you are making incorrect assumtion that this console cycle will work like the last. the reason previuos consoles had a warm up period is because they were unique, in PS3 case never seen before, hardwware that needed developers to re-learn how to program for it. Thus it got better with time as their skill improved.
This is not true with current gen consoles. They use standart x86 architecture. You know, the one that was used for PCs for decades now. so any developer that know how to program for PC already know how to program for consoles. Xbone even supports the regular directx so the game runs pretty much identically.
The reason new games look like shit and run bad is not because developers didnt have time to get used to hardware. its just that the hardware is crap (seriously, this power console should have been released in 2010, not 2013).

happy_turtle said:
Cool, this can only be a good thing for all gamers, I own a PS4 but would hate for it to have a monopoly as Sony would then stop giving a crap. More competition can only be a good thing...

Now if only Microsoft would hire someone to follow it's employees around and stop them from saying stupid shit on twitter.
Sorry, when did gamers market shrunk to Xbox and PS4? Last i looked we also had Wii, Pcs, handhelds tablets and phones for gaming. The market is too diverse for monopoly to exist.

The Lunatic said:
This generation is more "Slightly modified PC parts".
you got the gist of it but id like to add that the processing unit they are using isnt even "modified PC parts", they are releasing a cell phone with it. its a phone processor.

faefrost said:
The big thing is the memory size and speed. XBone's DDR3 vs the PS4's DDR5. pretty much any path of optimization you use to squeeze more out of the XBox One will also work on the PS4. So the gap will remain.
DDR5 what? DDR5 does not exist. What they use is VDDR5, which is different. It has higher speed, but lower bandwidth. it is good for holding graphics, terrible for processing. There is a reason all previous systems had both regular RAM and Video RAM.

Parshooter said:
Just want to harp on it again: $70 dollar games and the $50 extra means in Canada the XBone is better than the PS4 now.
so a console is faster because you pay less?

AzrealMaximillion said:
Even with modified PC parts, game engines on consoles and PC leave a large amount of room to make amazing leaps in fidelity given enough time.
any leap done on new consoles will be applied to PCs as well. And the game engines will run on same architecture as PCs, so there isnt any "room" left.
 

loc978

New member
Sep 18, 2010
4,900
0
0
Strazdas said:
The Lunatic said:
This generation is more "Slightly modified PC parts".
you got the gist of it but id like to add that the processing unit they are using isnt even "modified PC parts", they are releasing a cell phone with it. its a phone processor.

faefrost said:
The big thing is the memory size and speed. XBone's DDR3 vs the PS4's DDR5. pretty much any path of optimization you use to squeeze more out of the XBox One will also work on the PS4. So the gap will remain.
DDR5 what? DDR5 does not exist. What they use is VDDR5, which is different. It has higher speed, but lower bandwidth. it is good for holding graphics, terrible for processing. There is a reason all previous systems had both regular RAM and Video RAM.
Being x86, the Jaguar architecture is more a netbook APU than a phone APU, though phones with 'em are in the works... thing about the PS4 and Xbone versions, though, is that they're essentially doubled. Each has a huge die containing two quad core CPUs and a double-sized GPU to a normal Jaguar... so they're rather like two netbooks strapped together.

Also, it's GDDR5, which has higher bandwidth than DDR3 (up in the 5+Ghz range, as opposed to 2-3Ghz tops), but also higher (slower, ~CAS15 to DDR3's ~CAS10) latency... which means the PS4 can load new textures and such at north of twice the speed the Xbone can... and because real latency numbers in nanoseconds are dependent on memory bandwidth... they both work out to be in the 10-12ns range. The advantage the XBone's (2133Mhz, 68Gbps) DDR3 holds over the PS4's (effectively ~5500Mhz, 172Gbps) GDDR5 in terms of latency is completely negligible, while the GDDR5 in question has over twice as much bandwidth.

...memory bandwidth is the one thing I'm actually kind of jealous of in the PS4's hardware... that's the same as you get in very high-end video cards. The Xbone, on the other hand runs its video on the same memory bandwidth as my system RAM (which, to be fair, also runs my video overflow... but I'm running a budget, crossfired APU PC).
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
loc978 said:
Being x86, the Jaguar architecture is more a netbook APU than a phone APU, though phones with 'em are in the works... thing about the PS4 and Xbone versions, though, is that they're essentially doubled. Each has a huge die containing two quad core CPUs and a double-sized GPU to a normal Jaguar... so they're rather like two netbooks strapped together.

Also, it's GDDR5, which has higher bandwidth than DDR3 (up in the 5+Ghz range, as opposed to 2-3Ghz tops), but also higher (slower, ~CAS15 to DDR3's ~CAS10) latency... which means the PS4 can load new textures and such at north of twice the speed the Xbone can... and because real latency numbers in nanoseconds are dependent on memory bandwidth... they both work out to be in the 10-12ns range. The advantage the XBone's (2133Mhz, 68Gbps) DDR3 holds over the PS4's (effectively ~5500Mhz, 172Gbps) GDDR5 in terms of latency is completely negligible, while the GDDR5 in question has over twice as much bandwidth.

...memory bandwidth is the one thing I'm actually kind of jealous of in the PS4's hardware... that's the same as you get in very high-end video cards. The Xbone, on the other hand runs its video on the same memory bandwidth as my system RAM (which, to be fair, also runs my video overflow... but I'm running a budget, crossfired APU PC).
Well, they are releasing cell phones with these APUs [http://www.techradar.com/news/computing-components/processors/amd-launches-new-mobile-chips-will-they-be-in-your-next-laptop-or-tablet--1153617]
All are based on the same Jaguar cores as the AMD-powered next-gen consoles as well as the new Graphics Core Next (GCN) GPU architecture.
They may be in netbooks, sure, i never denied that.

that may be two of the strapped together, but that hardly makes it twice the power. Processors dont work like that as im sure you know. besides, they are underpowered and microsoft already overclocked theirs.

Yes, GDDR5, not sure why my brain farted about that one. Thansk for exapanding on it.

If the console was built properly it owuld have both GDDR5 and DDR3 and it would use both making it run much faster.

You are wrong about being jelous. well, you cna be jelous, but you seem t be jealous for wrong reasons. Its not a high end video card. a 160 dollar 750ti beats the console in graphical power. You can build a 500 dollar PC that will be faster than Xbox one or PS4. Consoles have a reason for their popularity, but power is certainly not one of them.
 

loc978

New member
Sep 18, 2010
4,900
0
0
Strazdas said:
loc978 said:
Being x86, the Jaguar architecture is more a netbook APU than a phone APU, though phones with 'em are in the works... thing about the PS4 and Xbone versions, though, is that they're essentially doubled. Each has a huge die containing two quad core CPUs and a double-sized GPU to a normal Jaguar... so they're rather like two netbooks strapped together.

Also, it's GDDR5, which has higher bandwidth than DDR3 (up in the 5+Ghz range, as opposed to 2-3Ghz tops), but also higher (slower, ~CAS15 to DDR3's ~CAS10) latency... which means the PS4 can load new textures and such at north of twice the speed the Xbone can... and because real latency numbers in nanoseconds are dependent on memory bandwidth... they both work out to be in the 10-12ns range. The advantage the XBone's (2133Mhz, 68Gbps) DDR3 holds over the PS4's (effectively ~5500Mhz, 172Gbps) GDDR5 in terms of latency is completely negligible, while the GDDR5 in question has over twice as much bandwidth.

...memory bandwidth is the one thing I'm actually kind of jealous of in the PS4's hardware... that's the same as you get in very high-end video cards. The Xbone, on the other hand runs its video on the same memory bandwidth as my system RAM (which, to be fair, also runs my video overflow... but I'm running a budget, crossfired APU PC).
Well, they are releasing cell phones with these APUs [http://www.techradar.com/news/computing-components/processors/amd-launches-new-mobile-chips-will-they-be-in-your-next-laptop-or-tablet--1153617]
All are based on the same Jaguar cores as the AMD-powered next-gen consoles as well as the new Graphics Core Next (GCN) GPU architecture.
They may be in netbooks, sure, i never denied that.

that may be two of the strapped together, but that hardly makes it twice the power. Processors dont work like that as im sure you know. besides, they are underpowered and microsoft already overclocked theirs.

Yes, GDDR5, not sure why my brain farted about that one. Thansk for exapanding on it.

If the console was built properly it owuld have both GDDR5 and DDR3 and it would use both making it run much faster.

You are wrong about being jelous. well, you cna be jelous, but you seem t be jealous for wrong reasons. Its not a high end video card. a 160 dollar 750ti beats the console in graphical power. You can build a 500 dollar PC that will be faster than Xbox one or PS4. Consoles have a reason for their popularity, but power is certainly not one of them.
Actually.. twice the cores at the same speed is twice the power... if the software running on it is designed to utilize all of the processing threads available to it. We've had game engines capable of utilizing up to 17 threads since back in 2008, yet most games are still developed only able to utilize 2, occasionally 4... some, brand new games, mind you, can only utilize 1. That's the primary reason my 4Ghz quad-core outperforms the PS4's 2Ghz Octo-core... and it's also the reason that those new consoles are going to see some optimization that PCs by and large won't be able to utilize. We may see the old FX Visheras outperforming everything short of a hyperthreaded i7 as a result.

...and the only appreciable advantages DDR3 has over GDDR5 are availability, socket compatibility and price. As I went over before, the latency pans out the same, and GDDR5 gives you roughly twice the bandwidth. I never said I was jealous of their on-chip GPU's processing power, just the bandwidth of the memory in the PS4. I'm aware that $400 worth of new parts I slapped into my old case a few months ago (an A10-7850k and R7-250 with accompanying RAM and mobo) outruns the PS4 by a fair margin... I just wish it was running the 8GB of GDDR5 the PS4 has (hard to find an affordable video card with half of that... the 750ti typically comes with 2GB of the same stuff. My video card has 1, but I got it for under $100). It could be running circles around the thing.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
loc978 said:
Actually.. twice the cores at the same speed is twice the power... if the software running on it is designed to utilize all of the processing threads available to it. We've had game engines capable of utilizing up to 17 threads since back in 2008, yet most games are still developed only able to utilize 2, occasionally 4... some, brand new games, mind you, can only utilize 1. That's the primary reason my 4Ghz quad-core outperforms the PS4's 2Ghz Octo-core... and it's also the reason that those new consoles are going to see some optimization that PCs by and large won't be able to utilize. We may see the old FX Visheras outperforming everything short of a hyperthreaded i7 as a result.

...and the only appreciable advantages DDR3 has over GDDR5 are availability, socket compatibility and price. As I went over before, the latency pans out the same, and GDDR5 gives you roughly twice the bandwidth. I never said I was jealous of their on-chip GPU's processing power, just the bandwidth of the memory in the PS4. I'm aware that $400 worth of new parts I slapped into my old case a few months ago (an A10-7850k and R7-250 with accompanying RAM and mobo) outruns the PS4 by a fair margin... I just wish it was running the 8GB of GDDR5 the PS4 has. It could be running circles around the thing.
Meanwhile, no software is except ones specifically designed to hod your whole processor for stuff like video rendering.
We have modern games coming out that struggle to use the second CPU, let alone 16. they will have to reprogram the whole thing to use that APU. and why would they when a single i7 core beats all 16 of those underpowered jaguar cores. Its just a very stupid choice for a processor. especially when you can run multitasking on seperate cores since you need to run a single demanding game.

and i seriuosly doubt you will see anything outperforming i7s soon. they may not look great when you look at the surface but 3 ghz in i7 is much more than 3 ghz in an athlon. Its the reason you need 8 AMD cores to do the same task 4 intel cores can handle. except, the problem here comes that the game developers does not use said 8 cores and in result you get 4 cores sitting idle while the game is being bottlenecked. that architectural gap was the reason AMD processors never really got on par with intels since the i generation and in result almost went bancrupt but the mining craze started and their GPUs saved the day.

Yes, i can see being jelous of 8 gb GDDR5, looks like i misunderstood your point there.

DDR3 has its uses, but DDR4 is around the corner and who knows what that will bring (well we know what it will theoretically, but how it will work in practice, especially for early adopters, is another thing).
 

loc978

New member
Sep 18, 2010
4,900
0
0
Strazdas said:
loc978 said:
Actually.. twice the cores at the same speed is twice the power... if the software running on it is designed to utilize all of the processing threads available to it. We've had game engines capable of utilizing up to 17 threads since back in 2008, yet most games are still developed only able to utilize 2, occasionally 4... some, brand new games, mind you, can only utilize 1. That's the primary reason my 4Ghz quad-core outperforms the PS4's 2Ghz Octo-core... and it's also the reason that those new consoles are going to see some optimization that PCs by and large won't be able to utilize. We may see the old FX Visheras outperforming everything short of a hyperthreaded i7 as a result.

...and the only appreciable advantages DDR3 has over GDDR5 are availability, socket compatibility and price. As I went over before, the latency pans out the same, and GDDR5 gives you roughly twice the bandwidth. I never said I was jealous of their on-chip GPU's processing power, just the bandwidth of the memory in the PS4. I'm aware that $400 worth of new parts I slapped into my old case a few months ago (an A10-7850k and R7-250 with accompanying RAM and mobo) outruns the PS4 by a fair margin... I just wish it was running the 8GB of GDDR5 the PS4 has. It could be running circles around the thing.
Meanwhile, no software is except ones specifically designed to hod your whole processor for stuff like video rendering.
We have modern games coming out that struggle to use the second CPU, let alone 16. they will have to reprogram the whole thing to use that APU. and why would they when a single i7 core beats all 16 of those underpowered jaguar cores. Its just a very stupid choice for a processor. especially when you can run multitasking on seperate cores since you need to run a single demanding game.

and i seriuosly doubt you will see anything outperforming i7s soon. they may not look great when you look at the surface but 3 ghz in i7 is much more than 3 ghz in an athlon. Its the reason you need 8 AMD cores to do the same task 4 intel cores can handle. except, the problem here comes that the game developers does not use said 8 cores and in result you get 4 cores sitting idle while the game is being bottlenecked. that architectural gap was the reason AMD processors never really got on par with intels since the i generation and in result almost went bancrupt but the mining craze started and their GPUs saved the day.

Yes, i can see being jelous of 8 gb GDDR5, looks like i misunderstood your point there.

DDR3 has its uses, but DDR4 is around the corner and who knows what that will bring (well we know what it will theoretically, but how it will work in practice, especially for early adopters, is another thing).
I feel like we're having two different conversations here... I thought it was pretty clear I knew all that, it's the why of it that I was trying to discuss... and the why of it is pretty silly. We've been at the point where advancements in hardware need to go the route of "more cores, more memory channels" for several years now, unless we find a replacement for silicon with better heat tolerance... yet very few software designers are building for that future.

I'm pretty sure these new octo-core consoles are going to represent a paradigm shift in the way games are optimized... which means more cores are going to start being more relevant than single-thread support across a multi-core CPU, which is why, as you said, Intel's 3ghz is faster than AMD's 3ghz. With an application that can utilize 8 processing threads, a 2ghz octo-core is going to outrun a 3ghz quad-core, to say nothing of the old 4.4ghz octo-core vishera. The i7 extreme will still be king, with those 12 threads and insane cache... but the plain quad-cores? Not so much.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
loc978 said:
I feel like we're having two different conversations here... I thought it was pretty clear I knew all that, it's the why of it that I was trying to discuss... and the why of it is pretty silly. We've been at the point where advancements in hardware need to go the route of "more cores, more memory channels" for several years now, unless we find a replacement for silicon with better heat tolerance... yet very few software designers are building for that future.

I'm pretty sure these new octo-core consoles are going to represent a paradigm shift in the way games are optimized... which means more cores are going to start being more relevant than single-thread support across a multi-core CPU, which is why, as you said, Intel's 3ghz is faster than AMD's 3ghz. With an application that can utilize 8 processing threads, a 2ghz octo-core is going to outrun a 3ghz quad-core, to say nothing of the old 4.4ghz octo-core vishera. The i7 extreme will still be king, with those 12 threads and insane cache... but the plain quad-cores? Not so much.
Well, while silicon remians silicon, we already managed to invent processors that do twice the job of same frequency processors a decade ago. So its not like the frequency wall is so deadly. While going more cores/channels is definatelly a way things seem to be going, software will always be behind in that department. especially as long as few cores can still do the same job, provided those cores arent, ahem, underpowered mobile processors at low frequency.

octo-core processing will be relevant. Weak 1.6ghz octocore processing will not. the programmers that need to go for multicores due to single core not being enough will look at the powerful multicores, not the bottom of the barrel. Therefore consoles APUs will never be "The thing" for maximum performance. the movement towards multi-cores will be looknig at the high end cores, not the low end ones, and as long as 2 Intel cores can do the job there is no reason to spend extra money on programming time to make it run on 16 console cores.

Its a long streach, but i think we may even see CPUs slow down significanly as more and more procesing is done via GPUs. And they havent hit frequency wall yet. Of course they cant replace CPUs, but more like GPUs becoming far more important to the point that any old single core CPU is enough if your GPU is good. We already see some of that in games, and its the GPUs that are mining bitcoins and not the CPUs.