Assassin's Creed Unity Runs at 900p Resolution to "Avoid All the Debates" - Updated

DTWolfwood

Better than Vash!
Oct 20, 2009
3,716
0
0
I wasn't excited for this game to begin with, now this? i'll just skip this one. Maybe something interesting will come out next year. I'm just tired of European setting.
 

Ishnuvalok

New member
Jul 14, 2008
266
0
0
It's amusing that people believe that bottlenecking your CPU doesn't affect your GPU performance. The PS4 and Xbox One have, literally, the same 8-core AMD Jaguar APU, with the Xbox's running at a slightly higher clock rate.
It doesn't matter if the PS4 has a more powerful GPU if you've bottlenecked the CPU.
Is it because of lazy devs or Microsoft cash? Hardly, it's because 8th-gen consoles are a complete joke.

But ask yourself this: would you rather a game run at 900p and stable 30 fps, or at 1080p and an unstable framerate?
 

shintakie10

New member
Sep 3, 2008
1,342
0
0
Revolutionary said:
I like how PS thralls never consider the possibility that a console with a APU output power equivalent to an under-clocked GTX 480 just might not actually be able to run a AAA title from a studio who are infamous for their terribly optimised games.

Fucking Morons.
This is basically the crux of it. Not the PS thralls part, but the fact that Ubisoft can't optimize worth shit. Have you all seen the things people say about Black Flag? There are people runnin the tippytop of the top line of oced gpu, the best of the best cpu, the best everythin possible and they still can't run Black Flag without performance drops.

Ubisoft can't code worth a damn and they refuse to learn how to code worth a damn because they don't need to learn how to code worth a damn because people keep buyin their games regardless of how unoptimized they are.
 

Lightknight

Mugwamp Supreme
Nov 26, 2008
4,860
0
0
Supernova1138 said:
It's not at all surprising they're hitting a CPU bottleneck with the new consoles, the PC versions of AssCreed only really use 1-2 cores and leave the rest idle. As a result, the PC ports of AssCreed tend not to run well on AMD chips and older Intel CPUs. The low clocked CPU cores on the new consoles would only make this problem worse. The only way to fix this issue would be for Ubisoft to rewrite their game engine, and they're not going to do that when they're churning these things out on an annual basis until people stop buying them.
Modern computing taxes the RAM/GPU well before it hits the CPU. So CPU should seldom be an issue.

There's generally a tradeoff in games. Do you want more numerous or detailed objects or do you want high framerate/resolution? The more complex or numerous the assets get the harder it is to maintain a higher framerate or resolution. Bump Mapping thankfully reduced the demands of object complexity to a degree. But if you wanted to simulate assets with real geometric complexity or if you have a lot of physics and character tracking going on then getting the full frame rate and resolution becomes significantly less feasible.

My guess is that Ubisoft is lying. That they had the opportunity to scale the PS4 higher and decided that it wasn't worth the time to do it just for that console. They probably plan to do it in the future when the method to upscale for the ps4 is in place and easy. They may also just not want to piss Microsoft off.
 

ryukage_sama

New member
Mar 12, 2009
508
0
0
The Escapist already had an article about how programming AI well is difficult, but processing it is easy. The idea that it's the games NPCs that slow things down is bollox. These lies and half-truths are nothing new, after all, since we're talking about Ubisoft. Hopefully, the Jimquisition will let this pass since he's already beaten this horse to dust.
 

Adam Jensen_v1legacy

I never asked for this
Sep 8, 2011
6,651
0
0
ryukage_sama said:
The Escapist already had an article about how programming AI well is difficult, but processing it is easy. The idea that it's the games NPCs that slow things down is bollox. These lies and half-truths are nothing new, after all, since we're talking about Ubisoft. Hopefully, the Jimquisition will let this pass since he's already beaten this horse to dust.
Not to mention that resolution depends mostly on the GPU. VRAM is also very important for higher resolutions, especially when a game has such high quality textures like Unity. This is definitely the Xbone's fault. PS4 is more than capable of rendering this game at 1080p with it's massive 8Gb of GDDR5. The Xbone still has that retarded low bandwidth DDR3. They can try to spin this as much as they want, but I'm willing to bet that Microsoft had something to do with this.
 

svenjl

New member
Mar 16, 2011
129
0
0
Laggyteabag said:
I never really understood the whole parity thing. The PS4 is the better console hardware wise, but why are Ubisoft so scared to use that little extra power? Is it going to be the whole Watch_Dogs thing all over again?

When the Xbox One and PS4 were announced, I was really expecting the industry standard to finally become 1080p with 60FPS on consoles, but it just seems that aside from one or two games, most seems to end up sticking with 720p 30FPS, or something slightly higher. The games are prettier for sure, but the experience is still mostly the same. To me, this is still last gen, just with a glossier coat of paint.
Could it be that Ubisoft couldn't hit 1080p on PS4 without performance issues, while 900p was the sweet spot? Better not to over reach on one console graphically, and aim for the best possible performance benchmark on both. If the PS4 version had run at 1080p but with frame rate issues, there would have been an indignant outcry over that. Ubisoft may have a tarnished reputation in some respects, but they make some fantastic games. I'd count ACIV, Watch Dogs and SC:Blackilist as some of my favourite games recently. I didn't have issues with any of those, and played them all on 360.
 

svenjl

New member
Mar 16, 2011
129
0
0
DTWolfwood said:
I wasn't excited for this game to begin with, now this? i'll just skip this one. Maybe something interesting will come out next year. I'm just tired of European setting.
900p is the nail in the coffin? Or is it a Ubisoft issue? I bet Unity would look amazing at 720p.

Performance and gameplay are paramount, and frankly 900 v 1080 is one of the dumbest cause of disputes and outrage imaginable to me. We've been gaming with 720p 30FPS for years! In less than a year, we're expecting life changing experiences?! Resolution isn't the game changer. It should be about ideas, invention, creativity, design, art, mechanics. Great games can be found whether they cost $100 million or $100 thousand (or even much less).
 

Rozalia1

New member
Mar 1, 2014
1,095
0
0
KaZuYa said:
Troll gotta troll, an ignorant one at that.
You're an ignorant troll? Because that is what is sounds like if you're responding to my questions with that. No you must have called me a troll as that would make more... "sense". Either way bad juju on your part.

I was prepared to go into the specifics of the matter but with your confession I think there is no longer a need.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
pilar said:
1080p60 High is really in the GTX 760 or 280x range. I don't think SONY or Microsoft would have much success selling an $800 console. And can you imagine how expensive games would be? Even still, Digital Foundry was really taken aback by the particle engine in Infamous: Second Son. And Roberts of Star Citizen has talked about the possibility of a console port, so they're powerful enough -- a stripped down mid-range PC -- according to him.

How much would you pay for a console?​
A 70 dollar 750gtx will run 1080p@60fps with high/medium settings just fine. and dont pretend like consoles even reach such settings. you dont need 800 dollar console to run these games, you just have to, gasp, not pick parts made for tablets for it.
 

pilar

New member
Jul 7, 2014
59
0
0
Strazdas said:
pilar said:
1080p60 High is really in the GTX 760 or 280x range. I don't think SONY or Microsoft would have much success selling an $800 console. And can you imagine how expensive games would be? Even still, Digital Foundry was really taken aback by the particle engine in Infamous: Second Son. And Roberts of Star Citizen has talked about the possibility of a console port, so they're powerful enough -- a stripped down mid-range PC -- according to him.

How much would you pay for a console?​
A 70 dollar 750gtx will run 1080p@60fps with high/medium settings just fine. and dont pretend like consoles even reach such settings. you dont need 800 dollar console to run these games, you just have to, gasp, not pick parts made for tablets for it.

[HEADING=3]It's Apples to Freakin' Oranges[/HEADING]
The PlayStation costs $360 - $380 to assemble because it's an APU: it's designed for optimization. And it's actually quite brilliantly engineered for what it is and what it allows developers to do.

It was never meant to be compared with a discrete GPU -- that's why I find the PC user argument hilarious. They don't even know the console hardware (not to mention having a clue on what the word APU means).

SONY's exclusives have won the industry-wide Game of the Year award 3 out the last 5 years: so is it more the console's hardware or it's policies that turn you off?
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
pilar said:
[HEADING=3]It's Apples to Freakin' Oranges[/HEADING]
The PlayStation costs $360 - $380 to assemble because it's an APU: it's designed for optimization. And it's actually quite brilliantly engineered for what it is and what it allows developers to do.

It was never meant to be compared with a discrete GPU -- that's why I find the PC user argument hilarious. They don't even know the console hardware (not to mention having a clue on what the word APU means).

SONY's exclusives have won the industry-wide Game of the Year award 3 out the last 5 years: so is it more the console's hardware or it's policies that turn you off?
Image is a .jpg. Thus i can dismiss it outright as compression artifacts diminish quality from the start. Secondly, the image resolution is only 768x432. not even something as stupid as IGN does comparison on so bad resolutions. If you want a comparison start with this even at horrible compression [http://www.digitalstormonline.com/unlocked/images/articles/Nav/Articles/Watch-Dogs-Xbox-one-PC-1.jpg]. ALso funny how you picked Watch Dogs, the game that is known to be intentionally gimped for PC to make it look worse.

How about we try shadow of mordor? PC [http://i2.wp.com/gearnuke.com/wp-content/uploads/2014/10/shadow-mordor-2-pc.png] PS4 [http://i0.wp.com/gearnuke.com/wp-content/uploads/2014/10/shadow-mordor-2-ps4.png]

Yes, PS4 and Xbox Done uses an APU. and that is their downfall - the APU they use does not allow them to meet even the minimum requirements for modern games. Its not brilliant, its underclocked tablet GPU. its off-the-shef hardware from AMD that they actually put in other products (Tablets) too. Well maybe except the Xbox Ones memory channel that MS invested a billion on and that seems to be useless in real world).

Oh, we know console hardware and we know just how awful it is. using APU for gaming is the worst thing you can do, and they went on and did it.

there is no industry wide game of the year award. there are many different sites giving their awards. there are no "oscars" for videogames.
 

Lightknight

Mugwamp Supreme
Nov 26, 2008
4,860
0
0
Strazdas said:
pilar said:
[HEADING=3]It's Apples to Freakin' Oranges[/HEADING]
The PlayStation costs $360 - $380 to assemble because it's an APU: it's designed for optimization. And it's actually quite brilliantly engineered for what it is and what it allows developers to do.

It was never meant to be compared with a discrete GPU -- that's why I find the PC user argument hilarious. They don't even know the console hardware (not to mention having a clue on what the word APU means).

SONY's exclusives have won the industry-wide Game of the Year award 3 out the last 5 years: so is it more the console's hardware or it's policies that turn you off?
Image is a .jpg. Thus i can dismiss it outright as compression artifacts diminish quality from the start. Secondly, the image resolution is only 768x432. not even something as stupid as IGN does comparison on so bad resolutions. If you want a comparison start with this even at horrible compression [http://www.digitalstormonline.com/unlocked/images/articles/Nav/Articles/Watch-Dogs-Xbox-one-PC-1.jpg]. ALso funny how you picked Watch Dogs, the game that is known to be intentionally gimped for PC to make it look worse.

How about we try shadow of mordor? PC [http://i2.wp.com/gearnuke.com/wp-content/uploads/2014/10/shadow-mordor-2-pc.png] PS4 [http://i0.wp.com/gearnuke.com/wp-content/uploads/2014/10/shadow-mordor-2-ps4.png]

Yes, PS4 and Xbox Done uses an APU. and that is their downfall - the APU they use does not allow them to meet even the minimum requirements for modern games. Its not brilliant, its underclocked tablet GPU. its off-the-shef hardware from AMD that they actually put in other products (Tablets) too. Well maybe except the Xbox Ones memory channel that MS invested a billion on and that seems to be useless in real world).

Oh, we know console hardware and we know just how awful it is. using APU for gaming is the worst thing you can do, and they went on and did it.

there is no industry wide game of the year award. there are many different sites giving their awards. there are no "oscars" for videogames.
Modern software development does not target the CPU for processing except for very specific and minor things. It will always hit RAM and GPU first. If your CPU is getting taxed by software then it's likely because the GPU/RAM have maxed out performance. The CPU simply isn't all that good or specialized for video processing. SoDDR3 is specialized at transferring smaller amounts of data (keeping track of a large number of small objects) and GDDR5 is specialized for transferring larger amounts of data. Sony did something interesting by reducing the latency GDDR5 usually suffers in order to make it competent at performing both functions. If programmers lean heavily on the GDDR5 RAM as video ram then the performance of the GPU can be greatly augmented.

The CPU portion of the APU? The laptop bit that you're complaining about? That really shouldn't get hit and should not matter except as the method to direct traffic.

Not sure about how the XBO is going to cope.
 

DTWolfwood

Better than Vash!
Oct 20, 2009
3,716
0
0
svenjl said:
DTWolfwood said:
I wasn't excited for this game to begin with, now this? i'll just skip this one. Maybe something interesting will come out next year. I'm just tired of European setting.
900p is the nail in the coffin? Or is it a Ubisoft issue? I bet Unity would look amazing at 720p.

Performance and gameplay are paramount, and frankly 900 v 1080 is one of the dumbest cause of disputes and outrage imaginable to me. We've been gaming with 720p 30FPS for years! In less than a year, we're expecting life changing experiences?! Resolution isn't the game changer. It should be about ideas, invention, creativity, design, art, mechanics. Great games can be found whether they cost $100 million or $100 thousand (or even much less).
Those are the nails on the coffin. Any and all "controversy" that comes out of PR is just icing on the cake. If i want to play an Assassins Creed game, i'll play Shadows of Mordor. That game does assassination better than Assassins Creed.
 

PatrickXD

New member
Aug 13, 2009
977
0
0
They set a target for their games, graphically speaking. Then they reached the target, without going any extra mile in any case. I know nothing about coding, how hard or easy it might be, and I don't care. Good for you, business people, for achieveing your business targets. I hope the game is fun.
 

igor2201

New member
Sep 19, 2013
11
0
0
Whatislove said:
Charcharo said:
HOWEVER METRO LAST LIGHT REDUX looks noticeably SUPERIOR to Unity and is 1080p 60 fps on PS4 and 912p on Xbox One. Yet again Ubisoft cant code as well as a low budget eastern European studio...
If only people paid attention when I was talking... that is what you get.
Exactly, and what about Shadow of Mordor? It runs 60fps at 1080p on the PS4 and it looks just as good (if not better) graphically and the AI and CPU intensive tasks are on par or surpass that of Assassin's Creed.

Infamous Second Son runs 1080p/30fps and it is a large open world with plenty of AI.

Sleeping Dogs HD will run at 60fps/1080p. Akiba's trip on the PS4 (another open world game with lots of AI, though cel-shaded graphics) runs 1080p/60fps. Hell, even GTA V will run at 1080p on the PS4 (though only at 30fps) and that is a much bigger game than ACU, and how about this one for a kicker: ACIV: Black Flag runs at 1080p on the PS4 (30fps)... so they can't even code as well as a previous title in the franchise.

Ubisoft couldn't code themselves out of a wet paper bag.

Honestly, I can understand being restricted to 30fps OR only being able to hit 900p.. but both? really? Even though the last AC was 1080p? and every single game on the ps4 released so far is at least 1080p/30fps, with many being 60fps, with only TWO exceptions.. and one of those exceptions happens to be another Ubisoft game, Watch dogs, which runs 900p/30fps of course.. their favourite numbers!
Infamous Second Son cheated on NPCs and AI, It only bothered to render NPCs and process their AI in bulk when you hit the ground, where your visual distance is much shorted and it's a much slower way across the map, and it gave them the room they needed for those specs. Which is why when you go across the city on rooftops you only see a tiny amount of non combat NPC's and once you hit the ground you see a lot more, most of whom start wandering into your immediate area implying they were just rendered off screen for your immediate environment. Assassin's Creed has always been fully populated no matter where you are in the world with a full set of NPCs in your immediate area. What people often forget is that Sucker Punch flat out stated they had basically hit the roof of what the ps4 was capable of doing even with those tricks.

The fact is if you have a cpu bottleneck, then the power of the rest of your hardware doesn't really matter. The cpu is basically the brain of the APU.

For a metaphor, consider that someone is a clinical vegetable, the heart (gpu) and the spinal column (ram) are still working, and yet the person is incapable of doing anything.

No matter how powerful the hardware is, the moment you hit 100% load on the cpu all hardware takes a major hit. So while I know a lot of people won't listen or comprehend this very real limitation of modern computing technology. CPU bottlenecks are very real and can cause exactly the issues ubisoft ran into. Furthermore, the ps4 relies on asynchronous processing tech on the gpu to relieve stress on the cpu, however AI is exactly the wrong type of computational calculations to take advantage of this.

I know its fun to rant and rave, but basic research into this would have revealed all this. Is Ubisoft telling the truth? I don't know, but their statements actually fit with how computing technology works, the technology inside the ps4 and its very real limitations and make far more sense then them purposely not making the best game they could, in a era where studios and IPs fail all the time because there is always someone else work who did spend all the time and energy to make the best game possible to take your place.


BrotherRool said:
For some comparison, check out a screenshot of a PS3 launch title vs a PS3 game that came out at the end of the lifespan.

See the difference between a launch game and a real 'next-gen' looking game?
Uncharted was not a launch game nor a launch window game. The launch games actually had worse rendering but were brighter and clearer.
 

Glaice

New member
Mar 18, 2013
577
0
0
Good job Ubisoft, you sparked debate and other things because of this dumb move of yours..

If you're gonna make any version reasonable, do it for the PC because of the countless configuration types you will have to deal with.
 

pilar

New member
Jul 7, 2014
59
0
0
Strazdas said:
pilar said:
[HEADING=3]It's Apples to Freakin' Oranges[/HEADING]
The PlayStation costs $360 - $380 to assemble because it's an APU: it's designed for optimization. And it's actually quite brilliantly engineered for what it is and what it allows developers to do.

It was never meant to be compared with a discrete GPU -- that's why I find the PC user argument hilarious. They don't even know the console hardware (not to mention having a clue on what the word APU means).

SONY's exclusives have won the industry-wide Game of the Year award 3 out the last 5 years: so is it more the console's hardware or it's policies that turn you off?
Image is a .jpg. Thus i can dismiss it outright as compression artifacts diminish quality from the start. Secondly, the image resolution is only 768x432. not even something as stupid as IGN does comparison on so bad resolutions. If you want a comparison start with this even at horrible compression [http://www.digitalstormonline.com/unlocked/images/articles/Nav/Articles/Watch-Dogs-Xbox-one-PC-1.jpg]. ALso funny how you picked Watch Dogs, the game that is known to be intentionally gimped for PC to make it look worse.

How about we try shadow of mordor? PC [http://i2.wp.com/gearnuke.com/wp-content/uploads/2014/10/shadow-mordor-2-pc.png] PS4 [http://i0.wp.com/gearnuke.com/wp-content/uploads/2014/10/shadow-mordor-2-ps4.png]

Yes, PS4 and Xbox Done uses an APU. and that is their downfall - the APU they use does not allow them to meet even the minimum requirements for modern games. Its not brilliant, its underclocked tablet GPU. its off-the-shef hardware from AMD that they actually put in other products (Tablets) too. Well maybe except the Xbox Ones memory channel that MS invested a billion on and that seems to be useless in real world).

Oh, we know console hardware and we know just how awful it is. using APU for gaming is the worst thing you can do, and they went on and did it.

there is no industry wide game of the year award. there are many different sites giving their awards. there are no "oscars" for videogames.
You don't know what their custom APU does. [http://youtu.be/0ruo84asvQo]

Digital Foundry's written analysis of the three platforms [http://www.eurogamer.net/articles/digitalfoundry-2014-shadow-of-mordor-face-off] on Shadow of Mordor. Their 760 2GB managed a stable High Textures (like the PS4) @ 1080p30 locked; they needed a $400+ 780ti to max the game out.
Hmm... A $400 GPU to max out + $60 Shadow of Mordor, which isn't really saying a lot beyond a double frame-rate and High vs Ultra Textures.

I'm never convinced when PC users say that it's cheaper in the long run: sure it is, if you're willing to hold off another 6 months on the games you've been excited about for two years.
And look at all the games coming out next year: MGSV: Phantom Pain, Order 1886, Uncharted 4, Witcher 3, Bloodbourne, Final Fantasy XV, The Division, Until Dawn, Dying Light, No Man's Sky, etc...
 

Lightknight

Mugwamp Supreme
Nov 26, 2008
4,860
0
0
igor2201 said:
What people often forget is that Sucker Punch flat out stated they had basically hit the roof of what the ps4 was capable of doing even with those tricks.
Cite this please. Sucker Punch IS Sony. I could see non-Sony entities stating that they'd hit a wall on Sony hardware but I don't think any Sony employees are that keen on losing their jobs.

I will say this, though, inFamous Second Son looked absolutely beautiful. I've got no regrets if that's the pinnacle. Then again, I can always just wonder over to my PC for any graphiophilic needs that crop up. But from what I've seen there's still plenty of work to be done to take advantage of the GDDR5 and some of the other non-traditional hardware. We won't see the kind of optimizations we saw in the past with proprietary hardware but the should be plenty meat left on the bones than a comparable pc thanks to standardized hardware, the API, and the particularities they decided to go with.