General Gaming News.

BrawlMan

Lover of beat'em ups.
Legacy
Mar 10, 2016
29,400
12,232
118
Detroit, Michigan
Country
United States of America
Gender
Male
  • Like
Reactions: XsjadoBlayde

FakeSympathy

Elite Member
Legacy
Jun 8, 2015
3,489
3,222
118
Seattle, WA
Country
US

XsjadoBlayde

~it ends here~
Apr 29, 2020
3,372
3,499
118
I dont think it coincidence Digital Foundry is basically the only major outlet that didnt receive review copies. They knew what they were shipping.
Tested it out on the ps5 last night and the performance mode (the intended 60fps mode) is shockingly shoddy, like it's basically just choppier 30fps with shitter resolution: it is just worse on every level. So is now merely a waiting-for-patch game for the shelf meanwhile, cos that ain't pleasant to play. Am a bit concerned how no reviews seemed to mention that either, sticking to brief summaries of insignificant visual bugs instead. If it were a small indie dev it'd be somewhat more understandable, but this is EA with all the money - and Respawn who develop the super smooth Titanfall universe games - Also Star Wars...the supermassive black money hole of IPs? It didn't need to be rushed out the door, unless there's some weird esoteric tax loopholes they're trying to take advantage of, am not sure what other excuses are left.
 

XsjadoBlayde

~it ends here~
Apr 29, 2020
3,372
3,499
118
Talk of the devil...


Star Wars Jedi: Survivor launched yesterday on PC and at this point in time, it's a technical failure on just about every level. This is somewhat hard to believe as all of the issues Fallen Order possessed return for the sequel, despite their staggeringly obvious nature. Shader compilation stutter and traversal stutter are back, while the in-game options offer no recourse in addressing the game's key issues. In terms of polish, performance and accessibility, this is actually far worse than Fallen Order, perhaps taking the cake as the worst triple-A PC port of 2023, despite some remarkably strong competition.

The game's initial handshake with the player fails completely in improving over its predecessor. The options are barebone UE4 standards, right down to their naming. There's zero context given to the user as to what the performance and visual ramifications are for each setting, leaving the player with no hints as to how to improve the experience for their specific hardware. There's also a litany of issues that get in the way of you figuring out whether options improve performance at all: for example, toggling ray tracing from on to off then back on again can make the game run worse than it initially did with RT on. The only way to fix it is a full game restart after which performance returns to the level it was at earlier.


poster


There are issues with other options as well. For example, you only have access to TAA and FSR2 – and FSR2 is just not a good option for acceptable image quality. Essentially, whenever an object or the camera moves, FSR2 causes a blurred pixelisation effect at all quality levels. This applies to the entire image and all aspects of it, giving the game a motion blurred, ghost-like look - even if you turn motion blur off. Unreal Engine 4 has excellent DLSS and XeSS plug-in support. The developers have chosen to ignore them in a world where each provides improved quality for Nvidia and Intel GPU owners respectively - and that's unacceptable.

All of this is borderline insignificant, however, compared to the PC port's other issues. Shader compilation stutter is back - despite the game kicking off with a precompilation pass. The infamous traversal stutter in Fallen Order that everyone noticed and complained about? It's back again in Survivor. We've raised enough awareness around #StutterStruggle at this point that it's borderline inconceivable that a major triple-A title should still release with this problem, but there it is. Traversal stutter continues to marr the Star Wars experience: everywhere you go in Star Wars Jedi: Survivor sees spikes of frame-time over a number of frames or even over a few seconds as you cross invisible boundaries in the game world.

This happens everywhere, no matter what area of the game you are in and no matter what CPU or GPU combination you have. Even areas that look innocuous and seem like they should be easy to load and render induce traversal load stutter. The frame-times spikes of these stutters can be smaller, but they can also be large and they can compound with any shader compilation stutter that might occur potentially at the same time. No matter how many times you play the game or run across an area, you will always see the same stutter - and nothing can prevent this.


Shader compilation stutter returns (left) and the traversal stutter issue from Fallen Order also raises its ugly head in Survivor. 100ms stutters on a powerful processor like a Core i9 12900K, backed with 6400MT/s DDR5 is excessively poor. The results will be poorer still on mainstream CPUs.

That is not an exaggeration. Even playing the game at the lowest settings possible you will still see traversal and shader compilation stutter, even on the most powerful CPUs and GPUs on the market. The latest 3D cache processors from AMD deliver the 'best' experience, but even these are subject to noticeable stutter.

So far, so Fallen Order, but Survivor is also blighted by astonishingly low CPU core utilisation. A Core i9 12900K paired with fast 6400MT/s DDR5 and an RTX 4090 will not maintain 60 frames per second. This in itself is not exactly something to cry about though from a rational perspective – hardware never guarantees that you should be able to run the game at X setting and at X frame-rate. But here, the performance at the absolute lowest settings is completely indefensible and just flat-out bad. Barely any of the processor is touched, performance held back by around two threads, which are always pegged at 70 percent utilisation or higher.

In short, Star Wars Jedi: Survivor is essentially ignoring the fact that CPUs have entered the many-core era. With higher settings it is even more disastrous – with ray tracing active, more smaller cores are tasked with maintaining RT's BVH structures, but ultimately, performance drops still further to the point where I've observed CPU-limited scenes on a 12900K that just about exceed 30fps. On a mid-range CPU like the Ryzen 5 3600, for example, it is even more catastrophic.


On the image on the left, the game running on an RTX 4090 on lowest settings barely touches GPU resources and still can't maintain 60fps. On the right, fully maxed with RT at 4K FSR2, this scene sees only 36% GPU utilisation - meaning that the CPU bottleneck is bringing performance down to 36fps.

Settings recommendations are therefore impossible, really, although you can disable ray tracing to claw back some performance. Basically, Star Wars Jedi: Survivor is not utilising the hardware presented to it at all in a meaningful way. I still can't get my head around the fact that one of the best games of 2019 launched with profound issues on PC that were never fixed - and then we see them again in the 2023 sequel, on top of a host of other issues.

Yes, there are patches on the way, while EA has also offered up a hollow mea culpa that actually seems to be trying to shift the blame to the user's hardware, but none of this addresses the fact that the PC version of a very handsome game has launched in a totally unacceptable state.

We were towards the back of the queue to receive Jedi Survivor code. The PC version arrived on Thursday with consoles on Friday, so further coverage is still in development - but what we should stress is that while the PC version is terrible, this is definitely a game with strong current-gen credentials - and from a visual perspective, there's much to commend it (and we have a full tech review in the works that covers this and more). It's early days but the console versions do seem have far fewer technical issues, even though performance still requires a lot of work. We'll be reporting back on that front as soon as we can.
 

Chimpzy

Simian Abomination
Legacy
Escapist +
Apr 3, 2020
12,840
9,273
118

CriticalGaming

Elite Member
Legacy
Dec 28, 2017
11,227
5,682
118
You know it's really bad when DF leads with words like "terrible" and "worst" instead of their usual more diplomatic approach.
The state of pc ports these past couple of years have been really sad. I wonder if this latest console generation is the culprit. Because i dont remember having too many bad ports during the ps4 era. I wonder if there is something about console optimization that also makes porting to pc a pain in the ass.
 
  • Like
Reactions: Zykon TheLich

Chimpzy

Simian Abomination
Legacy
Escapist +
Apr 3, 2020
12,840
9,273
118
The state of pc ports these past couple of years have been really sad. I wonder if this latest console generation is the culprit. Because i dont remember having too many bad ports during the ps4 era. I wonder if there is something about console optimization that also makes porting to pc a pain in the ass.
There were plenty of bad ports last gen too, but it wasn't as noticeable because by about the mid-point of the generation, the gap in raw horsepower between pc hardware and the base consoles was so large even a mid-range pc could just brute force past poor optimalisation for the vast majority of bad ports. Nowadays the gap between higher end pc and console is still big, but not that big and developers need to learn to actually utilise all of a pc capabilities.

For example, one of Jedi Survivor's problems is that it only seems to utilise a couple cpu cores, even then not even fully, even tho 6-8 core cpus are now very common. Which actually was a problem last gen too, but that single to only a couple threadedness of pc ports wasn't as big a problem because pc cpus had gotten so much more performant than the consoles weedy Jaguar cpu. We've entered the era of truly multicore pc gaming, but a lot of developers don't seem to have gotten the message and are still designing renderers as if it were last gen, when cpus could just power through.

Another is shader compilation. That happens because modern games use a lot of custom shaders. On console, that's easy. There's only one hardware config to consider, so you can ship the compiled shaders along with the game. But on pc that doesn't fly of course. So they either need to compile them during gameplay, which puts more stress on those same couple cpu threads, which is why shader stutter happens. Or they need to pre-compile all them upon booting the game, which can take a good while depending on your cpu. I often see a single core go up to 100% utilisation, max clock speeds, while the others are not really doing anything. Again, the solution is better scheduling, and properly use all of a pcs cores.

Traversal stutter happens because every time a new area needs to be loaded, the cpu needs to fetch that data from storage and feed it to the gpu, while still doing all the other stuff. The PS5 for example doen't have this problem because its APU can have data loaded directly into memory through an IO component in the APU which can decompress data in real-time. Think of it as accessing the contents of zip file without needing to unzip it. Neat bit of tech, and pc has no real equivalent to this. But it does have Direct Storage, which allows the gpu to access storage directly, bypassing the cpu, meaning the gpu can work more efficiently since it doesn't need to wait for the cpu to feed it, and the cpu is free to direct those cycles towards something else. Unfortunately, very few games are using it properly.

TLDR: lots of developers still doing pc ports as if it were still last gen
 

FakeSympathy

Elite Member
Legacy
Jun 8, 2015
3,489
3,222
118
Seattle, WA
Country
US

CriticalGaming

Elite Member
Legacy
Dec 28, 2017
11,227
5,682
118
JFC, I don't think I ever seen DF go off like this since CP2077. I thought EA finally had their shit together. But BF2042 and JS is making me reconsider
EA heard our cries for games that were single player and didn't have a bunch of extra Monetization bullshit. EA heard us but said, "Fine but we're not making it run good."
 

hanselthecaretaker

My flask is half full
Legacy
Nov 18, 2010
8,738
5,910
118
If you are aware, THEN WHY THE FUCK DID YOU RELEASE IT!?
Isn’t the excuse always something like, “don’t know until it’s out”? But really, any PC dev should have a general idea of how a sampling of spec configurations will perform just based on establishing a game’s system requirements in the first place too.
 
  • Like
Reactions: BrawlMan

hanselthecaretaker

My flask is half full
Legacy
Nov 18, 2010
8,738
5,910
118
There were plenty of bad ports last gen too, but it wasn't as noticeable because by about the mid-point of the generation, the gap in raw horsepower between pc hardware and the base consoles was so large even a mid-range pc could just brute force past poor optimalisation for the vast majority of bad ports. Nowadays the gap between higher end pc and console is still big, but not that big and developers need to learn to actually utilise all of a pc capabilities.

For example, one of Jedi Survivor's problems is that it only seems to utilise a couple cpu cores, even then not even fully, even tho 6-8 core cpus are now very common. Which actually was a problem last gen too, but that single to only a couple threadedness of pc ports wasn't as big a problem because pc cpus had gotten so much more performant than the consoles weedy Jaguar cpu. We've entered the era of truly multicore pc gaming, but a lot of developers don't seem to have gotten the message and are still designing renderers as if it were last gen, when cpus could just power through.

Another is shader compilation. That happens because modern games use a lot of custom shaders. On console, that's easy. There's only one hardware config to consider, so you can ship the compiled shaders along with the game. But on pc that doesn't fly of course. So they either need to compile them during gameplay, which puts more stress on those same couple cpu threads, which is why shader stutter happens. Or they need to pre-compile all them upon booting the game, which can take a good while depending on your cpu. I often see a single core go up to 100% utilisation, max clock speeds, while the others are not really doing anything. Again, the solution is better scheduling, and properly use all of a pcs cores.

Traversal stutter happens because every time a new area needs to be loaded, the cpu needs to fetch that data from storage and feed it to the gpu, while still doing all the other stuff. The PS5 for example doen't have this problem because its APU can have data loaded directly into memory through an IO component in the APU which can decompress data in real-time. Think of it as accessing the contents of zip file without needing to unzip it. Neat bit of tech, and pc has no real equivalent to this. But it does have Direct Storage, which allows the gpu to access storage directly, bypassing the cpu, meaning the gpu can work more efficiently since it doesn't need to wait for the cpu to feed it, and the cpu is free to direct those cycles towards something else. Unfortunately, very few games are using it properly.

TLDR: lots of developers still doing pc ports as if it were still last gen

So it’s almost like, have the chickens finally come home to roost? I mean, for the longest time now PC elitists would scoff at new consoles and say how outdated they’d be after the next GPU/CPU cycle hits PC, but here we are where a huge chunk of PC players still probably aren’t even running an SSD. Even those that are still have the hurdle of more APU overhead than XSX/PS5 ever would. That presents a significant disadvantage next to the fairly privileged standards the console install bases have been enjoying this time round.

It used to be consoles didn’t have enough memory for big worlds, GPU power for good textures, etc. but the law of diminishing returns seems to have its foot firmly in the door now.
 
Last edited:
  • Like
Reactions: BrawlMan

CriticalGaming

Elite Member
Legacy
Dec 28, 2017
11,227
5,682
118
Isn’t the excuse always something like, “don’t know until it’s out”? But really, any PC dev should have a general idea of how a sampling of spec configurations will perform just based on establishing a game’s system requirements in the first place too.
Depending, not all dev teams know about PC specs and it's perfectly possible to have a team that's experienced with console development without being as versed in making a game run on PC. Though in this case, I don't even think they knew what the consoles needed either.

Honestly i think the team ran out of time during optimization. EA is notorious for "release it when we say and not a day later", so it seems like they simply couldn't finish it in time.
 

hanselthecaretaker

My flask is half full
Legacy
Nov 18, 2010
8,738
5,910
118
Depending, not all dev teams know about PC specs and it's perfectly possible to have a team that's experienced with console development without being as versed in making a game run on PC. Though in this case, I don't even think they knew what the consoles needed either.

Honestly i think the team ran out of time during optimization. EA is notorious for "release it when we say and not a day later", so it seems like they simply couldn't finish it in time.
*Waits for a “Will you ever fucking learn, EA?!?!?!” meme to surface.*
 

Chimpzy

Simian Abomination
Legacy
Escapist +
Apr 3, 2020
12,840
9,273
118
So it’s almost like, have the chickens finally come home to roost? I mean, for the longest time now PC elitists would scoff at new consoles and say how outdated they’d be after the next GPU/CPU cycle hits PC, but here we are where a huge chunk of PC players still probably aren’t even running an SSD. Even those that are still have the hurdle of more APU overhead than XSX/PS5 ever would. That presents a significant disadvantage next to the fairly privileged standards the console install bases have been enjoying this time round.

It used to be consoles didn’t have enough memory for big worlds, GPU power for good textures, etc. but the law of diminishing returns seems to have its foot firmly in the door now.
Kind of. I simplified things quite a lot, there a lot more to it than just an I/O overhead advantage. It would be more accurate to say modern games tech demands a lot, which both consoles and pc are struggling with in their own specific ways, but manifest in more noticeable ways in pc. Solutions for these problems already exist or will be coming relatively soon on pc, but it will take some time for developers to shift to these new paradigms, and for gamers to move to hardware that can properly use them, and in the meantime we're all kind of stuck in this awkward transition stage.
 

XsjadoBlayde

~it ends here~
Apr 29, 2020
3,372
3,499
118
Lol.


(Apparently involves spoilers so am avoiding copy/paste of contents in this post cause I care about you all so much 😇)

(Wait, there's a couple of sentences before the spoiler warning that looks safe to add)

A Star Wars Jedi: Survivor player is warning others of a "major game-breaking bug" that can prevent progress.

If you pass a certain point early in the game – usually within the first two to three hours, although this depends upon how quickly you're trying to get through the story, of course – and die before you save, you risk respawning into a trapped location, effectively blocking your progress.
 

FakeSympathy

Elite Member
Legacy
Jun 8, 2015
3,489
3,222
118
Seattle, WA
Country
US
Lol.


(Apparently involves spoilers so am avoiding copy/paste of contents in this post cause I care about you all so much 😇)

(Wait, there's a couple of sentences before the spoiler warning that looks safe to add)
Holy crap, this is almost Bethesda-level shit. I remember both Oblivion and Skyrim had bugs where certain questlines became impossible to finish. At least here the bug seems to happen early. Imagine what that would've been like if it happened near the end
 
Last edited: