Yes, we're still talking Star Wars Outlaws - it's been a very popular release after all (even though the standard version doesn't actually arrive until Friday), and interestingly it comes with three different graphics settings on Xbox Series X.
Those are 30FPS, 40FPS and 60FPS, with the middle of the three being targeted at players with displays that support a Variable Refresh Rate. According to the wizards at Digital Foundry, the 30FPS quality mode reaches a base resolution of up to 1620p, whereas the 60FPS performance mode goes up to a native max of 1080p.

But which is the best? Well, if you can take advantage of VRR, 40FPS is apparently the ideal way to go:
"The 60fps modes on PS5 and Series X do sacrifice a lot in image quality, and ideally need a VRR display to look their smoothest. As a broader recommendation, the 40fps mode is my pick as it provides a better balance between visual settings, image quality and frame-rate stability, while the 30fps mode is a good backup for those without 120Hz displays."
The good news is that the game is described as a "well-optimised console effort" even if you include the Series S version, which is locked to a "surprisingly robust" 30FPS. Perhaps 40FPS could be a stretch goal for the future on the S?
Whether you're waiting for the standard release this Friday or maybe playing the Ultimate Edition as part of Ubisoft+ on Xbox, you should get a good experience here - in fact, we thought it was a "Great" one in our Pure Xbox review!
Which mode have you been playing Star Wars Outlaws on so far? Let us know in the comments below.
Comments 30
I was expecting to use the 40FPS mode, so it's good that it is actually a nice middle ground.
It would be real nice if next-gen we didn't need these options.
I never understand any of this, @FraserG. It is not my forte, to say the least! I have an LG TV that supports 120htz and VRR, does that mean that I can play it at 60, or do I need to be playing it at 40? Tis all very confusing...
@Fiendish-Beaver It means you can play on whichever mode you like and you'll get the smoothest experience possible.
Someone who plays at 40FPS or 60FPS on a non-VRR display will see more juddering than you do.
In this DF analysis, they're saying that due to the lower image quality in the 60FPS mode, they'd suggest going for 40FPS as a nice balance between frame rate and visuals.
Avatar has a 40 fps mode on the S. I'm assuming that'll be patched in. I have an X and S, along with a PS5 and I am fortunate to have 120hz displays with all of them. The 40 fps options are really nice. Hellblade 2 should at least get that treatment.
@abe_hikura I was hoping they'd go the way of Atomic Heart and not even give an option for a "quality mode." I understand a 120 fps mode, but 30 seems just lazy. Especially since a lot of games get patches to get them up to par for a 60 fps option. So in other words, developers should just take a but more time
If it works as well as Hogwarts did, I found the 40fps mode ideal even on a 60fps display where 30fps always has HORRENDOUS judder, and 60 is smooth. 40 felt like what some other people must see for 30. People, and companies really really underestimate just how different modern displays are from each other and how it affects what you see and how some fps are unusable on certain displays. Some people have a display where 30 looks fine. Some like me have a display where 30 looks like a broken rotoscope with a bent axle.
@JustinSane The problem with modern gaming is, in the old days, in PC gaming, if your game ran janky and choppy or low res on your GPU the companies basically just told you "too bad, you should have bought a top of the line GPU instead." Today, when your console game runs janky and choppy or low res on your console, the companies basically just tell you"too bad, you should have bought a top of the line GPU instead."
Unfortunately my TV doesn’t support the 40fps option 🙁 I’ve been playing in performance mode and haven’t really had any issues. I’m sure Ubisoft will be further optimising the game anyway. Another interesting point in the DF video was regarding the widescreen (21:9) picture option. If you use it in normal gameplay (it defaults to it anyway for cut-scenes) it seems you actually lose some of the picture. Rather than optimising the picture for widescreen, it seems Ubisoft have just slapped some black borders on the normal (16:9) image. The 21:9 option may look more “cinematic”, but I’m sticking with the 16:9 (fill screen) option.
@Fiendish-Beaver The rule is that you can play at the fps that can be divided by the Hz of your panel.
If you have a 120Hz panel you can play at 120fps, 60fps or 40fps (this is because 40x3 = 120).
But if you have a 60Hz panel it is not recommended to play at 40fps, because although it is possible, it can be unstable since mathematically you cannot represent 40 images on a screen that displays 60 every second.
Another case are panels with VRR, here there is no problem as they adapt the Hz dynamically to the frames delivered by the GPU.
With your TV, being a 120Hz panel with VRR you won't have any problem, you will be able to try all the modes and choose the one you like
@Fiendish-Beaver @Pabpictu There's "rules" for how it's supposed to work, but meanwhile have a 4k60 fast IPS LG display on which 40 looks wonderfully solid and 30 is a juddering migraine inducing mess. The rules say 40 shouldn't work and 30 should be fine. The panel says 40 is fine, and 30 will make you go blind. I don't have VRR (well not true it's a Freesync display which PC and XB support, but I have my Xb going through HDMI switches that claim to support it, but end up showing a blank screen if I try so I'm not using it.)
1080p for 60fps. At that also dips below 1080p.
At least it is there thank you, but what a half generation.
Or it’s the development process.
yawn
Yet, another 30-60 fps article. It does not make any game better or worse. Especially since all of us were playing Super Mario Brothers and Mario Cart back in the day at maybe 30 fps. Tops.
This hipster level argument, solely for article attention seeking clicks, really just needs to go away.
No one's head is going to explode due to 30 fps. No one's eyeballs will melt out their skull due to 30 fps. Unless you are a higher end product from Skynet that can detect a millisecond of a frame rate, just play the game.
All of us played 30 fps before, and surely all is us can still play 30 fps. Let it go. Just let it go.
Zelda is game of the year nominee all the time. It's 30 fps at best. Sometimes even below 30 fps. Let. It. Go. Play the game.
Thank you, @FraserG. 40 sounds good then...
Thank you for the explanation, @Pabpictu. I think I might understand... maybe... 😂
Just read the whole report. Have not played the game but sounds similar to Avatar which even with the setting at thee best when panning around was a blurred mess and bad looking.
This generation so frustrating, to me it seems the earlier games this generation even though cross generation where more stable and looked a lot cleaner in panning and movement.
Playing AC Vanilla on series x at 60fps looks is clean and smooth compared to these brand new modern games.
HFW is another at 60fps looking clean and smooth.
Both with a lot higher resolutions at 60fps than SW outlaws.
Help, what is going on? My roughy guess is they are adding things like ray tracing, more effects and level of detail and slowly the consoles right down resolution wise at 60fps and pushing it to much creating blurred and messy panning etc.
@NEStalgia
Just read the whole report. Have not played the game but sounds similar to Avatar which even with the setting at thee best when panning around was a blurred mess and bad looking.
This generation so frustrating, to me it seems the earlier games this generation even though cross generation where more stable and looked a lot cleaner in panning and movement.
Playing AC Vanilla on series x at 60fps looks is clean and smooth compared to these brand new modern games.
HFW is another at 60fps looking clean and smooth.
Both with a lot higher resolutions at 60fps than SW outlaws.
Help, what is going on? My roughy guess is they are adding things like ray tracing, more effects and level of detail and slowly the consoles right down resolution wise at 60fps and pushing it to much creating blurred and messy panning etc.
@OldGamer999 I'm not sure what's going on with game development this gen. "Something" happened at the development level. Maybe Nvidia is just lead platform for almost everything with console as an afterthought? It's every publisher, every engine, EVERYTHING has the same problems. Maybe they just got rid of all the experienced talent and staffed up on interns for "good enough" for cheap? Throwing too many shaders and lighting and particle tricks at it beyond what hardware can really handle? Just assuming RTX is lead platform? IDK.
A lot of it has to be shaders/materials/lighting, particle effects going on. Why are resolutions AND performance tanking everywhere? It's not RT, most of these games don't even use RT by default. RT is a whole other mess.
Then we have all the CPU bound games, but why are they CPU bound suddenly when they don't seem to do much more than old gen games? Starfield was one of the only ones with an excuse with the amount of object tracking going on, but then that DID get a 60fps patch. But it's the res that takes a hit.
IDK, something changed in the dev of everything in the past few years, even if the end result doesn't justify the losses. The games all seem like they're being made for future theoretical hardware, or just built around 4080s and just tuned down till it runs on anything else.
@NEStalgia
Well I think you have summed it up well there.
All they had to do was add a few effects and features to say a game at AC Vanhalla’s level making it more this generation and we would have easily kept the 60fps mainly at native 4k.
And probably got the games quicker.
Instead they seem to fanny around the houses and making a dam mess this generation.
I mean 60fps running at 720p to max 1080p, game not running smooth and looking blurred when panning around etc.
That is one crap generation for these consoles.
Maybe it’s my ages as well, but for me this has been the worst and most disillusioned generation if ever been in, and I have been in them all.
@OldGamer999 One problem is of course that the hardware simply sucks. They designed it at a point that hardware wasn't really different from the old pro models. It's not a new generation. But the dev issues apply to PC too.
It's also too many cooks spoiling the broth. Look at the credits on these games. They run for an hour. Top budget Hollywood films credits last 10 minutes. How many different teams from different companies on different continents are working on the pieces of these games? How COULD it gel?
Trying to do too many advanced visual things on hardware that doesn't have room to spare with teams that don't work well together and often aren't even working for the same company.
This gen is so bad it pushed me back to PC. It had to be REALLY bad to do that. I swore it off for life. THAT is how bad it is...
@NEStalgia
I’m not sure what direction to go in being honest
I’m just lounging around in the grey mist of the current generation right now.
@NEStalgia
I know it’s only one game but I do enjoy a great 3d platformer.
So I’m hoping Astro Bot will put a little bit of faithfulness of this generation back in me.
I’m sure it will be flashy, well optimised, I believe running at 60fps at just under 4k and will look as smooth as silk and play very well.
@OldGamer999 That's ok, Phil's wandering with you. You might bump into each other.
Yeah, Asobi hasn't let anyone down yet, so I'm expecting quality from any remnant of Japan Studio. Shame Sony disbanded every other part of their best (but not most profitable) team...
@NEStalgia
Don’t worry I shall look for Phil, Sonys AAA studios and the switch 2.
If I find any of them, I will let you know 🤣
@OldGamer999 If you need me, I'll be in the leather aisle with Jensen. 😂
@GuyinPA75 It's fine if you don't want 60fps but there is definitely a factual difference in response time and smoothness. Why do you think most fighters target 60fps or higher? It's not a "hipster" argument. Most games were 60fps until the early 3D era anyway.
@Fiendish-Beaver at the end of the day there is no ‘right’ way to play it, it’s up to you which mode your eyes and brain prefer. But as you have a 120fps VRR TV you get the most choice to play at any setting.
Although 40fps seems like it’s nearer 30fps than 60fps it’s actually EXACTLY in the middle because it’s 1/30, 1/40, 1/60 which means frame times are
That means 40fps feels quite a lot smoother than 30fps and a fair bit sharper than 60fps, a good balance imho, but everyone’s eyes/brain are different.
40fps can also sometimes have a small advantage as devs know that 120Hz tvs must also have VRR to hit the HDMI 2.1 standard. This means they can push the visual settings a little more aggressively, risking uneven frame rates, which VRR will then smooth out. So visually 40fps is sometimes a little nearer the 30fps high fidelity mode’s settings than 60fps.
At the end of the day play whichever looks and feels best to you.
@abe_hikura choice is good. Devs will always want to push graphical fidelity but also give a higher frame rate option for those that prefer it. This is unlikely to change next gen.
@NEStalgia Maybe I misunderstood your post but you shouldn’t be able to get a 40fps mode on a 60Hz TV as it would have a horribly noticeable uneven frame timing, it would appear jerky in motion.
The frame timing needs to be synced to the displays refresh rate so for a 60Hz TV that is usually 30fps or 60fps, delivering a new frame every or every other screen refresh.
40fps is only possible on a 120Hz to because it delivers a new frame every 3 screen refreshes (120/40=3) that won’t work on a 60Hz TV
Thank you for the detailed any easy to follow explanation, @themightyant. That was really helpful... 👍
@themightyant No you didn't misread. I have a 60hz IPS panel, LG, not some weird Chinese unknown. Granted it's technically a monitor instead of TV but that shouldn't matter in this context. I just double checked the LG specs now to make SURE I didn't have something way more expensive than I thought with 120Hz But nope, listed as 60hz.
However 30fps has horribly noticeably uneven frame timing that appears jerky in motion. (I.E. "30fps is almost unplayable!" as I always say.) 40-45 runs perfectly fine. And 60 is ideal.
If I had to GUESS, the only explanation must be the actual panel is a 120Hz panel and something else on the display is limited, or just business reasons for selling the same panel for more money in a 120hz version means they restricted it to 60hz max, even though it's really a native 120Hz panel. Or something? 40 is playable, 30 is a juddering mess.
Given the amount of people with "30fps is unplayable" comments these days, and the dominance of LG displays in general, I would not doubt at all this is not uncommon for them. Many "60hz" panels may really be 120hz panels artificially limited to 60 (cheaper to produce one panel and limit it for cheaper skus?), and thus 30fps really is nearly unplayable on many displays.
I've always thought the panel market has a lot to do with the unending fps arguments.
@Mustoe "Now it's just a production line"
Yeah, ain't that the truth....that's so much of the problem, absolutely.
EDIT: It's an IPS panel, I typed TN originally for some reason.
@GuyinPA75
Don't click on the article then. It's like you want it to get you angry. Unless you are trying to be spicy for attention, in that case, I am happy to oblige.
Now move along, the adults are talking.
Leave A Comment
Hold on there, you need to login to post a comment...