Bethesda's recent Starfield Direct was a huge, all-encompassing presentation on what to expect from the upcoming open world RPG, and now Digital Foundry has another 45 minutes of coverage for us all to watch through!
Eurogamer's tech team has gone into a bit more detail on why Starfield is going for a 30FPS frame rate on console, and why it can't really be compared to other current-gen 30FPS releases like Redfall.
Some of this chat goes way over our heads we must admit, but it's an interesting watch nonetheless, and the team over there seems pretty impressed by what they saw at the Xbox Showcase - regardless of the targeted frame rate. Here's a snippet from Eurogamer's own roundup of the Digital Foundry breakdown:

"John [from Digital Foundry] puts forward the argument that a frame-rate target is a design decision..."
"The question is what frame-rate allows you to hit your design goals for the game? In the case of Starfield, it's clear that this is an open-world or even "open universe" game, with item permanence, AI, simulations, and a lot of interlocking systems to create an immersive experience."
"With all of the CPU demands that this sort of a game entails then, dropping the resolution might not have much of an effect on the overall frame-rate. As a developer, you may as well crank up the visuals and embrace that 30fps output for an overall more consistent output..."
The team also mentions the sheer visual fidelity on show here. Bethesda clearly wants to sell the experience of a space-exploration fantasy with Starfield — as mentioned when Todd Howard talked about the game's frame rate — and DF thinks they're doing a pretty good job based on the recent presentation.
"Beyond the frame-rate, John calls out Starfield as being the first truly beautiful game that Bethesda has ever made - and a lot of that comes down to improvements to rendering techniques, particularly the use of real-time global illumination, reflections and motion blur."
Ultimately, we'll see how the experience all comes together in just a few short months, and we're becoming increasingly eager to try this game out for ourselves on Xbox Series X|S. Starfield hits Microsoft's current-get consoles and PC, including through Xbox Game Pass, on September 6, 2023.
Have you had a chance to watch any of this breakdown? Thoughts on their Starfield Discussion? Drop 'em down below.
[source eurogamer.net]
Comments 45
I actually would have gone for 24fps to give it a more cinematic feel but 30 fps might come close to that feeling so I'm fine with it
I mean, yeah, cool sentiment however it's people like DF who have stirred this whole crap show about frames, literally providing blank ammo to fan boys who try to make a sport out of tech instead of just enjoying the games. Sorry, it's just I have a slight loathing for Digital Foundry, it's weird, and almost obsessive, and sucks the joy out of games.
That's jsut my personal feelings toward it.
I suppose it's good they're defending Starfield, but they shouldn't need to. And I know, it's not just DF's fault, Xbox and Sony both bragged about frame rates and that was stupid. If it was never mentioned, nor made a huge deal of, it wouldn't be the big drama it is today when a game releases at 3OFPS.
I think a game should be played, simple as that. If you like it, great, if you don't, move on to another game, yet there is this inane need for people to rant rave and negate everything about a game they don't like, and then other nasty trolls and fan boys gravitate to it because it resonates with their warped reality and helps them live in the fantasy world they've created, which provides the original whiner with enough likes which provides the dopamine they're lacking from real life to such an extent they actively go out their way to crap on everything their demographic doesn't like, breeding a black hole of negativity and causing huge online arguments that get blown out of scale.
The whole frame rate discussion is tedious and only serves a few who enjoy making a sport out of details, but in reality, if a game is good it's good, and from now on I would love next gen to have 30fps as a norm, so that it can no longer be weaponised by either sides.
If you want real power house peformance, get a PC. But let's face it, consoles are safer, comfier, and have always been mostly 30fps, so it's not like you're really missing out.
Sorry, might have gone off there a bit. My wife says I need to learn to tone things down haha. My brain is wired weird
I never expected this game to be 60fps due to the scope of it. For those who do, I personally think they have unrealistic expectations. I, for one, can’t wait for this game.
@Zoidpilot4, holy rant Batman. Lol
I don't disagree with the framerate argument. I always choose the make it look pretty mode over the performance mode anyway. I know I'm in the minority, but as long as the framerate is consistent, it looks the same to me. You only really notice the framerate when it drops.
@Kraven, I was always going to play it, thank you Game Pass, but I wasn't in the camp of I can't wait for this game. After the deep dive, I ended up preordering the Constellation Edition, and am now bummed that I'm going to be out of town until after its official launch on the 6th.
I was always expecting to play this at 30fps anyway as I've only got a Series S I'm just hoping it'll be a stable 30fps on the S.
Aslong as the game is well polished in other areas then 30fps won't matter.
I can't wait for this. Been gaming since the 80s, so 30fps doesn't bother me one bit given the title's complexity. As a dev, I can't wait to see what they've squeezed out of the systems.
Digital foundry predicts a lot of the massive games will choose to go the 30fps route this gen.
A Plague Tale: Requiem at the time of my playing it was 30 FPS and it is my personal GOTY. I have many concerns about Starfield, but framerate is not one of them.
DF does good stuff but it's weird for them to jump out and theorize on what the game is doing to defend it without knowing. He may be right, but that's a lot of assuming as to what's taking that frame cost up, and assuming that it should. Seems odd for a place that's known for analyzing, not speculating.
I'd like DF or someone to go into the monitor tech issues with 30 and the rate of turns in video games though. People don't understand that. I traded my non-HDR 1080p screen that was slow and smoothed graphics for a 4k HDR one that's fast and makes high contrast edges rapidly flash back and forth over an inch when in motion, mostly turning, at 30fps. Makes 30 virtually unplayable unless you make sure to never look at, say, a building outline or a mountain while turning. It's very visible on, say, Forza Horizon at 30, where, if a car goes by you actually don't see it move from left to right, you see it like a flipbook, actually moving slightly back and forth, flashing rapidly, as it advances. I thought it was ONLY the screen but totk also does it on the SWOLED just much smaller screen so obviously harder to see. How are engines making the display rapidly oscillate. Choppiness is one thing, but it shouldn't make an edge move back and forth, and it has to be the engine or something about frame buffering. OLED users have these problems, but it's not just OLED, mines an IPS LCD and it has it as well, because it's a fast one.
If they're talking about 30fps being the future, basically, they need to talk about how it's basically incompatible with a huge amount of displays, too.
@Zoidpilot4 nor should you defend starfield or anyone beside Microsoft, asking for a 60fps mode shouldn't warrant any fanboy feeling personally attacked, the ***** is creative decision, I get motion f***ing sickness
No one cares if their sandwich isn't in the exact same position they left it. Give us 60fps and shut-up about it.
@NEStalgia I've actually done a lot of digging on this since you have been bringing it to my attention. I guess a lot of people in the forums I've looked at have basically said where some of these newer displays have these near instant response times, it's essentially just making the transition between frames harsher because on older slower displays the frames were sort of smearing together. Some people claim they notice it, others don't. It's definitely a bizarre phenomenon, but you aren't just rambling drunk like a mad man, there's apparently something to it. I guess I'm glad I opted for a QLED as my main gaming display rather than an OLED.
I’d much prefer 60 but I will be playing Starfield at 30. At least I can appreciate the scope and reasons why this is 30, opposed to a Redfall where the game looks last gen and should be running at 60 as standard.
60 is far superior and I don’t understand anyone who says they can’t notice a difference. I say to anyone fire up Control on PS5 or series X, try 30 and move the camera then try 60 and do the same. It’s night and day. Preferring 60 doesn’t make me some techno fanboy, a good game at 30 would be even better at 60. There’s no argument.
@Fenbops I'm truly an advocate of Masahiro Sakurai's stance. 30 is adequate and 60 is optimal. Of course, not everyone will agree with that (nor are they required to), and as our friend NES points out, there really does seem to be some issues around really snappy displays and lower framerates. But even still, not all 30fps is made the same. Trying to play one of Gust's JRPGs on PC at 30 is basically just a slideshow. I'm assuming it's the rubbish engine and the game's complete lack of post-processing effects to try and smooth it out. It looks utterly awful. Then you get some of Nintendo's 30fps offerings that by comparison are just excellent. It's truly a weird issue I guess because software is different and not every 30fps is as unpleasant as another.
I hate the patronizing tone of these things. "Oh you just don't understand why it has to be 30 because of their design choices." No, I totally get it and think they should have made different design choices. 60 fps > Persistent sandwiches, 4k, never seeing pop in. I don't begrudge them for making the game they wanted to make but it's annoying to be told that my disappointment is somehow an unfair misunderstanding.
Yep, they suspect the same thing I said on Sunday, this isn't a GPU issue. There are so many uninformed people screaming to "just lower the resolution" that just don't understand in an open world, systems driven RPG like this, you will obliterate a CPU at times. Lower it to 480p and you'd have the same issues with framerate in the same places. Because the game is so CPU taxing, they can optimize for 30fps, and then push the graphics to what we've seen.
https://www.purexbox.com/news/2023/06/starfield-will-run-at-4k-30fps-on-xbox-series-x-1440p-30fps-on-series-s#comment7644140
Just bought myself a Series X for this game so I hope it doesn't disappoint.
I mean it was pretty obvious and a few of us have been saying this, with these very reasons, for a few months now at least
I own a 65inch LG C2 OLED and 30fps games like Zelda are fine on it. I also own an older 55inch LG B7 and 30fps is fine on that aswell.
It was always going to be 30fps on £500 console. Come on
@Zoidpilot4 I see where you are coming from: the comments section of their videos are a toxic waste dump. People use it for console war fuel.
But I do find the analysis interesting from a technological perspective - especially "impossible" Switch ports.
Personally, I just like to see the differences in how hardware is handled and the strengths/weaknesses of each platform. It's just out of sole curiosity. I mean, I'm still going to buy the Xbox version of a game because I prefer the console overall - unless the Xbox version is a completely broken mess.
They do benefit the industry though as they have called out performance/visual bugs that need to be addressed. And they do report findings to developers.
For example: reflection bugs in Callisto Protocol for Xbox or the really poor visual quality of Dead Space on PS5.
Sure, you could say "just enjoy the game" but a bug is a bug and I think that kind of public exposure puts pressure on developers to fix broken things in the game.
They also mention that it’s not actually 4K. The scene they counted pixels on was 1296p and then upscaled. So not only is it not 60fps, it’s not even 4K/30 which is a bummer.
@GamingFan4Lyf I agree with your points, thank you. They gave me a kind of different view on it, though I still dislike DF and the reasons I gave, I can understand the pro's to it too now.
@Zoidpilot4
Your comment is spot on.
The reality though is that fanboys are just a bunch idiots. No matter whether they are on my side (XB is my primary console) or not. They probably don't even play games, they just go on the internet trying to justify their console purchase to the rest of the world.
I have all three major consoles, PC, and game on my phone. There is pluses and minuses to every device. But I like a lot of different games and certain devices are better at certain games. I wouldn't play Candy Crush on a console or PC, but it is nice when I am out and about on my phone. You can't beat a PC for RTS and complex simulator games. Halo and Gears are my favourite campaign shooters, Sony makes the best action-adventure games, and I love Nintendo's Mario and Zelda franchises.
The main reason I consider XB my primary console is because I like the achievement system, the Live network has always been stable, GamePass is the best gaming subscription, and I believe in the play anywhere on any device goal.
I am someone that much prefers 60, or even 120fps.
Though based off this video it looks like Starfield is getting the "Nintendo 30" treatment like what we've seen on BOTW and TOTK.
Even though they're 30fps, they have near perfect frame pacing which owes to the smoothness. And with the correct motion blur, it even looks pleasant.
If Starfield pulls this off, i'll be more than happy playing at 4K30fps.
I am uncomfortably hype for this game. Can't wait to get my hands on it.
I pretty much expected it to run at 30 FPS. It's a massive game, by the sounds of it. Hitting 60 is probably very difficult.
@FatalBubbles To be fair, unless you play on an expensive gaming rig, even PC players use reconstruction to 4K.
Heck, I have been able to play Quake II RTX on my Nvidia 1060 thanks to FSR! Sure, the native resolution is like 640x480 or something...but the output image is pretty decent for a laptop screen.
Native resolution is moot now that DLSS, FSR, and even UE5's reconstruction are providing convincing "4K" image quality.
They should lower the graphical quality to hit 60fps. Frame rate is so much more important than graphical fidelity.
@NEStalgia the truth will come out very quickly after the game comes out. All someone needs to do is run the game on a PC that matches (not exceeds) Xbox series x specs and try to replicate the Xbox settings. Once that’s done, we can see what is being hit harder, CPU of GPU. That alone hints as to if the issue is indeed all the stuff tracking or it’s simply over-focus on graphic settings.
I tend to like DF, but they have proven I the past they will rush to defend things they are biased in favor of, while draconianly destroying stuff they don’t care for over a few frames dropped. John in particular tends to let his emotions paint his opinions.
We've had item permanence in games for decades now...
Hard to talk without the full picture but if I had to guess, it's either an old engine (the creation engine 2 apparently) or an old console (the Series S of course) getting in their way. Probably both.
I'm pretty confident people on PCs will be able to play the game at 60fps without too many sacrifices to the visuals and with lower specs than the Series X.
Good see see a video with people who understand the technology and its limitations in a realistic way.
What I usually see online are trolls who know little to nothing on the subject yet act like they are an expert on game engine design and development.
Tod Howard made a creative choice on how to use resource and you can either get on board with it or you can't, but it is what it is and it's not going to change.
I do sympathise with those that have medical conditions where they require higher frames and/or cannot afford a PC that can run the game at higher frame rates.
I am excited to play this game on my Series S, or Series X if I decide it is worth the upgrade (I don't have a 4K TV at the moment).
@Tharsman yeah and beyond even whether it's cpu or gpu bound, or still doesn't say it's a design decision or a flaw. If it's hammering a single core it's a technical problem not part of anyone's vision, the load could be halved. And even if not it won't explain if it's efficiently using the CPU or running 100 dragon scripts on another planet. Bethesda can tell us (or lie) if that's the cause or not but SF can merely guess the same as us, unless Bethesda shares the engine code with them or testing reveals specific systems that repeatedly cause load spikes.
DF standing in the pr crossfire to defend a games technical performance 3 months before it releases based on what "might" be the cause based on "artistic vision" on a game using an engine that's been known to be less than optimally performing for 25 years, Is really.... I've never been a big df acolyte but it makes everything they do kind of suspect to me imo. They really need to stick to testing and reporting what is, not speculating if that's "vision"or not, and certainly not defending technical limitations of unreleased games like theyre the publisher. It really undermines what they do do well.
I said it above but it has the same vibe as the critics and fans that decide a hyped game is goty based on trailers and directs and "confirm" the expectations with playthroughs.
I don't know why some people even thought, for a second, that this beautiful and open world could ever run at 60 fps on a current machine. It's really not a resolution question.
Something of that scale was always supposed to run at 30, and that's perfectly fine. I did A Plague Tale at 30, Jedi Survivor at 30, even Miles Morales, on PS5, at 30,etc... and it was fine. Really, it was (I'm personally among the few people who always choose fidelity mode over the performance one).
Digital Foundry themselves explained it sometime ago : we have to get used to that, and it's not a question of power. Because when the new consoles will be launched, developers will push even further their game's graphics, and the "problem" (which is not a problem, in my opinion) will remain the same : 30fps fidelity or 60 with very reduced assets and / or resolution.
But let's be honest, here : of all the genres that could be affected by that phenomenon, RPG is probably the one for which it is the most ok.
It's not a problem, for a RPG, where the main goal is to explore and to wonder, to discover beautiful landscapes, universes and to immerse yourself, well, in other worlds.
Did I mention I just cannot wait for this game ?
It was obviously a CPU intensive game - the fact that you can pick up any item and put it 'anywhere' and return anytime and it will still be there - like the Women who liked to collect Sandwiches.
Digital Foundry have done LOTS of PC breakdowns that go into CPU utilisation and why games can't maintain a '60fps' at 540p on an RTX4090 (the 'best' GPU on the market that blows a Series X GPU away) because of the CPU utilisation.
They also compared a game like Star Citizen which has 'barren' moons that may hit '60fps' but go into a City and you'll drop to 'below' 20fps on a set-up 'similar' in spec to a Series X.
As for Series S, it has almost the same CPU spec as the Series X - a bit 'slower' but close enough that if it is CPU limited as expected, the fact its almost as fast should mean that it should cope with the CPU workload.
@Bobobiwan "d, developers will push even further their game's graphics, and the "problem" ("
That's the problem though. They shouldn't. We started with 60fps in the CRT days then reversed to 30 in the early 3d era as a seemingly temporarily limitation. As long as devs keep approaching games from a starting point of maximizing visuals or even features at the cost of performance, rather than a starting point of building the best game they can around a standardized performance baseline, we're always going to have a starting point that's behind what we had in 1987. There needs to be a point in time where we have an industry cutoff that says "ok, from now on, our starting point is to design our game to be the best it can be starting at 60fps" just the same as VR developers have to do at 90fps (or 60fps doubled to 120) because otherwise they have to ship with a barf bag. One generation needs to be the one "sacrificed", where performance is the focus over bredth, to reset the baseline forward from then on. This one seemed like it was going to be the one.
Developer "vision" is kind of the problem. We need to get to a point that no developer would consider targeting 30 from the time they start their game. We're at a point where consoles and displays are promoting the significance of 120fps and beyond, but devs are still starting games with the idea that 30 is enough, and then they declare it as "cinematic" without locking their games to cinema styled panning rates which would mean taking several seconds to turn around and shoot someone behind you like PS1 era tank controls. Film slow pans because of the fps. Film also has the option of "cuts" from one camera angle to another to quickly transition. Games cant do that. Our solution is higher frame rates.
Now we also have technology limitations of various screens that make the problem much more pointed than during the earlier transition from CRT to LCD where the limitations of the hardware meant that games got a sort of "free" post-processing effect to smooth the performance courtesy of refresh sustain on displays. The hardware acted like a filter applied to games to add a motion blur to hide the frame gaps. That's becoming much rarer so the original problem is more exposed.
There simply shouldn't be a world where a video game is running sub-60fps at this point, and it's developers that need to put their foot down and say this is the acceptable performance baseline in video games in 2023, not making excuses as to how their "vision" just doesn't actually work at acceptable baselines on currently mass available hardware rather than tailoring their "vision" to the medium.
Again this is Bethesda so you kind of expect that, but it still needs to be called out until it's fixed. TES6 should be designed now to fit 60fps targets on expected hardware of the era, not designing the game around what they hope could be possible and then making it run janky on whatever hardware people actually have at the time.
I understand, just sad they didnt choose to target 40fps. It would be way easier than 60 but a great improvement for 120Hz screens, and others would have a stable 30.
@Zoidpilot4
My guy, passion is a fantastic way to express yourself.
But you are spewing a lot of the same vitriol you are so upset about, on more than just this topic.
What you are doing is not a healthy way to express yourself.
As a mental health provider giving out advice in gaming threads to an unknown actual person
Take a step back and refrain from posting, or have your wife read them out loud to you before slapping post
@Zoidpilot4 superb assessment mate, agree with every word of your post.
@FatalBubbles Depending on the upscale technique being used, it can look as good as if not better than native 4k. FSR gets really close to native and DLSS can even surpass native, rendering a 4k image that's even less pixelated in the really detailed areas. I believe Control was one of the first games where DLSS achieved this.
Suffice it to say, Starfield is one game where I'll be playing on my gaming PC rig rather than my Series X. I might still install it on the X to show friends the eye candy.
Not a dig at the consoles in any way but it all boils down to the laptop cpu being used in the Series and PS5 consoles. It's okay and has always been this way for console, compromises have to be made for an all in one box that cost 500.
Removing global illumination would help and maybe will be patched in the future but for me I just completed Zelda totk @30 and it was one of the best games I have ever played.
Wanna go after a company about frame rate issues ? Start with From Software.
@Zoidpilot4 Digital Foundry are impartial. They just report what they observe, so I don't understand your frustration with them. They did once admit that they struggle to relax and enjoy games, even in their private lives, given their job is to analyse them with a fine-tooth comb.
@NEStalgia, well said and I agree 100%. Make games for the gen relased in. 60 fps should be the standard built from with other frame rate options if needed. Needed for perhaps compatibility with different monitor specs.
My take on Starfield technically so far is; yikes/yuck no 60fps, and lots of blur and image softness from depth of field use, global illumination volumetric techniques, and haze and god rays. Along with that TH used the words motion blur.
The worst has to be TH using the most unwanted word, to me, about a video game. Cinematic. Its a video game not a movie. Totally 100% not the same thing. Want Cinematic? Fine, make a movie.
All that said Starfield may have a great story with fun game play, but will need mods like Fallout 4 has to make it look better along with a wait to next gen or mid gen to play at 60fps.
@Thumper I think my point was clear enough, if you don't understand it I can't help you, but don't worry about it, it's just an opinion
@Zoidpilot4 Why the sarcastic, sneering response? God I hate the internet.
@Thumper it wasn't meant to be sarcastic or sneering. Sorry, I'm autistic, I have been told I can come across cold, blunt, and yes sarcastic, but that's not what i was feeling nor trying to convey. I did say don't worry about it as I thought my opinion might have affected you....
And yeah, the Internet is a nasty toxic world. I hate it too. Imagine having a brain wired like mind, it's a minefield
Reading back, I think what confused me was you said you didn't understand my frustration with digital foundry, but I thought I had explained that clearly and could not understand how you didn't understand, and with the limited scope of understanding other humans I thought it best to say I couldn't help you understand, as literally I couldn't. My wife pointed out this sounds like I'm saying it in a way where I'm being arrogant, and could imply I think you're stupid, but I don't think that at all. I don't know. I imagine you're clever, most gamers are i my experience. Sorry if I upset you or made you feel annoyed. Its just my brain is wired differently, but I am working on it.
Leave A Comment
Hold on there, you need to login to post a comment...