Elden Ring has been out in the wild for a few days now, and we've seen plenty of discussion on how each version stacks up. Initially, the Xbox Series X version seemed a little disappointing, although Digital Foundry was quick to interject. Now, the DF team has delivered its full verdict, and it's a win for VRR on Xbox.
For those unaware, VRR is variable refresh rate, a display setting that effectively matches up the game's internal frame rate to your display's own refresh rate. It works to provide a smoother experience for titles with unstable frame rates on Xbox Series X and Xbox Series S, which Elden Ring certainly is. Keep in mind that VRR typically requires a display that supports HDMI 2.1 and VRR specifically, although there are some exceptions to this rule.
In Digital Foundry's full analysis, the team covers all technical aspects of Elden Ring, across all next-gen consoles. There are some losses on Xbox, namely overall frame rate and load times, but VRR appears to save the day for Xbox players.
"Microsoft's system-level integration of variable refresh rate (VRR) support at least resolves this [frame rate] issue - if it's an option for you, we recommend setting your screen to 60Hz (even if it supports 120Hz) and using VRR here to smooth out the performance drops. All told, this is our preferred way to play the game across any available system - though in non-VRR mode, PlayStation 5 does actually run better."
So, yeah, due to Elden Ring's general instability across all platforms, it looks like VRR is the way to go, where supported. The game does get things right outside of performance though, and the reception for FromSoftware's latest has been incredibly positive around launch.
Do you use VRR on Xbox? How much of a difference does it make? Let us know your thoughts below.
[source eurogamer.net]
Comments 71
VRR also works on monitors with Freesync if you don't want to dish out big bucks on hdmi 2.1 tvs
It's not just hdmi 2.1 tv's need. They need VRR as a function on the tv too. My tv is 4k has two 2.1 hdmi ports and supports 120hz, but doesnt have VRR as it's ghosted in the settings. Even my XSX shows this.
@Savage_Joe
Yeah I also got VRR working, on a series X with a Iiyama monitor (freesync premium pro 165hz) which is a hdmi 2.0 monitor.
Hopefully a few patches will improve the general performance for the other 95% of users you don't have cutting edge TVs.
@ymo1965 The whole situations a mess. I have a HDMI 2.0+ TV but it thankfully does support 4K/120 and VRR on one port. It's rare but you don't have to have HDMI 2.1 to sometimes have some of these features. But it's worth checking a few things
1) You have latest TV updates
2) Check other HDMI ports. Sometimes only one port is enabled for VRR or 4K/120.
3) Check the cable is HDMI 2.1 compliant (one that came with XSX is)
Updated the article to avoid any confusion 👍
My monitor isn't hdmi 2.1- only 2.0 but it supports VRR - and it works perfectly.
It surprises me that there hasn't been an in depth investigation by DF or anyone else about what exactly is going on with Series X performance in multiplat games. Almost without exception, Series X seems to exhibit worse performance, despite on paper offering more power. Vague talk regarding the state of the development tools aside, there really hasn't been an explanation that specifically gets to the heart of what is going on. I would love to be able to pick the brains of some developers to find out exactly where the bottlenecks are, and why it's seemingly so difficult to get the most out of the machine.
@themightyant I'm a little disappointed my LG tv didn't have VRR but having said that, I was happy it had 120hz. The tv only cost about £750 from Curry's (Oct 2020). I reckon LG could enable that VRR option. A software update could enable it if they wanted too. But think they want people paying another few hundred for that. Not worth it really. It's updated already. Already using the 2.1 cable the XSX came with. The other ports are all 2.0 standard.
@Spaceman-Spiff
Dying light 2 performs better on Series x
Has a higher resolution on quality mode.
I think it’s down to the developers.
It’s one of my concerns that if new AAA third party games sales are low on series consoles compared to PS5, due to whatever.
Then will developers stop making the effort to optimise the power of the series x.
Time costs money and if they are not making the money back on series console due to low sales then why bother.
The whole world is geared by money and return on investment, especially with price increases escalating around the world.
I have a VRR tv at 120hz and it syncs perfect with my series x, no screen tear, no pacing issues, just seems smooth and does the job it’s meant to do.
@Spaceman-Spiff the theory is that the ps5 components were designed to work well together and are in very close proximity, reducing bandwith and input latency. You can thank Mark Cerney for that, he's a freaking genius. XSX feels like powerful pc components frankensteined together.
@ymo1965 Shame. It was worth checking!
Come on LG fix a man up!
Why does it run better without VRR on PS5? Hoping for an answer here so I don’t have to watch their whole video.
@Medic_Alert Shame you can't move back and forth between PS4/PS5 saves. You can only do this once.
I’ve been playing with VRR and I’ve been quite happy with performance considering all the negative articles about the frame rate. It’s pretty smooth. So yeah, it really does make a difference.
Good for the 5% who have it, but we need decent frame rates for everyone else!
@Savage_Joe "Genius" 🤣🤣🤣🤣
@Savage_Joe But whose theory is this? It sounds like the sort of conjecture formed in Reddit threads rather than based on actual expert analysis. As far as I remember the Series X hardware and architecture was widely praised upon release. I just think it would make an interesting article for an established games journalist to take on - someone with contacts in the development community who could be used to shed light on what precisely is going on.
@Spaceman-Spiff no conjectures there. Like watch the march 2020 video of Cerney explaining the ps5's architecture.
What I love about DF analysis is that they finally acknowledged not everybody has a VRR capable display (I believe Thomas pointed that out). Kudos to them for that.
Performance should be fixed on a game level instead of relying on still globally unspread technology.
@Spaceman-Spiff I'm not sure I buy the premise that "Almost without exception, Series X seems to exhibit worse performance". This generation it's very rare to have any major differences so far. From what i've seen XSX and PS5 are pretty evenly matched on almost all games, but they alternate very marginal advantages between them.
However if I did buy into the premise i'd suspect that it's because PS5, and PS4 before it, is likely the primary target during development because it's the market leader. Not saying Xbox is an afterthought, but one has to be first for developers right?
They also don't have to worry about 2 profiles on PS5. Whereas Xbox has XSX and XSS.
@Medic_Alert BTW just following up our discussion about Death's Door the other day. Finally finished it, in fact finished it twice as I wanted to get 1000GS by doing an Umbrella only run.
It's now my top contender for GOTY this year so far (not played Elden Ring or finished Horizon) but more tellingly would have been my GOTY last year had I played it.
Brilliant game in so many ways. Loved it.
@themightyant I think you are right about where the focus goes for many devs, and I'm sure for Elden Ring From Software developed with the PS5 in mind much more than the Xbox. But I'd still be interested to know more details! Especially as both PlayStation and Xbox are very similar this generation from a hardware perspective. For me it's like having two PCs with largely similar spec, with GPUs from the same manufacturer, but the PC with slightly higher specs runs with a performance disadvantage. Wouldn't that be questioned?
@Savage_Joe But saying that the PS5 architecture and Mark Cerny are both awesome because Mark Cerny says so isn't what I would call definitive analysis. I would just like to hear some independent thought on the issue from those with no vested interest.
@ymo1965 I'm guessing the X90H? That TV is great up until you want to use the 2.1 features it was featured for. VRR disables local dimming, so basically the TV doesn't have VRR because nobody is going to turn the feature on if it disables local dimming. It's also ridiculous the TV can't do Dolby Vision and 120hz at the same time as well.
@themightyant There are 2 big problems that sometimes result in sketchier performance on Series X compared to PS5, although it should be a more powerful console.
First, and very obvious, is 2 very different targets with upfront promised feature parity (writing this as Series S owner). S is going to put limitations and constraints on how X is utilized. It is just a practical reality.
Second, that can be deduced from some other DF analysis videos, Series consoles seem to unnecessarily target higher resolution compared to PS5, especially in their DRS profiles. Results of that practice are a marginally better video quality when inspected in a lab, but lower perceived performance (generally, it lands a game in a not too flattering territory of maintaining framerate between 52-60 fps).
@Spaceman-Spiff I'm telling it to you as I saw it. But there's gotta be something the ps5 is nailing that makes multiplat games run generally better than XSX that is not about hardware power or dev tools. Power wise, the Series X should push and improve upon everything the ps5 can do, except ssd speed, but that's not happening...
@Spaceman-Spiff I think that's recency bias at play, as several notable recent games run better on PS5 - like Cyberpunk 1.5 and Elden Ring.
The truth is in one of the 1 yeaer retrospective videos, they found the SeriesX on average had about a 15% advantage over the PS5, ie close to its 20% on paper advantage.
As for Elden Ring and Cyberpunk? I can't explain that. I would have thought it was as simple as the SeriesX having higher resolution targets to take advantage of the extra horsepower, but that doesn't appear to be the case. It must be API related, as the architecture is similar enough that you'd think optimisations would work both ways.
@Medic_Alert
Are you playing the PS4 version off an NVME? If so, how are the load times? I have the Series X version at the moment, and load times are very far off my brothers PS5. I also tried the PS4 Pro version on his PS5, but it was installed on an external SSD. Honestly load times still seemed the same or faster than my XSX.
@Savage_Joe The SeriesX is no more "PC parts just bundled together" than the PS5.
In fact the SeriesX and PS5 are very, very similar architecture wise. Both have an AMD CPU resembling a Ryzen 7 3700x, and both have an AMD RDNA2 GPU - with the key difference being the SeriesX having 45% more cores, and the PS5 having a clockspeed advantage.
Similar architecture, yet the SeriesX has a solid 20% TFLOP advantage, and not only that, but a 25% memory bandwidth advantage. Definitely makes it strange that the PS5 ever does perform better, tbh, unless they have clearly different settings/resolution for a given game.
Guys, lately I've been wondering: aren't we becoming too obsessed with technical stuff in our hobby? You see, yesterday I was watching some retro gameplays on YouTube with my dad (I'm 36 and he's 60, by the way) and we had fun remembering some good old Atari2600/Nes classics...What I mean is: maybe we (myself included, of course) could have more fun, if we didn't dwell that long on the tech side of gaming. I know I've been trying to.
@Raffles
I don't remember them saying anything about a 15% advantage for XSX in that video. Maybe I'll watch it again. edit Unless you are referring to Control in photo mode? That is the only example I can think of with XSX taking a big lead
Have had XSX and PS5 since release and always watch DF videos, seems like PS5 usually comes out ahead in framerate. I still buy most multiplats on XSX as I prefer the controller and it has VRR
@OldgamerDave
Most NES games ran at a solid 60fps. Maybe compare to N64 if you want to argue why performance matters
@Raffles i'm not talking about cpu and gpu architecture. We both know that they are the same. I'm talking about how they interact within the SoC and the motherboard. Bandwidth speed, ram allocation, what will the cpu execute first, etc. are things that are programmed by the system architect of each console (ie. Cerney and Ronald respectively). And that's where I see the reason why ps5 is having the edge over the more powerful series x. Cerny probably made the ps5 execute the tasks more efficiently wile using less power.
@Medic_Alert
Cool, thanks! Haha, feeling like I should have went with the PS4 version now. Load times aren't bad, but after playing all these fast loading games it is frustrating. Specially in a souls game
@Savage_Joe
Probably has a lot to do with higher clock rates on the PS5. Don't need to optimize for certain hardware advantages with that.
@Moto5 series x has higher clock speed than the ps5 (locked 3.8ghz vs variable 3.4ghz) on the cpu. The gpu the ps5 does have a higher clock speed, but significantly less cores than series x
@Spaceman-Spiff Agreed, It IS interesting. However I’d also question the premise that Xbox “on paper offering more power”, while I agree on first look it does seem that way I think it’s more nuanced.
They seem VERY evenly matched but where they differ they mostly cancel each other out. People like to focus on the Teraflop advantage or some other metric but I see these more like race cars and sometimes an 900hp car can match or beat a 1000hp one, it’s more about all the other little bits and how they work in synchronicity to get around the track.
Or perhaps we’ll have a more PS3 like situation, albeit less extreme, where the XSX starts to shine later in the generation as devs learn how to eke out every last bit of performance. But we’re past the point of diminishing returns so less likely. You need a bigger gap to see differences.
@Cikajovazmaj Agreed it seems weird that so many devs seem to push a higher max DRS resolution at the expense of reliable FPS on. Box, perhaps VRR is working against Xbox here and devs are factoring that in. Which they shouldn’t IMO.
@Medic_Alert Agree Death’s Door is an almost perfect game.
The second Umbrella only run was possibly more fun than the first. Had to completely change strategy and effectively become a mage, it was a brilliant, and short. Also that feeling of progression from novice to intermediate in first run. Inter to adept in the second. Honestly was almost tempted by a third run to try and flirt with mastery.
Story while light and humorous on the surface had some deep thoughts bubbling underneath. Plus a banging soundtrack.
You say “just another indie” I make my GOTY list every year and for the last 6-7 years half of those have been indies.
@Moto5 Good point! But I was just mentioning it in the name of a simpler time: we just had fun (or not, due to those pesky difficulties!) and that was it. Another example I can think of is Castlevania Symphony of the Night, on PS1 - it had lots and lots of frame-rate drops, but we thoroughly enjoyed it. Let me rephrase that: I know it is important to demand quality for a product you've paid for, but, at the same time, maybe we should relax for a bit and have fun.
I've been playing Elden Ring on my XSS and I've been having a blast - with issues and all!
@OldgamerDave Completely agree. Was making this point the other day when people were saying how can Elden Ring get 9 or 10 out of 10 with technical problems. Bloodborne STILL has uneven frame pacing but is, to many, one of the best games ever made. Ocarina of time ran at 17fps in the UK/PAL (20fps elsewhere), Goldeneye 4 player was often in the single digits yet was the best thing since sliced bread.
My view is as long as it doesn’t affect your enjoyment of the game this stuff shouldn’t matter. That is subjective and the bar is different for everyone. But we definitely hyper focus on it too much nowadays and make Mountains out of molehills.
@OldgamerDave Dwelling on the tech side of gaming is one of the things I enjoy most about it! Hence why I am a big fan of DF. Not because of any console war thing just because I'm interested in how it all works and why some games look/run better than others even though they are using the same hardware, or vice versa. But yes I agree that it only goes so far and it of course shouldn't get in the way of enjoying excellent gameplay.
@themightyant Yeah, and also, some of us have such a limited schedule for gaming that it just seems counter productive to keep so immersed in tech stuff.
@Spaceman-Spiff Really? You have fun with that? That's a new perspective to me. Then again, once I watched a guy saying that spending hours upon hours configuring all possible aspects of pc games was simply his special thing!
Well, more power to you then, friend. Keep enjoying your games and technical analysis!
No it is not a 'life saver' and they should NOT be relying on it. As to use it means buying a whole new TV, and if you've got a nice big high end 4K OLED pre 2021 that is not a cheap 'upgrade' to make, in fact it's the price of a high end gaming PC. At least with VRS they can do it without needing HDMI 2.1, I hope?
@OldgamerDave
Some are more sensitive to framerate. For example Goldeneye on N64 always make me feel sick after a while as did other N64 games. I didn't realize it was framerate related until the Dreamcast released and had way more 60fps games.
Also some games are more sensitive gameplay wise to frame rate. Trying to play SMB3 between 40-60 would make it almost unplayable. Ocarina of time does not require much precision, so the low FPS is manageable gameplay wise. I had also quit playing Bloodborne on PS4 Pro in hopes of an Pro optimization, still waiting..... Was hard to put down but made me sick. I'm honestly more excited for a 60fps version of that then ER, the world and gameplay systems are incredible
@S1ayeR74
Agreed, VRR should not be relied on to smooth out a game. I think a VRR specific graphics option in games would be fine, push up resolution or effects and let the framerate waver between 50-60. But there should be a more solid 60 option for those that don't have VRR.
The other problem with relying on VRR is for when games are targeting 30fps like the Matrix demo. VRR is only effective above 40fps
@Moto5 I'll try and find the video, I'm pretty sure it be, and that I didn't imagine it
Another recent one where the SeriesX had a big lead was um, the Tales game maybe?
@iplaygamesnstuff It's the LG 65UN85006LA
@Savage_Joe Well, we know for a fact the SeriesX has a faster memory bus/bandwidth, and a faster CPU. So what are the advantages you think the PS5 has? As far as I see it, it's a GPU clockspeed advantage, and that's really it. Which obviously wouldn't come close to overriding the huge core advantage, plus memory bandwidth, CPU etc.
That's why it's strange, and can maybe be explained by a more efficient API, or some optimisations that for some reason benefit the PS5 more than XSX.
I know all things being equal in Control for example, the SeriesX had about a 20% advantage, so tallying exactly with the on paper specs.
@Raffles
Found it, was Control with a 16% difference in XSX favor. Ended up getting that game on PS5 as well as the stutters and hitches on XSX drove me crazy, but that is fixed now apparently. The performance advantage here makes sense for XSX as it should have stronger RT performance.
CPU clock and memory bandwidth won't play into performance advantage as much right now as those won't be the primary bottlenecks. GPU seems to be though, and clockrate does have advantages on GPU for many applications. Another thing that probably gets in the way of XSX performance on multiplat games is the dev time to take advantage of its unique hardware. Will probably see more advantage closer to the end of the generation.
@Moto5 Sorry about it, man! I honestly didn't know that frame-rates could induce physical discomfort. I really hope you can enjoy more games without that worry. I hope that more developers are able to lock those frames at 30 or 60 fps, instead of the "all over the place" thing, so everybody can enjoy their games.
@OldgamerDave
I'm not too worried about it honestly, there are so many awesome games to play that perform great. And ways to play old games at higher FPS! VRR on SXS has been awesome, PS5 really needs to get on board.
I had found work arounds to make 30fps games more comfortable once I realized what was happening. Like slowing down camera speed in 3rd person games. One thing I have noticed is 30fps being more jarring on my LG CX compared to my old LCD panel. Apparently the higher image persistence of LCD can smooth it over.
@Moto5 Glad to hear that, friend. The slight possibility of feeling sick is what's kept me away from vr experiences. I can only imagine your situation. But you're absolutely right: there's so much cool games that what we actually lack is the free time for them! On that note, I've been hearing good things about the newly released Shadow Warrior 3: solid pacing and simple gameplay loop - if you're a shooter fan, maybe you could give it a try sometime.
@Moto5 CPU I agree, none of these games are going to be pushing their very good CPUs to their limit, especially at 60fps.
Memory bandwidth on the other hand? It should be significant. Take the 2080ti for example, it's still a beast, able to match or often exceed the performance of the 3070, despite the 3070 having a whopping 50% TFLOP advantage. I think this is in large part due to the 2080ti's own big VRAM bandwidth advantage, almost 40%.
Regarding clockspeed and the XSX having unique hardware, not sure there's any truth to that tbh. I mean, the processing power of GPUs is simply cores * clockspeed * 2. So the SeriesX more than makes up for the clockspeed differential by having way more cores. And it's not like developers need to specifically program for extra cores, like with CPUs. Graphics APIs do that.
For example, my 3070 actually has a slower clock speed than one of my older GPUs, the 1660 Super, yet is obviously much, much faster, due to having literally 4 times as any cores. And clearly older games benefit from those extra cores, it's not just newer games that specifically take advantage of the newer hardware.
That's why it's so surprising the PS5 does often have the edge, I wonder if it's API related or?
@OldgamerDave
Surprisingly I can handle most VR alright. If a game pushes the limits or handles movement poorly I get sick, but the worst part is the "hangover" I get after. I actually find the discomfort of the headset worse than any. Will check out Shadow Warrior 3!
@Raffles i'll give you a good example. Apple chips vs qualcomm chips. Qualcomm chips have significantly higher clock speeds and core counts than apple's, and can handle huge amounts of ram to boot. Yet, the iphone always wins in the benchmarks and can run miles on even the strongest android phone. Both are ARM chips, yet, why does the iphone keeps winning? Simple. Apple's engineers made the os and its tasks so efficient that they don't need that much raw power from their components. That's what I believe Cerney did with the ps5.
@Raffles
API could be a factor. Apparently PS5 development environment is very similar to ps4, and Series consoles is quite different to One consoles. A lot of developers probably also have more experience with Playstation dev environment due to ps4 popularity. But stuff like this will probably also always be case by case for 3rd party. Xbox ahead in some cases, PS in others. For example, Ubisoft games have far better input response time on Series consoles over PS. But in other cases like Call of Duty and Doom PS is more responsive.
Probably also hard to compare things like bandwidth to PC as lots of the architecture is very different. As far as compute units go there is definitely diminishing returns on higher counts right now, work is not spread evenly. Until developers start utilizing compute more some the cores are dead weight sadly.
Crazy you now need to buy more hardware to get around games being released poorly optimised. Even more crazy we’re praising VRR fixing poorly optimised games as a ‘win’ when they’re still taking your money for poorly optimised games.
@Bleachedsmiles
These are kind of separate things. VRR is a big win so far for XSX. How 3rd parties optimize games is not up to Microsoft(as far as I know) but Microsoft has given us a feature to help these games dramatically if you have a VRR display. Admittedly most people do not have one right now, but they will become more common in the future.
VRR is a positive for Xbox, not FromSoft
@Savage_Joe Yes it is interesting quite what a huge advantage iOS has over Android, despite fewer cores etc. I mean now Apple CPUs do have 6 cores and generally higher clockspeeds, but still, on paper they shouldn't have this massive advantage.
By system architecture I thought you meant hardware, but in this case you're more talking about kernel efficiency, in which case yeah the iOS kernel is clearly much faster than Android.
So yes it is possible the PS5 has a more efficient kernel, but that would affect CPU performance more than anything - so primarily games that are very physics or draw call heavy, or even streaming assets.
Which might be the case I guess, leading to slightly more consistent performance in some games, yet in some games that come down to GPU power like Control and Hitman 3, the SeriesX comes close to the advantage it has on paper.
@Moto5 Yeah cores do have diminishing returns, but at the same time they are the core horsepower of a card, and don't need to be specifically programmed for like CPU cores do.
The 3090 is a prime example - it has a slower core clock than both the SeriesX and PS5, yet its mammoth 10000 cores lead it to be somewhere close to twice as fast as both of them
@Raffles
I think you may be confusing Cuda cores with compute units. A compute unit could have different values. For example one CU on RDNA 2 could be 'worth' 1.5 on GCN, the cluster size of CU's can vary.
And it is also difficult to compare these directly to PC components, they do many thing differently.
Your example of the 3090 is a good showing of how Tflop performance dosen't always equate to real world performance. The 3090 would be almost 2x the FPS of a 3070 in most games, but it's not. In some cases only 20% faster
I'm not confusing them
A Compute Unit is just a cluster of individual cores (which are essentially an FPU). NVIDIA just happen to give cores rather than CUs, as it's a bigger number.
Even though as you say different generation architectures will be able to do more per core per cycle (just like generations of CPUs), all I'm saying is cores/CUs are the heart of a GPU and are responsible for all the number crunching per vertex and per pixel.
Which is why despite the clear law of diminishing returns (even in identical architecture such as 3070 vs 3090) the 3090 still beats everything thanks to its massive core count.
But yeah I guess it's true the 3090 is a good example of how you can't expect TFLOPS to tell the whole story with performance, it seems the higher you go the more the law of diminishing returns applies. On paper, the 3090 should be almost twice as fast as the 3070, but in reality it's what, about 33% faster at 4k?
If anything it makes it surprising that in some games like Control and Hitman 3, the SeriesX does seem to come close to the 20% TFLOP advantage it has.
I find the difference between XSX and PS5 quite big, much more dramatic than Cyberpunk for exemple. Sony have begin rolling out their VRR update on their TVs so the PS5 itself shouldn’t far behind but I still think that falling back on VRR to have “steady” framerate just excuses poor optimization. Fromsoft games are always all over the place in term of framerates when they come out (not counting remastered and remakes here) and they don’t really care about really fixing it. They are master artists and game designers but they have a ***** engine to work with it seems
@Raffles
Yeah, that is how I understand cuda cores and CU's as well. It is possible PS5 has more cores per CU than XSX to make a performance difference, but I'm not sure if that is the case.
Think Hitman was more stable FPS on PS5 but at 1800p. Less stable on XSX but 2160p.
I really don't think we will see any big differences in 3rd party games on the consoles. Often XSX is better resolution and PS5 better FPS, but even that is not always the case. We will probably never be able to compare them with games that really take advantage of the hardware strengths as those will be first party or exclusive games
Honestly sucks seeing things like this happen on the regular. With Elden Ring specifically, had three friends ditch their Xboxes, 2 S and 1 X and replaced them with a PS5 since that's the only way to get a stable 60fps frame rate.
@Spaceman-Spiff
Elden Ring is a cross gen game. Almost all cross gen games perform better on the PS5. Reason: "Old" software architecture is easier boosted with more MHz (PS5). It's brute forcing the FPS up. If developers want to take advantage of more cores (XSX), they would have to change a lot of the code basis. It's like a surgery on the open heart.
If you have a current gen game, it's more likely to be build from the ground up, with the specifics of each new console in mind. It's pretty simple really. If a game is not tailored according to the available tech, it will run like crap. See PCs.
So it runs better on PS5 even tho PS5 doesn’t have VRR yet???
@sjbsixpack
Performance mode on Series X with VRR is the best if you want to play the current gen version. You can still feel and see frame rate drops if you are sensitive to that, but it does not stutter like it will in PS5 without VRR. It is a pretty good experience.
If PS5 gets VRR that version would be a better, but not by a huge amount. The most fluid way to play the game currently is PS4 Pro version on PS5
@Moto5 yeah, i heard that about PS4 version. I pre ordered Elden Ring on Series X simply because i had Horizon an Gran Turismo 7 coming on PS5, share the memory. I think Elden ring looks great on both machines, i don’t see a noticeable difference myself.
@sjbsixpack
Yeah, XSX and PS5 look the same for settings. I have Elden Ring on my XSX, but going to wait on playing it for a while. Wait for bugs and performance to get fixed up hopefully
Show Comments
Leave A Comment
Hold on there, you need to login to post a comment...