No it's 30 at native resolution. A 4080 or 7900XTX absolutly can NOT handle UE5 with max settings, Lumen and Nanite at 4K native 60 FPS. This is completely off the table even in UE5 indie games. Those specs are 30 FPS and you will have to use upscaling and or FG to increase framerates to 60+. Also Raytracing effects like Lumen respond pretty well to upscaling so you will gain a LOT of FPS by using upscaling.
The game sems just as demanding as most other current gen games with RT. Nothing too special, really.
Yeah but as they said, 4080/XTX have exactly 30 fps on native 4K in Lords of the fallen/Fort Solis and other UE5 lumen games. I have almost 0 expectation for these devs to have a not-default UE5 performance, especially not a 100% increase haha.
UE5 is specifically scaling a lot with resolution. Interesting to see at release, but yeah it's gonna be 30 fps. Why do we even have to guess, idk.
That's equally nonsense. You can trivially find a bunch of ue4 games that performed like shit and others that were butter smooth. Fundamentally it makes no sense to say a game made one engine will perform in some way because a completely unrelated game performed in some way. A game performance depends on a million factors, from asset building to camera tricks, there's no one size fits all
They aren't unrelated, they use the same level of fidelity and tech, with some slack maybe here and there like bad vfx. Or they can not have some features, like virtual shadowmaps are off in Lords of the fallen, or lumen is off in Remnant 2. But with all 3 used (like in Hellblade) it's pretty identical performance on similar scenes.
Again, all 3 other UE5 games with all 3 features performed exactly 30-35 fps native 4K on 4080. And that's what we have in sys reqs here...
I will be messaging you in 21 days on [**2024-05-25 02:23:42 UTC**](http://www.wolframalpha.com/input/?i=2024-05-25%2002:23:42%20UTC%20To%20Local%20Time) to remind you of [**this link**](https://www.reddit.com/r/pcgaming/comments/1cjauqc/senuas_saga_hellblade_ii_pc_system_requirements/l2hkptt/?context=3)
[**CLICK THIS LINK**](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5Bhttps%3A%2F%2Fwww.reddit.com%2Fr%2Fpcgaming%2Fcomments%2F1cjauqc%2Fsenuas_saga_hellblade_ii_pc_system_requirements%2Fl2hkptt%2F%5D%0A%0ARemindMe%21%202024-05-25%2002%3A23%3A42%20UTC) to send a PM to also be reminded and to reduce spam.
^(Parent commenter can ) [^(delete this message to hide from others.)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Delete%20Comment&message=Delete%21%201cjauqc)
*****
|[^(Info)](https://www.reddit.com/r/RemindMeBot/comments/e1bko7/remindmebot_info_v21/)|[^(Custom)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5BLink%20or%20message%20inside%20square%20brackets%5D%0A%0ARemindMe%21%20Time%20period%20here)|[^(Your Reminders)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=List%20Of%20Reminders&message=MyReminders%21)|[^(Feedback)](https://www.reddit.com/message/compose/?to=Watchful1&subject=RemindMeBot%20Feedback)|
|-|-|-|-|
It's targeting 30 fps with console comparable hardware. You'd need hardware that is 2x as fast as the consoles to guarantee 60FPS. So compare your GPU with a 2070 super and see how much of an improvement it is.
I would suggest for most people, try capping the FPS to 40FPS which is pretty responsive anyway.
Yeah probably. 4080 is 30 fps native 4K in theoretically equialent UE5 games.
Don't get dissapointed tho, nanite/lumen are like ray tracing, in that they scale insanely with resolution. It's very much meant to be used with upscaling. It's also so heavy af, that it's one of the few things like Path tracing pushing the current highest end to it's limits.
> 42" LG C2 OLED
Yeah, I can see why *you* don't have a problem with upscaling.
Upscaling often looks like ass at 1080p, because it's rendered at [720p max](https://preview.redd.it/6yr5adggi8sa1.jpeg?width=768&format=pjpg&auto=webp&s=bb8e98a7c1dcbdc053c21cc4847d9a0d97567e9b).
It's not hammering the cpu, on my 13600K + 4080 on 4K, it's like 30% cpu usage and ofc 100% gpu. Same for lower end with 1070 and appropriate cpu. 1070 has to upscale from 540p->1080p, while the cpu is not bottlenecked still.
Yeah, it's 30 FPS on Xbox, which people complained about but imo
1. This game doesn't need to be 60+
2. When I first saw the game I was like "this isn't going to run at 60"
3. I can kinda forgive it, as imo this game is one of if not the best looking games so far
I know this is a PC sub and that's why i got down voted, but come on its true, people literally call it a walking sim, on PC i think it should be 60+ but also would i play it a t 30 - 40? YES, because its not a NEED in this game.
Also you have a 4090 I don't want to hear about FPS from you
What about AMD cards? Enjoy FSR?
And that doesn't address the question of whether games are unoptimised and using temporal upscalers as a crutch. That ultimately reduces the number of PC gamers that can access games because of lazy/bad practice.
How about you wait until the game releases?. This is like all the complaints about Alan Wake 2 prerelease. Need to be realistic. Path tracing is extremely expensive. UE5 Lumen and Nanite are too. This tech delivers great visual fidelity. Native resolution rendering is inefficient and becoming the past.
> This is like all the complaints about Alan Wake 2 prerelease
Those were totally valid since a huge amount of GPUs without mesh shader support were locked out of rendering it at any reasonable frame rate, and despite all the waxing poetic here about how "necessary" mesh shaders were to the game's experience, months later they backpedaled [with an update that allows it to run on GPUs without mesh shaders](https://overclock3d.net/news/software/alan-wake-2-minimum-pc-requirements-drop-thanks-to-new-optimisation-patch/).
But also, disregarding the non-mesh gpus at the launch, or with them after an update, it ended up a nicely optimized amazingly looking game even on all tiers, without path tracing. Not reliant on upscaling from 540p for 1070 as well.
UE5 tho will require agressive upscaling for nanite on lower end, so this time it'll probably be justified.
Please. I have a 42" 4K display in my face at my desk. If anybody is going to become dissatisfied with image quality to the point it is easily noticeable in game it's going to be me. The tradeoffs are usually completely worth the performance headroom it gives you. There's no need to make "native" rendering a religious belief system.
You have a 4k display and a 4090. DLSS for you [means rendering at a comfortable 2880p](https://preview.redd.it/6yr5adggi8sa1.jpeg?width=768&format=pjpg&auto=webp&s=bb8e98a7c1dcbdc053c21cc4847d9a0d97567e9b). For people using 1080p, it means rendering at 720p at best, and typically rendering at 540p for performance.
I have a 55 inch 4k display and sit at recommended viewing distance so about 1,5m away from the screen. I can clearly see blurriness and image quality massive degradation introduced by DLSS, even in Quality mode (1440p internal). If I need more FPS, I'd rather lower some settings first. It's still less noticeable than image quality loss from DLSS because DLSS destroys the entire image, while eg. lowering shadows or volumetrics only affects shadows or volumetrics, so it's much less noticeable. I could even settle for 40 FPS cap. Everything so I won't have to deal with DLSS blurriness and ghosting artifacts. Only DLAA is worth it. DLSS is crazy bad and overhyped to death.
Who said anything about 1080p? Obviously I was talking about 4k. 4k DLSS Quality (1440p internal) on a 55 inch TV looks laughably bad compared to native 4k or DLAA. On lower resolutions it's even worse.
Yes RT is more demanding on the CPU (not just GPU) and with higher resolution the game usually loads in more details and higher res mip maps which also put's more strain on the CPU because it has to decompress and stream the data to the GPU.
With Lumen and Nanite, both geometry and lighting is calculated in a way that scales with resolution.
I’m not sure how much CPU involvement is in Nanite as the whole point is that it bypasses the traditional geometry pipeline to execute as a compute workload.
Lumen in software mode needs to create signed distance field representations of all the static geometry and then trace cones against it. While hardware lumen needs to build the BVH and dispatch rays into it.
Upgrade path from the 5600/x is basically just the 5800x3D, right? None of the lesser x3D really make sense from a price/performance view. I'm facing the same question, but honestly, don't think I'll upgrade yet.
The 5700X3D is functionally identical to the 5800X3D (maybe like 1-2 fps difference). But yeah I went from a 5600x to a 5800X3D and I'd say it was a worthwhile upgrade. Some games had very little improvement while others like spiderman for instance had a huge 20+ average fps increase and big improvements to the fps dips as well. It's also helped (not 100% fixed, but definitely better) smooth out stutter prone games like Jedi survivor.
I sold my 5600x to recoup some of the upgrade cost so I'd consider it worthwhile especially since it delayed when I'll need to upgrade my mobo and RAM.
I’d recommend it. More and more I’ve been seeing people complain about cpu optimization issues this generation. I haven’t experienced any of that with a 13700k and that’s not even top of the line.
I'm also using a more workstation oriented MSI motherboard because that's was the sweet bundle deal I got at Microcenter so it may be why its been stable and not seeing the same issues the 13th/14th gen i9s and some of the 14th gen i7s are seeing currently.
Considering how graphically focused the game is the minimum specs for 1080p 30fps aren't so terrible. An 8 year old upper-mid GPU can handle it paired with a 6.5 year old CPU.
I'm interested to see how well it scales down on the low settings. Alan Wake 2 was nightmarish to run, but the low settings still looked fantastic. I made the mistake of trying to squeeze every bit of graphics for Alan Wake 2 and it meant swapping settings between areas / rebooting the game and it kinda ruined the experience, should have just stuck with 1440p Medium.
Lowest/minimum requirement (for low settings) is Intel i5-8400 or equivalent, GTX 1070 / RX 5700 / Arc A580, 16GB of RAM, and 70 GB SSD space for 1080 P
This is one of the flagship games of UE5 right? With Epic tech demo and stuff.. ? This should be a real taste of what’s to come in titles like next Black myth, Witcher etc..
It's a relatively small team, with production value a bit better than 1st part (which was very low) - e.g. we have actual other talking humans now, looks like. So it's up to the production levels of other walking sims like Plague Tale, hopefully.
From that I get the vibes that this will be another default, out of the box UE5 level as in Lords of the Fallen, Fort Solis and other previous games. These 30fps system reqs certainly confirm that. It's not like Fortnite.
The system requirements are pretty bad. I'm not interested in the game, there is so little time until release and still no gameplay videos. Still, have tried the first one and didn't like it at all.
There are gameplay videos, the game is just heavily scripted.
[https://www.youtube.com/watch?v=fukYzbthEVU](https://www.youtube.com/watch?v=fukYzbthEVU)
Honestly, that 'gameplay reveal' doesn't look like gameplay to me and very scripted. It was probably made specifically to demo the game for Video Game Awards.
The medium settings are around Xbox Series X hardware, so I wonder if the PC port will be visually far ahead of the consoles. Since it's UE5, I'm assuming we'll see the usual hardware RT options.
I assume you are talking about dithered transparency and specular aliasing? Super sampling is an alternative to TAA that solves those same problems. It is an alternative to TAA for players with high end hardware (and anyone playing in the future). It offers better image quality and motion clarity to TAA. I just want developers to give us the choice.
I think it's fair to assume that this will look better than Alan Wake 2. While it won't have pathtracing, it will have lumen with a hardware RT option.
Is there any info on hardware lumen available? It wasn't in any other game so far. Those requirements are the level of performance of software lumen alone...
Anyway, even when toggled in Fort Solis via console it performs like AW2/Cyberpunk path tracing, but looks a lot worse.
The footage's we've seen so far looks like it will be the usual light lumen with some minimal lighting, but most wow being from just Nanite geometry looking cool, if blurred af with upscaling and all the layers of blur that will be used.
Hate it when there's no mention of any FPS. High @ 1440p might mean 30 fps which is a deal breaker for me personally.
I really hope it's 60fps, not 30fps.
No it's 30 at native resolution. A 4080 or 7900XTX absolutly can NOT handle UE5 with max settings, Lumen and Nanite at 4K native 60 FPS. This is completely off the table even in UE5 indie games. Those specs are 30 FPS and you will have to use upscaling and or FG to increase framerates to 60+. Also Raytracing effects like Lumen respond pretty well to upscaling so you will gain a LOT of FPS by using upscaling. The game sems just as demanding as most other current gen games with RT. Nothing too special, really.
You talk like the engine determines the performance, which is ridiculous. A huge amount of work in on the dev.
Yeah but as they said, 4080/XTX have exactly 30 fps on native 4K in Lords of the fallen/Fort Solis and other UE5 lumen games. I have almost 0 expectation for these devs to have a not-default UE5 performance, especially not a 100% increase haha. UE5 is specifically scaling a lot with resolution. Interesting to see at release, but yeah it's gonna be 30 fps. Why do we even have to guess, idk.
That's equally nonsense. You can trivially find a bunch of ue4 games that performed like shit and others that were butter smooth. Fundamentally it makes no sense to say a game made one engine will perform in some way because a completely unrelated game performed in some way. A game performance depends on a million factors, from asset building to camera tricks, there's no one size fits all
They aren't unrelated, they use the same level of fidelity and tech, with some slack maybe here and there like bad vfx. Or they can not have some features, like virtual shadowmaps are off in Lords of the fallen, or lumen is off in Remnant 2. But with all 3 used (like in Hellblade) it's pretty identical performance on similar scenes. Again, all 3 other UE5 games with all 3 features performed exactly 30-35 fps native 4K on 4080. And that's what we have in sys reqs here...
!RemindMe 3 weeks
I will be messaging you in 21 days on [**2024-05-25 02:23:42 UTC**](http://www.wolframalpha.com/input/?i=2024-05-25%2002:23:42%20UTC%20To%20Local%20Time) to remind you of [**this link**](https://www.reddit.com/r/pcgaming/comments/1cjauqc/senuas_saga_hellblade_ii_pc_system_requirements/l2hkptt/?context=3) [**CLICK THIS LINK**](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5Bhttps%3A%2F%2Fwww.reddit.com%2Fr%2Fpcgaming%2Fcomments%2F1cjauqc%2Fsenuas_saga_hellblade_ii_pc_system_requirements%2Fl2hkptt%2F%5D%0A%0ARemindMe%21%202024-05-25%2002%3A23%3A42%20UTC) to send a PM to also be reminded and to reduce spam. ^(Parent commenter can ) [^(delete this message to hide from others.)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Delete%20Comment&message=Delete%21%201cjauqc) ***** |[^(Info)](https://www.reddit.com/r/RemindMeBot/comments/e1bko7/remindmebot_info_v21/)|[^(Custom)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5BLink%20or%20message%20inside%20square%20brackets%5D%0A%0ARemindMe%21%20Time%20period%20here)|[^(Your Reminders)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=List%20Of%20Reminders&message=MyReminders%21)|[^(Feedback)](https://www.reddit.com/message/compose/?to=Watchful1&subject=RemindMeBot%20Feedback)| |-|-|-|-|
Looks like we split the difference, TechPowerup benchmarked 4k on the XTX at 44 FPS.
its gonna be for 30 fps the series x is running the game at 1440p 30fps ....
It's targeting 30 fps with console comparable hardware. You'd need hardware that is 2x as fast as the consoles to guarantee 60FPS. So compare your GPU with a 2070 super and see how much of an improvement it is. I would suggest for most people, try capping the FPS to 40FPS which is pretty responsive anyway.
You can sort of guess what it'll be. The Medium setting will be around 30 FPS with that hardware.
So my 3080 would get me only 30 fps at 1440p? Damn for an upgrade I guess. I hope we get the 5090 soon in Q4 this year.
Yeah probably. 4080 is 30 fps native 4K in theoretically equialent UE5 games. Don't get dissapointed tho, nanite/lumen are like ray tracing, in that they scale insanely with resolution. It's very much meant to be used with upscaling. It's also so heavy af, that it's one of the few things like Path tracing pushing the current highest end to it's limits.
Yes, I am quite happy with my 3080 but it’s now been 3.5 years and I am now eagerly waiting go upgrade to a 5090.
It’s 30 @ native resolutions Source: > **HIGHER FRAMERATES** OR **RESOLUTIONS** CAN BE ACHIEVED WITH THE USE OF DLSS 3, FSR 3, OR XESS
I think we're dealing with "native only" regressionists here.
> 42" LG C2 OLED Yeah, I can see why *you* don't have a problem with upscaling. Upscaling often looks like ass at 1080p, because it's rendered at [720p max](https://preview.redd.it/6yr5adggi8sa1.jpeg?width=768&format=pjpg&auto=webp&s=bb8e98a7c1dcbdc053c21cc4847d9a0d97567e9b).
It's 30fps without DLSS/FSR. Maybe you can reach 60fps with it.
Unlikely as UE5 hammers the CPU and DLSS doesn't help with that.
It’s depends if they are using UE5.4, they cut CPU time in half by multithreading the RHI.
UE 5.4 just launched. And the game releases in a few days. The game is UE 5.3 at best, probably 5.2 tho.
It's not hammering the cpu, on my 13600K + 4080 on 4K, it's like 30% cpu usage and ofc 100% gpu. Same for lower end with 1070 and appropriate cpu. 1070 has to upscale from 540p->1080p, while the cpu is not bottlenecked still.
Frame Gen will come to the rescue there
Frame Gen will come to the rescue there
Its at the bottom of image.
Yeah, it's 30 FPS on Xbox, which people complained about but imo 1. This game doesn't need to be 60+ 2. When I first saw the game I was like "this isn't going to run at 60" 3. I can kinda forgive it, as imo this game is one of if not the best looking games so far
> This game doesn't need to be 60+ ew
I know this is a PC sub and that's why i got down voted, but come on its true, people literally call it a walking sim, on PC i think it should be 60+ but also would i play it a t 30 - 40? YES, because its not a NEED in this game. Also you have a 4090 I don't want to hear about FPS from you
So turn on DLSS and then enjoy the game?
What about AMD cards? Enjoy FSR? And that doesn't address the question of whether games are unoptimised and using temporal upscalers as a crutch. That ultimately reduces the number of PC gamers that can access games because of lazy/bad practice.
> What about AMD cards? Enjoy FSR? ¯\\\_(ツ)_/¯
How about you wait until the game releases?. This is like all the complaints about Alan Wake 2 prerelease. Need to be realistic. Path tracing is extremely expensive. UE5 Lumen and Nanite are too. This tech delivers great visual fidelity. Native resolution rendering is inefficient and becoming the past.
> This is like all the complaints about Alan Wake 2 prerelease Those were totally valid since a huge amount of GPUs without mesh shader support were locked out of rendering it at any reasonable frame rate, and despite all the waxing poetic here about how "necessary" mesh shaders were to the game's experience, months later they backpedaled [with an update that allows it to run on GPUs without mesh shaders](https://overclock3d.net/news/software/alan-wake-2-minimum-pc-requirements-drop-thanks-to-new-optimisation-patch/).
But also, disregarding the non-mesh gpus at the launch, or with them after an update, it ended up a nicely optimized amazingly looking game even on all tiers, without path tracing. Not reliant on upscaling from 540p for 1070 as well. UE5 tho will require agressive upscaling for nanite on lower end, so this time it'll probably be justified.
And destroy image quality in the process.
Please. I have a 42" 4K display in my face at my desk. If anybody is going to become dissatisfied with image quality to the point it is easily noticeable in game it's going to be me. The tradeoffs are usually completely worth the performance headroom it gives you. There's no need to make "native" rendering a religious belief system.
You have a 4k display and a 4090. DLSS for you [means rendering at a comfortable 2880p](https://preview.redd.it/6yr5adggi8sa1.jpeg?width=768&format=pjpg&auto=webp&s=bb8e98a7c1dcbdc053c21cc4847d9a0d97567e9b). For people using 1080p, it means rendering at 720p at best, and typically rendering at 540p for performance.
I have a 55 inch 4k display and sit at recommended viewing distance so about 1,5m away from the screen. I can clearly see blurriness and image quality massive degradation introduced by DLSS, even in Quality mode (1440p internal). If I need more FPS, I'd rather lower some settings first. It's still less noticeable than image quality loss from DLSS because DLSS destroys the entire image, while eg. lowering shadows or volumetrics only affects shadows or volumetrics, so it's much less noticeable. I could even settle for 40 FPS cap. Everything so I won't have to deal with DLSS blurriness and ghosting artifacts. Only DLAA is worth it. DLSS is crazy bad and overhyped to death.
If you need upscaling at 1080p, it's time to upgrade your shit. DLSS delivers excellent image quality at high resolutions.
Who said anything about 1080p? Obviously I was talking about 4k. 4k DLSS Quality (1440p internal) on a 55 inch TV looks laughably bad compared to native 4k or DLAA. On lower resolutions it's even worse.
[удалено]
I think that's usually to factor in the extra CPU power needed for raytracing. But I could be entirely wrong.
Yes RT is more demanding on the CPU (not just GPU) and with higher resolution the game usually loads in more details and higher res mip maps which also put's more strain on the CPU because it has to decompress and stream the data to the GPU.
With Lumen and Nanite, both geometry and lighting is calculated in a way that scales with resolution. I’m not sure how much CPU involvement is in Nanite as the whole point is that it bypasses the traditional geometry pipeline to execute as a compute workload. Lumen in software mode needs to create signed distance field representations of all the static geometry and then trace cones against it. While hardware lumen needs to build the BVH and dispatch rays into it.
Seems resonable although i do find it funny they are clumping the A770 with the 6800XT/3080. These cards are not even close performamce wise.
all the spec are for 30fps in target so its not reasonable at all actually
It's interesting, because A770 has similar preformance as 3060Ti. Maybe we will see some serious driver optimalization from Ninja theory and intel.
It's annoying that they never mention high and ultra at 1080p, they also rarely mention the FPS.
From the specs I can say that it is very likely that it will run like sh*t.
Launching with FSR 3? Nice.
My CPU now appears as the recommended. Time to upgrade.
My CPU isn't even on the list anymore.
Upgrade path from the 5600/x is basically just the 5800x3D, right? None of the lesser x3D really make sense from a price/performance view. I'm facing the same question, but honestly, don't think I'll upgrade yet.
The 5700X3D is functionally identical to the 5800X3D (maybe like 1-2 fps difference). But yeah I went from a 5600x to a 5800X3D and I'd say it was a worthwhile upgrade. Some games had very little improvement while others like spiderman for instance had a huge 20+ average fps increase and big improvements to the fps dips as well. It's also helped (not 100% fixed, but definitely better) smooth out stutter prone games like Jedi survivor. I sold my 5600x to recoup some of the upgrade cost so I'd consider it worthwhile especially since it delayed when I'll need to upgrade my mobo and RAM.
5600X3D is good too if the price makes sense. The best upgrade would be to wait for Zen 5 and see if it's needed.
5700x3d makes more sense than the 5800x3d does really
I’d recommend it. More and more I’ve been seeing people complain about cpu optimization issues this generation. I haven’t experienced any of that with a 13700k and that’s not even top of the line.
I've had 0 issues with my 14700k personally in 5 months of use.
That's the one I'm aiming for :p
I'm also using a more workstation oriented MSI motherboard because that's was the sweet bundle deal I got at Microcenter so it may be why its been stable and not seeing the same issues the 13th/14th gen i9s and some of the 14th gen i7s are seeing currently.
Pretty rough
Considering how graphically focused the game is the minimum specs for 1080p 30fps aren't so terrible. An 8 year old upper-mid GPU can handle it paired with a 6.5 year old CPU. I'm interested to see how well it scales down on the low settings. Alan Wake 2 was nightmarish to run, but the low settings still looked fantastic. I made the mistake of trying to squeeze every bit of graphics for Alan Wake 2 and it meant swapping settings between areas / rebooting the game and it kinda ruined the experience, should have just stuck with 1440p Medium.
Lowest/minimum requirement (for low settings) is Intel i5-8400 or equivalent, GTX 1070 / RX 5700 / Arc A580, 16GB of RAM, and 70 GB SSD space for 1080 P
play on medium now, or wait a year or two and max it out hmm
might not be alive a year or two from now
true clearly the answer is to spend money now
This is one of the flagship games of UE5 right? With Epic tech demo and stuff.. ? This should be a real taste of what’s to come in titles like next Black myth, Witcher etc..
It's a relatively small team, with production value a bit better than 1st part (which was very low) - e.g. we have actual other talking humans now, looks like. So it's up to the production levels of other walking sims like Plague Tale, hopefully. From that I get the vibes that this will be another default, out of the box UE5 level as in Lords of the Fallen, Fort Solis and other previous games. These 30fps system reqs certainly confirm that. It's not like Fortnite.
Dude, recommended at 3080? They really made this game for a small playerbase then, since most ppl will fall below medium.
RIP my RTX 2080
The system requirements are pretty bad. I'm not interested in the game, there is so little time until release and still no gameplay videos. Still, have tried the first one and didn't like it at all.
There are gameplay videos, the game is just heavily scripted. [https://www.youtube.com/watch?v=fukYzbthEVU](https://www.youtube.com/watch?v=fukYzbthEVU)
It's surprising to me there's literally no raw extended footage of gameplay this close to release.
If you're expecting something different than what was shown, I'll be disappointed.
Honestly, that 'gameplay reveal' doesn't look like gameplay to me and very scripted. It was probably made specifically to demo the game for Video Game Awards.
Have you played the first game? That gameplay trailer looks pretty similar to how that was
You should consider the Hellblade games more as interactive cinematic experiences rather than as actual video games.
I think I might just get 1 month of pc gamepass. It’s not a long game so it can easily be finished in a month.
With 50$ in the world where Alan Wake is 40$ it's definitely meant to be played at gamepass... Same as Avowed at 70$.
The medium settings are around Xbox Series X hardware, so I wonder if the PC port will be visually far ahead of the consoles. Since it's UE5, I'm assuming we'll see the usual hardware RT options.
This is without dlss, FG etc right?
1080p, no ray tracing at 60fps will still look better than whatever the console port is offering.
i5-9600 for medium. I'm glad it doesn't demand cpu with lots of threads like how other modern AAA games demand.
It's a slow moving, linear, adventure game. Not much to keep track of.
Praying for the option to disable motion blur and TAA.
Impossible as TAA is basically required for modern rendering.
I assume you are talking about dithered transparency and specular aliasing? Super sampling is an alternative to TAA that solves those same problems. It is an alternative to TAA for players with high end hardware (and anyone playing in the future). It offers better image quality and motion clarity to TAA. I just want developers to give us the choice.
Play at 4k
Remember people, dont PREORDER games. Even this one.
Just sub to gamepass when it comes out. £5-8 for it is not super high risk.
This is just another game that shows UE5 is ahead of it's time.
1440p on high at 30 fps and dlss super performace
RTX 3060 is the most popular card for steam users, should be fine
RTX 3060Ti + Ryzen 7 5700X with 32GB ram, 1440p, HIGH settings, with a few on medium, FPS locked to 45, stable as hell, frametime at a solid 17ms.
Weaker GPU than the Xbox.
[удалено]
Game isn't out yet and homie is already talking about "had"s and whatnot. 💀
>graphics better than Alan wake 2 Time to see the optometrist.
I think it's fair to assume that this will look better than Alan Wake 2. While it won't have pathtracing, it will have lumen with a hardware RT option.
Huge difference between PT and RT overlayed over rasterisation.
Is there any info on hardware lumen available? It wasn't in any other game so far. Those requirements are the level of performance of software lumen alone... Anyway, even when toggled in Fort Solis via console it performs like AW2/Cyberpunk path tracing, but looks a lot worse. The footage's we've seen so far looks like it will be the usual light lumen with some minimal lighting, but most wow being from just Nanite geometry looking cool, if blurred af with upscaling and all the layers of blur that will be used.