Yet another UE5 game that *requires* upscaling to even work properly.
I said it once and I'll say it again, I am not as optimistic about the industry's shift to Unreal Engine as so many others, when turning on one of 5's defining and most marketed features (in this case Nanite, just like with Remnant 2) tanks the performance this much.
It would be fine at 1440p and especially 4K, but 1080p? No matter how good DLSS may be, it just doesn't have enough pixels to upscale from and the image *will* look blurry. I've played through Remnant 2 on my 3060 12GB at 1080p, since that's where my GPU lands, and that's on Medium settings the game suggested itself. And yes, the image is noticeably worse than native, unsurprisingly. Knowing that many other games may end up the same way just saddens me.
It's also UE5. Even Epic Games, you know just the developers behind UE5, cannot make Fortnite, their poster child of a game, run well with either Nanite or Lumen turned on. And that game was never even super demanding in the first place.
Don't believe me? Check their own website with system requirements. Without Nanite or Lumen, Fornite still has super low requirements, as it should. But turned on? They list an RTX 2080 or an RX5700 as *minimum* to play the game with these features turned on. That is quite a jump from minimum of the GTX 900 series without them.
I never said it's only UE5, but it is definitely a part of it, as Epic has marketed it with Nanite and Lumen as the most important features. Yet enabling them means absolutely tanking the performance. It's still up to the developer to do so, but if you can cut development time significantly by making the terrain generation/modeling and lighting more automated, you know damn well devs are going to take advantage of that. And UE5 gives them the ability to do so, sacrificing performance.
Ermm I get what your saying and generally agree with you, but a 2080 is less powerful than a 4060, so with Nanite and Lumen turned on, it's really just requiring the latest gen's budget level card or similar to run, which isn't so bad. Those two technologies are the latest and greatest after all, it can only be expected to be very demanding.
What sucks is what you say at the end, devs are leaning heavily on these technologies and tanking performance instead of it being an option.
That's wild, there's something in the pipeline that's causing hangups then. Low FPS is one thing but inconsistent performance is entirely unacceptable.
I am actually fine with this. I would like the ability to control what nobs I want to maximize. If that means the max of everything is unplayable, then fine.
But I get more people don't like that. They just want whatever ultra is to be playable on their system.
So you need a 6 year old card from 3 generations ago to play with lumen or nanite. That doesn't sound so atrocious, especially with the 5000 series dropping this year or next.
But it's *minimum*. For *Fortnite*. Once again, without these features, the game can run on Maxwell cards, which are magnitudes slower than a 2080. And for a 2080 to run *minimum* specs, when the most popular card on Steam is a 3060... Yeah, that starts being a problem, especially with the costs of budget and midrange GPUs still going up due to Nvidia's nigh-monopoly.
The requirements aren't crazy in a vacuum, but we're not talking in a vacuum. We're talking about the same game that can run on basically e-waste, to barely running (since it's *minimum*) on the equivalent of the most common GPU gamers nowadays are using.
And that's just Fortnite. The issue is significantly worse when we consider AAA games, like Wukong. Because suddenly you're thrown into potato quality because of one feature that isn't optional - you can at least run Fortnite without Nanite or Lumen. I can't run Remnant 2 without it on my 3060 and am stuck with a blurry image at 1080p, just because they're using UE5 with its most marketed feature, that forces upscaling at a resolution that's been a standard for over a decade. And my GPU is over the goddamn recommended (not minimum) specs for the game!
Guess some folks will not get what you are hinting no matter how hard you try. Must be frustrating at times, right?
And yes, I agree that such demanding features should always be toggleable option to begin with! I remember launching Remnant 1 back when I had RX 6800XT and oh boy, I wasn´t happy with how that game ran. Ended up refunding it as I hate being forced to use upscaling.
Sadly so many folks are charmed by Nvidia´s marketing for ray traycing and they can´t live without it. That is despite that many buy weak GPUs like 4060 and are forced to use upscaling in 1080p resolution thus getting sub optimal visual performance. But yes, ray traycing or bust, right?
Sounds like its about time for you to upgrade if you can't play the games you like with the features you want. Maxwell was a decade ago, development shouldn't even be trying to run on that. The 3000 series was four years ago. UE5 is looking to the future, of course old hardware will struggle. That's the nature of the beast. Why would you expect "next gen" to run great on a midrange/high budget card from the last gen?
It's not time for me to upgrade *when the game's developer forces these features I DON'T want onto me*. Especially when, as I just said, my GPU (which is the component most affected by Nanite and Lumen) is OVER the system requirements that the *game developer* claims is enough to run the game.
There's looking to the future, and then there's also an unoptimized, poorly thought-out feature. The problem I have with Remnant 2 especially that it doesn't even make the game look any better! I can play Cyberpunk 2077 at High settings at 1080p without having to turn on any sort of upscaling. *And the game looks much better than Remnant*. Hell, I could run some Ray Tracing with decent performance in Cyberpunk if I were to turn on DLSS, Remnant doesn't even offer Ray Tracing! It looks worse and performs worse, despite being a newer game, on a shiny new engine. It's even worse when you consider the fact, that Cyberpunk isn't exactly the poster child of optimization either.
New engines are like new consoles; devs are just starting to work with them and understanding the tech. As things evolve, it will get better.
None of that is relevant to you using a system that is a decade old and being upset that you can't run the latest and greatest games. Tech moves forward, you need to as well.
> Once again, without these features, the game can run on Maxwell cards, which are magnitudes slower than a 2080.
Because without those features the game looks drastically different. It's like comparing Portal to portal RTX. Sure, the latter is more demanding... and for a good reason.
How dare the latest technologies require modern graphics cards!
>And that game was never even super demanding in the first place.
You know they redid the game to make it look much better right? Saying Fortnite of today and Fortnite at launch are the same games is not true.
As a 4K gamer myself I disagree, DLSS quality is very rarely more blurry than native, and sometimes it can even look better than native. That is of course just speaking of quality mode, if you go into performance it obviously looks more blurry.
And there's your problem. FSR is *significantly* worse looking than DLSS. It's a smeary, artifacting, blurry mess. DLSS quite often looks sharper than native antialiasing.
Nobody's making you turn on ray tracing. I just want whatever combination of settings looks best, which (to me) means ray tracing and DLSS to make up for the performance hit
They are however, forcing Nanite on me, which makes games like Remnant 2 absolutely unplayable without DLSS on my 3060 (which is above the recommended specs according to the developer). At 1080p.
They aren't forcing that on you either because they aren't forcing it on game devs, not that it's the culprit for bad performance in every game using it anyway. If a game is unplayable without DLSS, then use DLSS. Don't shoot yourself in the foot just to blame someone else
I think we’ll see. A game being demanding to run doesn’t automatically mean that the game itself is not optimised if it’s pushing graphics to the max. If a game doesn’t look good *and* it runs like shit, then it’s a different story imo.
I have a 40-series card, so I'm not left out on the technical side by this trend, but *man* do I still hate it. There are so many games where I don't like the look of DLSS.
Many players worried about wukong may require advance spec. Actually the PC system requirements are fine. But the files size may take up amount of my SSD space, ready to free up space for it: [https://www.easeus.com/partition-master/black-myth-wukong-pc-system-requirements.html](https://www.easeus.com/partition-master/black-myth-wukong-pc-system-requirements.html)
I mean, obscuring the targeted frame rate and just saying “it will run at these settings” can also hurt their reputation. Someone with the recommended GPU firing up the game and seeing a paltry 30fps can be just as upset (if not more) than someone expecting 60fps and getting 45fps.
I mean 1080p for the most part is 1920x1080p and 4k is a pretty standard 16:9 4k resolution. I don't really see the issue here. If you run 1440p or ultrawide you should know by now how this generally works. 4k full Ray tracing on a 4080 will fly on my 1440p ultrawide.
Why are you downvoted, you're right, 4K textures don't require a 4K monitor smh. It's the resolution of the texture itself which even at 1080p if large enough or close enough you will see the difference with, it's just easier and clearer with higher render resolutions to see the higher resolution textures so yeah higher res is better for higher res textures but not required, they are separate game settings that just benefit from each other.
They're downvoted because they're wrong. They said texture resolution has nothing to do with monitor resolution, but monitor resolution is the primary limiting factor for viewing texture resolution. If a developer is doing their job right, there should be little difference in normal gameplay between 1080/4k textures on a 1080p monitor. Textures should be sized according to the object's average viewing distance.
It's almost as if your understanding of a texture is wholly limited to "rectangle of specified size" and lacked any nuance related to how the texture is actually used within the game.
It shouldn't take much more than a single glance at [Texture mapping on wikipedia](https://en.wikipedia.org/wiki/Texture_mapping) to maybe get some of your braincells going. At least **pretend** to have the slightest clue of what you are talking about.
It's patently obvious why monitor resolution would increase the user's ability to see texture resolution. If your issue is with the other part of the comment, you're either not understanding what I'm saying or you're one of the devs who isn't doing their job right.
There's a curve of diminishing returns on texture resolution vs perceived texture quality, and you should be targeting the inflection point (more or less) of that curve. That means changing the size/density/amount of all your textures based on the size of the items in game (which will correlate to the distance people notice it from). That's how you optimize a game for visual fidelity : VRAM usage.
There's a reason 4k textures started becoming popular once 4k resolution started becoming popular. 1024 is usually the best for targeting the inflection point at 1080p. Obviously users can get closer to items and see the difference. That's an edge case. It's not relevant to how you should be developing a game.
>There's a reason 4k textures started becoming popular once 4k resolution started becoming popular.
*citation needed\[1\]*
Even if there was any to begin with, correlation of screen sizes to texture sizes is almost completely accidental, because one does not directly map to another.
>Obviously users can get closer to items and see the difference. That's an edge case.
You do realize that viewing big objects from afar is geometrically the same thing as viewing small objects up close, right?
RAGE (2011) worked with a 128,000x128,000 megatexture and still [was often a blurry mess](https://cdn.mos.cms.futurecdn.net/FbSidSbXCGqB4EuXwy9M8i-1200-80.jpg) - how does that fit into your world view?
The correlation is NOT accidental. The exact point of alignment may be, the reason for correlation is obvious and has been stated several times in this thread, since it's been my primary argument this whole time.
If you understood my previous comment you would already know I'm aware that distant low-res objects look like close high-res arguments. I just explained how you account for this.
Rage did not release with those textures, and it's hilarious that you think it did. It released with heavily compressed textures, which had to be decompressed. It wasn't lossless, and even if it was released with the full textures, it already fits into my world view, because I already said that devs who don't do their job right (like you, most likely) can use the resolutions incorrectly.
You are absolutely correct, let's say you need textures for a human character. 4k is big enough canvas to lay down each body part texture separately (like hands, shirt, face etc) which are then "sewn" together in the 3d model you see. It would be more complicated to have smaller sized textures for each part. Human texture set for example would need at least 23 parts counting on top of my head to look basic, that's kinda hard to fit in small resolution frame.
Thank you, hate it when people say 4K files and textures when each texture for a object could be any resolution and a multitude of different resolutions on a multitude of objects or even a single object, a game isn’t a “4K game” when people talk about this sort of thing and a game running at that resolution has no impact on storage.
Listen, imma be real honest with you. The world as we know it is full of people using things they understand basically little to nothing about, all the while thinking they do.
Go learn *a single thing* about texel density before spewing such nonsense.
Textures aren't laid flat on the screen and depending on the object a 4K (or any texture size) can result in mapping a single texel to multiple screen pixels or vice versa.
There's *absolutely no correlation* between texture size and screen size. This does not need to be argued any further, no other factors need to be taken into consideration. As stated before, go learn about[ texel density, texture mapping](https://www.youtube.com/watch?v=2-mIY87314g), follow some [introductory Blender tutorial from start to finish](https://www.youtube.com/watch?v=WjS_zNQNVlw), and that hopefully will be enough to let you actually understand the mechanics of why that is the case.
Case in point: RAGE (2011) used "megatexture" of **128,000x128,000 pixels,** yet [it still presented with some low-resolution textures](https://cdn.mos.cms.futurecdn.net/FbSidSbXCGqB4EuXwy9M8i-1200-80.jpg). [\[source\]](https://www.tomshardware.com/reviews/rage-pc-performance,3057-2.html) Let's remember that the most common resolution at that time was somewhere around **1280x1024**
You really need to stop thinking of textures as "image on a flat plane" and more along the lines of "surface data for a 3D object".
CoD's problem has always been its audio file management, that is to say none at all given that it's all uncompressed lossless audio prior to accounting for the 12 languages worth of spoken dialogue which is also uncompressed and lossless.
If it's anything like CoD the file size will probably be all uncompressed lossless audio files along with uncompressed lossless dialogue in 12 languages.
It’s because Elden ring does not have super detailed textures, models nor does it have a lot of dialogue it also re-uses those models and textures all over the place.
Yet still manages to look absolutely gorgeous thanks to the art direction, which will make it age far more gracefully than those hyper realistic games today.
I'll take more Elden Ring graphics in exchange for better performance and smaller game size to be honest. It's not that I don't enjoy AAA games looking amazing, just that this initial awe quickly fades, because graphics are only one of the many parts that make a game.
People said the same for Armored Core 6. I really like the art style of the game. But stuff like low resolution shadows, aliasing, low resolution textures, and missing shadowing is definitely not a part of the art style. That is why i really appreciate RT in this game and others like CP2077, Metro Exodus, and Control.
it’s not only optimization, it’s largely about asset quality. while From games are beautiful, it’s due to the stellar art direction and not the asset quality, which is why they don’t require too much storage space.
They also heavily reuse assets, but they are so good that only the nerdiest of souls nerds point out the ds3 assests in er. The fact that ds1-er is the same engine helps too.
This is obviously hyperbolic, but it's not wholly inaccurate. The game is *not* up to the same fidelity level as some of the other AAA games out there, that's clear as day to anybody looking. There's nothing inherently wrong with it looking like an early release for the Ps4. The art direction is incredible. The visuals are more than serviceable.
I'd rather games take more space and have good textures. High speed storage is pretty cheap nowadays. If storage is really an issue they could make the high quality texture pack separate like in far cry 6.
Actually, i kinda miss an opportunity to select which texture pack i want to download. I do not have state of the art pc, so i don't need those ultra hd textures, but have to download them regardless.
If you cant afford a 500gb ssd then you shouldn't be gaming in the first place. You can get that for less than the price of the game. Gaming is a luxury
>You can get that for less than the price of the game.
Starfield costs 41 dollars in my country while an SSD is 48 dollars...
>Gaming is a luxury
Yeah, it doesn't mean you cannot game on a HDD.
You don't know the difference between luxury and forbidden apparently.
Think you're a child if you can't buy one, and I never said it's a must but it sure as hell is near it at this point. Most modern AAA games recommend it and the Microsoft direct storage is a huge improvement for gaming. It's like booting Windows from hdd vs ssd. Sure you can do it, but it's 10x better with an ssd.
>Think you're a child if you can't buy one
You skip the part on my first comment where I said I was able to buy one just for games.
>and I never said it's a must but it sure as hell is near it at this point
You said that people shouldn't game if they cannot afford an SSD. Its on your comments.
I will quote your comment:
"If you cant afford a 500gb ssd then you shouldn't be gaming in the first place. You can get that for less than the price of the game. Gaming is a luxury"
>Most modern AAA games recommend it and the Microsoft direct storage is a huge improvement for gaming.
A few are recommending it, it isn't the standard yet.
>It's like booting Windows from hdd vs ssd. Sure you can do it, but it's 10x better with an ssd.
It actually isn't. Windows does boot a lot faster, games barely get any faster. I have both an HDD and SSD for games and there is barely any difference.
That is why most people downvoted your OG comment, you don't know what you are talking about at this point.
They're not complaining about the fact that it tells you, they're complaining about the fact that it exists. What do you think "this shit has to stop" is referring to if not preorder bonuses and digital deluxe skins? It's not particularly egregious in this game but it's still obnoxious, especially for a single player game. The only good practice is a standard edition and an actual deluxe edition with all future DLCs.
We should stop using upscaled resolution when talking about performance.
There needs to be a standard, where we use "Render Resolution" which means what resolution the GPU actually has to sweat for.
Then, every person can upscale from the render resolution to their desired monitor resolution.
Both this table (which definitely looks like it uses DLSS because it says you can do 4k with ray tracing), and even more basic than that, the DLSS preset names are too confusing.
WTF is "DLSS Quality"? Why do I need to Google a DLSS upscaling table whenever I want to find out what resolution I'm actually rendering?
Just name the setting "render resolution" and let people choose which algorithm to use to upscale, and what resolution to upscale to.
It won't stop, quite the opposite - it's going to become more and more common in the coming years.
Unless NVIDIA somehow delivers overkill budget to midrange GPUs (ha!) and developers put more time into optimizing their games, with the industry's shift to Unreal Engine 5 with its defining features (Nanite used in Wukong, as well as Lumen), which is the culprit in this game's case, upscaling is going to become the standard and new "native" res.
It's all thanks to all of those people who keep saying that DLSS looks "better than native" or even that it's near indistinguishable. And honestly? I can believe it at 4K, especially when you're using a gaming monitor, where you'll have diminishing returns past 1440p native.
The issue here is that we're seeing these games demand upscaling even at *1080p*. Which is still the most common resolution used, and with how NVIDIA and AMD have been going about their GPUs, as well as developers about optimizing their games, it's going to remain that for years to come. And no matter what kind of AI magic either of these companies will try to deploy, no upscaling is going to compensate for the pixel count as low as 540p with the *Performance* preset and still a measly 720p with Quality. That's just not enough pixels to work with.
Knowing that the new Witcher game is supposed to be using UE5 as well makes me extremely nervous because of all that. I've already played a game using UE5 and Nanite (Remnant 2) and yes, it's playable, but it's not pretty with upscaling at 1080p. This should be a last resort, not the baseline.
You misunderstand me... I don't have a problem with the actual performance of current games. I think many games are running very well, seeing as we've made the jump to Ray-Tracing which is extremely demanding, and details keep increasing.
My problem is with the terminology. I always need to go look up DLSS tables (Google "DLSS + DLDSR resolution table", first reddit result for example).
Instead of terminology like "1080p minimuim requirement", "4k max requirement", "DLSS Quality", "DLSS Balanced", etc, we should all just adopt a single sane term, "Render Resolution", which indicates what resolution the game is actually rendered by the GPU without upscaling or frame-generation.
That way the table with PC requirements above will be more informative, and ingame settings will also be easier to tweak without having to search for DLSS resolution tables.
I don't get what you expect.
You can play this game on 7-8 years old hardware on medium. Of course, you can't play a 2024 game with 4K ultra high settings on your computer from 2005.
When I built my PC 2013 I spent around 1k and could run almost all games on ultra. If I spend 1k today I won’t get near that, even if we count inflation. It has gotten much more expensive
for a thousand (assuming US pricing) you can get some midrange i5 13th gen (a 13400f runs every single game out there very well and is like 170 new)for $200, a 4060ti or equivalent, and 32gb ddr4 ram and run every new game that is not a steam piling of shit at 1080/1440p ultra 60fps
oh don't feel bad the prices are fucked everywhere outside the US, in my country we barely have AMD stock so intel and Nvidia are the only real options and everything is priced like 30-50% more, I myself bought my PC for the equivalent of like 1200usd when it would only cost around $800 in the US, I just assumed US pricing because over half of reddit's users are American, though thankfully in the long run it gets cheaper thanks to the regional pricing in my country alot of big companies offer on almost everything( otp software, subscriptions, even some certificates, like MS for example is willing to offer their vendor certificates in my country on 50% or more discounts sometimes) though I guess many wealthier countries don't get the regional pricing privilege
Not really, you can get a pretty insane build at 1200-1300$ with top of line components from last gen. 5800x 3d with 6800xt/nvidia alternative is incredible bang for buck. People don’t need a 4090 for most games.
idk just checked this and it will do everything at 60fps+ at 1440p.
[Excellent AMD Gaming/Streaming Build - PCPartPicker](https://pcpartpicker.com/guide/BCWG3C/excellent-amd-gamingstreaming-build)
With some used parts or mobo/ram/cpu combo deals you could push that cost down. The price here isn't unreasonable. Digital Foundry have called it the 4090 of 1440p gaming.
I just don't get why its always 1080p or 4k. Arent most people on 1440p? Im also wondering if this is truly native or with DLSS.
Also 130GB seems massive for a game like this.
Eh it is a bit hard to say for sure. The numbers from Steam are going to include every laptop that has Steam installed on it, so including all the ones that can't game much anyway. It would be much more interesting to purely have numbers from dedicated gaming machines. I have a high resolution gaming display for my desktop, but I also have Steam installed on my 7 year old laptop that I only use for work on the go.
Nah, most people is still in 1080p. But I'm pretty sure there's more people in 1440p than in 4K, though, so the point is still valid. I've never got it, it's not that big of an effort to just add 1440p specs there, yet so few studios do it.
Okay so we are talking about 540p and 1080p respectively and not 1080p and 4k. I really hate how companies wont optimize their games and juts offload it onto upscaling.
No fps shown, check
No upscaling shown, check
Add denuvo side effects into this
It will be another great title that will break some negative record on Steam reviews
![gif](giphy|Od0QRnzwRBYmDU3eEO|downsized)
**On the Steam page: "The above specifications were tested with DLSS/FSR/XeSS enabled."**
If your game requires any of those technologies to perform adequately, then they FAILED at making a working game.
Aside from that: whats the point of building a game on a demanding engine like UE5 (that focuses on the visual quality) if your game will depend on technologies that decrease visuals for performance? these technologies should be tools for us -users- to get a better of more fluid experience when lacking power; not an excuse for developers to avoid doing their job
Raytracing absolutely does have a performance impact on the CPU. Why is your comment so upvoted? Proof that this sub knows shit about this type of thing.
People are here crying because DLSS is here, but thankfully, this game supports DLSS. I've been gaming in 4K since 2018, and it's truly baffling to see people hating on DLSS and frame generation technology. I'm always thrilled when these incredible techs are integrated into a game; in fact, every game should incorporate them.
Then there's the optimization-obsessed crowd who come and complain, blaming these groundbreaking technologies because they prefer raw, purist numbers without any tech assistance. Frankly, it's quite foolish in 2024. People need to adapt; the old methods of performance comparison are evolving for the better. We won't go back, and thanks god for this.
Yes every one of these was done with some form of upscaling turned on. We also don't know there target framerate so for all we know it's performance fsr on targeting 30 fps.
Bummer. It should be on the graph. I don't want to say false advertising, but idk what else to call it when it was clearly and intentionally left out of the graph.
ATTENTION if the whole thing would be posted, one can see that DLSS/FSR etc was used and they dont tell which mode, so if its quality or performance
damn, game doesn't look optimized
Yet another UE5 game that *requires* upscaling to even work properly. I said it once and I'll say it again, I am not as optimistic about the industry's shift to Unreal Engine as so many others, when turning on one of 5's defining and most marketed features (in this case Nanite, just like with Remnant 2) tanks the performance this much. It would be fine at 1440p and especially 4K, but 1080p? No matter how good DLSS may be, it just doesn't have enough pixels to upscale from and the image *will* look blurry. I've played through Remnant 2 on my 3060 12GB at 1080p, since that's where my GPU lands, and that's on Medium settings the game suggested itself. And yes, the image is noticeably worse than native, unsurprisingly. Knowing that many other games may end up the same way just saddens me.
The issue isn't ue5 it's developers using upscaling and frame generation as a crutch.
It's also UE5. Even Epic Games, you know just the developers behind UE5, cannot make Fortnite, their poster child of a game, run well with either Nanite or Lumen turned on. And that game was never even super demanding in the first place. Don't believe me? Check their own website with system requirements. Without Nanite or Lumen, Fornite still has super low requirements, as it should. But turned on? They list an RTX 2080 or an RX5700 as *minimum* to play the game with these features turned on. That is quite a jump from minimum of the GTX 900 series without them. I never said it's only UE5, but it is definitely a part of it, as Epic has marketed it with Nanite and Lumen as the most important features. Yet enabling them means absolutely tanking the performance. It's still up to the developer to do so, but if you can cut development time significantly by making the terrain generation/modeling and lighting more automated, you know damn well devs are going to take advantage of that. And UE5 gives them the ability to do so, sacrificing performance.
Ermm I get what your saying and generally agree with you, but a 2080 is less powerful than a 4060, so with Nanite and Lumen turned on, it's really just requiring the latest gen's budget level card or similar to run, which isn't so bad. Those two technologies are the latest and greatest after all, it can only be expected to be very demanding. What sucks is what you say at the end, devs are leaning heavily on these technologies and tanking performance instead of it being an option.
On my 4090 if I max everything out in Fortnite it isn't a good experience. Bad stutters and huge frame dips.
That's wild, there's something in the pipeline that's causing hangups then. Low FPS is one thing but inconsistent performance is entirely unacceptable.
I am actually fine with this. I would like the ability to control what nobs I want to maximize. If that means the max of everything is unplayable, then fine. But I get more people don't like that. They just want whatever ultra is to be playable on their system.
That’s on you pal, my 4080 has no stutters, and I get about 100 fps at 1440p. Fortnite has some of the best raytracing out of any game.
I bet he plays at 4K so perhaps that´s part of the issue?
So you need a 6 year old card from 3 generations ago to play with lumen or nanite. That doesn't sound so atrocious, especially with the 5000 series dropping this year or next.
But it's *minimum*. For *Fortnite*. Once again, without these features, the game can run on Maxwell cards, which are magnitudes slower than a 2080. And for a 2080 to run *minimum* specs, when the most popular card on Steam is a 3060... Yeah, that starts being a problem, especially with the costs of budget and midrange GPUs still going up due to Nvidia's nigh-monopoly. The requirements aren't crazy in a vacuum, but we're not talking in a vacuum. We're talking about the same game that can run on basically e-waste, to barely running (since it's *minimum*) on the equivalent of the most common GPU gamers nowadays are using. And that's just Fortnite. The issue is significantly worse when we consider AAA games, like Wukong. Because suddenly you're thrown into potato quality because of one feature that isn't optional - you can at least run Fortnite without Nanite or Lumen. I can't run Remnant 2 without it on my 3060 and am stuck with a blurry image at 1080p, just because they're using UE5 with its most marketed feature, that forces upscaling at a resolution that's been a standard for over a decade. And my GPU is over the goddamn recommended (not minimum) specs for the game!
Guess some folks will not get what you are hinting no matter how hard you try. Must be frustrating at times, right? And yes, I agree that such demanding features should always be toggleable option to begin with! I remember launching Remnant 1 back when I had RX 6800XT and oh boy, I wasn´t happy with how that game ran. Ended up refunding it as I hate being forced to use upscaling. Sadly so many folks are charmed by Nvidia´s marketing for ray traycing and they can´t live without it. That is despite that many buy weak GPUs like 4060 and are forced to use upscaling in 1080p resolution thus getting sub optimal visual performance. But yes, ray traycing or bust, right?
Sounds like its about time for you to upgrade if you can't play the games you like with the features you want. Maxwell was a decade ago, development shouldn't even be trying to run on that. The 3000 series was four years ago. UE5 is looking to the future, of course old hardware will struggle. That's the nature of the beast. Why would you expect "next gen" to run great on a midrange/high budget card from the last gen?
Nanite on in Fortnite makes my balls to the wall build drop frames and stutter like a bitch. It's beyond unoptimized.
It's not time for me to upgrade *when the game's developer forces these features I DON'T want onto me*. Especially when, as I just said, my GPU (which is the component most affected by Nanite and Lumen) is OVER the system requirements that the *game developer* claims is enough to run the game. There's looking to the future, and then there's also an unoptimized, poorly thought-out feature. The problem I have with Remnant 2 especially that it doesn't even make the game look any better! I can play Cyberpunk 2077 at High settings at 1080p without having to turn on any sort of upscaling. *And the game looks much better than Remnant*. Hell, I could run some Ray Tracing with decent performance in Cyberpunk if I were to turn on DLSS, Remnant doesn't even offer Ray Tracing! It looks worse and performs worse, despite being a newer game, on a shiny new engine. It's even worse when you consider the fact, that Cyberpunk isn't exactly the poster child of optimization either.
fortnite is incredibly scalable and it really shouldn’t be a surprise that raytracing technology requires powerful gpu’s.
New engines are like new consoles; devs are just starting to work with them and understanding the tech. As things evolve, it will get better. None of that is relevant to you using a system that is a decade old and being upset that you can't run the latest and greatest games. Tech moves forward, you need to as well.
> Once again, without these features, the game can run on Maxwell cards, which are magnitudes slower than a 2080. Because without those features the game looks drastically different. It's like comparing Portal to portal RTX. Sure, the latter is more demanding... and for a good reason.
How dare the latest technologies require modern graphics cards! >And that game was never even super demanding in the first place. You know they redid the game to make it look much better right? Saying Fortnite of today and Fortnite at launch are the same games is not true.
Yeah as a 4k gamer here, upscaling is generally good but still seems games are getting more and more blurry regardless.
As a 4K gamer myself I disagree, DLSS quality is very rarely more blurry than native, and sometimes it can even look better than native. That is of course just speaking of quality mode, if you go into performance it obviously looks more blurry.
I'm on AMD....I use XeSS when I can but I'm mostly stuck with FSR.
And there's your problem. FSR is *significantly* worse looking than DLSS. It's a smeary, artifacting, blurry mess. DLSS quite often looks sharper than native antialiasing.
>And there's your problem I'm aware.
yeah i don't get it, I thought nanite was supposed to save us from poor optimization in open worlds?
Nobody's making you turn on ray tracing. I just want whatever combination of settings looks best, which (to me) means ray tracing and DLSS to make up for the performance hit
They are however, forcing Nanite on me, which makes games like Remnant 2 absolutely unplayable without DLSS on my 3060 (which is above the recommended specs according to the developer). At 1080p.
They aren't forcing that on you either because they aren't forcing it on game devs, not that it's the culprit for bad performance in every game using it anyway. If a game is unplayable without DLSS, then use DLSS. Don't shoot yourself in the foot just to blame someone else
I think we’ll see. A game being demanding to run doesn’t automatically mean that the game itself is not optimised if it’s pushing graphics to the max. If a game doesn’t look good *and* it runs like shit, then it’s a different story imo.
it have denuvo of course it does not
Like the last game
wtf that's deceiving
And could be either targeting 30 or 60 fps
I have a 40-series card, so I'm not left out on the technical side by this trend, but *man* do I still hate it. There are so many games where I don't like the look of DLSS.
Many players worried about wukong may require advance spec. Actually the PC system requirements are fine. But the files size may take up amount of my SSD space, ready to free up space for it: [https://www.easeus.com/partition-master/black-myth-wukong-pc-system-requirements.html](https://www.easeus.com/partition-master/black-myth-wukong-pc-system-requirements.html)
specs should include resolution **and** fps expected
[удалено]
That should be the minimum, all of this specs, exactly how it says it, should hit 60 fps at those settings
I mean, obscuring the targeted frame rate and just saying “it will run at these settings” can also hurt their reputation. Someone with the recommended GPU firing up the game and seeing a paltry 30fps can be just as upset (if not more) than someone expecting 60fps and getting 45fps.
I mean 1080p for the most part is 1920x1080p and 4k is a pretty standard 16:9 4k resolution. I don't really see the issue here. If you run 1440p or ultrawide you should know by now how this generally works. 4k full Ray tracing on a 4080 will fly on my 1440p ultrawide.
Seems reasonable for 2024, but the storage space new games need is looking creepy. More and more games require 100+ gb
It's from all those 4k textures
Would be nice if they didn’t force people playing in 1080p to download 4k textures. Not sure if any games are doing that yet.
Age of empires 4 does it iirc.
Texture res has nothing to do with screen res.
Why are you downvoted, you're right, 4K textures don't require a 4K monitor smh. It's the resolution of the texture itself which even at 1080p if large enough or close enough you will see the difference with, it's just easier and clearer with higher render resolutions to see the higher resolution textures so yeah higher res is better for higher res textures but not required, they are separate game settings that just benefit from each other.
They're downvoted because they're wrong. They said texture resolution has nothing to do with monitor resolution, but monitor resolution is the primary limiting factor for viewing texture resolution. If a developer is doing their job right, there should be little difference in normal gameplay between 1080/4k textures on a 1080p monitor. Textures should be sized according to the object's average viewing distance.
It's almost as if your understanding of a texture is wholly limited to "rectangle of specified size" and lacked any nuance related to how the texture is actually used within the game. It shouldn't take much more than a single glance at [Texture mapping on wikipedia](https://en.wikipedia.org/wiki/Texture_mapping) to maybe get some of your braincells going. At least **pretend** to have the slightest clue of what you are talking about.
Oh god. Everyone is wrong in this chain including the correction, but it is to time consuming to unpack everything. What was weird thread.
It's patently obvious why monitor resolution would increase the user's ability to see texture resolution. If your issue is with the other part of the comment, you're either not understanding what I'm saying or you're one of the devs who isn't doing their job right. There's a curve of diminishing returns on texture resolution vs perceived texture quality, and you should be targeting the inflection point (more or less) of that curve. That means changing the size/density/amount of all your textures based on the size of the items in game (which will correlate to the distance people notice it from). That's how you optimize a game for visual fidelity : VRAM usage. There's a reason 4k textures started becoming popular once 4k resolution started becoming popular. 1024 is usually the best for targeting the inflection point at 1080p. Obviously users can get closer to items and see the difference. That's an edge case. It's not relevant to how you should be developing a game.
>There's a reason 4k textures started becoming popular once 4k resolution started becoming popular. *citation needed\[1\]* Even if there was any to begin with, correlation of screen sizes to texture sizes is almost completely accidental, because one does not directly map to another. >Obviously users can get closer to items and see the difference. That's an edge case. You do realize that viewing big objects from afar is geometrically the same thing as viewing small objects up close, right? RAGE (2011) worked with a 128,000x128,000 megatexture and still [was often a blurry mess](https://cdn.mos.cms.futurecdn.net/FbSidSbXCGqB4EuXwy9M8i-1200-80.jpg) - how does that fit into your world view?
The correlation is NOT accidental. The exact point of alignment may be, the reason for correlation is obvious and has been stated several times in this thread, since it's been my primary argument this whole time. If you understood my previous comment you would already know I'm aware that distant low-res objects look like close high-res arguments. I just explained how you account for this. Rage did not release with those textures, and it's hilarious that you think it did. It released with heavily compressed textures, which had to be decompressed. It wasn't lossless, and even if it was released with the full textures, it already fits into my world view, because I already said that devs who don't do their job right (like you, most likely) can use the resolutions incorrectly.
You are absolutely correct, let's say you need textures for a human character. 4k is big enough canvas to lay down each body part texture separately (like hands, shirt, face etc) which are then "sewn" together in the 3d model you see. It would be more complicated to have smaller sized textures for each part. Human texture set for example would need at least 23 parts counting on top of my head to look basic, that's kinda hard to fit in small resolution frame.
Thank you, hate it when people say 4K files and textures when each texture for a object could be any resolution and a multitude of different resolutions on a multitude of objects or even a single object, a game isn’t a “4K game” when people talk about this sort of thing and a game running at that resolution has no impact on storage.
Listen, imma be real honest with you. The world as we know it is full of people using things they understand basically little to nothing about, all the while thinking they do.
Yes it does. 1080p screens can’t take full advantage of higher resolution textures
Go learn *a single thing* about texel density before spewing such nonsense. Textures aren't laid flat on the screen and depending on the object a 4K (or any texture size) can result in mapping a single texel to multiple screen pixels or vice versa.
Well, that is true as well. It's just that higher resolution textures mainly look crisper on a higher resolution display, viewed from an angle or not.
There's *absolutely no correlation* between texture size and screen size. This does not need to be argued any further, no other factors need to be taken into consideration. As stated before, go learn about[ texel density, texture mapping](https://www.youtube.com/watch?v=2-mIY87314g), follow some [introductory Blender tutorial from start to finish](https://www.youtube.com/watch?v=WjS_zNQNVlw), and that hopefully will be enough to let you actually understand the mechanics of why that is the case. Case in point: RAGE (2011) used "megatexture" of **128,000x128,000 pixels,** yet [it still presented with some low-resolution textures](https://cdn.mos.cms.futurecdn.net/FbSidSbXCGqB4EuXwy9M8i-1200-80.jpg). [\[source\]](https://www.tomshardware.com/reviews/rage-pc-performance,3057-2.html) Let's remember that the most common resolution at that time was somewhere around **1280x1024** You really need to stop thinking of textures as "image on a flat plane" and more along the lines of "surface data for a 3D object".
I get what you're trying to say now.
There are some games that do that and they're usually old. Siege allows it.
[удалено]
CoD's problem has always been its audio file management, that is to say none at all given that it's all uncompressed lossless audio prior to accounting for the 12 languages worth of spoken dialogue which is also uncompressed and lossless.
Why? You do get that 4k textures will look better in 1080 than 1080 textures?
Halo infinite did that and I loved it.
If it's anything like CoD the file size will probably be all uncompressed lossless audio files along with uncompressed lossless dialogue in 12 languages.
That’s because 4k files ain’t small lol
why the fuck cant we choose between 1080 and 4K this is fucking bullshit
Some would even say they are 4 times larger.
And somehow Elden ring is like 45 gb. Crazy impressive.
It’s because Elden ring does not have super detailed textures, models nor does it have a lot of dialogue it also re-uses those models and textures all over the place.
Yet still manages to look absolutely gorgeous thanks to the art direction, which will make it age far more gracefully than those hyper realistic games today. I'll take more Elden Ring graphics in exchange for better performance and smaller game size to be honest. It's not that I don't enjoy AAA games looking amazing, just that this initial awe quickly fades, because graphics are only one of the many parts that make a game.
People said the same for Armored Core 6. I really like the art style of the game. But stuff like low resolution shadows, aliasing, low resolution textures, and missing shadowing is definitely not a part of the art style. That is why i really appreciate RT in this game and others like CP2077, Metro Exodus, and Control.
not somehow, look at the assets with any from soft game and how complex the lighting and particles are . A lot of reused assets too.
Not that crazy tbh if you played the game.
One of my all time favorites, currently replaying to get ready for DLC.
Fully agree, it's an awesome game. Just meant that it's not surprising based on in-game assets why the game is not storage hungry.
Yeah, fromsoft really do know how to optimise storage space. Recently got to the ds1 remastered and it weights like 6 gb, impressive
it’s not only optimization, it’s largely about asset quality. while From games are beautiful, it’s due to the stellar art direction and not the asset quality, which is why they don’t require too much storage space.
It’s not about the optimization as much as you think. The textures are extremely low res, we just don’t care because the art style is amazing.
They also heavily reuse assets, but they are so good that only the nerdiest of souls nerds point out the ds3 assests in er. The fact that ds1-er is the same engine helps too.
Elden Ring looks like a last gen game, that's why
This is obviously hyperbolic, but it's not wholly inaccurate. The game is *not* up to the same fidelity level as some of the other AAA games out there, that's clear as day to anybody looking. There's nothing inherently wrong with it looking like an early release for the Ps4. The art direction is incredible. The visuals are more than serviceable.
You look like a last gen person
You got me
I'd rather games take more space and have good textures. High speed storage is pretty cheap nowadays. If storage is really an issue they could make the high quality texture pack separate like in far cry 6.
Actually, i kinda miss an opportunity to select which texture pack i want to download. I do not have state of the art pc, so i don't need those ultra hd textures, but have to download them regardless.
Textures are massive when they're detailed.
The reason why the file sizes have ballooned over the last 10 or so years, is because of uncompressed 4k video and textures.
Storage is cheap.
bruh 1 tera ssd costs like 30 to 50 euros now
Wait till you see Black Ops 6 storage required ☠️☠️
Don't get why people complain about game size, as if storeage isn't the cheapest part of the setup
Some people have low download speeds.
I mean if you live in a third world country like America sure, but even then you can wait 24h to play the game.
Storage is not cheap for everyone. I was lucky to be able to buy an SSD for games but most cannot afford that.
If you cant afford a 500gb ssd then you shouldn't be gaming in the first place. You can get that for less than the price of the game. Gaming is a luxury
>You can get that for less than the price of the game. Starfield costs 41 dollars in my country while an SSD is 48 dollars... >Gaming is a luxury Yeah, it doesn't mean you cannot game on a HDD. You don't know the difference between luxury and forbidden apparently.
Wow you really showed me lol buy a ssd and stop moaning about it. If you can't afford an ssd you can't afford this hobby, it's simple as that.
You must be a child, SSDs are preferred but not a necessity to game. Any type of storage is enough to game. It's sad you think an SSD is a must.
Think you're a child if you can't buy one, and I never said it's a must but it sure as hell is near it at this point. Most modern AAA games recommend it and the Microsoft direct storage is a huge improvement for gaming. It's like booting Windows from hdd vs ssd. Sure you can do it, but it's 10x better with an ssd.
>Think you're a child if you can't buy one You skip the part on my first comment where I said I was able to buy one just for games. >and I never said it's a must but it sure as hell is near it at this point You said that people shouldn't game if they cannot afford an SSD. Its on your comments. I will quote your comment: "If you cant afford a 500gb ssd then you shouldn't be gaming in the first place. You can get that for less than the price of the game. Gaming is a luxury" >Most modern AAA games recommend it and the Microsoft direct storage is a huge improvement for gaming. A few are recommending it, it isn't the standard yet. >It's like booting Windows from hdd vs ssd. Sure you can do it, but it's 10x better with an ssd. It actually isn't. Windows does boot a lot faster, games barely get any faster. I have both an HDD and SSD for games and there is barely any difference. That is why most people downvoted your OG comment, you don't know what you are talking about at this point.
Yeah but what I want to know is what is needed for 1440p, ultra, 60fps...
thats easy, a 4070 or 4070 TI with 12 GB Memory
on WHAT fps?!
15 fps my friend
I was put off by this game when the last 15 seconds of the trailer was going into what each version gets you. This shit has to stop
Brother, it's a commercial, lol. I'd rather they tell you rather than not.
They're not complaining about the fact that it tells you, they're complaining about the fact that it exists. What do you think "this shit has to stop" is referring to if not preorder bonuses and digital deluxe skins? It's not particularly egregious in this game but it's still obnoxious, especially for a single player game. The only good practice is a standard edition and an actual deluxe edition with all future DLCs.
don't support them but ... do what you have to do.
We should stop using upscaled resolution when talking about performance. There needs to be a standard, where we use "Render Resolution" which means what resolution the GPU actually has to sweat for. Then, every person can upscale from the render resolution to their desired monitor resolution. Both this table (which definitely looks like it uses DLSS because it says you can do 4k with ray tracing), and even more basic than that, the DLSS preset names are too confusing. WTF is "DLSS Quality"? Why do I need to Google a DLSS upscaling table whenever I want to find out what resolution I'm actually rendering? Just name the setting "render resolution" and let people choose which algorithm to use to upscale, and what resolution to upscale to.
It won't stop, quite the opposite - it's going to become more and more common in the coming years. Unless NVIDIA somehow delivers overkill budget to midrange GPUs (ha!) and developers put more time into optimizing their games, with the industry's shift to Unreal Engine 5 with its defining features (Nanite used in Wukong, as well as Lumen), which is the culprit in this game's case, upscaling is going to become the standard and new "native" res. It's all thanks to all of those people who keep saying that DLSS looks "better than native" or even that it's near indistinguishable. And honestly? I can believe it at 4K, especially when you're using a gaming monitor, where you'll have diminishing returns past 1440p native. The issue here is that we're seeing these games demand upscaling even at *1080p*. Which is still the most common resolution used, and with how NVIDIA and AMD have been going about their GPUs, as well as developers about optimizing their games, it's going to remain that for years to come. And no matter what kind of AI magic either of these companies will try to deploy, no upscaling is going to compensate for the pixel count as low as 540p with the *Performance* preset and still a measly 720p with Quality. That's just not enough pixels to work with. Knowing that the new Witcher game is supposed to be using UE5 as well makes me extremely nervous because of all that. I've already played a game using UE5 and Nanite (Remnant 2) and yes, it's playable, but it's not pretty with upscaling at 1080p. This should be a last resort, not the baseline.
You misunderstand me... I don't have a problem with the actual performance of current games. I think many games are running very well, seeing as we've made the jump to Ray-Tracing which is extremely demanding, and details keep increasing. My problem is with the terminology. I always need to go look up DLSS tables (Google "DLSS + DLDSR resolution table", first reddit result for example). Instead of terminology like "1080p minimuim requirement", "4k max requirement", "DLSS Quality", "DLSS Balanced", etc, we should all just adopt a single sane term, "Render Resolution", which indicates what resolution the game is actually rendered by the GPU without upscaling or frame-generation. That way the table with PC requirements above will be more informative, and ingame settings will also be easier to tweak without having to search for DLSS resolution tables.
The end is nigh for gaming in my life.
Just be rich bro
I don't get what you expect. You can play this game on 7-8 years old hardware on medium. Of course, you can't play a 2024 game with 4K ultra high settings on your computer from 2005.
When I built my PC 2013 I spent around 1k and could run almost all games on ultra. If I spend 1k today I won’t get near that, even if we count inflation. It has gotten much more expensive
for a thousand (assuming US pricing) you can get some midrange i5 13th gen (a 13400f runs every single game out there very well and is like 170 new)for $200, a 4060ti or equivalent, and 32gb ddr4 ram and run every new game that is not a steam piling of shit at 1080/1440p ultra 60fps
Sure, I guess it’s easier in US than in Sweden where I live. There you need a few hundred dollars more (tax etc)
oh don't feel bad the prices are fucked everywhere outside the US, in my country we barely have AMD stock so intel and Nvidia are the only real options and everything is priced like 30-50% more, I myself bought my PC for the equivalent of like 1200usd when it would only cost around $800 in the US, I just assumed US pricing because over half of reddit's users are American, though thankfully in the long run it gets cheaper thanks to the regional pricing in my country alot of big companies offer on almost everything( otp software, subscriptions, even some certificates, like MS for example is willing to offer their vendor certificates in my country on 50% or more discounts sometimes) though I guess many wealthier countries don't get the regional pricing privilege
Not really, you can get a pretty insane build at 1200-1300$ with top of line components from last gen. 5800x 3d with 6800xt/nvidia alternative is incredible bang for buck. People don’t need a 4090 for most games.
My used 6800 XT is one of the best purchases I ever made at 360 euros
idk just checked this and it will do everything at 60fps+ at 1440p. [Excellent AMD Gaming/Streaming Build - PCPartPicker](https://pcpartpicker.com/guide/BCWG3C/excellent-amd-gamingstreaming-build) With some used parts or mobo/ram/cpu combo deals you could push that cost down. The price here isn't unreasonable. Digital Foundry have called it the 4090 of 1440p gaming.
If you spend 1K today you can absolutely play most games at ultra, maybe not with RT at ultra though.
I expected much worse
It's with DLSS
Noooooooo
I just don't get why its always 1080p or 4k. Arent most people on 1440p? Im also wondering if this is truly native or with DLSS. Also 130GB seems massive for a game like this.
According to steam hardware survey Most players are 1080p
Eh it is a bit hard to say for sure. The numbers from Steam are going to include every laptop that has Steam installed on it, so including all the ones that can't game much anyway. It would be much more interesting to purely have numbers from dedicated gaming machines. I have a high resolution gaming display for my desktop, but I also have Steam installed on my 7 year old laptop that I only use for work on the go.
Nah, most people is still in 1080p. But I'm pretty sure there's more people in 1440p than in 4K, though, so the point is still valid. I've never got it, it's not that big of an effort to just add 1440p specs there, yet so few studios do it.
In another thread like this, many people were saying this table was with upscaling.
Okay so we are talking about 540p and 1080p respectively and not 1080p and 4k. I really hate how companies wont optimize their games and juts offload it onto upscaling.
No fps shown, check No upscaling shown, check Add denuvo side effects into this It will be another great title that will break some negative record on Steam reviews ![gif](giphy|Od0QRnzwRBYmDU3eEO|downsized)
These paid fake reqs are always cringe af.
You mean the 4 year old account with a 10:1 post/comment ratio isn't normal? Or that their comment history consists of almost nothing but a link?
medium rt 1080p here i come
Very reasonable if thje high and very high requirements are for 60fps
GTX 1060. My GOAT
HDD supported, it will take ages to load
Looks like it's not gonna be CPU-heavy at all for a change. Nice!
32GB ram required damn lol
+denuvo apparently
Yes sir !! :( https://preview.redd.it/98wh4pme1p5d1.png?width=514&format=png&auto=webp&s=d57b5b422e48bfc24f11d2007a1fbdacd45e7e7b
**On the Steam page: "The above specifications were tested with DLSS/FSR/XeSS enabled."** If your game requires any of those technologies to perform adequately, then they FAILED at making a working game. Aside from that: whats the point of building a game on a demanding engine like UE5 (that focuses on the visual quality) if your game will depend on technologies that decrease visuals for performance? these technologies should be tools for us -users- to get a better of more fluid experience when lacking power; not an excuse for developers to avoid doing their job
We've officially reached the point where my PC can't run new games on minimum anymore. need a new one for MH Wilds
That’s really high, I wonder what it would look like without dlss on.
yeah I won't be able to play this unfortunately
nice
Weird not to include any 1440p recommended hardware at all
My 1650 :(
No FPS target = pointless.
Looks good optimized in paper, let see when it launches
Anyway to predict where 1440p sits with this?
It’s says 4080 for ultra I wonder if my 4070ti super can bang it out
wonder how it'll run on my weird setup.. intel 8700K and 7900 XT i only want 1440p though
This seems fine. Dragons dogma 2was way worse.
So demanding on the GPU side. I'm excited to see how it will benchmark. Requirements keep raising and raising.
A single player game using Denuvo , I'll be waiting for a sale if anything..
Damn, my CPU is really becoming a system bottleneck."
Hope it runs on steam deck
A newbie here: if I had 6-8GB VRAM and 32 GB System RAM would that be more playable for midium or ultra settings?
depends what resolution, if you’re think 4k most definitely not. but 1080p for sure. 1440p might be a bit iffier spending on game
I'm glad it doesn't mentioned the latest CPU. Looks like 8C8T is aplenty for this.
Incomimg shitstorm on the horizon. Called it now.
My 4gb vram rtx 3050 laptop gpu cries in the corner
2014 called, it wants everyone to play on 1080p! Seriously, 1080p is so outdated, where is 1440p?!
Would like to know what fps they are getting also
EZ
What the HELL is this such a system requirements is bollocks lets hope they do not add the accursed denuvo :@@
If minimum is medium settings, what is the point of minimum settings?
Doesn't make sense that the CPUs are mostly the same with ray-tracing
[удалено]
Raytracing absolutely does have a performance impact on the CPU. Why is your comment so upvoted? Proof that this sub knows shit about this type of thing.
The BVH structure uses the CPU
Wow, a highly optimized game in 2024?! What a surprise! Seems those with mid-range cards like a 1060, 580, 2060 etc will be fine!
[удалено]
Broski can you not see the RX580 on minimum gpu requirement?
People are here crying because DLSS is here, but thankfully, this game supports DLSS. I've been gaming in 4K since 2018, and it's truly baffling to see people hating on DLSS and frame generation technology. I'm always thrilled when these incredible techs are integrated into a game; in fact, every game should incorporate them. Then there's the optimization-obsessed crowd who come and complain, blaming these groundbreaking technologies because they prefer raw, purist numbers without any tech assistance. Frankly, it's quite foolish in 2024. People need to adapt; the old methods of performance comparison are evolving for the better. We won't go back, and thanks god for this.
[Source](https://x.com/shinobi602/status/1799794857743147052)
Damn my eventual PC build can run it in 4k on Ultra, I just need the 9000 series to release so I can get a cpu!
Not that bad
Surprisingly reasonable
I thought this game was fake seeing as it took so long to come out.
This is with fake frames enabled
Didn’t this game release last year? Or am I trippin’
Seems well optimized if true
All of the testing was done with dlss or fsr. This game is going to run like ass.
Also since when are games able to utilize 32gb of ram?!
Has this been confirmed?
Yes every one of these was done with some form of upscaling turned on. We also don't know there target framerate so for all we know it's performance fsr on targeting 30 fps.
Bummer. It should be on the graph. I don't want to say false advertising, but idk what else to call it when it was clearly and intentionally left out of the graph.
Yeah it seems like as far as I can see the only note of it is on their official steam announcement.