It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
ClassicGamer592: This youtube channel explains what's wrong with Unreal Engine 5 and modern graphics.

And many games from the late 2000s still look good today like Killzone 2 running at native 720p with 2x MSAA on PS3. We don't need blurry TAA, upscaling, or frame generation to get amazing graphics back then. Now we have games running below 720p on PS5, or worse, running below 480p resolution on Xbox Series S on some Unreal Engine 5 games!
This video is really amazing and complex in details.
The kind that people don't like to watch and thus, fall on generalizations that caused this mess to begin with.

Generalization solutions don't work when you need something so precise as this and his explanation is really deep.
Gotta be honest, I couldn't understand all of it.

But what I noticed was (and I don't really think this is indie/AA devs fault):

I saw a pattern of companies creating huge amounts of propaganda for people to normalize and use specific products that are, apparently, being made by said people who made the propaganda before and normalize, again, said products in the market and for customers that have no access to information, supposedly.

As you explained, about how 2000's games can look good today with enough upscaling technologies and run well, I think its all because back then most developers had to create their own engines and had to optimize them to work with current time hardware.

Nowadays I always hear people saying: "Yeah, we could optimize that, but now we don't need to because we have the technology to run it!" Yes, sure! Just that 5% of humanity will have the money to enjoy it. The others will, well, as always, complain that product was not well made.

Dropping the "rant-like commentary", its a difficult problem to solve:
While we could argue that devs must develop their own engines like in the past, this is by no means feasible for 8th gen demands. Talking about small games with simple mechanics and not so many complex assets, maybe, but what about middle-sized dev companies producing AA games?

Those engines "empowered" (I started hating this word because of how its being used nowadays, but anyway... :P) developers to create AAA-like products and its amazing.

My mind rests in the idea that a middle-ground solution will be found where, like others have said on this thread, they will learn to use those awesome tools correctly, rejecting generalization tools that unoptimized games while embracing and developing technologies, with the community, that creates better optimized games for last gen and current gen hardware.

There are other problems "below the surface" though.
But this would derail the talk to the point of being another thread, and forum, discussion topic.

Thanks for sharing his channel and this amazing video!

avatar
idbeholdME: This is the core of the issue, yes. Time pressure and or wanting to cheap out by not hiring high level developers. It's much easier to hire a person and tell them "check these 2 boxes here and then do this" instead of someone who can modify the underlying engine code as needed and cook up a customized/modified solution for the project in question. And most importantly, it's much easier and cheaper to find somebody able to do work in a widely available engine tailored for large scale projects, than initiating someone completely from scratch into your in-house engine. When a developer can start working on the project immediately rather than having to go through a month long initiation to the inner workings of your custom engine before being able to start working, it saves a lot of money. That is also a major part of why a lot of studios are dropping their engines and switching to Unreal and there is pretty much guaranteed to be a steady supply of hirable, ready to go devs. Maintaining and updating/developing a custom, studio specific engine is extremely expensive.

But it's true UE focuses a lot on ease of use and may be pushing the still basically experimental features as the go-to/definitive solution a little too fast. Nanites and Lumen don't have to be used and may not always be the best solution for a specific scenario. But it's a choice by the developer, not really the fault of the engine. The features were first introduced with the launch of UE5, which is a mere 2.5 years ago. Look at Ray Tracing for example and how much time has passed before it started slowly becoming the norm.

Lastly, here are some highlights of the latest 5.5 version of the engine.
https://www.youtube.com/watch?v=BcmUZpdChhA
About this, completely agree.
Here's an example of that from the comments on the video liked by @ClassicGamer592:

@unrealcreation07 - 3 months ago

10 years Unreal developer here. I've never really been into deep low-level rendering stuff, but I just discovered your channel and I'm glad your videos answered some long-lasting questions I had, like "why is this always ugly and/or blurry, no matter the options I select?? (or I drop to 20fps)".

With time, i have developed a kind of 6th sense about which checkbox will visually destroy my game, or which option I should uncheck to "fix" some horrible glitches... but I still don't know why most of the time. In many cases, I just resign and have to choose which scenario I prefer being ugly, as I can never get a nice result in all situations. And it's kind of frustrating to have to constantly choose between very imperfect solutions or workarounds. I really hope you'll make standards change!
We can't say this happens most of the time, but its definitely happening.
Not even the developer know what he is doing, and it kinda breaks hearts.
He's just following generalization criteria based on his own senses, or project superiors.

---

Edits:
Wording, corrections, add info..
Post edited Yesterday by .Keys
avatar
botan9386: Either that, or developers are not as good as they used to be at optimisation. Some of my favourites still look and run amazing and are tiny file sizes compared to modern titles.
Both. UE5's appallingly bad optimisation (and developers complete lack of competence / laziness) is disproportionately worse than before. Eg, wind back the clock a decade and UE3 games from the early 2010's - examples like Alien Isolation, Bioshock Infinite and Dishonored actually ran very well at 1080p on High settings on low-end GPU's of same era (eg, GTX 750Ti, HD 7790) that cost £100-£150, and only dipped below 60fps on Ultra. Back then turning down settings from Ultra to Low also often doubled the frame-rate as they put in the effort for play-testing "disable the settings with the largest performance hit but smallest image quality impact first".

Today you can blow £300-£400 on a "budget" GPU and still get 40-50fps at 1080p, and turn down the settings only for the game to look worse rather than run faster, only to met with "upscaling is the new norm!" gaslighting. You compare how a shiny new UE5 2024 AAA game looks on Low using 9GB VRAM at 1080p next to a 2014 Indie game on High using 1.5GB VRAM at 1440p and conclude : we are very definitely on a "post-peak developer competence" downslope at the moment...
Attachments:
Post edited 20 hours ago by AB2012
avatar
P. Zimerickus: I never managed to get past a total system load of 550W myself, and that's with a 450W gpu.
I also rarely use more than 170Watts when gaming (gpu only, according to nvidia....)

The power consumption is rising noticeably though. I can't run cities 2 under 200W with medium low settings and a performance DLSS option... 30 fps atm..... I actually gotten used to a average of 140W in the previous years but as you just read, this figure increased with almost 30W
avatar
Palestine: P. Zimerickus, are these wattage figures obtained via software, or, from the wall outlet (using something akin to 'Kill A Watt')?
The latter is far more accurate.
It is difficult to describe n accurate figure i guess...... I mean these are only measurements taken from my card.... but with my experience with nvidia i could assume that for a reasonable 60 fps on 2k stand the numbers probably won't differ to much if your running a 3090Ti a 4090 or the coming 5090...... That always manages to confuses me a lot. Like why does a card that is sold as 4 times more powerful end up with similar behavior in fps locked or power locked situations??? You tell me!
developers are coding like improv groups, you need to see these monkeys code.
i'm 42!! i am still coming up with new and faster ways to do math and draw visuals.
we have all this hardware that optimizes data flow and has advanced methods.

nearly every system has a NPU... how many games used it so far?
every nvidia gpu has physix, no one uses it. "maybe because nvidia didn't make it less annoying to use..."
games can run twice their speed... that does not sell hardware... so it's crime for crime..

the days of developers brute forcing their code to waste no cycle... is dead..
Post edited 20 hours ago by XeonicDevil
The currently most impressive engine in overall is probably Decima, the one used in Horizon Zero Dawn and other titles from this series. Apart from Kojima only Guerilla Games is using this engine.

UE5 is... simply a new "industry standard" able to provide next gen graphics and surely high flexibility... but it is a big performance hog: This is the true downside of this engine which may require some improvements else it will result into a "hardware war" and GPUs above lower-midrange (basically above Intels Arc series) are a "luxury hardware" for a good time already.

Gamers can check out the price of a PS5 Pro, which is one of the most expensive console ever made... yet this is it what it takes for having sufficient hardware for really demanding engines. The price will probably not come down... so i guess it is safe to say this will be the new price tag in the future for "capable" hardware.

Sure, the new Intel Arc is a "fresh piece of affordable GPU" but lets be honest... it is not the GPU you can run 4k and/or always nearly maxed out graphics with. The minimum would be the "category" i used for more than 2 years already which is at the spot of 3090 TI/7900 XTX and 4070 TI+.

Sure, even the Decima engine is leeching the PS5 Pro like crazy on the newest "Horizon Zero Dawn Remaster" but many of the stuff i was able to detect there on the PS5 Pro was really impressive... it looked in many cases better than for example "Stalker 2", which is a UE5 game.
Post edited 19 hours ago by Xeshra
avatar
Palestine: In my opinion, the price is not even a concern; it is the energy consumption of new GPUs. It is beyond ridiculous that it is now regarded as the norm for dedicated graphics cards to have a TDP of over 200 watts... some, as high as 450 watts (uncertain of true consumption during full-utilization). A few years prior, I had made the decision to limit myself to only purchasing relatively efficient-efficient APUs (Accelerated Processing Units), which, fortunately, are now able to compete with (or, at the very least, match the performance of) some decent dedicated graphics cards.
Since I play only on (gaming) laptops, my solution to that problem is to play AAA games many years after their release, when mobile GPUs have caught up and are able to run those games satisfactorily. Plus, of course, the games themselves might have received some performance optimizations in the meantime as well.

This has worked so far, unfortunately now they are saying Moore's law (and Moore himself) is dead so maybe we need that 450W GPU to play The Witcher 4 also in 2050 (providing it is released before that)?

Maybe we should at least leave x86 behind and start using more energy-efficient ARM-based CPUs, like Snapdragon? That's a start. Not sure if NVidia GPUs have similar old backwards-compatibility stuff dragging it down as well, or is it as optimized and power-efficient as GPUs can be?
Post edited 17 hours ago by timppu
I just had another run (only 18 minutes) in Darkblade Ascent. A rogue dungeon runner, developed by 2 brothers from Europe. So these guys developed the entire game in UE5, with an DLSS option. I just played in high settings, 60 fps, quality dlss settings and found a 160W average power draw.

When you look at the attached image you can see that in terms of model detail the game severely lacks compared to its higher budgeted counterparts. Still, the overall visual quality wildly 'outperforms' any pre-2020 game no matter it's budget. I much prefer this result over any other (unity)
Attachments:
avatar
Xeshra: Sure, the new Intel Arc is a "fresh piece of affordable GPU" but lets be honest... it is not the GPU you can run 4k and/or always nearly maxed out graphics with. The minimum would be the "category" i used for more than 2 years already which is at the spot of 3090 TI/7900 XTX and 4070 TI+.
4k still is and will be for a long time, first and foremost, a massive resource hog mostly for bragging rights. If anything, the next goal of the industry should be 1440p 120 FPS. It's nearly only half as demanding (56%) as 2160p while still offering a noticeable upgrade from 1080p. And any of the surplus performance going into extra FPS instead of always clinging to 60 because of consoles is much better spent. Though of course, then we come to the problem of most TVs being incapable of more than 60 FPS and we come back full circle to consoles steering the entire industry in a suboptimal direction.
Post edited 13 hours ago by idbeholdME
avatar
.Keys: We can't say this happens most of the time, but its definitely happening.
Not even the developer know what he is doing, and it kinda breaks hearts.
He's just following generalization criteria based on his own senses, or project superiors.
Reading that quote from a 10-year developer is pretty disappointing. They were never expected to understand how their tools worked but because of that they could only choose the least-worst option when they had issues. If developers can't troubleshoot then who will?

avatar
AB2012: You compare how a shiny new UE5 2024 AAA game looks on Low using 9GB VRAM at 1080p next to a 2014 Indie game on High using 1.5GB VRAM at 1440p and conclude : we are very definitely on a "post-peak developer competence" downslope at the moment...
I feel this even with non-UE5 games like Cyberpunk and Monster Hunter. VRAM aside, there is no justification as to why a lot of recent game are such CPU hogs. Do we all remember the high density crowds in games like Hitman and Assassin's Creed? What happened...As impressive as the X3D chips are, they should not be a necessity.

avatar
Xeshra: Sure, the new Intel Arc is a "fresh piece of affordable GPU" but lets be honest... it is not the GPU you can run 4k and/or always nearly maxed out graphics with. The minimum would be the "category" i used for more than 2 years already which is at the spot of 3090 TI/7900 XTX and 4070 TI+.
I have my eyes set on the 7900XTX as it sometimes drops to £700 on discount (or used). I expect it to go up when the next generation is announced since NVIDIA will be overpriced and AMD won't compete with them. Unfortunately it is as you say, budget GPUs even at good pricing are being beaten down by modern games.

avatar
P. Zimerickus: When you look at the attached image you can see that in terms of model detail the game severely lacks compared to its higher budgeted counterparts. Still, the overall visual quality wildly 'outperforms' any pre-2020 game no matter it's budget. I much prefer this result over any other (unity)
I don't know...I wouldn't really mind if this game looked like Dishonored 2.
Post edited 12 hours ago by botan9386
Well, when a legion of clowns seem to believe that photo realism is more important than mechanics. We end up with lazy game design, that looks pretty.

"If you want to live a happy life. Never make a pretty woman, your wife."

I look at Stalker mods and how sleak and slim they are. Play rather well on a potato. Then I see a over 100GB pile of pretty crap. Still buggy as F in the official latest game.

If the game devs were not wasting effort with graphical garbage, we already had what fans wanted for years. Its not exactly hidden from the official devs to figure out what to do.

So viewing this behavior, one can come to a conclusion, they just dont care. I mean, across most modern game design. Its like the developers are willfully ignorant about past projects across many platforms.

If so many old people were not a part of the teams making these mistakes. I would go full old geezer, shaking my cane saying "You damn wippersnappers today...." and "In my time we.....". I kind of do that anyway. But its odd seeing so much milarky! Especially when some devs have everything on display for them.
IMO it's a bad engine overall, it can be easy to work with since you don't have to worry about stuff like having really nice lightning, but that also means you need a mid-high end PC to even run the game because the devs just don't optimize shit and let the engine do everything
Post edited 10 hours ago by Memecchi
It is something going on with the UE for some time, not only with UE5.

There are games on UE 5 and UE 4 that do look like being a generation older (naming Terminator Resistance for 4 and Robocop for 5 as Example) and who will cook my PC harder then Cyberpunk did.
Running Cyberpunk on high details and Robocop on medium to low is some kind of bad joke.
To the point that my coolers spin up at not so crowded or filled places for no reason you could see (but most likely bad graphics geometry in the background).
And not to forget the close up cutscenes in both that did the same thing, when presenting their lifeless doll kind faces...

It is lazy devs for sure, but to some degree these are the auto "make it better" features of the engine and people in engine development who think everyone opted in for "AI" upscaling and whatever modern graphic cards can do afterwards to upgrade a low res picture with artifacts.
It simply doesn't look like UE5 is actually build for running games in their nativ resolution and without high amounts of post processing the rendered frame to get rid of artifacts. More like the opposite.

Personally I would prefer higher resolution but no need of any kind of AA because of it. No whatever shitty filters and whatever, but an engine that is able to simply NOT render stuff the player can't see from his position. Because very often it seems that this is not happening properly and will burn power for nothing.
Post edited 11 hours ago by randomuser.833
avatar
botan9386: The last batch of UE5 games have all received criticism for poor performance, especially those with large worlds. UE5 is now pretty much labelled a bad engine by the gaming community but developers seem to welcome it, including CDPR who will develop Witcher IV with UE5 and likely Cyberpunk 2.

But when your target audience needs a new $1000 system for a good experience then maybe it's worth considering an alternative. I guess you could also argue that PCs should be cheaper than they currently are so that a midrange PC is more accessible.

Either that, or developers are not as good as they used to be at optimisation. Some of my favourites still look and run amazing and are tiny file sizes compared to modern titles.
Both can be true. UE5 is bad, and developers are crap at their jobs because of reasons I won't bother to get into here.