It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
While my paltry Intel UHD 620 onboard GPU can play some very good games with great results, there are others which make it struggle (such as The Witcher 2, a game already 10 years old!).

Making games which are unplayable (at least on the lowest end) on systems without discrete GPUs is counterproductive to game companies.

First of all, GPU production is experiencing (and has experienced since the global cryptocurrency explosion) chronic shortages and problems in availability.

Secondly, there is an absurdly large market of PCs with onboard GPUs.

Excluding such a computer base from your latest game simply impose an unreasonable limit on your potential audience.

Games should have Crysis-like and ray-tracing GPU consumption levels on their highest tiers, I can get that. But they should also ensure that their games should be playable on the lowest setting with onboard GPUs such as Intel UHD chipsets.

What do you think?
Post edited March 08, 2021 by thegreyshadow
I think you're just looking at the wrong angle. I recently bought myself a Radeon Pro WX 3100 for somewhere in the range of under 200 USD. It's a completely boring workstation card with absolutely no flair. Which is great, because I stuck it in a boring workstation with absolutely no flair.

Now I realize that as you live in South America, things already have become a little complicated.
avatar
Darvond: I think you're just looking at the wrong angle. I recently bought myself a Radeon Pro WX 3100 for somewhere in the range of under 200 USD. It's a completely boring workstation card with absolutely no flair. Which is great, because I stuck it in a boring workstation with absolutely no flair.

Now I realize that as you live in South America, things already have become a little complicated.
I understand your point but you need to see mine. Laptops are replacing desktops everywhere. My potato Intel UHD 620 system is a Dell laptop. Even if right now I have the cash to fork for a discrete GPU upgrade, where would I put it on my laptop? And there are millions of such systems which gaming studios are ignoring to their own detriment.

It simply doesn't make business sense.

I repeat: on the highest tiers go all the way you want; studios could require even three parallell jumbo GPUs for rendering everything to the highest detail; but it makes sense to have a "lowest / potato" tier which makes the game accessible to GPUs such as Intel onboard chips.
avatar
thegreyshadow: What do you think?
Not a viable idea at all, because it would require either:

a) all games have vastly obsolete graphics and/or


b) all devs spend an absurd amount of time & resources to make the game run on "ultra crap" settings...which are resources that are desperately needed for better & much more important aspects of the game.

And if they did re-direct all those resources, then the resulting games would suck most of the time, because not enough attention would have been paid to other things.
avatar
thegreyshadow: It simply doesn't make business sense.

I repeat: on the highest tiers go all the way you want; studios could require even three parallell jumbo GPUs for rendering everything to the highest detail; but it makes sense to have a "lowest / potato" tier which makes the game accessible to GPUs such as Intel onboard chips.
Have you forgotten the Cyberpunk 2077 outcry?

CDPR explicitly supported PS4 and XBox One consoles, despite the platforms not being strong enough. Could you describe this blur as anything other than potato mode?

For releasing Cyberpunk 2077: Potato Edition, CDPR was pilloried in the press. Sony pulled the game from their online store and offered refunds to anyone that wanted it.

That hardly seems like business sense.
low rated
I think you are lame.
Guess why people buy those expensive gpu-s? Because they want to play better/faster graphics games.
And that's why gaming companies make those games.
avatar
thegreyshadow: What do you think?
avatar
Ancient-Red-Dragon: Not a viable idea at all, because it would require either:

a) all games have vastly obsolete graphics and/or

b) all devs spend an absurd amount of time & resources to make the game run on "ultra crap" settings...which are resources that are desperately needed for better & much more important aspects of the game.

And if they did re-direct all those resources, then the resulting games would suck most of the time, because not enough attention would have been paid to other things.
Time spent is not absurd if you can increase your potential install base several orders of magnitude. Otherwise, is a valid point.
avatar
thegreyshadow: It simply doesn't make business sense.

I repeat: on the highest tiers go all the way you want; studios could require even three parallell jumbo GPUs for rendering everything to the highest detail; but it makes sense to have a "lowest / potato" tier which makes the game accessible to GPUs such as Intel onboard chips.
avatar
Mortius1: Have you forgotten the Cyberpunk 2077 outcry?

CDPR explicitly supported PS4 and XBox One consoles, despite the platforms not being strong enough. Could you describe this blur as anything other than potato mode?

For releasing Cyberpunk 2077: Potato Edition, CDPR was pilloried in the press. Sony pulled the game from their online store and offered refunds to anyone that wanted it.

That hardly seems like business sense.
I think this is the result of bad technical decisions other than optimizations for onboard GPUs.

I can run Far Cry 2 and F.E.A.R. 2 with much better graphical results on my "potato" GPU.

So your argument don't really apply here. CDPR screwed Cyberpung 2077 in many ways. The game was released unfinished, for potatos and for high-end systems.
avatar
Orkhepaj: I think you are lame.
Guess why people buy those expensive gpu-s? Because they want to play better/faster graphics games.
And that's why gaming companies make those games.
I'm not lame.

Gaming companies may require a rig cooled with liquid nitrogen for the highest tier if they want to and this might drive GPU sales alright.

My point is that the entry level should not require discrete GPUs; you're talking about higher levels.
Post edited March 08, 2021 by thegreyshadow
low rated
avatar
Ancient-Red-Dragon: Not a viable idea at all, because it would require either:

a) all games have vastly obsolete graphics and/or

b) all devs spend an absurd amount of time & resources to make the game run on "ultra crap" settings...which are resources that are desperately needed for better & much more important aspects of the game.

And if they did re-direct all those resources, then the resulting games would suck most of the time, because not enough attention would have been paid to other things.
avatar
thegreyshadow: Time spent is not absurd if you can increase your potential install base several orders of magnitude. Otherwise, is a valid point.
avatar
Mortius1: Have you forgotten the Cyberpunk 2077 outcry?

CDPR explicitly supported PS4 and XBox One consoles, despite the platforms not being strong enough. Could you describe this blur as anything other than potato mode?

For releasing Cyberpunk 2077: Potato Edition, CDPR was pilloried in the press. Sony pulled the game from their online store and offered refunds to anyone that wanted it.

That hardly seems like business sense.
avatar
thegreyshadow: I think this is the result of bad technical decisions other than optimizations for onboard GPUs.

I can run Far Cry 2 and F.E.A.R. 2 with much better graphical results on my "potato" GPU.

So your argument don't really apply here. CDPR screwed Cyberpung 2077 in many ways. The game was released unfinished, for potatos and for high-end systems.
avatar
Orkhepaj: I think you are lame.
Guess why people buy those expensive gpu-s? Because they want to play better/faster graphics games.
And that's why gaming companies make those games.
avatar
thegreyshadow: I'm not lame.

Gaming companies may require a rig cooled with liquid nitrogen for the highest tier if they want to and this might drive GPU sales alright.

My point is that the entry level should not require discrete GPUs; you're talking about higher levels.
what entry level? should they set the entry level so you can launch the game and load the menu? and then when you would start playing it crashes to desktop instantly and show error buy gpu ? that would make no sense at all
avatar
Orkhepaj: what entry level? should they set the entry level so you can launch the game and load the menu? and then when you would start playing it crashes to desktop instantly and show error buy gpu ? that would make no sense at all
The scenario you propose of course would not make any sense at all. But that's not what I'm proposing.

Entry level: Intel GPU or similar.
The game should be playable with an acceptable level of detail.
low rated
avatar
Orkhepaj: what entry level? should they set the entry level so you can launch the game and load the menu? and then when you would start playing it crashes to desktop instantly and show error buy gpu ? that would make no sense at all
avatar
thegreyshadow: The scenario you propose of course would not make any sense at all. But that's not what I'm proposing.

Entry level: Intel GPU or similar.
The game should be playable with an acceptable level of detail.
but many games are not playable at all with that hardware
and would make no sense to limit games to that level
avatar
thegreyshadow: Time spent is not absurd if you can increase your potential install base several orders of magnitude.
Determining your potential market for different products, and thus deciding on a target product profile (TPP), is one of the jobs that a company's business strategy division will be doing, and you can be sure that the larger game developers/publishers have such divisions. The fact that these companies are releasing games that target mid to high-range hardware should tell you the conclusions they came to about their potential markets. While this may not be the conclusion you desired, if you find yourself running a game development company you are welcome to make a different decision.
You have a really valid point about industry focusing in Low-end gpus for profit.
This is good too for Indie industry, because many of them don't require High-end gpus.

Problem is, if they focus more in Low-end gpus, the monopoly on NVidia, Ryzen, and high end Intel 9~10th generations will lose money? I don't own enough data to make my mind in this topic. -tl:dr-

But as you said, Ancient-Red-Dragon has a valid point too. It's interesting. Most recent games can't run in onboards, not because the Intel UHDs have no memory, but because games are badly optimized.* (much better)

When do graphical improvements, during development, need to be the focus, rather than general optimization?* (much better)
People would be surprised of how many games have junk code inside slowing everything.
Post edited March 08, 2021 by .Keys
Wishful thinking. Integrated GPUs are 99% oriented towards business, and that's why they are in the majority of PCs rather than discrete GPUs simply because the vast majority of PCs are used for business, not gaming. It follows that game developers simply can't afford accommodating integrated graphics because there's simply no market for games in integrated graphics, some individuals like OP notwithstanding because they are the exception to the rule and hardly worth consideration. Game developer will always (again, maybe with some exceptions) will target consoles first simply because consoles are sold primarily for gaming unlike PC with integrated GPUs which are sold primarily for business, and PCs with dedicated GPUs second, and there's simply no room left for integrated GPUs as all the money was already spent, not to mention no time left with publishers forcing game developers to push out games as early as possible (hence the CB2077 fiasco).

Bottomline: OP's problem is that he's trying to use a business notebook as gaming notebook. Results are obvious and no amount of crying will help as the equipment used is simply not intended for gaming, period.
Post edited March 08, 2021 by anzial
high rated
Onboard GPU's are not meant for gaming. They are cheap, low-energy solutions meant to be used in situations, where even midlevel graphics card would make little sense. In other words, they are meant for office use. AAA-games industry is not going to bother with them, because they can lack many of the features dedicated, even low end, cards support.

It could be possible, that at some point PC's evolve to a point, where onboard GPU's becomes the only thing you need, but as long as there is a thousand-mile gap between the performance of them and dedicated cards, it won't happen.
avatar
Orkhepaj: but many games are not playable at all with that hardware
My point is that they should if game companies want to increase their potential install base.

avatar
Orkhepaj: and would make no sense to limit games to that level
Read my post and replies again.
I never said anything about limiting anything. Game companies can set the highest tier as high as they want.
If anything, my point is that game companies are limiting their profit and market opportunities by setting hardware constraints too high.
They should expand those constraints, not limit them.

avatar
anzial: Wishful thinking. Integrated GPUs are 99% oriented towards business, and that's why they are in the majority of PCs rather than discrete GPUs simply because the vast majority of PCs are used for business, not gaming. It follows that game developers simply can't afford accommodating integrated graphics because there's simply no market for games in integrated graphics, some individuals like OP notwithstanding.
You said it. They are in the majority of PCs.
There's untapped potential.
Post edited March 08, 2021 by thegreyshadow