It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
Cavalary: Just a quick search for some games that shouldn't need more than the power of integrated graphics, many of them old, yet simply don't support them, as clearly stated in the requirements:

Kao the Kangaroo (first one, 2000)
Medal of Honor: Allied Assault War Chest (2004, 2002 for the base game)
Heretic Kingdoms: The Inquisition (2004)
SWAT 4 (2005)
MX vs. ATV Unleashed (2006)
Warhammer: Mark of Chaos (2006)
Broken Sword 4 (2007, 2006 according to other sources)
(probably) Fallout 3 (2009, 2008 for the base game)
Lethis: Path of Progress (2015, but simple graphics)
Teamfight Manager (2021, but pixel graphics)

Stopped at 10, and didn't include any of those saying that integrated graphics may work but are not officially supported, but should be a fair sample. In these cases it's not a matter of optimization, but quite obviously of using instructions specifically catering to dedicated cards...
Good post.

Two good games which I was able to play very well with integrated graphics were the F.E.A.R. series (even F.E.A.R. 2) and Far Cry 2. They are not perfect, but they look reasonably good and such graphic detail could very well be the lowest tier of graphical detail for games.

In my experience, mostly anything 2010 and below can be expected to run reasonably well with Intel HD onboard graphics.
avatar
toxicTom: The problem nowadays - apart from the insane hardware prices currently - is that optimisation isn't happening any more - devs rely on frameworks and third-party-tools. Which allows them to create passable results quickly, and manage the complexities of development better, but those frameworks - especially if they target multiple platforms - are big resource hogs by themselves. Optimisation is sacrificed for convenience, and yes, creativity. Nowaday you don't need to be a coding guru like John Carmack to create a game. Which allows for many people implementing their ideas (with varying results obviously). But this easiness and convenience comes at the price of performance issues, there's no way around it.
Nothing beats hand-optimised assembly code in terms of performance, but only a handful of people can manage that. Next best thing is C - still a very technical thing. So layers upon layers of abstraction were added, until we arrived at things like Unity or Unreal engines, which let you focus on the creative side of actual game design. But each of these layers costs performance.

The performance issues are basically a price we pay to have more, and more variety of, games. There are not enough coders out there who can make a game shine even on low-end machines. And it's a lot of work, even for those - on every single platform.
Very good points raised here, but the point stands. If the current lack of optimization is due to rapid development on third-party frameworks, then the piece of the puzzle that should be optimized first and foremost are the frameworks themselves.
avatar
Cavalary: Just a quick search for some games that shouldn't need more than the power of integrated graphics, many of them old, yet simply don't support them, as clearly stated in the requirements:

Kao the Kangaroo (first one, 2000)
Medal of Honor: Allied Assault War Chest (2004, 2002 for the base game)
Heretic Kingdoms: The Inquisition (2004)
SWAT 4 (2005)
MX vs. ATV Unleashed (2006)
Warhammer: Mark of Chaos (2006)
Broken Sword 4 (2007, 2006 according to other sources)
(probably) Fallout 3 (2009, 2008 for the base game)
Lethis: Path of Progress (2015, but simple graphics)
Teamfight Manager (2021, but pixel graphics)

Stopped at 10, and didn't include any of those saying that integrated graphics may work but are not officially supported, but should be a fair sample. In these cases it's not a matter of optimization, but quite obviously of using instructions specifically catering to dedicated cards...
By that era, a lot more games were aiming for NVidia or AMD based cards and all of their features, sets, and whatnot. They were totally in a whole other galaxy, putting to shame back then what Intel's integrated stuff was doing back then.

They didn't want to really support T&L and other features; Shaders later in GeForce 2+3 lines (non-MX) back when those were the rage - which dev's wanted, so dev's & pub's flat-out often skipped them; and/or barely supported them.

Intel back then was too busy w/ making a killing off everyone from business and gamers, namely for their Intel CPU's. Businesses, which did NOT need GPU's at all and still don't really need them, was and still is a HUGE part of their business.

We're also now in a space again, where console gaming is setting the bare minimum again for games - it's gonna set the base and bare minimum for RT gaming; and you're going to see more DLSS & FSR support too...especially more FSR support, as consoles are AMD-based.

DLSS is here (on NVidia) & FSR is going to be here soon from AMD (which supposedly might also be able to work on NVidia stuff and AMD says it's up to NVidia to optimize FSR).

Source on AMD on FSR working on NVidia stuff, since FSR is open - https://www.pcgamer.com/amd-says-its-up-to-nvidia-to-optimize-for-fidelityfx-super-resolution/

And these two techniques are performance boosters and are absolutely necessary, to handle all of the mess & bulk with something like RT, which is still in early years of its infancy.

iGPU's just likely ain't going to keep up w/ this RT stuff anytime soon, as now we have things like NVidia doing AI-based Tensor cores and whatnot for DLSS (hence the DL part here, the Deep Learning part of the AI) to keep image quality close even on up-scaling to even Native resolutions. This is the beauty and magic of DLSS and AI-based stuff here - that makes games even when upscaled look much more like a higher & native resolution you are trying to aiming for (namely, once you go over 1080p).

FSR doesn't do the Deep-Learning part, so it ain't AI-based. It's some other cheaper-form of DLSS, which looks like it's more DLSS Version 1.0 (ewwwwww) in looks and not whatever kind of magic NVidia has on DLSS 2.0.

Source on FSR not on DLSS's level - https://wccftech.com/no-amds-fsr-fidelityfx-super-resolution-is-not-a-dlss-alternative-and-here-is-why-you-should-care/

Given where consoles are going for games with RT & FSR on both XSS/XSX/PS5; and where PC Gaming is going with FSR and both DLSS & FSR - gamers keeping up w/ this stuff on PC really should be really aiming for RTX 2060 for a minimum or AMD 6000 series of cards.
avatar
thegreyshadow: Two good games which I was able to play very well with integrated graphics were the F.E.A.R. series (even F.E.A.R. 2) and Far Cry 2. They are not perfect, but they look reasonably good and such graphic detail could very well be the lowest tier of graphical detail for games.

In my experience, mostly anything 2010 and below can be expected to run reasonably well with Intel HD onboard graphics.
Post-2010 ones I played on this computer (which has poor graphics even for integrated ones, just being a Pentium G3440):
Regions of Ruin (2018, but pixel graphics)
The Mull Littoral (2017, but visual novel)
Ember (2016)
Jotun (2015)
Her Story (2015, but simple graphics)
Grim Fandango Remastered (2015, but that's just for the remaster)
Bound by Flame (2014, though temps go up more than with anything before)
Lords of Xulima (2014)
Driftmoon (original edition, 2013)
Gone Home (2013, but FPS really didn't matter, was often probably getting around 10)
Tropico 4 + DLCs (2011-2013)

Granted, the fact that I have a 1280x1024 monitor aids this, playing at native resolution on full HD (chip won't even support higher) would be out of the question.
avatar
MysterD:
Question being, is all of that necessary? And if included, may it not be disabled?
Post edited June 11, 2021 by Cavalary
it's better to get AMD APUs if you want to game competently using onboard graphics.
avatar
Cavalary: Post-2010 ones I played on this computer (which has poor graphics even for integrated ones, just being a Pentium G3440):
I still have a pentium g3258 around and is probably my favorite CPU ever. It's slow by today standards but have unlocked multiplier and some board manufacturers enabled overclock on low end boards. Wich means if you have something like a 45€ MSI H81-P33, overclock to 4.1/4.2GHz is doable on stock cooler and 4.4/4.5GHz on better cooler. Is still one of the fastest dual core CPU's and perfect for a retro XP machine.
I would love to find a cheap board to put the CPU some use.

That said, it has the "Haswell graphics" like your G3440 and is a bit weaker than the Haswell 4400/4600 graphics. I remember having some trouble playing some games on the integrated graphics, while the i3/i5 CPU's run them fine. At this moment some i3 and i5 Haswell CPU's are cheap as chips on the used market, going from 10 to 50 Euros, would be a nice upgrade to your machine.
A simple i3 4330 is a worthwhile upgrade for some 10 to 20 Euros, better CPU, graphics and will clear the stutter while open modern web pages. The G3258 while overclocked do stutter a lot on games and regulary freezes the mouse on web pages.

Also, worth mention that using 2 stick of RAM in dual channel worth 20-40% better GPU performance on those integrated graphics.
Post edited June 11, 2021 by Dark_art_
avatar
thegreyshadow: Two good games which I was able to play very well with integrated graphics were the F.E.A.R. series (even F.E.A.R. 2) and Far Cry 2. They are not perfect, but they look reasonably good and such graphic detail could very well be the lowest tier of graphical detail for games.

In my experience, mostly anything 2010 and below can be expected to run reasonably well with Intel HD onboard graphics.
avatar
Cavalary: Post-2010 ones I played on this computer (which has poor graphics even for integrated ones, just being a Pentium G3440):
Regions of Ruin (2018, but pixel graphics)
The Mull Littoral (2017, but visual novel)
Ember (2016)
Jotun (2015)
Her Story (2015, but simple graphics)
Grim Fandango Remastered (2015, but that's just for the remaster)
Bound by Flame (2014, though temps go up more than with anything before)
Lords of Xulima (2014)
Driftmoon (original edition, 2013)
Gone Home (2013, but FPS really didn't matter, was often probably getting around 10)
Tropico 4 + DLCs (2011-2013)

Granted, the fact that I have a 1280x1024 monitor aids this, playing at native resolution on full HD (chip won't even support higher) would be out of the question.
avatar
MysterD:
avatar
Cavalary: Question being, is all of that necessary? And if included, may it not be disabled?
Depends on how developers build their games and what they build them around for hardware, toolsets, features, etc etc.

Many RT games right now - they are doing a mix of old-school baking and/or RT. Up to you to enable or disable what you want. Not everybody's got RT cards yet, due to price/costs and/or shortages of hardware.

In cases like Metro Exodus Enhanced - new RT cards are flat-out REQUIRED. Period. RTX 2060 or RTX 3050 Ti for a minimum here; or AMD equivalent (i.e. 6000 series or newer).

Metro Exodus EE ripped-out the old baked-lighting stuff and went w/ mostly only RT for lighting techniques. Performance is similar to the old-version probably b/c the old lighting and its numerous settings got tossed; don't need it, as RT is doing all the work here & lighting scenes and reflecting every light source dynamically and realistically on-the-fly.

No iGPU currently can handle what this game is doing and RT is extremely expensive & taxing on systems.

Also, DLSS (NVidia) and FSR (AMD) upscaling techniques will be necessary, to boost performance...especially on RT-based games, which are very demanding.
Post edited June 11, 2021 by MysterD
avatar
thegreyshadow: While my paltry Intel UHD 620 onboard GPU can play some very good games with great results, there are others which make it struggle (such as The Witcher 2, a game already 10 years old!).

Making games which are unplayable (at least on the lowest end) on systems without discrete GPUs is counterproductive to game companies.

First of all, GPU production is experiencing (and has experienced since the global cryptocurrency explosion) chronic shortages and problems in availability.

Secondly, there is an absurdly large market of PCs with onboard GPUs.

Excluding such a computer base from your latest game simply impose an unreasonable limit on your potential audience.

Games should have Crysis-like and ray-tracing GPU consumption levels on their highest tiers, I can get that. But they should also ensure that their games should be playable on the lowest setting with onboard GPUs such as Intel UHD chipsets.

What do you think?
I think you'll like LowSpecGamer.

It's technically complicated. A game is designed to target some system, often a console. With a UHD 620 (which is 5 years old btw) you can generally play games that target PS3 and Xbox 360. If I'm not mistaken (I could be), The Witcher 2 was famed for it's graphics because it initially didn't target those consoles but just the PC instead. (an Xbox 360 version was released later though)

Some graphical options can lower the system requirements: disable special FX, lower texture resolution, lower screen resolution, reduce draw distance, lower detail for models. But at some point that becomes unacceptable. If the game has a rich environment, you may have to lower the draw distance to run it on your UHD 620. But then, the next checkpoint could be beyond your draw distance. Or a character tells you to visit the blacksmith "over there" which you can't see because he's too far away.

And model detail (including the environment) can't be adjusted easily I think. There are usually at least two models: a high quality model and an LOD (Low Object Detail) version. The game switches to the LOD version when something is further away. In some games you can hack something to use LOD always regardless of distance, a common feature of "potato mode".

To make any game run on your UHD 620, developers would have to create lower quality models that sit in between LOD and the regular models (some games do this anyway), have options to disable special FX (and test if they don't break anything), create simplified versions of animations and take care not to include too many active elements in any given scene. That'll likely affect gameplay. And this isn't even all they would have to do.

They absolutely would do that if game consoles didn't exist. Many potential customers either have a console or a dedicated GPU.

Some recommendations: Far Cry 3 takes some tinkering but can be made quite playable. Something similar for GTA IV and GTA V. Other titles with relatively good graphics that should run reasonably well: Skyrim, Risen, Driver: San Francisco, Mafia II, Test Drive Unlimited 2, Portal 2, Just Cause 2, Saints Row The Third, NFS: Most Wanted. And just check out LowSpecGamer.
Post edited June 11, 2021 by W3irdN3rd
avatar
thegreyshadow: While my paltry Intel UHD 620 onboard GPU can play some very good games with great results, there are others which make it struggle (such as The Witcher 2, a game already 10 years old!).

Making games which are unplayable (at least on the lowest end) on systems without discrete GPUs is counterproductive to game companies.

First of all, GPU production is experiencing (and has experienced since the global cryptocurrency explosion) chronic shortages and problems in availability.

Secondly, there is an absurdly large market of PCs with onboard GPUs.

Excluding such a computer base from your latest game simply impose an unreasonable limit on your potential audience.

Games should have Crysis-like and ray-tracing GPU consumption levels on their highest tiers, I can get that. But they should also ensure that their games should be playable on the lowest setting with onboard GPUs such as Intel UHD chipsets.

What do you think?
avatar
W3irdN3rd: I think you'll like LowSpecGamer.

It's technically complicated. A game is designed to target some system, often a console. With a UHD 620 (which is 5 years old btw) you can generally play games that target PS3 and Xbox 360. If I'm not mistaken (I could be), The Witcher 2 was famed for it's graphics because it initially didn't target those consoles but just the PC instead. (an Xbox 360 version was released later though)

Some graphical options can lower the system requirements: disable special FX, lower texture resolution, lower screen resolution, reduce draw distance, lower detail for models. But at some point that becomes unacceptable. If the game has a rich environment, you may have to lower the draw distance to run it on your UHD 620. But then, the next checkpoint could be beyond your draw distance. Or a character tells you to visit the blacksmith "over there" which you can't see because he's too far away.

And model detail (including the environment) can't be adjusted easily I think. There are usually at least two models: a high quality model and an LOD (Low Object Detail) version. The game switches to the LOD version when something is further away. In some games you can hack something to use LOD always regardless of distance, a common feature of "potato mode".

To make any game run on your UHD 620, developers would have to create lower quality models that sit in between LOD and the regular models (some games do this anyway), have options to disable special FX (and test if they don't break anything), create simplified versions of animations and take care not to include too many active elements in any given scene. That'll likely affect gameplay. And this isn't even all they would have to do.

They absolutely would do that if game consoles didn't exist. Many potential customers either have a console or a dedicated GPU.

Some recommendations: Far Cry 3 takes some tinkering but can be made quite playable. Something similar for GTA IV and GTA V. Other titles with relatively good graphics that should run reasonably well: Skyrim, Risen, Driver: San Francisco, Mafia II, Test Drive Unlimited 2, Portal 2, Just Cause 2, Saints Row The Third, NFS: Most Wanted. And just check out LowSpecGamer.
Exactly: most games, they are targeted for consoles. They're often targeted for THOSE boxes, as this is often where many gamers do play games. It's simply - this is the box for their games and dev's built around one (or two) sets of hardware (as consoles often just have one box or say one better Pro Edition box - like PS4 Pro or X Box One X).

Often, games are built around those systems (their original version), for the bare minimum. When they do that, then often the console's power/specs and those (or very similar stuff on PC) become the bare minimum and base-line for PC; especially since now XBox's architecture has usually been x86-based or x64-based.

They want their games to look and play a certain way - so if they require certain systems, toolsets, features - it's b/c they are trying to make their game look amazing (let's be real, great graphics sell; especially technically or artistically) and have a certain vision in mind of how they want it to look, as its their vision.

Finally, since PS4 is on that architecture and now Sony's playing ball here and not their own stuff like the Cell Tech and other stuff - it's probably tons easier to port this stuff to PC. Besides trying to keep up w/ Microsoft's doing now (i.e. bring their games to both PC and Xbox platform), Sony has to compete and probably also why we're finally getting PC ports of some recent Playstation games like Heavy Rain, Beyond: Two Souls, Detroit: Beyond Human, Days Gone, possibly Uncharted series, and more are supposedly planned - even if they come 6 months to a few years later here.

For PC - eh, they can just aim to the moon, if they want. Often, they go somewhere in the middle in the AA and AAA space - and those are the requirements. Sure, might not sell tons of copies if they push stuff to the moon on Day 1 release like Crysis 1 did (as you'd really have to convince gamers to upgrade their stuff ASAP) - but likely, at some point, hardware and software will catch up.
Post edited June 11, 2021 by MysterD
Most normal games should run onboards apart from modern big studio games.
Hopefully this and the graphics card shortages will start sorting the wheat from the chaff of game devs and publishers.
Agreed. Make sure as you purchase your games you test it before your refund period is over.

I love games from devs that have alternative methods of playing the games like in Retro City Rampage DX, SNK DOT EMU line and more just for this reason. These games will run great on my main gaming rig, laptop or even my raspberry pi.