Dark_art_: From ARK.Intel website some CPU's I´m very familiar with:
HD 4400 3200x2000@60Hz
HD 620 4096x2304@60Hz
UHD 620 4096x2304@60Hz
As I've stated before the UHD 620 an HD 620 are the same, performance-wise, even supported video codecs are the same (don't quote me on this). They are very similar to the older HD 520, wich are the last Windows 7/8/8.1 supported GPU's.
So, regarding wich superseedes what, from older to newer:
HD 3000 used on Sandy Bridge 2ng gen CPU'S. Underpowered, only old codec support and DX10
HD4000 Ivy Bridge 3rd gen, this marks the start of good Intel integrated graphics with DX 11, Direct Compute, modern codecs (not sure if accelerates vp9 and h265) and enough power to run many modern games (of course, not triple A)
HD 4400/4600 Haswell/Broadwell 4th and 5th gen. Slight update from previous line, small performance improvement but better overall support, including QuickSync encoder and Vulkan on Linux (didn't test myself).
HD 520 Sky Lake 6th gen. Major update with DX 12, modern 4k decoders, Vulkan and better performance.
HD 620 Kaby Lake 7th gen. Slight update to codecs and first GPU with Win10 only support.
UHD 620 8 and 9th gen CPU's and as far as I know, the same as above.
UHD G1/G4/G7 Used on 10th gen 10nm CPU's. Increased core count, better performane and overall support.
I've only listed more popular parts, stuff found on Celerons and Atoms are all over the place. Iris graphics have dedicated memory/cache, more cores, high power consumption and way better performance, usually only found in very expensive devices such as Apple's laptops. (I've posted a couple on NUC's featuring Iris graphics on this thread, it's the first time I see it on cheaper computers)
Also, Intel GPU's also have generations. HD4000 is gen7 while HD 520 is gen9, in the above list what's mentioned is the CPU gen for clarity and simplicity.
Not trying to be a smart azz but there seem to be a lot of confusion on this topic (who'd wonder, with all this naming schemes). Intel ARK and Noteboock check are good sources to check this stuff.
HD 4400's 3200x2000 is NOT 4K / 2160p / 3840x2160.
Though, sounds like HD 620 and UHD 620 are 4K supported, as it goes a little bit over 4K aka 2160p aka 3840x2160.
So, thanks for clarifying that up quite a bit more.
Looks like HD 620 and UHD620 went through a very smart re-branding process, to clear up & un-muddy some waters here, as the U (for Ultra) here signifies 2160p....and actually a bit beyond, in this case.
Thanks for the links.
EDIT:
Regardless, gamers really serious about modern games, they should really invest in some solid CPU/GPU pairs:
1. CPU's with Ryzen 3000 or newer CPU's for desktops or Ryzen 4000+ for laptops (or better); and/or Intel's 10th gen or newer with i5's or i7's (laptops or desktops)
2. And for GPU's - go with AMD's 5000 or 6000 series (6000 is probably better for those who want RTX and hopefully their DLSS equivalent comes soon, as 5000 doesn't do RT); and/or NVidia GTX 1660 (no RTX) or NVidia 2000 or 3000 series (with RTX).
Probably more so should at least aim for having RT support, especially since consoles (PS5 and XBS/X) support RT now.
Of course, patient gamers doing older stuff and not doing say RT stuff in the future or waiting for that major performance/RT jump, a few gens down - they'd probably be more than fine w/ even GTX 1660's, for now.