HereForTheBeer: Not sure I get it, either. I tried it once, with Crysis, hooking up the laptop to the 4k 55" TV. Looked good. Dropped it down to 1080p on the TV. Looked just as good.
Maybe the difference is more discernible with other titles, not sure. Or maybe my eyes simply can't pick up the difference.
Whether you can discern the difference in resolution depends quite much how much fine detail there is in the game anyway.
Like I said, many old 3D games from the 90s allow using quite high resolutions on modern systems that were not probably available for normal people back in the 90s (playing on their 3Dfx Voodoo 2 cards at max 800x600 resolution). I recall many Unreal engine games being like this, for instance, and many old games maybe have received widescreen mods and such allowing them to be run on higher resolutions than before.
But even if you crank the resolution to something like 1600x1200 or 1920x1080, it doesn't really look any better than running it in 1024x768 or even lower. The reason is that increasing the resolution does not increase the low polygon counts on the game objects, or the fuzzy low-detail textures, etc. They look pretty much the same in 1600x1200 as in 800x600, only difference maybe being polygon edges being a bit less jaggy (if no edge antialiasing is used by the game).
So yeah, in order to really benefit from ultra-high resolutions in games, the game details (textures, polygon counts etc.) should be increased too. A simple black square looks just the same in 640x480 as it does in 12k resolutions.
And of course the screen size matters too, the bigger the resolution, the bigger the monitor should be to really see any difference. No reason to use a 12k resolution on a 14" laptop screen, I guess.