It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
randomuser.833: Don't mix up patch size and "what does it fix".
Today most games are build up by few but very large single archive files, that do contain the assets (graphics, sound), that are like 90% of a game size, if not more.
I would argue that model is broken then, and grotesquely inefficient. Sure, provide a game as archive files on the media. However, an installed game should consist of those large archive files having been unpacked, such that the smaller individual files are accessible.

Otherwise, what is actually being installed then? The installation simply consists of copying 3 or 4 large archive files to the user's PC? You talk about the time/cost of unpacking and re-packing. Are modern games doing that unpacking every time the game is launched?

May as well just play it directly from the media then ...
avatar
randomuser.833: Don't mix up patch size and "what does it fix".
Today most games are build up by few but very large single archive files, that do contain the assets (graphics, sound), that are like 90% of a game size, if not more.
avatar
Time4Tea: I would argue that model is broken then, and grotesquely inefficient. Sure, provide a game as archive files on the media. However, an installed game should consist of those large archive files having been unpacked, such that the smaller individual files are accessible.

Otherwise, what is actually being installed then? The installation simply consists of copying 3 or 4 large archive files to the user's PC? You talk about the time/cost of unpacking and re-packing. Are modern games doing that unpacking every time the game is launched?

May as well just play it directly from the media then ...
You are several decades to late to the party.
It started with the ID tech 3 (Quake 3 Arena) and Unreal Engine for Unreal Tournament...
If not earlier...
Could be ID Tech 2 of Quake 2 already.

Why it is done?
In the first place it does save space on the disk.
They are more easy to handle and load for the engine. Loading a hell of small files simply takes longer compared to sucking in the whole thing at once.
Because stuff is streamed from the archive to the RAM by the engine.
And if you want, you can protect them so they can't be unpacked without the right key.

Install is basically unpacking the even larger pack, that contains those "smaller" packs and copy them. And not only talking about GoG here but "physical media" (where i would place the GoG offline installers).
How the Steam Client o GoG Galaxy does it when installing from the web, no idea.



Btw, newest Kingdome Come 2 patch is nearly 70GB.
No way I will store that somewhere next to the installer files. I will just redownload the whole game again and install it again.
Should be even faster in the end...
Post edited May 17, 2025 by randomuser.833
avatar
randomuser.833: ...
Ok, but I still have trouble believing it can be fast enough to unpack these archives on the fly when launching a game, but so very time-consuming and memory-intensive to unpack/re-pack to apply a patch. To the extent that it makes more sense for the user to have to re-download 50+ GB of files.

Launching a game takes typically less than a minute. Downloading 50GB could take a typical user an hour or more (even with a decent connection). There is no way it would take a hour plus to unpack these game archives, apply a 500 MB patch, and then re-pack.

I simply don't buy that the status quo is the optimal solution, for the user. It's the optimal solution for lazy developers, who don't want to pull their finger out.
Post edited May 17, 2025 by Time4Tea
avatar
randomuser.833: ...
avatar
Time4Tea: Ok, but I still have trouble believing it can be fast enough to unpack these archives on the fly when launching a game, but so very time-consuming and memory-intensive to unpack/re-pack to apply a patch. To the extent that it makes more sense for the user to have to re-download 50+ GB of files.

Launching a game takes typically less than a minute. Downloading 50GB could take a typical user an hour or more (even with a decent connection). There is no way it would take a hour plus to unpack these game archives, apply a 500 MB patch, and then re-pack.

I simply don't buy that the status quo is the optimal solution, for the user. It's the optimal solution for lazy developers, who don't want to pull their finger out.
Game Engines handle the pack differently to your default packing programm.
Game Engines take some data from the packs and store them in the memory of your graphics card or in system memory.

Your unpacker unpacks and writes to the disk.

And the patcher unpacks to the disk. The packs may contain packs too, so this will have to be unpacked again, then the files are changed or reworked and then everything repacked and copied back.

A bit different....


As I said, there are some technical pros for using the big packs. But that is mostly on the side of the engine and how smooth the game will load or run.
Don't ask me more here, had a talk a long time ago about this with people who are more into this. I can remember it is good for the game itself.
It doesn't matter anyway, because the last game I saw with a lot of small files with everything in it was Hearts of Iron 2. And with some extensive Mods that one will take muuuuuuuuuuuuch longer to load.
Even something like Mechwarrior 5 (UE Engine) will take much longer to load with mods, because there is data not in the packs that has to be loaded too.

In the end, the creators of various game engines decided it is that way and devs have to deal with it. And the engine creators give a shit about your patching experience.
End of story.

The only point where devs got something to say is, will the patch have the whole pack (big patch, needs a lot of internet resource) or not (long and local resource intensive patching).
They can pick their poison. And well, yours...

And a little reminder, people got mad when Cyberpunk does need 100GB for patching, because Cyberpunk patches used the file patching and not pack patching.
For Kingdom Come 2, people getting mad about the size of the patches, because they use pack patching and not file patching.

Something that will not work is small patches, fast patching and very few if at all disk space needed and a fast working and loading high class engine in combination.
For the people talking about multi-disc games: Cyberpunk 2077 came out on two discs. It still doesn't work, hasn't been patched and is unplayable with the autosave feature going off every other minute.
avatar
NuffCatnip: For the Playstation this is incorrect, most titles have all the data on the disc and are playable.
This might be true for the XBox or Switch, but I don't have those consoles and couldn't tell if that is the case.
The problem is that even with Playstation games, most games nowadays launch in an incomplete state, with many bugs or lacking quality of life features that receive patches later, post release.

So even in this case, games would need to be updated post launch.
We also have experience with DLCs being released years after launch.
So physical media end up being incomplete games and as soon as servers shut down, physical media versions will lack many features updated complete editions would have.
avatar
randomuser.833: There are 2 ways to patch this.
1. swap out the whole archive file
2. unpack the archive, change the files that need changes or even swap out code in files and repack the whole thing.
Actually, there are 3 ways:

3. binary patching (the way of bdiff). This creates small patch files and patching is fast unless the patched file is "too large" (because the file size usually changes and thus the remainder of the file usually must be rewritten entirely, multiple times even). So only small or medium files work well with this. However, this breaks the very instant there is even a minor difference between the file the patch was created against and the file that is installed, so you must always follow the entire patch chain in the proper order. But of course, you could then still download the failed file if that happens. If there is encryption (=DRM) at work, then the binary files will differ so vastly even for a single changed byte (that's half of the point of encryption), that a binary diff will at worst even be larger than the original file. This may also happen with compression (but most game archives aren't compressed, the assets themselves already are (textures and audio), the rest is peanuts).

From what I see, the game engines mount the game archives as virtual file systems (exactly like a CDROM emulator) and access the files that way. Accessing small files takes much overhead and doing that inside memory makes it faster especially with file attributes like access times the OS may otherwise use. Also the file placement can be controlled inside the archive but not on the drive, so it makes some sense. However, given that "badly optimized" would even be an improvement over the curent state of games (which easily classify as "no plans to maybe consider thinking about the possibility of optimization"), I doubt this is being done.

Anyway, this mounting is the reason why multiple mods / mods with many files slow things down, because for every file in every mod, the respective file in the mounted archive needs to be redirected, after checking for its existance, which either requires creating a list on the fly or checking each file upon access, or compiling such a list on every load. It cannot be replaced or permanently redirected because mods must be able to be removed, altered or disabled at any time. Plus, mod files usually reside on the OS filesystem, so the access time and other things impact them.
Post edited May 17, 2025 by Dawnsinger
avatar
randomuser.833: There are 2 ways to patch this.
1. swap out the whole archive file
2. unpack the archive, change the files that need changes or even swap out code in files and repack the whole thing.
avatar
Dawnsinger: Actually, there are 3 ways:

3. binary patching (the way of bdiff).
But for binary patching in archive files those files are usually unpacked first and then repacked too.
I never heard of binary patching of an archive.

And I wrote, that it can be that files are replaced, or that files are changed.
avatar
Dawnsinger: Actually, there are 3 ways:

3. binary patching (the way of bdiff).
Yes, this is what I was thinking of - binary patching of the installed game 'archive' files.


avatar
Dawnsinger: This may also happen with compression (but most game archives aren't compressed, the assets themselves already are (textures and audio), the rest is peanuts).
Right. It is highly unlikely that installed game archives would be compressed (since that would give a terrible performance overhead). Therefore, these installed 'archive' files are essentially just a chain of individual binary files, stitched together. So, it should be relatively trivial to snip out the portion of the archive that corresponds to a given file, patch it as needed, and then splice the result back together again (possibly modifying some header metadata).

I don't see anything difficult about that whatsoever. Again, bottom line --> lazy developers.
Post edited May 18, 2025 by Time4Tea
DO NOT buy Doom the Dark Ages. There is
1. Not the full data on the disc, or no disc at all
2. Always online requirement, so no game without connection

I will never buy it, as long as no changes made.

I had it on preorder but the moment i heard "always online required"... i was canceling the order instantly.
Post edited 4 days ago by Xeshra
There exists an amazing website called "DoesItPlay?" which tests whether physical games work offline and whether they require a download to be played (or enjoyed in case of a full bug-riddled physical release). You should always check this website before buying a physical copy of a game.
While they still provide some games on disc, they don't really, as it is all about a web connection. They've gone digital, and discs are not much more than a courtesy or initial promotional attraction ... just one of the ways they get you to spend your money on what they offer.

I feel the OP's pain I really do, as I gave up on PC games for about 9 years, because I had a similar thing happen when I bought SiN Episodes and the Orange Box, which had this Steam requirement in very small print, that was kind of meaningless to me at that point, and on a 56K modem connection, which was the norm around here back then (2008-2009), so having to wait days for a game update to download before playing, just turned me off buying anymore games for the PC. I discovered but couldn't really use GOG back then, and it was really only in May 2017, that I rediscovered GOG, and of course I had a much better web connection by then and well versed on online purchasing.
avatar
Time4Tea: I don't see anything difficult about that whatsoever. Again, bottom line --> lazy developers.
Are you a dev or do you got any deeper technical knowledge or is this just another example of the world best but never asked trainer in a stadium full of trainers who is watching a game far down below.
avatar
Timboli: and on a 56K modem connection, which was the norm around here back then (2008-2009)
Jesus! I mean, I get the whole dial-up issue over here in rural areas with poor connection hardware (esp. old copper connections lacking enough functional wires for ADSL), but I found websites were basically a huge pain to use around 2003 onwards on such a slow connection (now completely useless since sites will time out trying to load scripts under those speeds, lol). I had to bite the bullet and pay for a wireless broadband connection which was/is rather pricey for its extremely tiny data cap.

Everyone else in this area basically had ADSL/ADSL+ at that point. Dial-up was not the norm here at all.

Geez, loading a website must have been like pulling teeth for you.
Post edited May 19, 2025 by Braggadar
avatar
randomuser.833: Are you a dev or do you got any deeper technical knowledge or is this just another example of the world best but never asked trainer in a stadium full of trainers who is watching a game far down below.
I'm not a game developer, no (are you?). However, I've been gaming since the 80s and I have quite a bit of expertise in scientific computing.

Honestly, given the 'dumbing down' of game development in recent years (i.e. increasing levels of abstraction and loss of knowledge of low-level coding and optimization techniques), I probably have more technical computing knowledge than most people who call themselves 'game developers' these days.