timppu: Sometimes I really can't figure out how a game decides whether it uses vsync or not.
Yesterday I decided to try the GOG versions of Dungeon Siege 1 and 2 on my Windows 10 laptop (I have added the missing expansion packs to the GOG versions too, according to the instructions obtained here).
The laptop has a NVidia GPU, and in the NVidia control panel I have set:
- All games should prefer NVidia discrete GPU over Intel HD GPU.
- Vsync should always be enabled.
I checked also the Intel Command Center options, but I didn't find anything related to vsync there, just in case a game tries to use Intel HD GPU anyway.
Also in the Dungeon Siege 1 options, I recall there was a DirextX Options or something like that, and there I also enabled vsync. In the game itself, I don't see any vsync or framerate options.
Either way, when I run Dungeon Siege 1, it appears it is running with vsync off. The internal FPS counter shows it running at like 70-200 frames per second, depending what is on the screen. Also the laptop fan seems to be running at full speed, which indicates that the GPU (and/or CPU) are running hot too, obviously as they are trying to run the game as fast as they can, with ludicrous framerates I never asked for.
I am not even sure if Dungeon Siege is selecting the NVidia or Intel GPU, there is no apparent way to tell that.
But when I run Dungeon Siege 2, it seems to be using vsync because its framerate seems ot be locked at pretty constant 60 fps. And yes, the laptop fan is much quieter too when running DS2.
So, yeah, I just wish there was some damn way to make 10000000000% sure vsync is on, always. None of this bullshit "it works on this game, but not that one". The only logical thing I can think of is that for some reason Dungeon Siege still selects Intel GPU which hasn't enabled vsync (as I didn't find such an option in its drivers), while DS2 used NVidia where vsync is always forced. But then, DS "DirectX options" still has vsync enabled, why doesn't it work then?
I am unsure if using that "riva tuner" that someone mentioned would achieve that, framerates always locked to max 60 fps, no questions asked. Then again the Riva Tuner configuration page seemed to suggest you should use it only if your GPU is able to always run higher than your monitor refresh rate. Will it cause more problems than vsync in such cases?
Yeah I understand that vsync will drop the framerate to 30 fps if your GPU can't keep the framerate 60 FPS or over, which sucks of course. The worst scenario being that it constantly jumps between 30 and 60 fps.
Overall this isn't a big problem as I have already finished the first Dungeon Siege + expansion (on a different computer; I don't recall if I was able to use vsync on it) and I'm not going to replay it, but still I'd like to figure this out.
Yeah, you're not the first to complain about that. Similar issues happen (with selection) when you have two monitors, too. I can't remember if you can disable the integrated one or not. I'm sure there's a way to enable the vsync, though. Alot of times there's another program that can control your drivers which windows doesn't install: for ATI this is AMD Catalyst, not sure for other cards.
timppu: So is vsync like: "If you don't use it, you lose it."?
I don't know what you mean. I understand you know some coding. If you want, i can send you my VESA driver, which has vsync in use. It's not all that complicated.
Orkhepaj: btw i dont get this why would vsync drop to 30fps if your fps is below 60 for a 60hz monitor?
imho this is false info , could anyone link if this is true or not?
Frameskipping is default scenario if you fail to finish drawing the buffer in time. This is precisely how it prevents tearing. vsync is more or less forcing a double buffer system to sync with the refreshing of the screen. So it'll always draw the front buffer at refresh. If you don't make a new front buffer before it refreshes again ('cause it's too much for you to work) it drops the frame and redraws the old front buffer, 'cause you haven't flipped yet. Without vsync, it'll often catch you in the act of replacing the font buffer with the back buffer, which causes the tear. This is why i talk about triple buffering for games: you can keep vsync enabled and the game'll run like it can push as many frames as it want, but it's still enabled with vsync and at the monitor's refresh rate as long as the game can run faster than that. Your GPU will run hot, though.It also helped me dramatically improve Hyperdimension Neptunia 1's framerate, because vsync forces wait states and since my computer can't handle the game that well, it cut the wait states and since it properly double buffers already, the 3rd buffer lets my GPU run hotter to deliver a much higher framerate (at risk of loosing consistency at points).
Dark_art_: I usually help a friend that suffers from motion sickness setup the games, since he knows next to nothing about it. He can play Portal 2 with V-sync turned off and using 120+fps and cannot at 60fps with V-sync. First Person camera view seem to be the worse in this regard and tweaking the fild of view usually helps a lot.
My guess is he might be one of those that can see the upper ranges. He'd likely benefit from freesync. If he gets one, find out what rate doesn't cause him trouble.
Regarding 30 vs 60 fps, latelly I'am playing Into the Breach locked at 30 to save battery and the animations are junky as hell. Any First Person game at 30fps is a instant no to me, even some games at 60Fps V-synced.
I've mention this time and time again, some games really need the high fps, even on a 60Hz monitor. Richard Burns Rally need to be played at 100+ fps, it's playable at 60fps but good luck to beat the game, you'll need it.
If 60fps is too slow on a 60fps monitor, either vsync is broken or this is placebo effect. You cannot exceed the monitor's refresh rate with vsync turned off.