It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
kohlrak: It's definitely placebo effect. You see a similar effect with people complaining that a game runs at 60fps vs 120fps. Most people won't notice the difference between 30 and 60 without being told, and while there is a difference between 60 and 120, odds are pretty much everyone would fail a genuine blind test.
I usually help a friend that suffers from motion sickness setup the games, since he knows next to nothing about it. He can play Portal 2 with V-sync turned off and using 120+fps and cannot at 60fps with V-sync. First Person camera view seem to be the worse in this regard and tweaking the fild of view usually helps a lot.

Regarding 30 vs 60 fps, latelly I'am playing Into the Breach locked at 30 to save battery and the animations are junky as hell. Any First Person game at 30fps is a instant no to me, even some games at 60Fps V-synced.
I've mention this time and time again, some games really need the high fps, even on a 60Hz monitor. Richard Burns Rally need to be played at 100+ fps, it's playable at 60fps but good luck to beat the game, you'll need it.

As a personal opinion, OpenGl games seem to feel less smooth overall than DirectX.
avatar
Orkhepaj: btw i dont get this why would vsync drop to 30fps if your fps is below 60 for a 60hz monitor?
imho this is false info , could anyone link if this is true or not?
Depends on game and driver implementation (and settings used) I guess. On many games it's true and Shadow Tactics on Intel Graphics, is an exemple where if by any chance the frame rate drop a little below 60, the game will V-Sync at 30.
Many other games is not true, like Into the Breach mentioned above.
Post edited May 21, 2021 by Dark_art_
avatar
Mori_Yuki: In older games such as UT2004 which I still play from time to time there is no way around activating it. Running in the thousands of fps when turning it off the game would be unplayable.
This is false. I run UT 2004 without V-Sync with no issues. At 240 FPS, the game behaves completely normally.

I can even run UT 99 in up to 200 FPS before the game speed starts getting affected.
Post edited May 21, 2021 by idbeholdME
avatar
timppu: Sometimes I really can't figure out how a game decides whether it uses vsync or not.

Yesterday I decided to try the GOG versions of Dungeon Siege 1 and 2 on my Windows 10 laptop (I have added the missing expansion packs to the GOG versions too, according to the instructions obtained here).

The laptop has a NVidia GPU, and in the NVidia control panel I have set:
- All games should prefer NVidia discrete GPU over Intel HD GPU.
- Vsync should always be enabled.

I checked also the Intel Command Center options, but I didn't find anything related to vsync there, just in case a game tries to use Intel HD GPU anyway.

Also in the Dungeon Siege 1 options, I recall there was a DirextX Options or something like that, and there I also enabled vsync. In the game itself, I don't see any vsync or framerate options.

Either way, when I run Dungeon Siege 1, it appears it is running with vsync off. The internal FPS counter shows it running at like 70-200 frames per second, depending what is on the screen. Also the laptop fan seems to be running at full speed, which indicates that the GPU (and/or CPU) are running hot too, obviously as they are trying to run the game as fast as they can, with ludicrous framerates I never asked for.

I am not even sure if Dungeon Siege is selecting the NVidia or Intel GPU, there is no apparent way to tell that.

But when I run Dungeon Siege 2, it seems to be using vsync because its framerate seems ot be locked at pretty constant 60 fps. And yes, the laptop fan is much quieter too when running DS2.

So, yeah, I just wish there was some damn way to make 10000000000% sure vsync is on, always. None of this bullshit "it works on this game, but not that one". The only logical thing I can think of is that for some reason Dungeon Siege still selects Intel GPU which hasn't enabled vsync (as I didn't find such an option in its drivers), while DS2 used NVidia where vsync is always forced. But then, DS "DirectX options" still has vsync enabled, why doesn't it work then?

I am unsure if using that "riva tuner" that someone mentioned would achieve that, framerates always locked to max 60 fps, no questions asked. Then again the Riva Tuner configuration page seemed to suggest you should use it only if your GPU is able to always run higher than your monitor refresh rate. Will it cause more problems than vsync in such cases?

Yeah I understand that vsync will drop the framerate to 30 fps if your GPU can't keep the framerate 60 FPS or over, which sucks of course. The worst scenario being that it constantly jumps between 30 and 60 fps.

Overall this isn't a big problem as I have already finished the first Dungeon Siege + expansion (on a different computer; I don't recall if I was able to use vsync on it) and I'm not going to replay it, but still I'd like to figure this out.
Yeah, you're not the first to complain about that. Similar issues happen (with selection) when you have two monitors, too. I can't remember if you can disable the integrated one or not. I'm sure there's a way to enable the vsync, though. Alot of times there's another program that can control your drivers which windows doesn't install: for ATI this is AMD Catalyst, not sure for other cards.

avatar
timppu: So is vsync like: "If you don't use it, you lose it."?
I don't know what you mean. I understand you know some coding. If you want, i can send you my VESA driver, which has vsync in use. It's not all that complicated.

avatar
Orkhepaj: btw i dont get this why would vsync drop to 30fps if your fps is below 60 for a 60hz monitor?
imho this is false info , could anyone link if this is true or not?
Frameskipping is default scenario if you fail to finish drawing the buffer in time. This is precisely how it prevents tearing. vsync is more or less forcing a double buffer system to sync with the refreshing of the screen. So it'll always draw the front buffer at refresh. If you don't make a new front buffer before it refreshes again ('cause it's too much for you to work) it drops the frame and redraws the old front buffer, 'cause you haven't flipped yet. Without vsync, it'll often catch you in the act of replacing the font buffer with the back buffer, which causes the tear. This is why i talk about triple buffering for games: you can keep vsync enabled and the game'll run like it can push as many frames as it want, but it's still enabled with vsync and at the monitor's refresh rate as long as the game can run faster than that. Your GPU will run hot, though.It also helped me dramatically improve Hyperdimension Neptunia 1's framerate, because vsync forces wait states and since my computer can't handle the game that well, it cut the wait states and since it properly double buffers already, the 3rd buffer lets my GPU run hotter to deliver a much higher framerate (at risk of loosing consistency at points).

avatar
Dark_art_: I usually help a friend that suffers from motion sickness setup the games, since he knows next to nothing about it. He can play Portal 2 with V-sync turned off and using 120+fps and cannot at 60fps with V-sync. First Person camera view seem to be the worse in this regard and tweaking the fild of view usually helps a lot.
My guess is he might be one of those that can see the upper ranges. He'd likely benefit from freesync. If he gets one, find out what rate doesn't cause him trouble.
Regarding 30 vs 60 fps, latelly I'am playing Into the Breach locked at 30 to save battery and the animations are junky as hell. Any First Person game at 30fps is a instant no to me, even some games at 60Fps V-synced.
I've mention this time and time again, some games really need the high fps, even on a 60Hz monitor. Richard Burns Rally need to be played at 100+ fps, it's playable at 60fps but good luck to beat the game, you'll need it.
If 60fps is too slow on a 60fps monitor, either vsync is broken or this is placebo effect. You cannot exceed the monitor's refresh rate with vsync turned off.
higher render fps makes game feel more smooth even if it is over the monitor's refresh rate
avatar
Orkhepaj: higher render fps makes game feel more smooth even if it is over the monitor's refresh rate
That's impossible. All that does is present tearing. The refresh rate is specifically the rate at which the screen gets updates. It'll only update at it's refresh rate (usually 60fps), regardless of vsync. The whole point of vsync is to tell you when it's ready for you to swap buffers. Worst case scenario, you get oversped animations (because the game depends on it running at a specific framrate) and tearing, best case scenario, it is no different from vsync. Average case, these days, is that it still runs at the same speed as if it were vsynced, except there are occasionally torn frames.

Feel free to look at precisely how this works under the hood, as it's not that complicated. I could find videos on it if you need.
here they bring some good points why it is good

and from my own experience it is good to have more fps for shooters, they really feel more fluid
Post edited May 21, 2021 by Orkhepaj
avatar
Orkhepaj: higher render fps makes game feel more smooth even if it is over the monitor's refresh rate
avatar
kohlrak: That's impossible. All that does is present tearing
He's right, games do feel smoother and more responsive above 60fps (on a 60Hz monitor), even without V-Sync.
A good exemple of this is CS-GO, people don't play at crazy high fps just because. I did a test on this stuff back in the day, on a training map where you have to shoot random spawn position bots, to train your aim, I could get consistently much better scores with higher fps. Same with Richard Burns and plenty other titles.
I'm not sure if it has anything to do with the game physics calculation being tied to the framerate or just the "tearing effect" present some parts of the images earlier or even just less input/output lag.

This "perceived smoothness" is probably measurable and quantifiable, I'm positive there must some sort of quasi-cientific test from a random youtuber. Will try search some stuff later.
Depends on the game.

If there's performance loss, input lag and/or screen-tearing - nope, I will disable V-Sync then. I'll turn it off. I do this A LOT for Fallout 3/NV/FO4 - often capping straight-up at 60fps tops (via either NVidia Inspector/NVidia Panel/MSI Afterburner), to avoid nasty issues of physics going bonkers and input lag.

I do have G-Sync monitor on my GTX 1060 laptop (120hz built into that 15.6'' 1080p screen) and my desktop (RTX 3070-based PC with 240hz 1080p monitor via Displayport) - so, if all's well, I can just G-Sync there.

If G-Sync and/or V-Sync got issues - can always try Nvidia FastSync. I ain't tried this lately, since getting G-Sync and all - but some games work well w/ FastSync; especially titles that came out b/t V-Sync and G-Sync.
Post edited May 21, 2021 by MysterD
avatar
Orkhepaj: here they bring some good points why it is good

and from my own experience it is good to have more fps for shooters, they really feel more fluid
avatar
kohlrak: That's impossible. All that does is present tearing
avatar
Dark_art_: He's right, games do feel smoother and more responsive above 60fps (on a 60Hz monitor), even without V-Sync.
A good exemple of this is CS-GO, people don't play at crazy high fps just because. I did a test on this stuff back in the day, on a training map where you have to shoot random spawn position bots, to train your aim, I could get consistently much better scores with higher fps. Same with Richard Burns and plenty other titles.
I'm not sure if it has anything to do with the game physics calculation being tied to the framerate or just the "tearing effect" present some parts of the images earlier or even just less input/output lag.

This "perceived smoothness" is probably measurable and quantifiable, I'm positive there must some sort of quasi-cientific test from a random youtuber. Will try search some stuff later.
Game logic is a separate dicussion by virtue of triple-buffering features. Animation itself is a visual phenomena, to which saying it's smoother is a matter of opposing physics.

Like i said, though, there's something to be said about broken vsync drivers which are not even remotely uncommon. Even my vsync driver isn't that good, because I have to rely on a setup that more or less wastes over 50% of cpu cycles, because the vsync IRQ never actually gets implemented despite it being a standard they claim to implement. That's a dirty little secret that not alot of gamers are aware of. Bad vsync drivers would manifest as, say, 60fps or even 30fps on a monitor that can go 120fps. Do a quick google search for "vsync broken" and you'll see plenty of complaints of FPS drops (implying that it's going below the monitor's refresh rate, meaning it's not actually vsyncing but using some sort of other timer). You'll also notice that the most recent posts suggest that g-sync is the new industry standard and there's no interest in fixing vsync.
An interesting thing that I've noticed. Forcing V-sync via graphics drivers options sometimes provides better results than using in-game options. The lag is less noticeable for some reasons. At least in my case. It's worth a try if someone wants to use V-sync but can't bare the input lag it causes.
Post edited May 21, 2021 by Sarafan
avatar
Sarafan: An interesting thing that I've noticed. Forcing V-sync via graphics drivers options sometimes provides better results than using in-game options. The lag is less noticeable for some reasons. At least in my case. It's worth a try if someone wants to use V-sync but can't bare the input lag it causes.
The more i look into broken vsync, the more i'm starting to think it might be something broken in whatever they're using to create the wait states, so this would make sense. Letting them do their things at an unspecified rate will probably work better. I still say triple buffer, though. Have you tried enabling that to cut down the "input lag"? I keep mentioning it in this thread, but i'm curiously not getting any feedback on it.

EDIT: I just read that nVIdia cards appear to have a setting for a stack of pre-rendered frames for vsync, which would add an absolute fuck ton of "input lag" (output lag).
The way that it's implemented in most games will add a frame of latency.

I'm not even sure that you can get "true" double-buffered V-Sync with an NVIDIA GPU any more though, as I can't think of the last time I ever saw a 60 FPS game immediately drop from 60 to 30, even in Fullscreen Exclusive Mode - at least on Windows 10.

Reducing the maximum pre-rendered frames setting to 1 in the NVIDIA Control panel will reduce latency when V-Sync is enabled.

You can use RTSS to bring latency even lower than that, at the cost of introducing some minor stuttering: https://www.blurbusters.com/howto-low-lag-vsync-on/
Original post here.
Post edited May 22, 2021 by kohlrak
avatar
kohlrak: The more i look into broken vsync, the more i'm starting to think it might be something broken in whatever they're using to create the wait states, so this would make sense. Letting them do their things at an unspecified rate will probably work better. I still say triple buffer, though. Have you tried enabling that to cut down the "input lag"? I keep mentioning it in this thread, but i'm curiously not getting any feedback on it.
You mean forcing triple buffer via the graphics card drivers options and leaving V-sync on in the game options? I'll try to experiment with that later on.
Post edited May 22, 2021 by Sarafan
avatar
kohlrak: The more i look into broken vsync, the more i'm starting to think it might be something broken in whatever they're using to create the wait states, so this would make sense. Letting them do their things at an unspecified rate will probably work better. I still say triple buffer, though. Have you tried enabling that to cut down the "input lag"? I keep mentioning it in this thread, but i'm curiously not getting any feedback on it.
avatar
Sarafan: You mean forcing triple buffer via the graphics card drivers options and leaving V-sync on in the game options? I'll try to experiment with that later on.
how can i do that with amd?:O
low rated
avatar
kohlrak: The more i look into broken vsync, the more i'm starting to think it might be something broken in whatever they're using to create the wait states, so this would make sense. Letting them do their things at an unspecified rate will probably work better. I still say triple buffer, though. Have you tried enabling that to cut down the "input lag"? I keep mentioning it in this thread, but i'm curiously not getting any feedback on it.
avatar
Sarafan: You mean forcing triple buffer via the graphics card drivers options and leaving V-sync on in the game options? I'll try to experiment with that later on.
Yes.

avatar
Sarafan: You mean forcing triple buffer via the graphics card drivers options and leaving V-sync on in the game options? I'll try to experiment with that later on.
avatar
Orkhepaj: how can i do that with amd?:O
See screenshot from my really outdated computer.
Attachments:
VRR is the way to go if you care about tear-free non-input-lagged gaming.