Last year there was
a thread about DOS games, where it was decided:
- If gameplay is tied to real world time and doesn't speed up or slow down depending on CPU speed, CPU speed for such a game can be maximized to reduce lag.
- If gameplay is tied to CPU speed, and the game slows down or speeds up along with CPU config, CPU speed must be set to whatever was common when the game was released.
This only talks about JPC-RR and its capabilities. There is one more aspect that can depend on the CPU speed - framerate of the video that comes from the emulator.
DOS games don't use framerates too much above 70, and JPC-RR doesn't output arbitrary framerates, so for DOS games, even if they send frames to the VGACard at 1000fps, it's not a problem when it comes to displaying the video and dumping it to the file.
For PC games, this may be problematic.
When VSync is used, games use to render frames at common framerates the monitor and the video card are configured to. The most common framerate is probably 60fps.
There are monitors that support
up to 240fps.
Video cards support
theoretically unlimited framerates, it just depends on complexity of a given game and what processing it involves. For high resolutions, high settings, and modern games, framerates GPUs can output
don't go above 300.
Now,
with VSync disabled, a game may run at uncapped speed. If this just removes lag, we can handle this the same way we handle DOS games. If gameplay speeds up, we must limit framerate to whatever it's meant to be played at.
But what if the game has fixed real world speed for everything, but happens to
render frames at uncapped framerate? A 1000fps video is unwatchable due to heavy lag, and due to countless dropped frames as far as the monitor is concerned. Let alone human eye and what it can percept.
Hard limit for PC games at least for linux
seems to be 1,000,000,000fps. So if you're running a game on a quantum computer, my first question is WTF are you smoking, and the second question, who needs that framerate in games?
Due to inability to watch encodes at insanely high framerates, we have to limit maximum allowed framerate we let the games run at. Because we can't simply drop frames from encodes on a regular basis only to make the watchable. Unique events may be happening of frames we're dropping, and there can be no way to just dedup them.
Any artificial limit seems to be arbitrary. VSync for any game that can render frames at uncapped framerate sounds like a good limit to me. But even then, the computer can be configured to run at framerates above 60.
Where should we stop? 240fps? 60fps? Something else? And why there?