1 2 3 4 5 6 7
Banned User
Joined: 3/10/2004
Posts: 7698
Location: Finland
keylie wrote:
To me this is not a problem. If the run is not entertaining because of the low framerate, then it will be either rejected or it will go to the Vault.
I might have given the impression that my concern is purely about entertainment. It's not so. Sure, entertainment (or more precisely, lack of) would be an issue, because I doubt anybody would sit through a 1-hour run where each frame is shown for 10 seconds, after which the game jumps to a completely different location. However, it's not the main reason why I feel unease at the thought. I think games should be speedrun at their own terms. Affecting the game from the outside feels like quasi-cheating. I don't think anybody would agree that, for example, if we are running a DOS or Windows game, it would be ok to run some custom program in the background that affects the game in some manner, allowing eg. glitches that wouldn't otherwise be possible. Emulating cheating devices in older consoles, like GameShark, isn't allowed because, once again, it's changing the game in some manner from the outside. Moreover, and perhaps closer to this discussion, we would never allow overclocking a console (which would be quite easy to do in an emulator) in order to make the run go faster. The game should always be emulated in an environment that replicates the original console as faithfully as possible. Overclocking the console would be the cheapest and dirtiest possible trick to use in order to get a faster run, and nobody in the universe would accept that as legit. Now, when we are talking about PC games, we are entering a much grayer area, in both aspects, given that there is no "standard" PC configuration for any given game, and the speed of PC computers can vary by orders of magnitude. Also the area becomes grayer due to the fact that in most PC games that aren't old as dirt, the speed of the PC doesn't affect the playing speed of the game (at least usually not in any significant manner). However, I think the same principle ought to apply as above: If changing the speed of the PC somehow makes the game glitch, it just feels wrong. It's not playing the game at its own terms, but affecting it from the outside. It's not completely dissimilar, in principle, to run a custom background process that, for example, slows down the game. The only difference is that rather than doing the slowdown in software, we are doing it in hardware (at least in the emulated hardware). The end result is still the same. If we don't allow it done with software, why should we allow it if done with hardware? My own preference would be that, at least by default, PC games ought to be run in an environment that closely matches the recommended specs given by the developers of that particular game. (Exceptions could be made in individual cases, but there ought to be very good reasons for it. I don't think "running the game at 0.1fps makes it glitch" is a good-enough reason.)
InputEvelution
She/Her
Editor, Reviewer, Player (36)
Joined: 3/27/2018
Posts: 194
Location: Australia
PC games are inherently a different beast to console games. As has been noted many times in this thread, the platform has never had a consistent standard of what specifications the system should have, and this is very much reflected in how people tend to use PCs. They may have far above or far below the recommended hardware, simply depending on the price range they can afford. They don't necessarily care if it's overkill or underkill for every single one of the games they have - just that it's good enough for their general purposes. It is not a common attitude that anyone whose system differs in any way from the minimum or reccommended requirements is a cheater, because it is usually only a very small percentage of players who have the exact specifications. The open-ended manner of PC hardware also makes a standard very difficult to define, as even within the standard there is potential for wild inconsistency. Two CPUs may have a very similar processing power, but depending on how that power is split up between cores the resulting speed in practice could be very different, depending on the game. If a game uses a technology such as NVIDIA hairworks to improve performance, then it may run much better on an NVIDIA card than an equally competent AMD one. Load times may differ greatly depending on whether an HDD or SSD is used. Even if the hardware is similar, expecting anything close to a consistent output across "generally similar" hardware is ludicrous. Expecting that any combination of that hardware will happen to perfectly run the game at a common framerate is also ludicrous, so it's not like that would solve the "arbitrary framerate" problem either. Perhaps given all this, we should come to properly acknowledge that PCs are not like consoles, and they hence demand some different rules surrounding matters such as framerate (as long as it doesn't dynamically speed up and slow down with the framerate, of course). My thoughts are this: Either you can acknowledge the small differences in ruleset that PC games demand, or you might as well not accept them at all given that a console-like standard is essentially impossible.
Banned User
Joined: 3/10/2004
Posts: 7698
Location: Finland
The thought also occurred to me of the polar opposite of what I have been talking before: It might also happen with some particular game, especially an older one, that running it on a PC (real or emulated) that's too fast might make it glitch in some manner, or otherwise change its behavior in such a manner as to allow a faster completion. There are many very old PC games from the 80's and very early 90's that simply don't run on a modern PC, and in some/many cases this is because the PC is just way too fast (several orders of magnitude faster than the PCs of the time), and the game hasn't been coded with that kind of processing speed in mind. (Nowadays the only way to run such old PC games is in emulated environments such as DOSBox, which deliberately emulate a slow PC of the time. If you tried to just install DOS into a modern PC (and assuming it even works at all) and run the old game as-is, it might well just crash on launch, hang, or otherwise not work correctly.) It's not completely out of the realm of possibility that running a very old PC game on a modern super-computer of a PC does not immediately make it crash, and the game is actually completable, but might affect it in some way that allows for a faster completion. Again, this would toe the line of what is "affecting the game from the outside" in a manner that goes against the spirit of TASing, and what isn't.
fmp
He/Him
Active player (279)
Joined: 9/1/2018
Posts: 82
Warp wrote:
It might also happen with some particular game, especially an older one, that running it on a PC (real or emulated) that's too fast might make it glitch in some manner, or otherwise change its behavior in such a manner as to allow a faster completion.
Due to the nature of old computers, I honestly think they should be considered consoles here. The way things developed and evolved in that era was just so different from PCs of the modern era. If you built a real machine that ran DOS but just did it with the speed of a modern computer, I would consider that akin to building an overclocked SNES. In short, I agree with what you're saying, but I think this opinion should be its own topic and not effect a different ruling here.
Site Admin, Skilled player (1254)
Joined: 4/17/2010
Posts: 11475
Location: Lake Char­gogg­a­gogg­man­chaugg­a­gogg­chau­bun­a­gung­a­maugg
After talking for quite a bit to Nach, here's the problem he has with not limiting framerates at all:
19:10:16 <Nach> making the game rely on OS function time, and then also screwing with that time to allow range max (resolution max?) is extremely questionable

19:35:52 <Nach> it's questionable because you're screwing with the functions the game use and using it in situations it wasn't designed for. The entire topic also begins off with a false premise, which I absolutely not apologize for, nor will I even remotely try to operate in.
So yeah, I'm not a tech guru, and when brainstorming in order to create this thread, the internal technical aspects of this problem weren't known to me. After having learned about it, here's the summary of our talk: For a very long time, the standard the games strive to work well under has been vsync. It's the primary mode to use during play, since it helps with a lot of issues, one of them being inconsistency of hardware specs which only gets worse over time.
18:23:00 <Nach> PC specs games are designed for are typically around a 6 year window surrounding the game design/release period

18:23:27 <Nach> shoving a PC game onto a PC with specs 20 years later, and running it natively that way is crazy

18:24:33 <Nach> even consider the PCs from 3x86 to Pentium era, they all had a "Turbo" button which you could turn off to make the CPU speed more like what you saw on an x86 or 2x86, so you could run old software as they were intended

18:26:50 <Nach> since everyone saw games from the x86 and 2x86 era were misdesigned based on actual clock count, forcing later x86 CPUs to have that turbo option, practically every game since started limiting time based on video and sound output speeds. Games in the last ~15 years even go for a combo due to various differences in hardware and to allow the most precise measurement of time. This is why vsync exists. If you're going to use 20+ year future CPUs and you want it to work right, use vsync (assuming vsync isn't nuts either). Turning off vsync and using 20+ year CPU? Come on...

19:01:26 <Nach> the only reason why games generally even allow turning off vsync is so you can play them on older hardware, which isn't capable of hitting 60 FPS or whatever, or for cases where you have some exotic hardware which doesn't report vsync properly, although in that latter case, there's no telling without vsync will work either
So for games that allow to disable vsync and still run at the same speed regardless, they rely on OS time functions among other things to work properly time-wise. When the game runs at high framerate without vsync, the actual rate it runs at isn't consistent. Hourglass and libTAS replace time-related OS functions and can force the game to run at certain framerate consistently without vsync. While consistency sounds like something good, we're not just modifying OS functionality to make the game TASable and then replay the movie without OS hacks. We're hacking OS functions to affect the way the game works. Even though some time functions report proper time, allowing the game to be aware of real world time and work at consistent speed gameplay-wise regardless, other time functions are replaced and report false time altogether. The suggested solution is limiting all this to vsync and internal options for game speed, since neither involves hacking OS functions and both allow the games to run in the frameworks they target. I can add that this unifies the competition conditions too.
Warning: When making decisions, I try to collect as much data as possible before actually deciding. I try to abstract away and see the principles behind real world events and people's opinions. I try to generalize them and turn into something clear and reusable. I hate depending on unpredictable and having to make lottery guesses. Any problem can be solved by systems thinking and acting.
creaothceann
He/Him
Editor
Joined: 4/7/2005
Posts: 1874
Location: Germany
So, all games running vsynced at 60fps then?
Memory
She/Her
Site Admin, Skilled player (1556)
Joined: 3/20/2014
Posts: 1765
Location: Dumpster
creaothceann wrote:
So, all games running vsynced at 60fps then?
That is not what was said at all.
[16:36:31] <Mothrayas> I have to say this argument about robot drug usage is a lot more fun than whatever else we have been doing in the past two+ hours
[16:08:10] <BenLubar> a TAS is just the limit of a segmented speedrun as the segment length approaches zero
Banned User
Joined: 3/10/2004
Posts: 7698
Location: Finland
creaothceann wrote:
So, all games running vsynced at 60fps then?
I for one wouldn't protest very loudly if this were the rule, at least. Note, however, that some PC games, even some modern ones, have a hard cap of 30fps (usually because they were developed primarily for consoles and ported to PC as an afterthought, and it turned out that the game doesn't really work well at 60fps. In at least one example physics can get all wonky and stuff, when they were badly designed for 30fps only.) But, of course, and rather obviously, in these cases using 30fps is 100% justifiable.
Patashu
He/Him
Joined: 10/2/2005
Posts: 4043
I think this is the correct interpretation: So I can TAS computer games at any monitor refresh rate I can successfully emulate it being VSynced to and that exists (60, 120, 144, etc)? This would mean the 1000 FPS Towerfall Ascension TAS is rejected as invalid, even if an existing computer could run it that fast, unless a monitor that refreshes that fast can be proven to exist. Conversely, the 120 FPS Axiom Verge TAS (not submitted yet) is valid since a 120 FPS Monitor does exist.
My Chiptune music, made in Famitracker: http://soundcloud.com/patashu My twitch. I stream mostly shmups & rhythm games http://twitch.tv/patashu My youtube, again shmups and rhythm games and misc stuff: http://youtube.com/user/patashu
Player (26)
Joined: 8/29/2011
Posts: 1206
Location: Amsterdam
The reason that 1990-era DOS games do not run on modern PCs is absolutely not that these PCs are simply too fast. Rather, the reason is that system architecture has completely changed in the intervening years. For instance, 1990-era PCs use 16-bit architecture, whereas contemporary PCs use 64-bit architecture. It's a not uncommon among console players to think that "computers" are all the same thing, but that is a misconception and we really shouldn't be basing rules on it.
keylie
He/Him
Editor, Emulator Coder, Expert player (2841)
Joined: 3/17/2013
Posts: 392
19:01:26 <Nach> the only reason why games generally even allow turning off vsync is so you can play them on older hardware, which isn't capable of hitting 60 FPS or whatever, or for cases where you have some exotic hardware which doesn't report vsync properly, although in that latter case, there's no telling without vsync will work either
I don't know if we can generalize game dev intents, but this is definitively not the only reason why players disable vsync. Many players disable vsync to get a smoother rendering. Also, on fighting games, players disable vsync to reduce the input lag.
feos wrote:
While consistency sounds like something good, we're not just modifying OS functionality to make the game TASable and then replay the movie without OS hacks. We're hacking OS functions to affect the way the game works. Even though some time functions report proper time, allowing the game to be aware of real world time and work at consistent speed gameplay-wise regardless, other time functions are replaced and report false time altogether.
I'm confused about this part. Yes, we control the time that is given to the game, why do you use the term of hacking ? Also, this is not specific to vsync disabled. Running the game with vsync on does not garantee that the game will run at a constant framerate, it will only enforce a maximum framerate. The game can still have slowdowns. So we also report false time like you say. Edit: If vsync is on, the game still relies on OS functions, it doesn't use the monitor refresh rate for its physics. Edit:
18:23:27 <Nach> shoving a PC game onto a PC with specs 20 years later, and running it natively that way is crazy
We mentioned games that run on PCs from the same period at 1000+ fps with vsync off.
Banned User
Joined: 3/10/2004
Posts: 7698
Location: Finland
Radiant wrote:
The reason that 1990-era DOS games do not run on modern PCs is absolutely not that these PCs are simply too fast. Rather, the reason is that system architecture has completely changed in the intervening years. For instance, 1990-era PCs use 16-bit architecture, whereas contemporary PCs use 64-bit architecture.
That's incorrect. Intel's x86 architecture, and now x86_64 architecture, retains backwards compatibility with its historic 32-bit and even 16-bit counterparts. You can still switch your CPU to the old 16-bit mode, and it will work exactly like a pre-80386 16-bit Intel CPU. In fact, and moreover, when you start up your PC, the CPU will actually start in 16-bit mode, for legacy backwards compatibility reasons. (Or, at least, the x86 CPUs did, ie. up until the Pentium4 or so. I haven't checked if the x86_64 CPUs also start up in 16-bit mode, but I think they do as well.) The very first thing that the bootloader of your modern OS does, ie. the first thing that the CPU starts executing from your disk, is to switch the CPU from 16-bit mode to 64-bit mode (I think it has to switch to 32-bit mode in between, but I'm not sure). In theory a modern CPU is completely capable of booting directly to a 16-bit MS-DOS boot disk. Heck, even GPUs still have support for the old text mode and VGA modes from the early 80's. (In practice MS-DOS might work very poorly, if at all, with a modern PC. Not because of the CPU, but because it doesn't have support with the vast majority of the remaining hardware. Forget about anything USB, and it might have trouble with modern mass storage media, and a bunch of other things. Although you might be able, at least in theory, to boot from an MS-DOS floppy disk. Assuming your PC still has a floppy drive.) Which means that, in theory, a modern PC can still run old 16-bit MS-DOS programs natively, without the need of any sort of emulation. The problem with many DOS games from the 80's is that they used things like busy-loops for timing. Famously (or infamously), many games from the 80's (perhaps even very early 90's) would measure the speed of the PC when they started, and adjust their busy-loops accordingly, but once PCs got too fast, the counter they used for measuring this would roll over, and thus their timing would be way off, or they could even crash. (Busy-loops were a sadly common way of adjusting the speed of games back then. There was poor support for any "sleep" commands in the OS or the BIOS. In fact, MS-DOS used busy-loops itself. It didn't have support for idling the CPU, like all modern OS's do. Which means that if you were to run MS-DOS on a modern PC, it would run on one core at 100% utilization all the time, assuming it would run at all.)
It's a not uncommon among console players to think that "computers" are all the same thing, but that is a misconception and we really shouldn't be basing rules on it.
Intel CPUs have retained to this day backwards compatibility with their legacy CPUs, so it's not that far-fetched of a concept. As said, in theory even a modern Intel CPU can run 16-bit executables natively, in 16-bit mode.
Joined: 9/6/2009
Posts: 24
Location: Renton, WA
feos wrote:
We're hacking OS functions to affect the way the game works. Even though some time functions report proper time, allowing the game to be aware of real world time and work at consistent speed gameplay-wise regardless, other time functions are replaced and report false time altogether.
I'm not sure what you're talking about here; libTAS reports "false time" from all time functions (or at least that's the goal), and that's an important part of making the game run reproducibly across different computers. Also, the whole vsync issue feels like a red herring; from the point of view of a game running under libTAS, successive drawn frames are always exactly 1/fps seconds apart, so as far as the game can tell, vsync is always on (whether it asked for vsync or not). Also, games that support windowed mode need to be able to work with a wide range of frame rates, even with vsync on. (Technically, even games that only support exclusive fullscreen mode should work with a range of refresh rates, because it's not guaranteed that you can change the refresh rate to your desired value. I wouldn't be surprised if some games were buggy and assumed that refresh rate changes always worked, though.) Finally, a comment on the specific case of Towerfall Ascension. It's my understanding that if you run the game on a high-end computer from 2013 (when the game came out), and go into the game's video settings menu and turn off vsync, it runs at over 1000Hz. Then if a superhuman player played the game on that computer (with a 1000Hz gaming keyboard), they could approximately reproduce keylie's run; to exactly reproduce the run, they would have to slow the game down to exactly 1000Hz. If all this is correct, then the effect of the 1000Hz frame rate in keylie's run is to slow the game down compared to our hypothetical superhuman player using real hardware. In that case, I think that frame rate should clearly be allowed. And so (going back to the of an overall frame rate limit that is the subject of this thread), whatever general rule TASVideos adopts should be such that keylie's run is allowed.
Banned User
Joined: 3/10/2004
Posts: 7698
Location: Finland
Running games at ridiculously high refresh rates bothers me less than running them at ridiculously low ones, because at least the high refresh rate doesn't detract from the enjoyability of watching the run. However, one has to wonder if there nevertheless ought to be a sensible upper limit. If games are being run in an emulated environment, then there's no theoretical upper limit to the refresh rate (even if the host machine can't run it in real-time at that refresh rate, that doesn't matter; to the game it looks like it is, if it's emulated as if it were, even if in actuality it's being emulated at a much lower speed). So what stops a game from being run at a million Hz? Or a billion Hz? At some point we reach a stage where the run simply cannot physically be reproduced in real hardware, not even in theory. TASes ought to at least theoretically be reproducible in the original hardware. A related question: Even though 1000Hz keyboards exist in real life, I still have to wonder if it's physically possible to press keys, and have them register properly, at that frequency. For example can you alternate between two keys at that frequency, and have it work? It just feels a bit wonky if a TAS is theoretically reproducible... but doing so would require hardware that simply doesn't exist.
Joined: 9/6/2009
Posts: 24
Location: Renton, WA
Warp wrote:
However, one has to wonder if there nevertheless ought to be a sensible upper limit. If games are being run in an emulated environment, then there's no theoretical upper limit to the refresh rate (even if the host machine can't run it in real-time at that refresh rate, that doesn't matter; to the game it looks like it is, if it's emulated as if it were, even if in actuality it's being emulated at a much lower speed). So what stops a game from being run at a million Hz? Or a billion Hz?
I agree with the idea of an upper limit in theory, but I don't see how to define one in practice. For a game that explicitly supports disabling vsync, I think it should be allowed to run at least as fast as it would on a high-end computer from when the game was released; but defining that as the upper limit is impossible to enforce. (Would we require the TASer to go buy such a computer to run tests? And then the judge also has to buy a computer, to double-check what the TASer says?) The problem is that libTAS is not an emulator. Our emulators have a model for how many clock cycles it takes to execute a chunk of code, and how many clock cycles are available per frame, and how long various peripherals (like the GPU) take to perform various operations. Even if these models are not completely accurate (for example, I suspect some of our emulators model GPUs as infinitely fast), they are reproducible and thus a suitable basis for rules. In fact, with the sole exception (I think) of DOS emulation with JPC-RR, the emulators don't let you set things like clock rate, so there isn't even a need for a formal rule beyond the list of which emulators are acceptable. libTAS doesn't have that. It could measure how fast the game loop is on the TASer's computer, but not in a way that's reproducible even from one run of the game to the next on the same computer, and certainly not in a way that's reproducible across different computer models and could be compared to a hypothetical reference computer from a few years ago. To boil it down, I feel like there are a few desirable principles the rule should follow: 1) Games should not be allowed to run significantly faster than they would on reasonable contemporary hardware. 2) Games should be allowed to run as fast as they would on reasonable contemporary hardware. 3) Rules should be clear and enforceable, with a predictable effect. And I just don't see any way to satisfy all three principles. I personally prefer principle 2 to principle 1, but I'm glad I'm not responsible for making the final decision :)
Site Admin, Skilled player (1254)
Joined: 4/17/2010
Posts: 11475
Location: Lake Char­gogg­a­gogg­man­chaugg­a­gogg­chau­bun­a­gung­a­maugg
cwitty, do you happen to know how exactly libTAS is keeping this consistent no-vsync framerate? I asked here and got no response. Also, can any regular hardware or software simulate vsync by enforcing consistent framerate when the game isn't configured to keep it consistent? For example, I see an option to force vsync for 3D apps in the videocard settings, but it doesn't seem to be all that flexible in which framerates it allows.
cwitty wrote:
To boil it down, I feel like there are a few desirable principles the rule should follow: 1) Games should not be allowed to run significantly faster than they would on reasonable contemporary hardware. 2) Games should be allowed to run as fast as they would on reasonable contemporary hardware.
That's the whole point: contemporary hardware is relative! It's impossible to have a clear borderline when we consider the hardware not contemporary to a given game anymore. So there's a point about 3-6 years around the game release date, and whatever was common then was most likely targeted by the devs. Yet again this is also unmeasurable and uncertain, impossible to know or define.
Warning: When making decisions, I try to collect as much data as possible before actually deciding. I try to abstract away and see the principles behind real world events and people's opinions. I try to generalize them and turn into something clear and reusable. I hate depending on unpredictable and having to make lottery guesses. Any problem can be solved by systems thinking and acting.
Banned User
Joined: 3/10/2004
Posts: 7698
Location: Finland
feos wrote:
Also, can any regular hardware or software simulate vsync by enforcing consistent framerate when the game isn't configured to keep it consistent? For example, I see an option to force vsync for 3D apps in the videocard settings, but it doesn't seem to be all that flexible in which framerates it allows.
It is my understanding that the display driver / graphics card can "force" a game to vsync even if it isn't doing so itself, at least if the game is using DirectX, OpenGL or Vulkan. Essentially, if I understand correctly, what's happening is that the game tells the API to render something, and then the display driver deliberately waits for vsync before displaying the end result and informing the game that "ok, done, you can continue". (I'm assuming that depending on the game, and what kind of optimizations it has, this might not be the optimal way of running the game because since it's not itself aware of being "vsynced" it can't do something else while waiting for vsync. But this is just speculation on my part.) I suppose that, in a sense, if the display card were able to otherwise render faster than the monitor can display it, the driver is artificially "slowing down" the game to match the monitor's vertical sync. The game just idles while waiting for it. This is actually how g-sync and FreeSync work. If I understand correctly, to use either one, the game ought to be configured to not vsync, and the display driver then decides what the rendering pace of the game will be. (In normal situations the driver will tell the g-sync/FreeSync-capable display to show the picture immediately, unless we have reached the maximum refresh rate the display is capable of, in which case the driver deliberately waits for the display to become available again, doing essentially a vsync that's external to the game, as above.)
keylie
He/Him
Editor, Emulator Coder, Expert player (2841)
Joined: 3/17/2013
Posts: 392
feos wrote:
Also, can any regular hardware or software simulate vsync by enforcing consistent framerate when the game isn't configured to keep it consistent?
There are softwares like RivaTuner to limit the framerate of a game.
Joined: 9/6/2009
Posts: 24
Location: Renton, WA
feos wrote:
cwitty, do you happen to know how exactly libTAS is keeping this consistent no-vsync framerate?
At least in the basic cases, sure. libTAS keeps a current "libTAS time", increments that time whenever the game displays a frame, and reports that time whenever the game asks what time it is. So if the game displays frames with glXSwapBuffers(), and asks for the time with gettimeofday(), it might make these calls with the given results. (I'm assuming that libTAS time is set to start at 0, and the frame rate is 50Hz, to make the numbers prettier.) gettimeofday() -> 0.00 glXSwapBuffers() gettimeofday() -> 0.02 glXSwapBuffers() gettimeofday() -> 0.04 Which is the same sequence of function calls and results the game would see with vsync on (at 50Hz), or if the game has vsync off and an external tool is limiting the framerate to 50Hz, or if the game has vsync off and just happens to be running on a computer that's exactly fast enough that the game loop runs at 50Hz.
creaothceann
He/Him
Editor
Joined: 4/7/2005
Posts: 1874
Location: Germany
Warp wrote:
This is actually how g-sync and FreeSync work. If I understand correctly, to use either one, the game ought to be configured to not vsync, and the display driver then decides what the rendering pace of the game will be. (In normal situations the driver will tell the g-sync/FreeSync-capable display to show the picture immediately, unless we have reached the maximum refresh rate the display is capable of, in which case the driver deliberately waits for the display to become available again, doing essentially a vsync that's external to the game, as above.)
Afaik with G-Sync, when the game sends a frame, the driver and the graphics card immediately pass it along to the monitor. Then all three (driver, graphics card and monitor) go idle again. The only limiting factor is the hardware capabilities of the graphics card, the monitor, and the PC.
keylie wrote:
feos wrote:
Also, can any regular hardware or software simulate vsync by enforcing consistent framerate when the game isn't configured to keep it consistent?
There are softwares like RivaTuner to limit the framerate of a game.
There's also NVIDIA Inspector.
Site Admin, Skilled player (1254)
Joined: 4/17/2010
Posts: 11475
Location: Lake Char­gogg­a­gogg­man­chaugg­a­gogg­chau­bun­a­gung­a­maugg
I've been gathering info on the details, and here's the actual situation we're facing. Authenticity of the game We always enforced this. We want the image to be good, we disallow hacking it to make it easier (or to otherwise insignificantly change it). We disallow cheat codes. We disallow games that are poorly emulated, as well as in-game tricks and glitches resulting from poor emulation. When a game is not available at all in its original form, we allow its modified version that persisted as long as it doesn't glitch out over unintended console settings (like region). If a game can't be TASed due to various reasons, we may allow tweaking some of its files as long as it doesn't affect gameplay. Authenticity of the environment We also require our emulators to stay true to the original consoles they emulate. This feels implied, because consoles have locked and known once and for all specs, with exception for cases when some quirk was undocumented or documented wrongly. We don't allow emulator settings that don't correspond to something the original consoles had. It's important to note than in cases when a console behaves non-deterministically, we don't require emulators to inherit this aspect, quite the opposite: we enforce determinism even if it didn't exist on the actual console! With that one exception, we don't allow to emulate a hacked console, including its hardware and software it's been officially shipped with. Authenticity of their communication We don't allow NTSC games for CRT TV based consoles to be ran on PAL consoles. We don't allow using BIOS from the wrong region. The type of consoles the game version was released for is preserved while TASing. Exceptions are cases when different console version results in identical gameplay or seemingly improves it, like GBC or SGB do to GB games. Sometimes authenticity is not even remotely possible This is about PC specs. When architecture is open and anyone can release hardware and software for a machine, and when anyone can build a machine that fits their taste and budget, it's impossible to demand authenticity. Because there is no spherical original machine in vacuum, everything is infinite variations. Consoles have locked and sealed specs that may vary a bit, but otherwise it's possible to take a game, run it on original console and compare to how it's emulated. It's impossible to take a PC and check how well a game for it is being emulated to run on PC in general, because there's no the PC. You can only emulate one of the countless components. Then, you may manually overclock your hardware or throttle it as you like and can, to force the games do what you want. On top of that, some operating systems allow modifying games, even DOS could do that easily, and some games explicitly react to file modifications by enabling or disabling some features, or ignore them completely, loading levels in the wrong order or blindly using utterly irrelevant data as legit. So does that mean that for PC games, we should allow anything? Of course not. Probably by chance, but it happened so that over the years we've developed our rules to allow entertaining arbitrary decisions on one hand, and strictly defined, purely speed oriented, boring content on the other. When it comes to something inherently arbitrary, we do our best to disallow it from fastest completion and full completion, from the Vault tier (because those categories are supposed to be strict and obviously legitimate to vast majority of people). When it's hugely entertaining, we may allow non-standard approaches that may look odd or questionable, but we still try to forbid blatant misuse. For example, we allow in-game glitches that actually exist in the original software, but we don't allow injecting unintended game images before or during play. We don't allow pre-setting startup memory to something not known to be possible on authentic hardware. We don't allow Game Genie that goes between the game and the console, intercepting the game code. We don't allow unintended console region. Play can only feel superhuman if we enforce all the original limitations human had from the outside, and overcome our internal human limitations: planning in advance, perfect reaction and absolute precision. Breaking integrity and authenticity breaks the challenge we've been having with the game, as well as the challenge we're having while TASing. It'd also defeat the purpose of watching a TAS: who cares if I can kick my console and jump straight to ending? But we care how perfectly the game software can be played. We limit ourselves to working with software in isolation. Hardware this software is running on we take as is. All the conditions are authentic, only the human that faces them is no more mere human. So what do we do to PC games given unlimited specs variations? Just like with picking the intended console region or the intended BIOS, we can pick intended conditions the games were supposed to be played in. The easiest way to find them out is reading the official docs on recommended PC specs. If that's not enough or not available, we can use modes the games explicitly supply, for example several pre-defined speed modes. If even that is not enough, we may look at some less public resources like game code and deduce intended settings from that. Sticking to intended specs is required, because some games may completely glitch out on things they were never designed for, exactly like some NTSC games glitch out in PAL mode. Some games may be skipping levels over unsupported CPU, others would allow time saving glitches if you overclock the CPU, no one plays 3D games on GPUs that aren't meant to support them and break the gameplay. Breaking the authenticity just because it allows glitches is very very shaky ground. Come on already, should we ban that or not? No one is convinced such stuff should be outright banned. After all, maybe someone can make an entertaining TAS with noclip. But the arbitrary broken nature of unsupported environment doesn't really sound like it should be allowed for Vault. Also, when someone makes a TAS of Doom with overclocked CPU resulting in just 1 minute of very entertaining gameplay, such a setting would compromise the legitimacy of a movie to some people, just like there are people disliking major skip glitches or glitches in general. So if we limit such movies to Moons, we will need to add some labels and classes to them that would prevent any confusion. Just like we always mark movies that start from a save file and play nawgame+. The range of possible scenarios such classes and labels would need to reflect may be huge, so just throw out crazy ideas, brainstorm if you feel like it, we'll participate and review the suggestions. And of course tell me what you think about this post.
Warning: When making decisions, I try to collect as much data as possible before actually deciding. I try to abstract away and see the principles behind real world events and people's opinions. I try to generalize them and turn into something clear and reusable. I hate depending on unpredictable and having to make lottery guesses. Any problem can be solved by systems thinking and acting.
Alyosha
He/Him
Editor, Emulator Coder, Expert player (3822)
Joined: 11/30/2014
Posts: 2832
Location: US
feos wrote:
I've been gathering info on the details, and here's the actual situation we're facing. Authenticity of the environment We also require our emulators to stay true to the original consoles they emulate. This feels implied, because consoles have locked and known once and for all specs, with exception for cases when some quirk was undocumented or documented wrongly. We don't allow emulator settings that don't correspond to something the original consoles had. It's important to note than in cases when a console behaves non-deterministically, we don't require emulators to inherit this aspect, quite the opposite: we enforce determinism even if it didn't exist on the actual console! With that one exception, we don't allow to emulate a hacked console, including its hardware and software it's been officially shipped with.
I think part of the problem here is in dealing with different ideas of 'authenticity.' We don't really require emulators to be true to hardware, 'passably plausible' would be more accurate of what we require. This usually comes down to easy to identify and understand metrics. We don't overclock a NES CPU for example, this is something that is easy to understand and easy to check. Beyond that it if looks and sounds right it's given a pass. I haven't ever heard anyone say 'Hey that emulator is doing DMA instantaneously, that's not authentic!' I don't think this gives much of a platform to build off of for 'authenticity' in PC games, where the dynamic nature of hardware is just part of the landscape. My two cents would be to leave it up to the TASer and just make a judgement call as cases come up, probably based in large part on audience enjoyment.
Skilled player (1672)
Joined: 7/1/2013
Posts: 448
feos wrote:
Sticking to intended specs is required, because some games may completely glitch out on things they were never designed for, exactly like some NTSC games glitch out in PAL mode.
Oof...RoboSlop.
Banned User
Joined: 3/10/2004
Posts: 7698
Location: Finland
feos wrote:
Just like with picking the intended console region or the intended BIOS, we can pick intended conditions the games were supposed to be played in. The easiest way to find them out is reading the official docs on recommended PC specs. If that's not enough or not available, we can use modes the games explicitly supply, for example several pre-defined speed modes. If even that is not enough, we may look at some less public resources like game code and deduce intended settings from that.
If for some reason the game did not come with recommended specs (as might have been the case with many older games from the earlier 90's, and perhaps even some from later), one approach would be to research what was the most typical average gaming PC of the time of publication, and take that as a baseline. Of course even in this case there can be a lot of ambiguity (especially given that in the late 90's and early 2000's there were like 4 or 5 different major GPU manufacturers and at least 3 or 4 major CPU manufacturers, before only the current two on each industry were left), but some kind of baseline could probably still be reached. Of course there can still be other major problematic situations, even when using "recommended specs" or equivalent. A game from the late 90's could, for example, work properly when using a 3dfx Voodoo card but bug out if using an ATI FireGL card. Should the run be allowed to emulate the latter to abuse some glitches that don't happen with the former?
Site Admin, Skilled player (1254)
Joined: 4/17/2010
Posts: 11475
Location: Lake Char­gogg­a­gogg­man­chaugg­a­gogg­chau­bun­a­gung­a­maugg
Alyosha wrote:
I think part of the problem here is in dealing with different ideas of 'authenticity.' We don't really require emulators to be true to hardware, 'passably plausible' would be more accurate of what we require. This usually comes down to easy to identify and understand metrics. We don't overclock a NES CPU for example, this is something that is easy to understand and easy to check. Beyond that it if looks and sounds right it's given a pass. I haven't ever heard anyone say 'Hey that emulator is doing DMA instantaneously, that's not authentic!'
What you're describing was the situation of the past, mostly before major skip glitches became a norm. Lately with insanely precise memory corruption techniques that people are willing to verify on console and request emulator improvements accordingly - we do aim for accuracy of our emulators. While in this thread I was trying to find cons to obsoleting older optimal movies with newer equally optimal but more accurate ones, we still agreed to highlight such console verified resyncs and host them alongside the original movie files. And while accuracy can't be objectively measured, we depend on it being as high as possible, this is reality already.
Alyosha wrote:
I don't think this gives much of a platform to build off of for 'authenticity' in PC games, where the dynamic nature of hardware is just part of the landscape. My two cents would be to leave it up to the TASer and just make a judgement call as cases come up, probably based in large part on audience enjoyment.
Look at a similar case when inherent arbitrary nature of external data was decided to be disallowed from Vault.
Warp wrote:
If for some reason the game did not come with recommended specs (as might have been the case with many older games from the earlier 90's, and perhaps even some from later), one approach would be to research what was the most typical average gaming PC of the time of publication, and take that as a baseline. Of course even in this case there can be a lot of ambiguity (especially given that in the late 90's and early 2000's there were like 4 or 5 different major GPU manufacturers and at least 3 or 4 major CPU manufacturers, before only the current two on each industry were left), but some kind of baseline could probably still be reached. Of course there can still be other major problematic situations, even when using "recommended specs" or equivalent. A game from the late 90's could, for example, work properly when using a 3dfx Voodoo card but bug out if using an ATI FireGL card. Should the run be allowed to emulate the latter to abuse some glitches that don't happen with the former?
The main decision we should make here is whether we agree that arbitrary PC architecture and setup should be limited to Moons or not.
Warning: When making decisions, I try to collect as much data as possible before actually deciding. I try to abstract away and see the principles behind real world events and people's opinions. I try to generalize them and turn into something clear and reusable. I hate depending on unpredictable and having to make lottery guesses. Any problem can be solved by systems thinking and acting.
1 2 3 4 5 6 7