Table of contents [expand all
] [collapse all
The submission text below contains lots of spoilers that may take away from the experience.
I recommend watching it first and then coming back here to find out what the hell you just watched.
I've always wanted to explore arbitrary code execution (ACE),
but it's not easy to come up with something meaningful to do with it.
Since you have the potential to do literally anything, you are held to a high standard,
so it doesn't seem like you're wasting people's time sitting through the setup
for what feels like an unsatisfying effect, even though the run is technically excellent (unfortunate example
I was working on this run for multiple months already and started to lose interest in the concept,
when AGDQ 2017 came around, and I saw the amazing TASBot block
which showed a similar concept, but with a different focus, more suitable for a live event than a submission.
After a brief defeated I-was-beaten-to-the-punch feeling, I realized that it's now or never, get this done.
So thanks to everyone involved in that TASBot block, without it I may have dragged this out for many more months.
The aim of this run is to show more aspects of what you can do with ACE.
I feel it is often misunderstood as "you can skip to the end", or "you can cause crazy effects in the game",
the concept of being able to to literally anything is hard to grasp, perpetuated by the fact that in most
applications what you want to do is fairly limited by the goal you have set within the game.
This is also an exploration of the limits of the Gameboy hardware.
I try to do things that have not been done before on a Gameboy, things that it was clearly not designed
for, and things that seem impossible at first glance.
I'm not only exploiting the game, but the very hardware it runs on.
This run was began with a simple idea, to play another game within a game, using ACE exploits.
This has been done before with toy examples,
but I was aiming for a full-fledged existing game, in a way that is indistinguishable from the real thing.
Obviously, it should be two games from the same system, you don't want to write emulators in ACE,
plus running a powerful system (e.g. Wii) to emulate a much less powerful one (e.g. GB/NES) seems cheap.
Specifially, I looked at running Gen II Pokémon games inside a Gen I game.
The reasoning was that I already had lots of experience with the Gameboy system and the games in particular from my previous runs (or so I thought).
Gen I has easy and quick ACE setups, so you can get to the meat quickly,
instead of wasting the majority of the time just for the setup.
I quickly realized that Yellow will be my base game of choice,
because it is the only one with Gameboy Color support, which is essential for Gen II.
But there were more problems.
As a bit of background, GB game cartridges are not just ROM storage for the game, they also have their
own controller on them and additional hardware pieces that vary by game.
For example most games have writable storage built in to hold saved games or high scores.
Gen II Pokémon games also have a battery-backed real-time clock in their cartridge,
which is used to track real time in game.
Gen I games don't have that (they have an entirely different controller in fact),
which is a serious problem, meaning that Gen II code can't just be run on Gen I cartridges and work,
even if we had a way to get the code on there somehow.
But this is going to be a predefined input file anyway, so I don't need to run all of the actual
code, I just need to run equivalent code that produces the same audio-visual effect as the original.
At first I thought about streamlining the original code by cutting not needed code paths and priming
it so that it produced the predetermined results I wanted (basically pre-computing the emulation and
only running the resulting instructions), but I came up with an even more radical idea: I realized that
all the instructions that really mattered are those that put tiles on the screen or played some sounds.
So all I need to do is emulate the actual audio-visual output of the game with the right timing,
without any internal game state.
This realization was the key to this run, as it opened many more possibilities: The source of the A/V
doesn't need to be another game. It could be a hack of a game. Or the mash-up of different games.
Or from a different system. Or literally arbitrary A/V.
This made this run become not about running a game in another game, but more about pushing the limits of the Gameboy hardware and see what is possible.
The base game is not all that important, since the main point of this run is to showcase ACE.
I chose Pokémon Yellow, because it has a very fast ACE setup, and it is has GBC capabilites,
but any other game with an ACE exploit would work just fine.
This run uses lsnes, because unlike the other preferred emulator BizHawk, it supports sub-frame inputs.
Games can poll the joypad inputs at arbitrary times, as frequently as they like.
However, most emulators arbitrarily limit your input capabilites to one input per frame,
meaning that every time the game polls the inputs in that frame, this same one input is used.
This is often seen as "good enough", since most game loops run only once per frame as well.
It's actually not good enough since the concepts of what a "frame" is is arbitrary too,
and the game loop frames and the input frames often don't align properly,
so you can miss out on inputs you could otherwise make on a real console.
It would need to be at least twice per "frame" to work reliably,
you can kind of see it as an unusual application of the Nyquist–Shannon sampling theorem
(with no actual connection to it),
with the expected maximum frequency of one poll per frame.
An even better solution, which is used in lsnes, is to allow a different input every time the game polls the joypad.
This way you ensure that you can definitely do any sequence of inputs that are possible on an actual console.
It's still kind of awkward though, because it has an arbitrary concept of "frame" baked into the input file,
so while you can define a different input for each poll of the game,
you still need to know which arbitrary frame this input occurs at,
instead of just having a list of inputs, once per poll, as they occur.
When doing many inputs per frame, this becomes a problem, because you need to
pretty much know exactly to the cycle where the input frame boundary will be,
in order to assign them to the correct input frame.
When running your own code, being able to do as many inputs per frame as you want can be exploited
to vastly increase the data throughput when injecting data using the joypad.
Since the joypad is the only source of data you have, this dramatically speeds up the setup times,
and allows for the real-time playback seen in this movie, with thousands of inputs each frame.
ACE setups are usually done in multiple stages, allowing for progressively more control.
You can think of it as boot loaders of an operating system.
This run uses three bootstrapping stages to load a final payload stage that does everything.
So why not only use one and load the final payload right away?
You are very limited at first in what you can manipulate, and it can take a long time,
so you're often better off only creating a very simple program that is slightly more powerful than the current one,
but can be built quickly, and let it do the rest of the work at a much faster speed.
The first stage is 9 bytes long, and written using item manipulation which costs multiple seconds per byte.
The second stage is 13 bytes long, and written using the first stage at one byte per frame.
After that, the second stage can write many bytes each frame,
effectively making the rest of the rest of the setup instantaneous.
The inital ACE setup used in this run is very similar to FractalFusion's Pi movie
, with only minor improvements.
It spells out the same code, but uses menuing improvements to get there a bit faster (e.g. using swapping of 0-stacks as a faster way of throwing away items).
The code effectively allows to write one byte per frame starting at $d350, and after each byte everything is immediately executed.
See FractalFusion's post
for detailed information on how it works.
Using the first stage, the second stage is written at $d350:
e2 ld [$ff00+c], a // enable joypad read at all
18 07 jr .start // jump over unfinished parts of the code
f2 ld a, [$ff00+c] // read half-byte (nibble) input.
cb 37 swap a // swap upper and lower nibble.
57 ld d, a // store in register d temporarily
f2 ld a, [$ff00+c] // read another nibble
aa xor d // combine the nibbles
22 ld [hli], a // write result in memory
ab xor e // xor with e = $5d
20 f6 (7d ab) jr nz, .loop // loop if result not zero
It heavily uses the existing values in the registers when reaching this point.
Register c is 0, so [$ff00+c] conveniently points to $ff00, where the joypad inputs are read from.
In a Gameboy, the inputs are not all read at once, you can only read half of the inputs at a time,
either the directional keys or the buttons, 4 bit each. The other half of the byte you receive is
static garbage data.
In order to read a full byte of data, the joypad is therefore polled twice,
and the results are combined using xor, which ensures that for each byte you want to produce there is
a combination of two inputs that does it.
The final "xor e" is only used for the exit condition.
Zero is an important byte to be able to write and therefore a bad exit condition,
and xoring with $5d makes it so that $5d is the exit condition instead, which happens to be an expendable value.
The main advantage of this stage over the first one is that it is able to run many times each frame,
so it can potentially write more than 1000 bytes each frame, not just 1.
The third stage has no concern for its size anymore since the second stage can write it very quickly,
so it is focused on finishing the setup and putting the right bits into the right places for the payload to run.
cd 96 1e call $1e96 // call GBFadeOutToWhite, fades screen to white
f3 di // disable all interrupts
e2 ld [$ff00+c], a // re-enable joypad reads
e0 40 ld [rLCDC], a // Disable LCD
3c inc a
e0 4d ld [rKEY1], a
10 00 stop // Enable double-speed mode
21 00 c0 ld hl, $c000 // Write payload to $c000, similar to second stage
f2 ld a, [$ff00+c]
cb 37 swap a
57 ld d, a
f2 ld a, [$ff00+c]
aa xor d
22 ld [hli], a
ab xor e
20 f6 jr nz, .loop
c3 00 c0 jp $c000 // Jump to written code
It first calls GBFadeOutToWhite from Yellow's original code, which does a smooth screen transition to white.
This is not at all necessary for the expolit to work, but helps with providing a smooth transition between the game and the ACE-controlled scenes that follow.
After the transition it disables the screen (this is important to be able to access certain memory areas and be able to control the exact frame timing),
and puts the system into double-speed mode.
Double-speed mode is feature introduced in the GBC that increases the clock speed from 4MHz to 8MHz,
effectively doubling the amount of computation you can do in the same amount of time (there are some caveats).
How Gameboy graphics work
This is only an overview over the relevant parts of how a Gameboy works,
a more in-depth description can be found in the Pan Docs
and the Gameboy CPU Manual
, which were instrumental in figuring all of this out.
All graphics are based on 8x8 pixel tiles with 2bpp depth (i.e. 4 colors).
These tiles can be rendered on the screen in three different ways: Background, Window and Sprites.
The Background is a 32x32 tile grid (actually two of them that you can choose one of to use) that can be smoothly scrolled around on, and is often used for background images.
The Window uses the same tile grids as be background, but is not scrollable and rendered over the background.
They are often used for menus, dialogs, splashscreens, etc.
Lastly, the Sprites are either single (8x8) or double (8x16) tiles, that can be places anywhere on the screen and can be semi-transparent.
They are used for anything that moves on the background.
In addition to tiles, there are color palettes, which define which of the 4 colors of a tile corresponds to which RGB color (15bit color depth).
Palettes are not bound to individual tiles, but to the place in the background, window or sprite where they are used,
so a single tile can be used with different palettes in different places.
The Gameboy renders its screen line by line, one at a time.
Each line is largely treated independently from the others.
The screen has 144 lines, with 160 pixels each.
The time spent on each line is constant, exactly 912 cycles each (All listed cycle counts assume double-speed mode, single speed cycles counts are halved).
These 912 cycles are split up into 3 phases, called Modes.
The first phase is Mode 2, in which the LCD controller searches through the sprites to render, and which lasts for 160 cycles.
It is followed by Mode 3, in which the data to render is sent to the LCD controller, and which can take anywhere from ~344 to ~592 cycles, depending on a lot of factors, like the number of sprites on that line.
The rest of the time is spent in Mode 0 (also called HBlank), in which the LCD is inactive.
After all 144 lines are rendered that way, 10 more sets of 912 cycles are spent with the LCD inactive in Mode 1 (also called VBlank).
That makes a total of 912*154 = 140448 cycles spent per frame, resulting in a frame rate of 8388608 Hz/140448 = ~59.72 fps.
While the LCD controller is accessing data, it is inaccessible for the CPU.
That means that tiles can only be written and background and window changed in Modes 0-2,
and sprites can only be written in Modes 0 and 1.
Gameboy games usually handle this by using the time while the screen is rendered to execute its game logic,
and use the VBlank period to do all the graphics updates preparing for the next frame.
How the playback of Gameboy content is done
The main part of what this run does is provide a framework that allows the playback of arbitrary Gameboy footage in real-time.
To achieve this, it takes several processing steps:
The source footage is played and all relevant writes to memory are logged,
resulting in a log containing at which cycle which value was written to which address in memory.
From this log, you can determine the value of every address at every given cycle throughout the whole execution.
This is used to determine, for each line of each frame, which tiles were rendered on that line in the background, window and sprites.
Having gathered this information for all frames, you can work out which tiles and palettes are needed at which times, and when the background, window and sprite tiles need to be set to which value.
The end result is a collection of actions that need to be taken with a range of cycles when they need to happen, which when executed have the same effect as the original footage.
This results in a way of rendering a scene to look the same as the original footage,
but is generally more efficient, because it uses several optimizations that the original game doesn't use.
For one, it only renders a tile only if it will actually be visible on the screen at some point during the scene,
whereas games often render tiles that happen to end up off screen or covered by other tiles.
Also, most games are loading and overriding tiles and palettes in chunks ("tilesets"), even if only some of them end up actually getting used,
whereas the generated scene only loads a tile or palette if it ends up getting rendered, and tries to keep it loaded if it will be needed again later,
so that most tiles and palettes are only loaded exactly once throughout the entire movie, even across different games.
Additionally, tiles can be mirrored, allowing to re-use the same tile if they only differ by mirroring, so even fewer distinct tiles are used.
Having full knowledge about the scene beforehand also means that you can load the necessary tiles and palettes spread out at convenient times, even long before they are actually needed.
In order to execute the actions to reproduce the scene, the list of actions needs to be serialized into a sequence of commands
that can be executed one after the other so that each action is executed at the right time.
This is a scheduling problem with lots of constraints, since each action not only has a different range of cycles it needs to be executed in,
but also takes a different amount of time based on the type of action (e.g. loading a tile takes longer than setting a tile on the background grid).
Also, different actions can only be executed at specific times when their memory regions are accessible (i.e. when they are not used by the LCD controller).
The used commands are hand-crafted assembly functions that are loaded as part of the ACE payload, and perform specific tasks (e.g. load a tile into memory),
reading all necessary information (e.g. pixels of the tile, location where it should be stored) from the joypad.
For each command, I know precisely how many cycles it takes, at which cycles it reads joypad inputs,
and at which cycles it writes its output.
This information is crucial to be able to schedule the commands properly,
at each point you need to know exactly at which point of the rendering of the frame the Gameboy is,
to avoid the times when the required memory is inaccessible.
The whole execution is planned precisely down to the CPU cycle.
An example command used in this movie, which writes a single byte to HRAM:
WriteHByteDirect:: ; 88 cycles, 4 inputs at cycles (12,28,40,56), output at cycle 64
ld hl, $ff00 ; 12
ld a, [hl] ; 8
swap a ; 8
xor [hl] ; 8
ld c, a ; 4
ld a, [hl] ; 8
swap a ; 8
xor [hl] ; 8
ld [$ff00+c], a ; 8
ret ; 16
In order to define in which order the individual commands are executed,
one of the commands pushes function pointers of the commands that should be executed in order onto the stack (again, read from the joypad).
It is the first command to be executed after the payload has been loaded in the ACE initialization,
and the last function pointer put onto the command stack always is the function itself, so that after the commands have been executed,
we are ready to write a new command stack and keep going.
Writing the new command stacks is interspersed between the commands that do the actual playback in regular intervals, since the stack has only limited capacity.
Game audio is handled in a similar way to the graphics:
The log contains all memory writes to the sound subsystem, so by writing the same values we can recreate the same sound.
Audio is not bound to any video frame, and its memory is always accessible.
They are batched up when they happened in short succession in the original footage, and are replayed at approximately the same time (+- some thousand cycles).
In the end they are actions that are sequenced into commands together with the graphics actions.
After the success of playing back GB game content using ACE, where the sound was merely a side aspect,
I wondered how capable the sound hardware is, and what you can do with it.
Sound in a Gameboy turns out to be very limited in its abilities.
It has 4 sound generating channels that can be connected to two output terminals.
The first two channels generate square waves of different frequencies and amplitudes, with limited control over frequency and amplitude over time, and the last channel produces static noise.
Only the third channel is interesting, as it allows arbitrary wave patterns to be played.
However, the RAM that holds the wave pattern only contains 32 samples that are repeated over and over, with only 4 bits per sample (i.e. 16 different possible values).
It was clearly not designed for complex sounds like voice, but rather as an alternative way to creating waves with unusual shapes.
You can hear this clearly in the title screen of Pokémon Yellow, with the very crude sound they achieved by overlaying multiple waves: You can hear the words, but it's not pleasant.
However, you can use the third channel to play longer pieces of arbitrary audio,
by managing to update the wave RAM while the sound is playing.
This of course requires perfect precision when to update them, to ensure they are played once and only once.
The sound can only be played at very specific frequencies of 2097152/x Hz, where x is an integer between 1 and 2048.
For this to line up nicely with the Gameboy's frames, only specific values of x work, exactly multiples of 57.
All arbitrary sounds in this movie use x=114, which results in exactly 2 samples played every 912 cycles,
so it lines up perfectly with the line timings of the screen, resulting in a sample frequency of ~18396 Hz.
Still, the problem remains that there are only 4 bits available per sample, not nearly enough to produce acceptable-quality sound.
But there's one more audio control we can abuse: the volume control.
The volume control provides a linear scaling of the audio with 8 discrete levels.
By adjusting the volume for each sample, we can use it to increase the resolution of different amplitudes that can be achieved,
from 16 to ~100 (some sample/volume conbinations result in the same effective amplitude).
These effectively possible amplitudes are not evenly distributed though, there are more values available for the small amplitudes than for the large ones (which is actually exactly what you want).
So, what this movie does to produce high quality sounds (for a GB that is),
is writing the wave RAM at exactly 2 samples every 912 cycles to update the samples data, while also rapidly adjusting
the volume control at exactly the right times to tweak the resulting amplitudes.
These processes need to be time shifted by 32 samples,
meaning that the volume control affects the currently played sample, while the newly written sample is only played 32 samples into the future.
This requires a lot of precision and cycle counting, and is performed by a special assembly function that is loaded with the initial payload,
and fed the sound data using the joypad inputs as usual. In the idle times between two audio samples,
it updates the tiles on the screen to render the accompanying text and pictograms,
so it also needs to be synced up with the LCD operations to only write when the memory is accessible.
SpongeBob video sequence
For the ending, I wanted to go all-out, and see how good of an A/V experience you could produce on Gameboy hardware using only the joypad inputs.
Part of it was that I wanted to show off so-called HiColor graphics.
The Gameboy only has space to store 8 palettes each background tile can choose from, with 4 colors each,
so the maximum amount of colors you can use on each frame is usually 32, and each 8x8 tile area can only use 4 of them at a time (plus some extra colors for the sprites which draw from different palettes, but they're not useful for this purpose).
The so-called HiColor technique allows you to use significantly more colors in an image,
by changing the palettes for each rendered line.
This way, each line could use its own colors, even within the same 8x8 tile.
This technique was not originally intended in the Gameboy's design,
but it was actually used in some commercial Gameboy games.
The problem with it is that you have only a very small time window to update the palettes before the next line is rendered.
It is impossible to update all 8 palettes each line, so most games only update some of them, mostly 4, resulting in a total of 2304 possible colors each frame.
However, there are still a lot of limitations (e.g. while you change the colors of the palettes, all tiles still point to the same palette indices, so the configuration of which tile uses which palette is constant for each line of 8x8 tiles),
and it requires a lot of precision to do the palette change at exactly the right time, prohibiting the game from doing much else in the mean time.
Moreover, the whole palette-swapping procedure needs to be repeated each frame, even if the screen content isn't changing at all,
so it is a significant battery drain.
I did some calculations, to find out how much quality I can put into the sequence,
limited by the amount of data I could possibly push through in a given amount of time.
It came down to a balance of frame quality and frame rate:
If I try to refresh the whole 20*18 tile screen every video frame, that's 20*18*16 = 5760 bytes of data,
costing at least 5760*36 = 207360 cycles to read from the joypad (36 cycles is a lower bound, for just loading the byte, not actually doing anything with it).
Additionally, I'd need to load 144*4 palettes for each line of the image to produce the HiColor effect,
costing another 4608 bytes or 165888 cycles to load.
Meanwhile, each Gameboy frame I need to maintain the palette switching to keep the HiColor effect going,
costing around 62784 cycles, meaning there are only 77664 free cycles each Gameboy frame to do something useful
like loading the next video frame.
This would have meant I can only show a new video frame each ~6 Gameboy frames under ideal circumstances,
resulting in ~10fps video, which I deemed not good enough.
Instead, I chose to lower the quality a bit to achieve a higher frame rate.
The two compromises I made is to not update all the screen tiles, and to only update 2 of the
palettes each line instead of 4, cutting down on the maintenance costs.
This way I could push the frame rate up to a more acceptable 15fps, updating the video frame every 4 Gameboy frames,
while maintaining a HiColor image with 960 total colors and good quality audio.
Unlike the playback of other Gameboy content before, this is not assembled out of individual pieces,
but instead a single hand-crafted assembly function that coordinates everything, because of just how much
preciosion is necessary down do every CPU cycle.
It basically uses double-buffering to show one video frame, while building up the next one and switching them
using the two available background tile maps.
For each line that is rendered, it performs multiple operations:
It updates the music samples and volume (as described above),
it writes the next two palettes to update the HiColor image for the next line,
it loads 1/2 of a new tile to memory for the next video frame,
and it loads 3/8 of a palette to memory for the next video frame.
The awkward fractions are necessary in order to be able to squeeze everything into the 912 clock cycles that are available for each line.
The VBlank period is used to load the tile attributes (i.e. the mapping of tile to palette),
and to prepare the rendering of the new frame.
Preparing the source video to be in a format that is suitable to be rendered this way
while still looking acceptable was a challenge in itself.
Even though there are many more colors available in HiColor mode, they are not available where you want them.
Since I update only 2 palettes per line, that means the palette a specific 8x8 tile uses only updates every 4 lines,
so there are still effectively up to 4x8 blocks of pixels which use the same 4-color palette.
And since you only have 8 palettes available at a time for 20 tiles in each line,
some will need to share the same palette.
Determining which palettes are best and for which blocks to use them turns out to be a difficult problem
with many constraints.
I used some known algorithms to determine a good palette for each block (Median cut
, k-means clustering
used some simplifying assumptions to distribute the palettes on the blocks,
and some Dither
to smooth out the resulting image.
Moreover, the colors you see on the screen and the colors which a Gameboy Color produces are different,
meaning that the same RGB value will produce different results on a computer screen and on a Gameboy screen.
Luckily, a sneak peek into the source code of the emulator shows how it does the conversion,
and all I need is to do the reverse transformation.
One matrix inversion later I got a working color transformation to convert the video colors into GB colors.
After the ACE finished, I give control back to the original game.
This is to demonstrate that even after all of this, the underlying game can still continue.
I chose to put it back into the ending credits sequence, because it plays without further inputs,
and it follows the usual convention of "beating the game", whatever that means at this point.
After the inputs have ended, you can take over with manual inputs to play the game normally,
despite the ACE that happened.
There are 6 different games featured in the scenes played in this movie:
Pokémon Yellow, Pokémon Gold, Pokémon Crystal, Tetris, The Legend on Zelda: Link's Awakening DX, and Super Mario Bros. DX.
Notes about the individual scenes:
These give some insights into the thought process behind each scene.
Scene 1: Pokémon Yellow - In which the protagonist encounters Oak in the Hall of Fame and doesn't find his princess.
This scene is meant as a tribute to the usual credits warp that the viewer might have seen multiple times before and came to expect,
but with a twist at the end using the infamous SMB quote, signaling that this is not an ordinary run,
and foreshadowing the upcoming Gen II game as that next castle.
Scene 2: Pokémon Gold - In which a saved game is loaded that makes no sense.
The intro sequence of Pokémon Gold is played out for quite some time,
to give the viewer a chance to realize what just happened, that this is a different game now,
and ease them into the idea that we're going to switch games in an instant.
The loading of a saved game is the transition to the next scene, but the stats of the save game don't match the next scene at all.
While it would have been easy to make this more believable,
I liked the idea of having small inconsistencies in the narrative for the viewer to discover.
Scene 3: Pokémon Crystal - In which the protagonist catches a shiny Celebi and talks to a kid with a Gameboy.
The location of this scene is chosen to be immediately familiar to most players who ever played a Gen II Pokémon game,
next to the Pension and right before Goldenrod City.
The initial walking around in the overworld, encountering and catching a Pokémon is selling the fact that we're actually playing Pokémon Crystal now.
The inconsistency between the title screen being Gold and the gameplay being Crystal is again something for the viewer to discover.
The caught Pokémon is a shiny Celebi, and the own Pokémon is a shiny Mew, as a reminder that this is no actual gameplay and I still have full control over what's happening,
and to poke fun at how superficial the associated concepts of rarity are.
The Gameboy kid is used as the transition to the next scene, framing it as if this is what the kid plays.
Scene 4: Tetris - In which a kid shows off their superhuman block stacking abilities.
Tetris was chosen because of its universal fame and recognizability, both in picture and sound, and its short length.
The shown footage is my Tetris TAS
I did two years ago.
The victory sequences have been sped up significantly, both as a showcase that this is possible as well with this framework,
and as another small inconsistency for the viewer to find.
Scene 5: various - In which the protagonist finds his way home and plays the NES.
This scene jumps from game to game quickly, first Crystal, then Link's Awakening, back to Crystal, and finally Yellow.
The scene ends in Yellow's house in front of the SNES (renamed to NES), where he started off,
both to close the circle and to use the NES for the next scene transition.
Scene 6: SMB DX - In which the protagonist plays SMB but every time Mario hits a block it gets faster.
SMB 1-1 is again chosen because of its immediate recognizability by most viewers,
and the fact that the GB version looks very similar to the NES version it represents here.
The actual gameplay is just me playing around in 1-1, completing it semi-fast
while not using any pipes to have a continuous scene start to finish.
Including the "every time X it gets faster" meme in this sequence was merely an afterthought.
Scene 7: Pokémon Yellow - In which Prof. Oak turns out to be GLaDOS in disguise.
Keeping the theme of using other game's ending sequences, the Portal credits were a natural choice
to show off the high quality audio capabilities with.
It also fit nicely with with the text in the dialog mirroring the console text in the Portal credits.
I also debated how long I should keep it playing, as to not overstay my welcome.
Scene 8 - In which SpongeBob performs a magic trick.
The choice for the video sequence was difficult.
It needed to be short, not because I couldn't play longer sequences,
but because it is really expensive on the inputs (it's basically uncompressed video pushed through a 4-bit input), blowing up the input file size.
But it also needed to be recognizable, and somewhat related to the rest of the scenes.
I ended up going with the "How does he do that?" scene from the SpongeBob Squarepants episode "Shanghaied",
because I felt it was a fitting ending, at a state where the viewer is probably thouroughly confused about what's going on and how this is possible,
and it was short enough that they may not even realize the sequence is part of the movie,
in fact the technically most impresssive part of it.
The source code of all the tools and programs used to make this run can be found on GitHub
The usual caveats apply, it was not designed to be easly usable.
I'm very satisfied with how this turned out, and I learned a lot along the way,
not just about the inner workings of a Gameboy, but also audio and video processing knowledge
I didn't anticipate needing when I started: Dynamic range compression
audio and video container formats, and many more.
It was a lot of fun creating this, and I hope you enjoyed it.
: Uploading a recompressed version of the author's proper input file. Also, judging.
: This was quite a run. Many aspects of it needed to be reviewed for judgement purposes.
This run plays back correctly on a reasonably accurate emulator. However, it is unlikely this run will sync on actual hardware. However, what was done is deemed legitimate. With enough tweaking, a run with the same output should be possible on a real Gameboy Color. Since it does not actively exploit any emulator bugs, this is good enough for acceptance.
I will say though that playback on lsnes is a nightmare, processing the high rates of input updating its counters will bring many CPUs to it knees. Further, this is the first run I found where wine is incapable of running the Windows version of lsnes, as it crashes when the input gets heavy. Also, most emulators in wine and VirtualBox normally run as quickly as a native emulator does on Linux (often faster than even running the same Windows binary natively in Windows), although not so for this run. Something about what's going on here even brought my VirtualBox to such a slow crawl, that my ten year old laptop with a slow CPU could run it faster. None of this reflects on the legitimacy of the run though. An actual Gameboy Color has none of the overhead.
User feedback for this run was terrific. It received many positive reviews, extremely entertained by the run. The feedback posted by viewers is good enough to qualify for the Stars Tier. The votes were good too, not that that matters.
The entertainment level in this run was very good. The individual pieces shown were fun and entertaining. The part where Link suddenly walked through the door looking around wondering What the triforce just happened, where am I? was so funny, I literally almost choked on what I was drinking at the moment when I first saw it.
This is an important consideration in terms of accepting these kinds of runs. Some viewers incorrectly think these kinds of runs is all about technical mastery. While technical mastery is indeed a compelling reason for liking a run, it has no bearing on publishing it on our site. Technical mastery qualities alone is why I rejected #4947: dwangoAC, Ilari & p4plus2's SGB Pokemon Red "Pokemon Plays Twitch" in 08:11.42
as our site has no place to publish such runs, nor criteria for doing so.
In order for a payload to be unique, it has to be tailored for the game in question, and offer something unique. Creating a payload which would work just as well being slapped onto every other game too is an immediate cause for disqualification. Our first payload
on this game had absolutely nothing to do with the game in question and was accepted, although in my opinion, based on an incorrect foundation of technical mastery and unfamiliarity to the concept by the viewership at large. I accepted our second payload
for this game as an improvement to the original, cutting out downtime, and replacing with something concise, and at least tied to a concept of a quick to exploit short run, not possible with many games. However, publishing as an original run is questionable due to it having nothing to do with the game, certainly so when the concept has started to become familiar.
Published payloads for other games have been tied in various ways on the games they were based, and I'm pleased to see someone finally made such a payload for this game which can obsolete the earlier ones. The payload represented for the most part provides a storyline which extends the game from which it is connected to. The different pieces are part of a story, they are not just free-floating concept demonstrations which have nothing to do with the game. This payload as a whole would not work on any other Gameboy Color game as an effective storyboard, and would seem alien.
Pokémon Gold - Pokémon Crystal
This segment is our first bit of payload, and upgrades the existing game to add on areas from a later game in the same series. It's possible to have more closely tied this to Pokémon Yellow with a superhuman effort, however, there is no other Gameboy Color game I can think of where this payload would seem like somehow upgrading the original game. Therefore, this is a legitimate unique payload for Pokémon Yellow, as it wouldn't fit well elsewhere. It also continues to provide a framework where the continuing payload fits in.
This part of the payload could have technically been slapped onto any game. However, our author continued to make it part of the existing story. Pokémon has people playing games in them on their handheld consoles, and showing us an example of someone doing exactly that sticks to the story. Choosing any other Gameboy (Color) game could have worked too and that would be up to author preference, nevertheless it was a fine choice, and did not break the flow of the overall story started with the first payload and the original game itself.
This part of the payload to me was by far the funniest. At first glance, this looks like it could have been attached to any game, and indeed, it is possible to make this work within a storyline for payloads for other games. However it worked really well here, and did not break continuity with the overall payload.
Our player walks out of a building and ends up not just different from where they started, but in an entirely different game. Our author did a good job portraying the character's confusion as to what the heck just happened. The spot where they appeared is also a section known for somehow being a portal to a vast network of areas in Link's Awakening, and showing it's also somehow connected to a different game altogether really added to the humor. The player then walks into their residence in this game, and ends up back in their residence in Pokémon. In truth, this was executed flawlessly.
Super Mario Bros.
The original Pokémon has an SNES in the character's room, and the character plays what appears to be EarthBound on it (even though this is not explicitly stated or shown). Having the character play another SNES game and actually showing it would also fit in with the storyline of the payload, continuing what the game has, and works as being within the narrative. An NES game was chosen (via its Gameboy Color port), which is less plausible given the console icon, although we can postulate that the character owns a later model NES which was modeled after the SNES and therefore was only mistakenly thought to be an SNES. Changing the icon to make it look like the more recognizable original NES model would have been preferred, but all in all, this still continues a payload which is tailored to the original game.
The connection with this to the rest of the payload and game is the most tenuous of all the payloads. Choosing music from Portal, sort of fits as the storyline did involve our player using some portals between different games, but that's somewhat of a stretch in concept. The actual music having a reference to doing stuff just because we can ties in a bit with the fact the game is being exploited in this way. Other than that though, what was chosen really had no connection to the rest of it, nor do I think most viewers will make all the various assocations. This payload also goes on longer than it needs to for the connections it makes while not offering anything positive once the point has been established. If not for the strength of the rest of the payload, this segment would be enough of a reason to reject this run.
This short segment showing spectators in complete disbelief as to what just happened is fitting for a crazy payload. While it could also be tied onto other games, it fits in with the rest of the story here, and was short enough to not detract.
Being that this run uses content from elsewhere, a strong factor to consider is whether it is legal to publish this kind of run. I highly recommend others who wish to make a run like this in the future to carefully read this section and understand it before attempting to undertake what was done with this run.
Publishing material using content owned by others is illegal, unless it falls under fair use
. I enumerate the fair use criteria and how it relates to what we do in general in my aforementioned link. Part of what makes what we do legal is the effect of the use upon the potential market for, or value of, the copyrighted work
namely that we are transforming the original work, providing only a small subset of it, and doing so in a way which advertises the original, potentially generating more profit for the copyright holder. However, what happens with this run doesn't quite fit into that mold.
Pokémon Gold - Pokémon Crystal - Tetris - Link's Awakening - Super Mario Bros.
In terms of the segments for these games in general, we only provided what can appear to be video clips from these games. thatguy
summarized it well, in that what we're doing basically amount to what we're doing when we publish any of our runs on YouTube. However, a key difference here is that our run being labeled Pokémon Yellow
does not directly advertise these games. Therefore, in order to adhere to fair use
as best we can, any place we publish videos for this, be it our pages, YouTube, Vimeo, Archive.org and elsewhere, we should be including some advertising words for these games. I will include a section at the end of this judgement as to what that should look like.
The legality of this segment gave me the most difficult time in dealing with this run. To be honest, it should not have been included. The author in this case has gotten lucky with the choice, as I'll explain in a moment, but something like this really should not be done. While game hacks often include some content from other games, the content is often minor and overlooked, often only recreated in the minute, it's certainly not a focal piece. Here, the content is pushed front and center, displayed, with lyrics, and with a considerably large typically not-fair-use-length segment.
The author has gotten lucky here that the copyright holder Valve Corporation has published a reasonable and forgiving policy
. There are two main points of consideration from this policy:
- We are fine with publishing these videos to your website or YouTube or similar video sharing services.
- We're not fine with taking assets from our games (e.g. voice, music, items) and distributing those separately.
With the first point, they are providing the intent that they approve of users advertising their content, as this is beneficial to them. With the second, they are providing the intent that they disapprove of using their content to distribute in a way which typically is not beneficial to them. What we're doing here clearly fits with their second point, however, if we advertise them appropriately, we can achieve their intent of the first point.
We know they want to have their music advertised, as various online music stores include minute long samples of their various music tracks for free playback, which is longer than the roughly 45 second track included in this run. If we recommend users to buy those tracks, then we can achieve the intent of the copyright holder.
All in all, the author has managed to luck out
with the choice here, and I recommend future authors to avoid doing anything like this. I had to do considerable research to find a loophole here, which is unlikely to exist most of the time.
Typical fair use for video involves displaying a short segment, typically less than five seconds worth, of something culturally relevant and on point. Being that SpongeBob Squarepants is culturally relevant to many viewers, the video is short, and was used in a way which makes it part of something else falls under typical fair use. This kind of use does not even require any advertisement (although it doesn't hurt).
As a whole I find the payload legal, although just barely.
After reviewing all the different aspects of this run, I have found it acceptable, as long as it is published with the appropriate advertising clauses (below). I am accepting this to Stars, and as an improvement to the existing published run
Publishers, please include the following clause in all places this run is published to (including Archive.org):
This video includes segments from some terrific games, including Pokémon Yellow, Crystal, and Gold, as well as from Super Mario Bros. Deluxe, Tetris DX, and The Legend of Zelda: Link's Awakening DX. Please check those games out.
This video includes music samples from The Orange Box soundtrack. If you've enjoyed what you've heard, you can purchase The Orange Box soundtrack from Amazon
This video includes a clip from SpongeBob SquarePants (season two), which can be purchased in its entirety from Amazon
: TL;DR: It got accepted.