Posts for TheCoreyBurton


1 2 3 4
12 13
Experienced Forum User
Joined: 10/14/2013
Posts: 335
Location: Australia
xnamkcor wrote:
So, the "Native" one is somewhere near 320x240, and the other encode is just a full screen filter/upscale of that?
You're correct in regards to this game's native resolution (which is 320x240) but no, the other encodes are not upscales.
xnamkcor wrote:
No Encode at a higher resolution?
There are several available in my previous post at 480p, 720p and 1080p respectively. This is in addition to both the regular and brightened Youtube encodes available on the publication page. I apologize if my wording on the previous post was somewhat ambiguous. I tried to keep it straight to the point, but I'll go a bit of detail in the hope it clarifies what to expect from each encode: After resyncing (this was an easy project, as it only required a single savestate) I dumped this movie three separate times. The first dump I did was at the game's internal resolution with graphical enhancements such as anti-aliasing and anisotropic filtering. I used this dump for the encodes on the publication page and for the standard lossless encodes. The second dump I did was at 2880x2160 (4K, with a 4:3 aspect ratio); again with graphical enhancements. This was used for the 4K Youtube encodes and was downscaled for each of the additional resolution encodes. The final dump I did was at the game's internal resolution without any graphical enhancements. This was used for the native lossless encodes and is intended for viewers who'd prefer to see the run looking closer to the original hardware (albeit, not by much). You can generally check the Youtube encode to see what kind of video to expect from the additional resolution videos. If it's clear and not upscaled, then the encoder was able to dump from the emulator at a higher resolution and likely will have used that footage for any high resolution encodes they may have made available. If it looks like it's been upscaled, then you'll likely not have any luck (it can be for any number of reasons, such as the run only syncing with certain settings or perhaps the core only outputs at that particular resolution). Whilst I have provided upscaled additional resolution encodes (only for 2D games) in the past, when I started offering lossless encodes they took the place of any upscaled additional encodes I might provide. If I've provided lossless encodes in a post, then all additional resolutions available in that same post are rendered either at, or above, the resolution they specify.
I'm not as active as I once was, but I can be reached here if I should be needed.
Experienced Forum User
Joined: 10/14/2013
Posts: 335
Location: Australia
In addition to the encodes available on the publication page, here are some extra downloadable encodes for this run: Additional resolutions: 480p: Direct Link | Torrent 720p: Direct Link | Torrent 1080p: Direct Link | Torrent Additional resolutions (brightened): 480p: Direct Link | Torrent 720p: Direct Link | Torrent 1080p: Direct Link | Torrent Lossless encodes: Standard: Direct Link | Torrent Native*: Direct Link | Torrent Lossless encodes (brightened): Standard: Direct Link | Torrent Native*: Direct Link | Torrent * Encodes marked with an asterisk are done at native resolution with no graphical enhancements (such as anti-aliasing or anisotropic filtering). To download these files you may have to right click on the link and select "save link as" (this option may be named differently depending on your web browser).
I'm not as active as I once was, but I can be reached here if I should be needed.
Experienced Forum User
Joined: 10/14/2013
Posts: 335
Location: Australia
Nice improvement! I'll happily take this for publication if this gets accepted. On a personal note, thank you for using GLideN64.
I'm not as active as I once was, but I can be reached here if I should be needed.
Experienced Forum User
Joined: 10/14/2013
Posts: 335
Location: Australia
It's worth noting that a major problem with relying solely on an emulator movie is that getting it to sync for playback isn't always a simple task (I had some trouble with this specific movie, for instance). Regarding Dolphin, there are even several unfortunate cases of runs only syncing on specific computer hardware.
I'm not as active as I once was, but I can be reached here if I should be needed.
Experienced Forum User
Joined: 10/14/2013
Posts: 335
Location: Australia
In addition to the encodes available on the publication page, here are some extra downloadable encodes for this run: Additional resolutions: 720p: Direct Link | Torrent 1080p: Direct Link | Torrent Lossless encodes: Standard: Direct Link | Torrent Native*: Direct Link | Torrent * Encodes marked with an asterisk are done at native resolution with no graphical enhancements (such as anti-aliasing or anisotropic filtering). To download these files you may have to right click on the link and select "save link as" (this option may be named differently depending on your web browser).
I'm not as active as I once was, but I can be reached here if I should be needed.
Experienced Forum User
Joined: 10/14/2013
Posts: 335
Location: Australia
In addition to the encodes available on the publication page, here is an extra downloadable encode for this run: Lossless*: Direct Link | Torrent * There are three different video tracks available in the file: 3D anaglyph, left screen only and right screen only. To download these files you may have to right click on the link and select "save link as" (this option may be named differently depending on your web browser).
I'm not as active as I once was, but I can be reached here if I should be needed.
Experienced Forum User
Joined: 10/14/2013
Posts: 335
Location: Australia
feos wrote:
It absolutely doesn't sync on any other Dolphin version that has no av desync? Did av desync for the author too? Is it certain that av desync is not caused by missing audio samples, for which Ilari made this tool? Are savestates of other Dolphin versions incompatible?
Runs usually sync a few revisions forward or back depending on the changes made, but it's not guaranteed. I've never had success over 50 or so, let alone over 1000. The specific problem is that the fix for Dolphin's video dumping in 4.0-3595 is somewhat broken thanks to VFW. It works well for most Gamecube games, but for a lot of Wii titles and for any Gamecube game that changes rate, it incorrectly assumes each frame lasts for the same duration. This issue was exclusive to Windows, as Linux used ffmpeg for dumping (which didn't have the same problem). Fog fixed this in 4.0-8634 by changing the dumping method over to ffmpeg in Windows.
feos wrote:
It requires upstream dolphin's avi dumper code to be merged into the dolphin version it was made on, otherwise av desyncs can NOT be fixed during dumping.
The only other alternative here might be for someone to dump it on Linux. It's possible on a VM, but depending on the software used it may be difficult to get proper hardware support and so a native installation would probably make the process a lot smoother. That said, it would be better to have the upstream fix available in the long term.
I'm not as active as I once was, but I can be reached here if I should be needed.
Experienced Forum User
Joined: 10/14/2013
Posts: 335
Location: Australia
Aktan wrote:
Couldn't you just uncheck H.264 as decoding option in ffdshow settings?
Probably. I didn't think of this at the time, but it would have likely been the easier and better option.
I'm not as active as I once was, but I can be reached here if I should be needed.
Experienced Forum User
Joined: 10/14/2013
Posts: 335
Location: Australia
feos wrote:
with this installation of lav filters and without ffdshow, regular avisynth doesn't want to load an ffv1 clip for me
Does encoding using this command produce a file that loads correctly? It likely won't make a difference if you're using the Bizhawk command but it's probably better to be safe and check everything.
feos wrote:
As for h264, we can ship lav filters with bizhawk prereqs.
So h264 still loads with the removal of ffdshow (and installation of LAV filters)? If so, this is great news. We may also probably want to warn users somehow of the potential problem of having ffdshow installed at the same time. It's interesting as I had the same version of ffdshow and LAV on both this and the other PC. This one decodes fine, but the other one required removal of ffdshow first. It's like ffdshow took decoding priority over LAV on the other PC (and so had to be removed) but LAV took priority here. Maybe there's a way to force priority or decoder if this is true (while still using AVISource).
I'm not as active as I once was, but I can be reached here if I should be needed.
Experienced Forum User
Joined: 10/14/2013
Posts: 335
Location: Australia
I was worried about the "hack" nature of h264 in AVI due to some bad experiences, but surprisingly didn't have any issues at first (I assumed because VFW was not involved). I did two test encodes on my current PC today. In both cases I used AVISource to load the footage, but only used ExactDedup on the second script. The output of both encodes was the same as if I'd used a Lagarith source. I did the same tests on my second PC (which recently got a HDD replaced and therefore has a clean installation of Windows) and got different results. The only things installed on that PC were the OS (Windows 7 x64), AVS+, ffdshow and all the appropriate pre-requisites to get these and the encoding package running (such as MSVC++ 2010). Regarding the results, the non-dedupped encode looked the same, but the dedupped encode started with some garbage frames. The garbage frames are additional frames added at the start of the encode. They don't replace existing frames and therefore the audio sync is noticeably out when they're present: Randomly seeking through the non-dedupped script showed no issues. Seeking to the first non-logo frame of the dedupped script showed the garbage frame at first, but after seeking elsewhere in the clip (and then seeking back), the frame was gone. I tried reopening the script and the problem was still resolved. I rebooted the PC and opened the script again and as expected, the problem was back (to which seeking again solved the issue). The solution seems to have been to uninstall ffdshow (which is out of date anyway) and instead install LAV filters.* I'd need someone with a clean system to try encounter the error and solve it this way to be sure this is the correct solution, though on the test PC the script produces the expected output in any test situation I've tried, including the above situations where the output was previously broken. My main concern here is that the correct decoding of h264 in AVI looks to vary depending on installed codecs or system. Whilst this is true for most formats, there's usually some form of error message or some obvious sign that something's wrong; this isn't the case here. If you were to preview the footage in Virtualdub via AVS, the dump would look fine. If the problem did occur, seeking past it for a few frames and going back would fix it. If there were doubts and the script was reloaded to check, it would be fixed on reload. This could be problematic as it has the potential to go unnoticed in some situations. * I initially installed LAV filters without removing ffdshow (as that is the codec setup on my main PC), but this didn't solve the issue and so the safest bet is to remove ffdshow altogether. Note: There were a number of edits originally appended to the end to this post, which lead to the point being worded in a confusing and unnecessary way. All of the edits all have therefore been re-worded and added in to the original post.
I'm not as active as I once was, but I can be reached here if I should be needed.
Experienced Forum User
Joined: 10/14/2013
Posts: 335
Location: Australia
I was doing some work on a publication yesterday (dumped with FFV1 at 4K) and noticed the seeking was frustratingly slow. Given the resolution, I expected as much but it made me think about this post. I did some overnight tests and noted down a few things: FFV1: Command: ffmpeg -i "framedump0.avi" -c:v ffv1 -pix_fmt bgr0 -level 1 -g 1 -coder 1 -context 1 "output-ffv1.avi" File size: 891 MB (934,546,882 bytes) Approximate encoding time: 10 minutes. Approximate seek time: ~0.5 seconds per frame. Lagarith: Command: None. Opened FFV1 in Virtualdub and saved with Lagarith codec. File size: 1.01 GB (1,088,373,084 bytes) Approximate encoding time: 6 minutes. Approximate seek time: Instant. x264RGB "1": Command: ffmpeg -i "framedump0.avi" -c:v libx264rgb -qp 0 -preset ultrafast -g 1 -pix_fmt rgb24 -context 1 "output-x264rgb-30.avi" File size: 1.68 GB (1,813,903,120 bytes) Approximate encoding time: 1 minute. Random seek time: Instant. x264RGB "30": Command: ffmpeg -i "framedump0.avi" -c:v libx264rgb -qp 0 -preset ultrafast -g 30 -pix_fmt rgb24 -context 1 "output-x264rgb-30.avi" File size: 1.22 GB (1,313,018,268 bytes) Approximate encoding time: 2 minutes. Random seek time: ~1.5 seconds per frame. Before discussing the results, there's a few things worth mentioning: The encoding times come from the file properties and are only accurate to the nearest minute. The seek times are based on an external capture of the PC doing the seeking and could be slightly askew. "-c:a pcm_s16le" isn't needed in these cases as Dolphin dumps video and audio separately. The game runs at 30fps and so every second frame is a duplicate. Currently, I re-encode Dolphin's FFV1 dumps to FFV1 (with the settings above) to make them compatible with AVISource. I'm replacing that portion of my process with the x264RGB "1" command line, because in terms of encoding speed and seek time it's better to deal with. My main observation on the previous conclusion is that with the keyint being that large, the file becomes downright horrendous to work with when seeking is required. FFV1's seek time was already a nuisance and tripling it only makes things worse. For this particular dump, the file is also larger when encoded this way, and so saying Dolphin must switch to this preset might be a bit hasty. I have a lot of HDD space to work with though and so I can see how better compression would benefit other users more than myself. Perhaps the best option here is the ability to choose a custom command line? Combine that with a couple of presets (such as the three tested here, for instance) for users less familiar with ffmpeg and then the situation is user friendly and the possibilities are endless.
I'm not as active as I once was, but I can be reached here if I should be needed.
Experienced Forum User
Joined: 10/14/2013
Posts: 335
Location: Australia
That is a very good point.
I'm not as active as I once was, but I can be reached here if I should be needed.
Experienced Forum User
Joined: 10/14/2013
Posts: 335
Location: Australia
Assuming %FPS% represents the frame rate of the source value multiplied by 10, this is the lossless command I've been using for my tests (isolated from the batch system). I use AVS to load the video because it's the input format we'll likely be dealing with:
".\programs\x264-10_x64" --threads auto --qp 0 --keyint %FPS% --ref 16 --bframes 16 --b-adapt 2 --trellis 2 --direct auto --me tesa --merange 64 --subme 11 --partitions all --no-dct-decimate --no-fast-pskip --output-csp rgb --range pc --input-csp rgb24 --input-range pc --demuxer avs -o ".\temp\video.mkv" ".\source.avs"
Additionally, I'd probably say this should be coupled with FLAC for the resulting file, which can be encoded like this:
".\programs\avs2pipemod" -wav ".\source.avs" | ".\programs\flac_x64" - --best -o ".\temp\audio.flac"
And muxed like this:
".\programs\mkvmerge" -o ".\output\output.mkv" ".\temp\video.mkv" ".\temp\audio.flac"
Edit: Changed "x264_x64" to "x264-10_x64". The build I have handles the 10-bit encoding all in the main executable, but if you're using a version prior to that update you'll need to use the correct executable.
I'm not as active as I once was, but I can be reached here if I should be needed.
Experienced Forum User
Joined: 10/14/2013
Posts: 335
Location: Australia
I was curious about Youtube's current handling of AR flags and so I dumped some NES footage today and did two tests. The first was encoded losslessly with x264rgb at native resolution, muxed to MKV with a display aspect ratio of 16:9 and uploaded. The resulting video was encoded and played back in 16:9. The second test was point-resized to 823x720 and encoded with Lagarith, then muxed into MKV with a display aspect ratio of 4:3. It's playback resolution was 960x720. This could also have an interesting impact on the integer resizing suggestion I made, as a single scale value could be used for all resizing operations. If we divide the target height (2160 for example) by the initial height (224 in this case) and round up, we get a scale value that would provide a resolution equal or higher than the target, which can then be applied to both width and height to keep the pixel sizes consistent, then the aspect ratio can be set in the container for the final output. The significance of this is that the publication process would be faster, particularly with longer runs. We'd get smaller files (or alternatively higher quality encodes at the same size), which would help significantly in cases where encodes exceed the upload limit and have to be re-encoded at lower settings (sometimes several times). Smaller files also upload faster, and the encode process would take less time with integer point resizing than non-integer. The problem with this idea is that it's very likely that Youtube's resizing is inferior to our own method. Whilst downscaling might be fine, in this case it would be used for upscaling the width to reach the destination, which will likely add unwanted horizontal blurring or distortion. Depending on the severity and the comparison between this and our current way of encoding, it could very well be better to keep doing things the way we're currently are. I didn't expect either of these tests to work correctly and so I hadn't prepared anything to properly follow up. I'd like to see a comparison in streams that have been encoded both ways and processed by Youtube before drawing any proper conclusions. More testing is required at this point, but I thought I'd post this and my thoughts for now.
I'm not as active as I once was, but I can be reached here if I should be needed.
Experienced Forum User
Joined: 10/14/2013
Posts: 335
Location: Australia
Aktan wrote:
I'm thinking if you set the flag to be full color range in the MP4, it will have the correct colors
It should probably be set to write the metadata in Bizhawk for cases like this. Do you know the ffmpeg command? (I think it's -color_range 2, but I'd like to be certain).
I'm not as active as I once was, but I can be reached here if I should be needed.
Experienced Forum User
Joined: 10/14/2013
Posts: 335
Location: Australia
feos wrote:
Dedupped cscd at level 9 (guess how I got it!) for [3019] SNES Super Mario World "warps" by BrunoVisnadi, Amaraticando in 09:57.10 is 125 MB (132,049,263 bytes).
That's significantly smaller than I expected! Did you make a command line tool? Also for comparison, my tests of the same file ran for 15:07 and were not dedupped.
I'm not as active as I once was, but I can be reached here if I should be needed.
Experienced Forum User
Joined: 10/14/2013
Posts: 335
Location: Australia
Ah, of course! But don't forget about hardware players and non-PC devices too, there's a chance they might not be ffmpeg based and may be limited to specific codecs. CSCD's still producing significantly larger files (although not as drastic as FFV1 or Lagarith) and depending on the footage it can much slower to seek through. This can be fixed by adding keyframes every second or so, but it could bulk up the size considerably. In comparison, x264's main drawback here is speed, but we're already toggling on lots of slow options to shrink the size of our current downloadable encodes. If this is a theoretical new option, it'd suit the site's existing procedures to follow on and provide the best-compressed file possible. The fact all the downloadable encodes would share the same video format would be an extra bonus in some situations, for instance:
  • We would not need any additional software in the package to make the video portion of these encodes.
  • Users who can play our current Modern HQ encodes would likely be able to play our lossless encodes without the need of any additional software, codecs, or hardware upgrades.
  • Troubleshooting for user playback issues would remain somewhat universal.
Edit: x264's encoding framerate isn't always terrible. When test-encoding NES Super Mario Bros. and NES Castlevania at the same time, both encodes averaged around 60-75fps with the same settings as before.
I'm not as active as I once was, but I can be reached here if I should be needed.
Experienced Forum User
Joined: 10/14/2013
Posts: 335
Location: Australia
Aktan wrote:
I believe what you saw was YouTube only re-encoding videos uploaded after a certain point, but not all videos.
You're probably right. I was basing my assumption that it was all videos on the combination of the information I mentioned above and that at one point, Youtube introduced a method to download your own input videos (as I had to do at one point). I tried this again now and was only able to download the video in 720p30 (despite it being available in 1080p60), which leads me to believe this feature has been replaced with a new method that allows you to download the best quality combined stream Youtube has.
I'm not as active as I once was, but I can be reached here if I should be needed.
Experienced Forum User
Joined: 10/14/2013
Posts: 335
Location: Australia
Here's FFV1 (using the settings from over here) and Camstudio using LZO (as I was unable to get GZIP to function properly). Lossless Camstudio LZO: 311 MB (326,661,150 bytes) FFV1: 707 MB (742,030,710 bytes) Compared to x264 they both encode a lot faster. I got over 200fps with FFV1, compared to 8fps with x264 (which was the same rate I got for encoding the Modern HQ, for reference), but it looks like neither comes close to x264 RGB in terms of file size. Compatibility could also be a factor here. If someone can play the Modern HQ, then they could play x264 lossless without need of additional codecs.
I'm not as active as I once was, but I can be reached here if I should be needed.
Experienced Forum User
Joined: 10/14/2013
Posts: 335
Location: Australia
feos wrote:
Ah, I thought the size would be comparable to 444.
I'll do a size test between our current Modern HQ method and the same method with RGB and post the results shortly. Edit, Results: Lagarith source file: 701 MB (735,481,802 bytes) Modern HQ 444 (YUV BT.601): 33.0 MB (34,685,337 bytes) Modern HQ RGB: 53.9 MB (56,564,916 bytes) Lossless RGB: 81.5 MB (85,467,011 bytes) All results were calculated using a dump of this run trimmed to the run's length, with no logo present or subtitles added, and no audio track present (because I wanted to do this quickly). The results were more varied than I had expected and have altered my opinions slightly. The first thing worth noting is that in this case, RGB is around 160% larger. This means my similar size hypothesis was incorrect. The second thing worth noting is the size difference between the Lagarith source dump, the Lossless x264 RGB and our current Modern HQ 444, which helps deal with the assumption that lossless encodes are going to be astronomical. In this test, the result is less than 3x the size of the Modern HQ 444 and roughly 8.6x smaller than the initial lossless dumps. Using these results as an indicator, I'd be more inclined to suggest the adding of Lossless RGB as an optional encode as opposed to replacing Modern HQ with the RGB variant, I don't see a point in bumping up the size by what appears to be a significant amount when a little more could present the ideal scenario.
I'm not as active as I once was, but I can be reached here if I should be needed.
Experienced Forum User
Joined: 10/14/2013
Posts: 335
Location: Australia
We currently use avs2pipemod, but both ffmpeg and x264 support direct .avs script input. ffmpeg can pipe to x264 or encode on it's own (allowing the use of other formats, such as VP9) and removing avs2pipemod from the chain will get us closer to having a package that can be entirely 64-bit. Here's an example of using ffmpeg to pipe a script to an external application:
".\programs\ffmpeg_x64" -i ".\%~n1.avs" -vn -f wav - | ".\programs\flac_x64" - --best -o ".\%~n1.flac"
I'm not as active as I once was, but I can be reached here if I should be needed.
Post subject: Modern HQ & RGB
Experienced Forum User
Joined: 10/14/2013
Posts: 335
Location: Australia
For publications we currently offer two main types of downloadable encodes: MP4 Compatibility uses YUV color encoding, limited levels and 4:2:0 chroma. MKV Modern HQ uses YUV color encoding, full levels and 4:4:4 chroma. With all the changes made to encoders and players alike over the recent years, Modern HQ just doesn't feel very "modern" any more. x264 has supported RGB for a long time, so my question is: why are we using 4:4:4 as our HQ option when we could be offering RGB? It would further separate the two encode types and we could still keep all the other (non-colorspace) settings identical and so the file size should theoretically be similar. The only difference is we'd be removing a needless RGB>YUV conversion from the process. To be safe, I did a number of tests to see how x264 worked when handling lossless content in RGB. The resulting files (when decoded to RGB24) were identical to the same encodes produced with the Lagarith codec in RGB. Interestingly, I used this run as my test and the resulting video stream (with no audio data) was 30.4mb. Combining that with FLAC audio leads to a final file size of 64.5mb (although this had no logo or subtitles), which gave me a second thought: Lossless RGB. I'd love to see a third (optional, see below) encode option on publications for Lossless RGB. If I were not making encodes myself, it's the one I'd download. Internet connections are getting faster and hard drive capacities larger. Digital retailers are selling music in lossless (and sometimes 24-bit) formats. Everything seems to be evolving in a very quality-driven manner and so I think we should too. Excluding detail changes (such as branch name) or errors, having Lossless RGB should prevent the need for any future encode updates and give people the best downloadable watching experience possible. Regarding it being optional: I'd love to see this introduced as a standard for every publication on the site- but it's a huge task. Not every game is going to be able to get a lossless encode right away (3D games for instance, would be harder), and not everyone is going to be able to upload large files (Metroid was small, but longer encodes and more complex games could easily get large very fast). From an encoding perspective, it's another thing that's going to take time, although the good news is that the first pass (frame decimation) data generated for the Modern HQ can be re-used for the lossless encode to save time. Having said that, it would be nice to see some publications slowly pop up with "Lossless MKV" or some similar term as a third download option. Even if they're few in number to begin with, it would have to start somewhere. I feel like at the very least we should have an RGB encode, whether it's new adjustment to the Modern HQ encode or a new lossless option.
I'm not as active as I once was, but I can be reached here if I should be needed.
Experienced Forum User
Joined: 10/14/2013
Posts: 335
Location: Australia
I've been thinking about how to potentially improve this for a while and it occurred to me: when youtube moved to 60fps, some of the older videos that were uploaded at 60fps (but which displayed at the previous limit of 30fps) were then re-encoded by Youtube and presented at 60fps. A similar thing happened when Youtube switched to the DASH formats, with videos being re-encoded to meet their new standards. There are plenty of other examples exhibiting this same pattern, showing that when Youtube update or change their own formats, they make new copies from the uploaded source video. This made me think about how we handle Youtube encodes, specifically our resizing methods and our frame limiting. Regarding resizing methods: We currently employ a direct resize to 2160p. This has an advantage of being sharper than the alternatives and allows us to directly control how the video will be presented at that specific resolution. It's important to note that all the other resolutions are going to be derived from this same source clip, although the differences are probably insignificant at those resolutions. The pixels sizes are also inconsistent when using this method. The alternative of point-resizing up by integers and then lanczos resizing down to the destination resolution of 2160p has also been mentioned. It fixes the issue of pixel sizes, but at a compromise of sharpness. The encoding times and file sizes are also significantly worsened. My thoughts at the moment are: why are we not just integer resizing to beyond the resolution? To be clear, I'm talking about taking a 256x224 clip as an resizing it by factors of 14 and 12, to 3854x2688. The first consideration was the old case of Youtube not generating 60fps encodes for non-standard resolutions. I did a test upload to see and 2000x1220 generated 1080p60, so this issue appears to have been resolved. The other consideration is quality, as we will have less control of how the final video will look. Having said that, currently we let Youtube decide how resolutions 1440p and below are going to look anyway. Additionally, going back to my first point, when Youtube change their formats they make new copies from the uploaded source. This makes me think that we should be focusing on giving them the best quality input video we can, rather than trying to manipulate the output (we already give them QP 5 encodes for this reason). If we give them an integer point-sized encode, it will be sharp, have correct pixel sizes and encode faster than lanczos resizing that down to 2160p would after the fact, and if they ever decide to allow exact-resolution playback (or even some other new situation, who knows) we won't have to reupload anything. Before moving on, it's worth noting that there will be some uncommon resolutions requiring ridiculous integers to achieve the correct aspect ratio. In these cases, point-resizing up and lanczos resizing down would be impractical as well. Regarding frame limiting: Before I start, this isn't nearly as relevant or important as the resolution topic above. This is just something to round off that discussion and leave it more complete. Currently we perform two Youtube frame rate limitations: a 60fps cap, and half-fps for games that lag every other frame. The 60fps limit is done in AVISynth via ChangeFPS. As far as I know, this just discards however many frames to ensure the rate specified is achieved. I don't see Youtube doing anything vastly different from this and we're already losing frames from this process. Is there any harm in uploading directly at 70.086~ for DOS? Youtube will just throw away frames the same as ChangeFPS. The advantage here would be that our input files would now be of the game's correct rate (which combined with QP 0 and integer point resizing would mean an entirely lossless input file would be ready for re-encoding should something change). We'll have less control over it again, but do we need control over this? That being said, I have more hope Youtube will support specific resolutions at some point than unique frame rates and so a change here probably isn't necessary. That's probably enough typing for now. There's probably a few typos and a dozen things I've overlooked, but right now I think resizing by integers is the strongest of the options. I'm interested in what everyone else thinks!
I'm not as active as I once was, but I can be reached here if I should be needed.
Experienced Forum User
Joined: 10/14/2013
Posts: 335
Location: Australia
I've just taken this for publication. I'm a little busy, but I'll be doing my best to resync this with GLideN64. Hopefully it won't take too long!
I'm not as active as I once was, but I can be reached here if I should be needed.
Experienced Forum User
Joined: 10/14/2013
Posts: 335
Location: Australia
It would depend on the question posed, but as polls are usually subjective in nature, my guess is no. As soon as you ask someone "which of these options do you prefer?", they're forced to think at least somewhat subjectively, even if they base their decision on some sort of clearly-defined personal rule-set. The obvious exception here is if there's a universal rule-set that all the voters must abide by, but then why have a poll at all? The answer's already technically there. For a poll to be truly objective, the question becomes less of "will people be entirely objective?", instead posing the question "will people answer this honestly?". So in any case, probably not.
I'm not as active as I once was, but I can be reached here if I should be needed.
1 2 3 4
12 13