If extended juggles are difficult and risky, I feel like that's all the more reason to be performing them. This isn't optimized completely for speed, nor entertainment, so I don't feel like it should go through.
The most significant problem with this is that all current players will play back the video with no warnings. Instead, people will just see graphical corruption with zero indication of what is wrong. This is not the same situation as previous codec changes.
In addition, as I mentioned extensively, I think it would be wholly unfair to judge compatibility only on a few desktop players now. Tablets, netbooks, smartphones are common nowadays. Using hardware decoding on desktops keeps them quiet and cool. These weren't problems back in the XviD days. They are problems now, that should not be simply swept under the rug.
Addressing concepts like "quality improvements" is difficult since everyone can have a screaming match at eachother over subjective things. I'm not interested in some quality flame war, that is counter-productive. What I can do, however, is address it from a purely factual point of view, on how the technology actually works.
The most significant improvements of 10-bit color in terms of quality come into play with gradients, dithering and color swapping. It is able to store more color information, so dithering and color swapping are less necessary to avoid color banding.
This is, however, where it gets more complicated. The most popular monitor is an LCD, and the most popular panel is a TN panel. TN panels are not capable of even 8-bit color (6-bit, actually). They use a color swapping and dithering system to achieve a color range equivalent to 8-bit. The result of this is that most color improvements that would be had by 10-bit color are lost when displayed. There are some cases where this is not true (especially with darker values), but it applies for the majority.
And then the next problem comes up. The most common location for these color bands and dithering is across gradients. True gradients are one of the few things you don't commonly see on old console games. The color range of those old systems (especially NES/SMS) is incredibly narrow, and can typically be reproduced perfectly even in 8-bit color.
That is why the quality differences aren't anything to write home about, from a technical standpoint.
This isn't really old technology. This is mainstream technology. 10-bit support has been extremely slow on pickup for a reason (it has minimal benefits when compared to h.265). Back during the pre-h.264 days, hardware decoding was rare, and most devices weren't even capable of such playback. The world has changed since then. I don't think it would be responsible to use the same guidelines for deciding that were used 2, 3, 4+ years ago (whenever the transition actually happened).
I'm not sure I agree with the "greatly improved" sentiment.
Let me add this:
Even IF quality improvements were significant, I think there are way too many drawbacks to make it worthwhile, at least for the time being. Fansubbers, where 10-bit really started to be pushed, are even switching back due to these problems.
I actually registered (with the encouragement of a few others) to post just for this topic. I am writing this in response to the recent news post of this soon becoming the primary codec.
I am asking that you strongly reconsider this.
First let me state plainly that I don't mind whatsoever if there are secondary encodes in 10-bit, or what have you. Options are always nice.
However making 10-bit the primary encode, I can't agree with. Here is my list of reasons, in no particular order:
1. Codec support is mostly experimental. VLC being one of the only ones I can think of that is shipping it as a stable release.
2. It requires viewers to obtain completely new software or codecs in order to appropriately view the content. Worse, if viewed on incompatible viewers, it will look abnormal, leading viewers to believe the video is corrupt or otherwise.
3. It completely removes hardware acceleration support across the board. I don't think there is a single consumer device out there that has 10-bit support with hardware decoding.
4. As a corollary to #3, the removal of hardware acceleration significantly impacts the playability for any mobile device. Smartphones, tablets, and netbooks all rely on hardware acceleration (especially for resolutions above 640x480). These devices would have to rely on YouTube encodes instead, which is hardly ideal. Furthermore, adding compatible decoders/players on these devices are incredibly slim.
5. The future of 10-bit playback is not certain at all. Considering the ratification of h.265 is coming in the near future (Which will offer much greater performance overall as a codec), 10-bit is becoming little more than a stop-gap.
6. As a corollary to #5, it's a stop-gap measure that doesn't serve much purpose. The quality improvements are minimal at best, file size reductions are also minimal (on the order of single-digit megabytes per 100MB).
Overall, I think it contains a significant impact on both the compatibility of the videos on the site, with very little return. In addition, it will cause unneeded complaints from people who don't understand the issues.
As I said at the beginning, using 10-bit as a secondary option is fine. It introduces none of the above problems. But the intent to enforce this as a primary codec in the near future is very brash. If TASVideos is desperate to implement 10-bit for some reason that I am not aware of, I would at least strongly encourage the staff to consider delaying any such decisions for at least 1 year, to see if hardware actually catches up with the software. I believe it will completely pass over it, but only time can prove that. In the meantime, there's no need to rush towards implementation.