Time out!
Will someone tell me already why it is that persons involved in this discussion are getting radically different results from Hi10p playback, with little to OVER9000 level banding, as seen in screenshots of the same scene from the same file?
Also a good time to figure out what part of that is attributable to decoders, what part - to settings, what part to GPU/processing capabilities, and what part to filters?
Cause while it's pretty easy to blame my screenshot on my "doing it wrong", is it also not easy to suspect the better results of "doing it TOO right" (aka introducing results of various technologies that vastly improve visuals and misrepresenting these results as those of the 8bit >> 10bit switch, when in fact they might have shit-all to do with colour bitrates or encodings)?
...ARE WE EVEN SURE THAT ANY OF THE DIFFERENCE "SHOWN" ANYWHERE AT ALL CAN BE ATTRIBUTED TO ANYTHING BUT GPU AND MADVR? Or any other differences due to owners of finetuned hi-end systems posting their top-of-the-line 10bit screens vs. basic users with average systems and no finetuning providing the 8bits for comparison?
Also, could somebody that got "pretty" results (screenshots) on a 10bit release do comparison shots with the same group's similar-filesize 8bit version of the same anime, while running the same player and the same presets? (Yeah I know that guy from UTW did that, but he was selecting screens of his own encoding to illustrate his own point - not exactly a perfect unbiased situation, a clean experiment would have to be done by someone without a prior conclusion and without the encoder's knowledge; and, in any case, both of his encodes were badly bitrate-starved, yet both looked pretty good and not exactly radically different, either)