BakaBT > Announcements
Hi10P and 8-bit encodes
bobthedog:
I agree with everything you say, Bob2004. (Must be something wise about us Bobs, eh? ;D )
And if you look at the Poll carefully, it's easy to see that the lower three choices all have in common the strong desire of the majority (as it stands
at this moment, and has been throughout the duration of the poll) to maintain 8-bit encodes for the foreseeable future so that the huge installed base
of people with either less-than-cutting-edge hardware or a desire to watch their anime on televisions won't find themselves left out in the cold.
Thank you for your maturity, and have a happy New Year!
Aadieu:
--- Quote from: Bob2004 on December 31, 2011, 10:14:48 PM ---@Aadieu/Aerah: I can see the differences between 10-bit and 8-bit in that comparison very clearly as well; it's very obvious, especially the TV in #1, the table in #5, and the night sky in #3. And Daiz is a pretty experienced encoder, he can be trusted to do the comparison properly.
--- End quote ---
Difference? YES. Better? Not so sure - banding on the TV in #1 is major in both cases, just looks sorta different - 8-bit has this sorta hybrid banding/blocking, while 10-bit has more banding and zero blocking (just looking at the corner of the television in the pic). Btw, both are only really that noticeable because it's what changes on mouseover, thereby drawing attention. And in part, the TV banding might actually be intentional, since it depicts rays of sunshine falling on a surface.
Somebody else mentioned Daiz's participation in prominent groups, OK then... but still, comparisons made by the encoder himself to prove his own point are to be taken with a grain of salt, no? And his filesizes were pretty bitrate-starved, creating those very differences that he then found the most major of. Which still weren't that extreme.
nstgc:
I voted for the second option even though my system can play multiple Hi10P encoded videos concurrently without a hicup even at 1080p. I remember when there was the switch from XviD to H.264 and how my computer, although decent and relatively new, couldn't play those encodes that were at the very top. Until there is decent GPU decoding I vote that we keep slot A 8b encodes.
Daiz:
--- Quote from: Aadieu on December 31, 2011, 11:26:56 PM ---Difference? YES. Better? Not so sure - banding on the TV in #1 is major in both cases
--- End quote ---
Getting rid of the banding on the TV would require very heavy debanding that would end up killing details. In other words, banding on the TV is a source issue. That doesn't change the fact that the 10-bit encode preserves gradients much better all over the place. Also, while the bitrate might be starved for 8-bit (as evident by notable banding), it works quite fine for 10-bit. Denpa Onna is not a particularly hard source to compress. And being somewhat bitrate starved wouldn't even hurt the comparison, really - it would just makes the difference(s) easier to spot. Also, you've completely ignored the other comparison I posted. Compare this to this for example - the huge gradient is very obviously better preserved with 10-bit than with 8-bit. You really should stop trying to downplay the compressional benefits of 10-bit, it just makes you look like an ignorant luddite.
--- Quote from: Aadieu on December 31, 2011, 11:26:56 PM ---but still, comparisons made by the encoder himself to prove his own point are to be taken with a grain of salt, no?
--- End quote ---
You really think people would even be switching to 10-bit if it didn't offer compressional benefits? I've done these comparisons as much for myself as I've done them for the public, and for the sake of both I do the comparisons properly. If all these comparisons were just "cheated" in favor of 10-bit and it didn't actually offer any improvements I'd still be encoding everything in 8-bit. Stop being stupid.
Also, as other people mentioned already, I'm an encoder (among other things) in Underwater and UTW, the former which I also lead.
Aerah:
--- Quote from: Bob2004 on December 31, 2011, 10:14:48 PM ---@Aadieu/Aerah: I can see the differences between 10-bit and 8-bit in that comparison very clearly as well; it's very obvious, especially the TV in #1, the table in #5, and the night sky in #3. And Daiz is a pretty experienced encoder, he can be trusted to do the comparison properly.
--- End quote ---
The only thing I noticed is clear lack of bitrate in the 8-bit pic - there is some blocking present - 5th pic bottom right.
These compression artifacts is the most annoying things ever when seen in massive amounts.
Unless you tell me what to look for, I am not going to find it quickly.
And if I am not going to find it while I am watching it the encode is of good quality.
That is not really a good argument for 10bit if you have to point out what and where.
For example the ed-6999-10bit-3000kbps.png is simply ed-6999-8bit-3000kbps.png dithered more which can be easily done client side. Additionally, why would the sharpness of the 8-bit encode not decrease with added bitrate?
Navigation
[0] Message Index
[#] Next page
[*] Previous page
Go to full version