BakaBT > Announcements

Hi10P and 8-bit encodes

<< < (63/107) > >>

Bob2004:

--- Quote from: Aerah on January 01, 2012, 12:29:42 AM ---
--- Quote from: Bob2004 on December 31, 2011, 10:14:48 PM ---@Aadieu/Aerah: I can see the differences between 10-bit and 8-bit in that comparison very clearly as well; it's very obvious, especially the TV in #1, the table in #5, and the night sky in #3. And Daiz is a pretty experienced encoder, he can be trusted to do the comparison properly.

--- End quote ---

The only thing I noticed is clear lack of bitrate in the 8-bit pic - there is some blocking present - 5th pic bottom right.
These compression artifacts is the most annoying things ever when seen in massive amounts.

--- End quote ---

Both the 8-bit and 10-bit are exactly the same bitrate. The reason the 8-bit looks more bitrate starved is because the better compression granted by 10-bit means it can make much more efficient use of that bitrate. In other words, 10-bit looks like it's higher bitrate (ie. better quality), even though it's actually not.

This means that when encoding in 10-bit, you can either encode it at a higher crf (and thus lower bitrate - and therefore lower filesize) and still get the same quality, or you can encode it at the same crf and get improved quality at a similar size.

Aerah:

--- Quote from: Bob2004 on January 01, 2012, 12:42:02 AM ---
--- Quote from: Aerah on January 01, 2012, 12:29:42 AM ---
--- Quote from: Bob2004 on December 31, 2011, 10:14:48 PM ---@Aadieu/Aerah: I can see the differences between 10-bit and 8-bit in that comparison very clearly as well; it's very obvious, especially the TV in #1, the table in #5, and the night sky in #3. And Daiz is a pretty experienced encoder, he can be trusted to do the comparison properly.

--- End quote ---

The only thing I noticed is clear lack of bitrate in the 8-bit pic - there is some blocking present - 5th pic bottom right.
These compression artifacts is the most annoying things ever when seen in massive amounts.

--- End quote ---

Both the 8-bit and 10-bit are exactly the same bitrate. The reason the 8-bit looks more bitrate starved is because the better compression granted by 10-bit means it can make much more efficient use of that bitrate. In other words, 10-bit looks like it's higher bitrate (ie. better quality), even though it's actually not.

This means that when encoding in 10-bit, you can either encode it at a lower crf (and thus lower bitrate - and therefore lower filesize) and still get the same quality, or you can encode it at the same crf and get improved quality at a similar size.

--- End quote ---

Well, yes. I know.
The issue is that there is nothing 10-bit exclusive about this - just up the bitrate on the 8-bit encode and what difference will remain then?
Show me the exclusive 10-bit quality.

Wolfpup:
10-bit is good for those that do not use a 'stand-alone' media player that uses hardware decoding as most units like that are designed to only decode 8-bit. I have one of these devices connected connected to my TV and i plug an external HDD into that device to watch the anime that i get here. My device is a WD TV Live Plus HD with a WD 1TB portable HDD pluged into it. The settings for the HDMI output are LOCKED to 8-bit color with no way to change this setting. So for me at least i will be only getting the 8-bit releases due to hardware limitations.

doll_licca:

--- Quote from: ConsiderPhlebas on December 31, 2011, 09:31:10 PM ---
--- Quote from: Aadieu on December 31, 2011, 08:29:38 PM ---Neither have I ever seen anything tagged [Daiz]. So, unless this person is affiliated with some major fansubbing or RAW group or is somehow else a well-known authority, ...
--- End quote ---
I don't think group affiliation and similar has any relevance to what is good encoding or not, nor do I have any strong opinion on 10/8 (although I notice that 10bit plays effortlessly on my old PC with little RAM), but as far as I know Daiz works for example for Underwater and UTW.

Otherwise I think this thread shows (an unusal amount?) of needless heat. Why not cool down and try to reason with rational arguments  :)

--- End quote ---
When my group started doing Hi10P encodes, it resulted in one of the very few times I've had to moderate comments on the blog (mostly to keep the blog reasonably family-friendly).  People feel very passionate about this entire 8-bit vs. 10-bit debate to the point of written and verbal abuse.

This entire thread is merely a microcosm of that.  It's like certain political hot-topics here in the United States; you support one side of an issue, you alienate the other side.

Although, it's very simple to placate both sides on the encoder's part; just do both 8-bit and 10-bit encodes.  It's very minimal effort to do so, and I've probably encoded a lot more episodes of Hi10P than most encoders.

slimcliffy:
 To begin with,Daiz is entirely correct on nearly all points, but there is in fact a time where 10-bits is an improvement over 8-bits even when the source itself is only 8-bits.

To start, 8-bit means that for red(r), green(g), and blue(b), the values 0 to 255 can be represented. For 10-bit, the rgb values can be from 0 to 1023. This means that per component, 10-bit is 4 times as detailed as 8-bit. Therefore, if you had a raw image with 10-bit depth, it would have a color palette 64 times as large (4x4x4=64) to represent the image on your screen. In the case of high definition video, with the exception of footage from EXTREMELY high end cameras (starting with the RedOne cinema camera and upward), you will never come across media of this scale. The reason is, it would require a signal of 3.125 gigabits per second to properly transmit this signal. TV networks with half million dollar cameras broadcast sports from the arena to the network at less than 1/3rd this speed, with quality loss.

The piddly 50mbit/sec that you get from high definition formats would almost definitely not benefit from higher bit depths as it already is stretching itself quite far by employing 150:1 compression to begin with.

The case where 10-bits for a consumer screen makes a big difference is in upscaling video from a lower resolution. Each color channel (red, green, blue), before scaling is multiplied by 4 to make it a 10-bit value to begin with. Then the image is scaled up by finding values inbetween "neighboring" pixels to jam in-between each pixel.

If you work in 8-bit, and you have a pixel with the value 1 and the pixel next to it is the value 2, then if you were to double the size of the image, the pixel inserted inbetween is calculated by adding the two values together, then dividing them in half. So, 1+2 = 3 / 2 = 1.5.

1.5 is not a valid pixel value. So, it would become either 1 or 2 since scaling systems are generally smart enough to use a more complex calculation that takes other pixels into account as well.

Using the same values, in 10-bit, therefore multiplying the 1 and 2 each by 4, we get the values 4 and 8 to start with instead. So 4 + 8 = 12 / 2 = 6. 6 is obviously a valid value, so now instead of the 8-bit version which would be either 1.1.2 or 1.2.2, we have a higher quality scaling of 4.6.8 instead.

The result is that the "sub-pixel-sampling" or the pixels in-between the encoded pixels are of a higher precision. The visible result, in special circumstances (generally you saw it more during the earlier jumps from 5 to 6 pixels per channel) is that color banding in the picture is much less.

The quality is even further improved when linear and temporal color scaling is taken into consideration. This is when the previous pictures and pixels around each pixel are used to help scale the current picture. So the scaler has as much data as possible to help it guess the new value of each pixel when scaling.

To summarize, depending on the quality of the processor being used for scaling on the TV/PC, it is possible to greatly improve the quality of a SD, 720p, even a 1080i (during the deinterlacing phase) picture on a 1080p screen using 10-bit color channel resolution since detail is filled in by guessing numbers for pixels that were not represented on the source media.

That being said, going from 1080p to a 1080p screen, you will not see any quality change.

16.7 million colors per pixel to a little over 1 billion colors per pixel is not as earth-shaking as it may sound. Thanks to motion in pictures, it's not likely to make a big enough different to matter, especially in the case of back-lit screens, but that's an entirely separate discussion.

(Tip: For less file space stop encoding in flac)  :D

Navigation

[0] Message Index

[#] Next page

[*] Previous page

Go to full version