Discussion Forums > Technology
Is 10bit encoded or source is 10bit.
Prideless:
Sorry I am a bit confused on exactly what I am trying to say myself.
So basically detail retention = better lighting?
When encoders brighten up their encode wouldnt the overall image be brightened? Because from what ive seen in 10bit it doesnt appear to be just that. It looks like videos have much more illumination in various scenes and also in different ways opposed to just simply brightening the image. Sometimes the lighting seems to have a fog effect or a sharp glare or a nice fade from weaker to stronger or stronger to weaker, its very apparent opposed to watching 8bit.
I am just trying to understand simply why is it that 10bit has better lighting than 8bit. It is because of the source? or is it added in via encode? or was their code their to begin with and simply converting from 8 to 10 bit makes the code able to display better lighting?
Thanks for reading btw and trying to answer odd question guys.
Freedom Kira:
I'm pretty sure it's just a coincidence. The lighting of the image should not change between 8-bit and 10-bit encodes. You probably happened to download a 10-bit version that the encoders tweaked a bit, or perhaps the 8-bit version was tweaked.
The tweak was not necessarily just a general brightness increase. I'm sure there are programs designed to brighten intelligently.
Why don't you try comparing encodes from a different movie/show/whatever? Let me know what you find.
Ozzaharwood:
4 bit = 16 shades or states
5 bit = 32
6 bit = 64
7 bit = 128
8 bit = 256
9 bit = 512
10 bit = 1024
So 256 x 256 x 256 = 16,777,216 total colours, 8 bit
1024 x 1024 x 1024 = 1,073,741,824 total colours, 10 bit
Basically, you get 64 times more colours and shades. This means it looks a lot better; bright colours get brighter and dark colours get darker. This could be the difference in "lighting" you are observing. The fansub group may also add effects to brighten because they have the smaller filesize to do so.
Bob2004:
Ozzarharwood, you're forgetting that it gets dithered back down to 8-bit colour again before being displayed on the screen, so encoding in 10-bit does not actually mean more colours. The benefit of 10-bit is in enhanced precision, and greater compression.
To the OP: what player/codecs are you using, and what version? I don't know the details, but there was a bug with the colour conversion in early 10-bit players IIRC. It was balanced out by another bug somewhere else IIRC, so most people didn't notice it, but it may be related to what you're seeing, since the colours in a 10-bit encode should not really be any different to an 8-bit encode.
kitamesume:
^i'm guessing hes using an old version of CCCP, which had those weird colorings and lighting when i used to use them.
Navigation
[0] Message Index
[*] Previous page
Go to full version