Author Topic: How do people play back 10-bit content w/o 10-bit HW (supposedly)?  (Read 1021 times)

Offline Astara

  • Member
  • Posts: 252
How do people play back 10-bit content w/o 10-bit HW (supposedly)?
« on: September 02, 2016, 05:25:07 am »
I was asking about specs in an Nvidia support forum about their new 1080 GTX card and asked about 10-bit output.  The support engineer was very adamant that the GTX line didn't have 10 bit output and that it was only their professional Quadro line that supported 10-bit output.

I find it hard to believe that all the 10-bit videos are only able to be played back on their way-over-priced professional line.  But according to him:
Quote
Unfortunately we do not have support of 10-bit color with GeForce series of cards. This is due to the physical design of the card as it does not carry a display port connection which supports 10-bit color.
 
Best regards,
NVIDIA Customer Care

 ??? So how do people watch 10-bit?  Second -- what TV models support 10-bit?  I thought my Samsung did, but it turns out it only supports its version of 10-bit, "Deep Color" when talking to a Samsung BR player -- not from a computer, for example. 

I see that intel's latest CPU offerings will have support for 10-bit encoding of videos -- but again -- what hardware to play it?

FWIW, my 64-bit MPC decodes and has options to play in 10 bit or dither, but w/o the hardware, where does it display?

Thanks!

Online ridon428

  • Member
  • Posts: 1311
  • Dial Four-Two-Eight Toll Free!
    • Personal Site
Re: How do people play back 10-bit content w/o 10-bit HW (supposedly)?
« Reply #1 on: September 02, 2016, 10:41:37 am »
Everything I will state here is based from what I know:

10-bit isn't that special. It has more color range and deeper colors. This is what most content creators use to see their projects in better colors.

To output in 10-bit color (output only, not decode), there should be a compatible display and a 10-bit capable graphics card. GeForce series GPUs doesn't support 10-bit but Quadro does.

For the decoding, I'm not quite sure. I think that the codec is converting 10-bit videos to 8-bit before getting displayed to your monitor.
« Last Edit: September 02, 2016, 10:49:00 am by ridon428 »

Offline sneaker2

  • Member
  • Posts: 240
Re: How do people play back 10-bit content w/o 10-bit HW (supposedly)?
« Reply #2 on: September 02, 2016, 09:02:14 pm »
The info is wrong. The GeForce series has been supporting 10 bit output for quite some time now. You should be able to set it in the driver and in a renderer like madvr. Is it worth it? You will likely not notice any difference. Most panels are 6 bit only and people don't complain.

For decoding: Nvidia Pascal can decode 12 bit and 10 bit HEVC/H.265 but only 8 bit AVC/H.264. No consumer PC hardware supports 10 bit AVC/H.264.

Offline Govna

  • Member
  • Posts: 135
Re: How do people play back 10-bit content w/o 10-bit HW (supposedly)?
« Reply #3 on: September 10, 2016, 06:39:13 pm »
Because you're not actually outputting 10bit colors. Either your player is dithering the colors down, or your monitor is.

I don't know what crazy things everyone else is talking about in this thread, but they're wrong.

Offline Bozobub

  • Member
  • Posts: 1744
  • Demon Lord of Clowns
Re: How do people play back 10-bit content w/o 10-bit HW (supposedly)?
« Reply #4 on: September 10, 2016, 11:45:06 pm »
Because you're not actually outputting 10bit colors. Either your player is dithering the colors down, or your monitor is.

I don't know what crazy things everyone else is talking about in this thread, but they're wrong.
Ironically, so are you, partially ^^' .  Almost no one is outputting 10-bit in the 1st place; as noted previously, just about every extant codec converts it to 8-bit before display. And even if they did, you are correct, in that it wouldn't matter anyhow, since very fw people have displays that can handle that much color info, although they certainly doi exist.

Offline Astara

  • Member
  • Posts: 252
Re: How do people play back 10-bit content w/o 10-bit HW (supposedly)?
« Reply #5 on: September 11, 2016, 02:21:28 am »
The info is wrong. The GeForce series has been supporting 10 bit output for quite some time now. You should be able to set it in the driver and in a renderer like madvr. Is it worth it? You will likely not notice any difference. Most panels are 6 bit only and people don't complain.

For decoding: Nvidia Pascal can decode 12 bit and 10 bit HEVC/H.265 but only 8 bit AVC/H.264. No consumer PC hardware supports 10 bit AVC/H.264.
You say the GeForce series has been "supporting", and the others "decode".  I know the various players have been able to use the CUDA tech in the GeForce to decode H.264 -- but decoding and supporting aren't the same as having H/W that is able to view it.  Viewing it requires 10-bit HW.   When searching for 10-bit monitors, I find "professional monitors" priced ~$2000 (https://www.bhphotovideo.com/c/buy/computers/Ntt/10-bit+monitors/N/0/). 

The Dell monitor you point to doesn't have 10-bit color listed as a feature -- it lists 1073.7M colors, but like the ASUS (talked about here: http://www.tomshardware.com/forum/id-2823575/enable-10bit-color-asus-pb278q-monitor.html), it is using dithering to get 10-bit from 8-bit input.  The best answer about 10-bit panels was:

Quote
"achieves 10-bit color by using 8-bits with FRC". It is not a true 10 bit panel, ASUS made a mistake intentionally or unintentionally with that description.

It seems that except for professional cards and monitors, 10-bit is the new snake-oil.  Part of that is deliberate feature-clipping by Nvidia to force users to pay for $$$$ Quadro's which are underpowered for the price.  It used to be the case that Adobe products disabled 3D acceleration in their products if your card wasn't on a list of Nvidia-approved cards (Quadro, mostly).  Eventually the deal w/Nvidia expired, and magically, Adobe products started supporting 3D graphics on GeForce cards, overnight, when a bit was flipped.  Was very lame -- lots of customers complained, as they knew their GeForce cards could be used to accelerate 3D graphics, but Adobe products, artificially limited this feature to Quadro cards.

Offline sneaker2

  • Member
  • Posts: 240
Re: How do people play back 10-bit content w/o 10-bit HW (supposedly)?
« Reply #6 on: September 11, 2016, 02:16:05 pm »
You say the GeForce series has been "supporting", and the others "decode".  I know the various players have been able to use the CUDA tech in the GeForce to decode H.264 -- but decoding and supporting aren't the same as having H/W that is able to view it.  Viewing it requires 10-bit HW.   When searching for 10-bit monitors, I find "professional monitors" priced ~$2000 (https://www.bhphotovideo.com/c/buy/computers/Ntt/10-bit+monitors/N/0/). 

The Dell monitor you point to doesn't have 10-bit color listed as a feature -- it lists 1073.7M colors, but like the ASUS (talked about here: http://www.tomshardware.com/forum/id-2823575/enable-10bit-color-asus-pb278q-monitor.html), it is using dithering to get 10-bit from 8-bit input.
My link was mostly about the GeForce Nvidia driver supporting 10 bit output/display and madvr, not about that specific Dell monitor. A number of new 4K displays support 10 bit since it's seen as requirement for the new UltraHD features like HDR and BT.2020. The display needs it to carry the "Ultra HD Premium" logo. But at the moment is there isn't really any way to access the little true HDR content via PC yet anyways. For the typical Anime fansub viewer it is irrelevant.

Offline Astara

  • Member
  • Posts: 252
Re: How do people play back 10-bit content w/o 10-bit HW (supposedly)?
« Reply #7 on: September 11, 2016, 05:28:49 pm »
My link was mostly about the GeForce Nvidia driver supporting 10 bit output/display and madvr, not about that specific Dell monitor.
But the base question was about hardware .
Quote
A number of new 4K displays support 10 bit since it's seen as requirement for the new UltraHD features like HDR and BT.2020.
Do you have any links for these new displays?  When I search for them, google came up empty handed except for the professional displays I previously mentioned.  I may have to replace a monitor -- not immediately (I hope), but I wanted to start researching options to find 10-bit options, other than expensive professional displays -- which are the only ones I've found so far -- and to get 10-bit output I haven't seen any GeForce cards listed by Nvidia as being able to do 10-bit.  To add salt to the wound, supposedly many programs like Adobe Photoshop won't enable 10-bit color unless they detect a professional line-card like the Quadro (grr).
[quote
The display needs it to carry the "Ultra HD Premium" logo. But at the moment is there isn't really any way to access the little true HDR content via PC yet anyways. For the typical Anime fansub viewer it is irrelevant.

HDR content?  What about 10-bit video content?  Also some programs like Photoshop can use 10-bit hardware if it is available -- but that seems to be dreadfully limited to expensive & *slow* professional cards like the quadro.

Offline cold_hell

  • Member
  • Posts: 351
Re: How do people play back 10-bit content w/o 10-bit HW (supposedly)?
« Reply #8 on: September 11, 2016, 06:33:14 pm »
So how do people watch 10-bit? 
I don't have 10bit monitor, so I can't speak about GPUs/driver support, but currently probably 90% of the monitors are 8bit, so in most cases the video is dithered to 8bit during playback.
As the new professional camcorders go with HEVC we'll probably see native 10-12bit materials soon, but if we ignore the MGVC discs everything else currently has 8bit source, so in theory watching 10bit content on 10bit screen won't make huge of a difference.

So why people encode in 10bit and professionals finally realized that we need higher bitdepth? Basically quantization has huge rounding errors with 8bit encoding, so encoding at higher bitdepth spares the quality and the details to some extend and you have much smoother gradients (but we can ignore this for a high quality live content, since by default it has so much noise that banding is hardly a thing).

Out of the few people I tested not a single one could notice the difference between 8bit output and 6/7bit dithered output, so unless your work doesn't require video mastering, some photo processing or drawing you shoudn't care much about the 10bit output for the moment.
« Last Edit: September 12, 2016, 12:06:58 am by cold_hell »

Offline Bob2004

  • Member
  • Posts: 3029
Re: How do people play back 10-bit content w/o 10-bit HW (supposedly)?
« Reply #9 on: September 11, 2016, 07:30:23 pm »
Astara, the benefits of encoding anime in 10-bit have absolutely nothing to do with how many colours you see on screen. As a few people have touched on already, by encoding video in 10-bit and then dithering it back down to 8-bit when you play it back allows for greater encoding accuracy. This greatly reduces artifacts like banding, and can also allow for slightly smaller file sizes. What hardware you are using to play it back is basically irrelevant.

Incidentally, the benefits only really apply to anime and similar videos, which have lots of smooth gradients and large blocks of colour, as this is where banding mostly occurs. It's much less of an issue with most live-action content, so the benefits gained there are minimal. This is why you mostly only hear about 10-bit video in relation to anime.

Offline Astara

  • Member
  • Posts: 252
Re: How do people play back 10-bit content w/o 10-bit HW (supposedly)?
« Reply #10 on: September 11, 2016, 09:21:50 pm »
Out of the few people I tested not a single one could notice the difference between 8bit output and 6/7bit dithered output, so unless you don't work video mastering, some photo processing or drawing you shoudn't care much about the 10bit output for the moment.
---
Well, it's a combination of wanting to make sure video purchases are a bit more future-proofed, but also, I'd _like_ to eventually get 10-bit support in things like Photoshop as well (which I realize is a different animal than getting it to work in video).  Only reason has been in working with some pictures at high rez, you can easily see areas where one color bands into the next -- and properties of the colors show them to be off by 1 in 8-bit RGB.  But those are rare cases. 
Another place I run into it is in LCD-TV's -- like my Samsung -- does "Deep Color" -- that some others label 10-bit -- but *only* when talking to a Samsung BR player -- but it's also an old LCD-TV (like 1st gen w/some internet capacity, but nothing that works with any modern services like internet videos).  So if I ever replace it, I'd also like video play-back on that LCD-TV to also work with more than a "same-brand" BR player -- with special interest in getting 10-bits out of my computer from 10-bit video sources.

So, I guess I need to go back into hibernation until the tech has caught back up or something like that... ;^)

The bit about increased accuracy to get less banding effects makes sense, though, as a "plus" to 10-bit video -- but I've been thinking my HW is more backward than it is, I suppose.

Thanks...
A*a

Offline Govna

  • Member
  • Posts: 135
Re: How do people play back 10-bit content w/o 10-bit HW (supposedly)?
« Reply #11 on: September 11, 2016, 09:33:49 pm »
Because you're not actually outputting 10bit colors. Either your player is dithering the colors down, or your monitor is.

I don't know what crazy things everyone else is talking about in this thread, but they're wrong.
Ironically, so are you, partially ^^' .  Almost no one is outputting 10-bit in the 1st place; as noted previously, just about every extant codec converts it to 8-bit before display. And even if they did, you are correct, in that it wouldn't matter anyhow, since very fw people have displays that can handle that much color info, although they certainly doi exist.

The codec has nothing to do with the process of it being dithered to 8bit.

Offline Krudda

  • Member
  • Posts: 10323
  • 私は 日本語 が 上手 じゃ ありません
    • My Anime List
Re: How do people play back 10-bit content w/o 10-bit HW (supposedly)?
« Reply #12 on: September 11, 2016, 11:16:58 pm »
Don't trust a TV that "has deep-color" which is 10bit.
Supporting it and displaying it are two different things. Its a typical company hijink to make you think the product is good, while it is actually only able to receive 10bit signals. It still processes them and outputs to what the panel can display.

For example, my TV supports deep-color, but the panel is only 6bit.

In simple terms, it is false advertising with some legalese slur on it that makes it juuuust legal enough to allow.

Offline Bozobub

  • Member
  • Posts: 1744
  • Demon Lord of Clowns
Re: How do people play back 10-bit content w/o 10-bit HW (supposedly)?
« Reply #13 on: September 12, 2016, 12:37:57 am »
Because you're not actually outputting 10bit colors. Either your player is dithering the colors down, or your monitor is.

I don't know what crazy things everyone else is talking about in this thread, but they're wrong.
Ironically, so are you, partially ^^' .  Almost no one is outputting 10-bit in the 1st place; as noted previously, just about every extant codec converts it to 8-bit before display. And even if they did, you are correct, in that it wouldn't matter anyhow, since very fw people have displays that can handle that much color info, although they certainly doi exist.

The codec has nothing to do with the process of it being dithered to 8bit.
Sorry, but no.  Read the rest of the responses here for more details, but the codec is the first part of what processes the data as 8- or 10-bit, for starters (Hint:  You're not going to see 10-bit Ogg Vorbis any time soon :P).

And your initial statement is still incorrect:  If all intervening devices are 10-bit capable, yes, you can see true 10-bit output, and yes, those devices do exist.

Offline Krudda

  • Member
  • Posts: 10323
  • 私は 日本語 が 上手 じゃ ありません
    • My Anime List
Re: How do people play back 10-bit content w/o 10-bit HW (supposedly)?
« Reply #14 on: September 12, 2016, 12:58:34 am »
Most audio is at least 16bit, so yes, you could see 10bit Ogg Vorbis, if someone used a 10bit Ogg encoder.
The codec only has influence on the initial encoding, nothing more. The rest is on the software playback level.

Offline Astara

  • Member
  • Posts: 252
Re: How do people play back 10-bit content w/o 10-bit HW (supposedly)?
« Reply #15 on: September 12, 2016, 11:40:12 am »
The codec only has influence on the initial encoding, nothing more. The rest is on the software playback level.

Codec stands for Code-Decode -- it is something that is able to encode and decode.  Not all codecs go both ways, but many if not most do.  In order to playback a stream a codec is used to decode -- just as a codec is used to encode from raw->stream.  If it only does one, then it is an encoder or decoder, but codec is supposed to mean something that does both.


Offline Krudda

  • Member
  • Posts: 10323
  • 私は 日本語 が 上手 じゃ ありません
    • My Anime List
Re: How do people play back 10-bit content w/o 10-bit HW (supposedly)?
« Reply #16 on: September 12, 2016, 03:25:40 pm »
Wow, someone can skim Wikipedia.

I'm not quite sure that is correct, I don't know of any media codec that both encodes and decodes. The two are separate and codecs are specialised for one or the other, not both.
As an example, LAME is used to encode MP3, but it is unable to be used to play back or decode MP3.
X264 encodes h.264 profile video, but it cannot be used to decode it.
HEVC encoders encode h.265 video, but do not decode the video that I am aware of.

Most codecs are specialised for encoding, or decoding. Not both.

Offline Astara

  • Member
  • Posts: 252
Re: How do people play back 10-bit content w/o 10-bit HW (supposedly)?
« Reply #17 on: September 12, 2016, 05:57:23 pm »
Wow, someone can skim Wikipedia.
Actually, that was from memory.  I didn't bother with Wikipedia, because it's pretty obvious that one of the prime uses of a codec is to decode -- why else would people download codec packs to help or solve playback issues?    You could think of a codec as being something like a crypto plugin or device.  It encodes and decodes -- doing one w/o the other is not very useful.  That doesn't mean you can't produce SW that only does one.

Another example: compression.  Nearly all compression SW goes both ways with a glaring exception: 'rar'.  There the author wants to keep the encoder proprietary so they can keep making money off of it,
but the fact that nearly all compression SW does both is so much assumed, that people only talk about 'compression' (the encode side) algorithms.  It's implicitly assumed that decoding is also part of the SW.

Audio and Video codecs are the same.  They default to doing both, but owners of patents may make the availability of one direction different, like charging different prices for one but not the other.  Example H.264 -- free to end users, but commercial use gets charged -- like all HW-BR players would have to have to have a license in the US.  They have very specific rules as for what gets charged for and who pays (now that fact I did check on Wikipedia).
Quote
I'm not quite sure that is correct, I don't know of any media codec that both encodes and decodes. The two are separate and codecs are specialised for one or the other, not both.
As an example, LAME is used to encode MP3, but it is unable to be used to play back or decode MP3.
     I know that's wrong.  The code has a build-time switch to include decoding or not -- and the build docs say that doing that was for patent purposes where in some countries, playback was charged patent royalties.  FLAC does both as well.  You can find specific examples where the codec is incomplete, but that almost always due to patent and or $$ issues.

Offline Govna

  • Member
  • Posts: 135
Re: How do people play back 10-bit content w/o 10-bit HW (supposedly)?
« Reply #18 on: September 13, 2016, 02:00:20 am »
Because you're not actually outputting 10bit colors. Either your player is dithering the colors down, or your monitor is.

I don't know what crazy things everyone else is talking about in this thread, but they're wrong.
Ironically, so are you, partially ^^' .  Almost no one is outputting 10-bit in the 1st place; as noted previously, just about every extant codec converts it to 8-bit before display. And even if they did, you are correct, in that it wouldn't matter anyhow, since very fw people have displays that can handle that much color info, although they certainly doi exist.

The codec has nothing to do with the process of it being dithered to 8bit.
Sorry, but no.  Read the rest of the responses here for more details, but the codec is the first part of what processes the data as 8- or 10-bit, for starters (Hint:  You're not going to see 10-bit Ogg Vorbis any time soon :P).

And your initial statement is still incorrect:  If all intervening devices are 10-bit capable, yes, you can see true 10-bit output, and yes, those devices do exist.

Oh so you have no idea what you're talking about.

I'll be on my way then.

Wow, someone can skim Wikipedia.

I'm not quite sure that is correct, I don't know of any media codec that both encodes and decodes. The two are separate and codecs are specialised for one or the other, not both.
As an example, LAME is used to encode MP3, but it is unable to be used to play back or decode MP3.
X264 encodes h.264 profile video, but it cannot be used to decode it.
HEVC encoders encode h.265 video, but do not decode the video that I am aware of.

Most codecs are specialised for encoding, or decoding. Not both.

You're thinking about it wrong. The codec is a set of standards of processing that any program must do in order to meet compliance. Data must be stored a certain way, and must be able to be decoded in a certain way. When a codec is developed reference encoders/decoders are also developed to provide examples of how the authors intend for things to be done.

Things like x264 are third-party software to accomplish encoding. There's also things like LAV Filters that do the same, but for decoding.

LAME is not a codec. x264 is not a codec. They are specific tools designed for a part of the process. The codec is the standard.
« Last Edit: September 13, 2016, 02:03:45 am by Govna »

Offline Astara

  • Member
  • Posts: 252
Re: How do people play back 10-bit content w/o 10-bit HW (supposedly)?
« Reply #19 on: September 13, 2016, 02:20:07 am »
LAME is not a codec. x264 is not a codec. They are specific tools designed for a part of the process. The codec is the standard.
Would it be more accurate to say the codec is an implementation of the standard, with the implementation possibly being a partial implementation?