Author Topic: Hi10P and 8-bit encodes  (Read 64139 times)

Offline RedSuisei

  • Member
  • Posts: 326
Re: Hi10P and 8-bit encodes
« Reply #300 on: December 31, 2011, 08:53:06 PM »
...I think I don't have to say anything more to you Aadieu. There's already plenty of proof in Daiz' screenshots. If your eyes can't see the reduction in banding, then that's a problem with your eyes, seeing that the rest of us (including the ones who will eventually make decisions on which torrents get accepted) can see the reduction in banding. I'm starting to think that you just simply hate 10-bit and won't accept it no matter what, so I'll stop the argument here. Feel free to stick with your 8-bit version even if there's a superior 10-bit just because you think there's no quality gain.

Offline Aerah

  • Member
  • Posts: 61
  • [h264][1080p][40Mbit][MKV][FLAC]
Re: Hi10P and 8-bit encodes
« Reply #301 on: December 31, 2011, 09:13:08 PM »
So much butthurt due to SupraGuy, prep-h is probably needed here.  :laugh:

I laughed at that comparison, took me 10 seconds to notice that on-cursor image was different and not someone trolling me submitting the same image for A and B.
If you say that A has x bitrate and B has x+1% bitrate it would also be believable.
If you say that A is without client-side postprocessing and B is with client-side post processing it would also be believable.
If you say that A is Haali Render and B is MadVR / EVRCP (set to HQ) it would also be believable.
Intel / AMD / NVIDIA
MPC:HC

Offline ConsiderPhlebas

  • Member
  • Posts: 1282
  • CP @ Live-eviL
  • Awards Award for multiple donations to Kiva charity. Award for donating to Kiva charity.
    • Live-eviL
Re: Hi10P and 8-bit encodes
« Reply #302 on: December 31, 2011, 09:31:10 PM »
Neither have I ever seen anything tagged [Daiz]. So, unless this person is affiliated with some major fansubbing or RAW group or is somehow else a well-known authority, ...
I don't think group affiliation and similar has any relevance to what is good encoding or not, nor do I have any strong opinion on 10/8 (although I notice that 10bit plays effortlessly on my old PC with little RAM), but as far as I know Daiz works for example for Underwater and UTW.

Otherwise I think this thread shows (an unusal amount?) of needless heat. Why not cool down and try to reason with rational arguments  :)

Offline bobthedog

  • Member
  • Posts: 30
Re: Hi10P and 8-bit encodes
« Reply #303 on: December 31, 2011, 09:52:04 PM »
Seeing how are people eager to do this jump into the future (come on, I don't think 6 months is exactly a lot).
"Moving to the future" is fine in my book, but I don't think that a torrent tracker/release aggregation/archival place is a good place to drive that. I would leave that to the elitist-enough fansubbers an rippers.
...
I don't think bakabt really needs to take too active role in this transition. Not even leaving a single fallback slot (as many people in the comments seem to be perfectly fine with) is outright bad imho.
(Come on, xvid was put to "equal treatment" policy just recently... about 5 years after the introduction of h.264?)


And again with the excellent points, OnDeed!


Talk about being elitist.. lmao.
I'm not really complaining about 10-bit, it's just that the sudden hatred for 8-bit is laughable considering that's what everyone was watching with no problem earlier this year.


Yup.  That's exactly the attitude I was pointing out in earlier posts:

"I got mine, Jack - and you can all kiss my arse if it's a problem for you.  If you don't watch anime the way I choose to watch it
then you aren't good enough to be watching it."

It's a selfish, snotty, elitist viewpoint which places the preferences of some people above widespread compatibility.

And they don't even see it, because of course their own egos are in the way...     :(

« Last Edit: December 31, 2011, 10:08:10 PM by bobthedog »

Offline Bob2004

  • Member
  • Posts: 2562
Re: Hi10P and 8-bit encodes
« Reply #304 on: December 31, 2011, 10:14:48 PM »
@Aadieu/Aerah: I can see the differences between 10-bit and 8-bit in that comparison very clearly as well; it's very obvious, especially the TV in #1, the table in #5, and the night sky in #3. And Daiz is a pretty experienced encoder, he can be trusted to do the comparison properly.

@bobthedog: While I don't entirely disagree with you, remember that 8-bit releases will only be replaced with 10-bit versions when there is actually a benefit in doing so; it will not just be due to personal preference (regardless of the situation in the wider fansubbing community). So, while I do agree that many people are a bit overenthusiastic about the wonders of 10-bit, this discussion is about BakaBT; and in practice if staff did start treating 10-bit and 8-bit equally, as proposed, 10-bit will only actually be used where it does genuinely offer a benefit over any 8-bit releases.

Personally, I still don't believe 10-bit is mature enough to be an acceptable replacement for 8-bit, even where it does offer an improvement. As a few people have pointed out, it took 5 years before the dedicated Xvid slot was removed; and while I'm not saying we should wait that long for 10-bit too, 6-7 months is a bit too soon to be relying on it completely.

I think (and the majority of voters agree with me) that it is essential we make sure there is an 8-bit alternative available. Not necessarily for all categories, but there should be at least one 8-bit version of every anime on the site. It won't create any extra work for staff, since all releases need to be compared anyway, and I don't think BakaBT needs to worry too much about the seeding pool being split - the site has a clear surplus of seeders, and the systems already in place will ensure that no torrent will be unseeded.

Offline bobthedog

  • Member
  • Posts: 30
Re: Hi10P and 8-bit encodes
« Reply #305 on: December 31, 2011, 10:21:53 PM »
I agree with everything you say, Bob2004.  (Must be something wise about us Bobs, eh?     ;D    )

And if you look at the Poll carefully, it's easy to see that the lower three choices all have in common the strong desire of the majority (as it stands
at this moment, and has been throughout the duration of the poll) to maintain 8-bit encodes for the foreseeable future so that the huge installed base
of people with either less-than-cutting-edge hardware or a desire to watch their anime on televisions won't find themselves left out in the cold.

Thank you for your maturity, and have a happy New Year!

Offline Aadieu

  • Member
  • Posts: 103
Re: Hi10P and 8-bit encodes
« Reply #306 on: December 31, 2011, 11:26:56 PM »
@Aadieu/Aerah: I can see the differences between 10-bit and 8-bit in that comparison very clearly as well; it's very obvious, especially the TV in #1, the table in #5, and the night sky in #3. And Daiz is a pretty experienced encoder, he can be trusted to do the comparison properly.

Difference? YES. Better? Not so sure - banding on the TV in #1 is major in both cases, just looks sorta different - 8-bit has this sorta hybrid banding/blocking, while 10-bit has more banding and zero blocking (just looking at the corner of the television in the pic). Btw, both are only really that noticeable because it's what changes on mouseover, thereby drawing attention. And in part, the TV banding might actually be intentional, since it depicts rays of sunshine falling on a surface.

Somebody else mentioned Daiz's participation in prominent groups, OK then... but still, comparisons made by the encoder himself to prove his own point are to be taken with a grain of salt, no? And his filesizes were pretty bitrate-starved, creating those very differences that he then found the most major of. Which still weren't that extreme.

Offline nstgc

  • Member
  • Posts: 7758
    • http://www.justfuckinggoogleit.com
Re: Hi10P and 8-bit encodes
« Reply #307 on: December 31, 2011, 11:40:56 PM »
I voted for the second option even though my system can play multiple Hi10P encoded videos concurrently without a hicup even at 1080p. I remember when there was the switch from XviD to H.264 and how my computer, although decent and relatively new, couldn't play those encodes that were at the very top. Until there is decent GPU decoding I vote that we keep slot A 8b encodes.

Offline Daiz

  • Member
  • Posts: 297
  • 10-bit Librarian
    • Underwater
Re: 10-bit and 8-bit encodes
« Reply #308 on: December 31, 2011, 11:50:20 PM »
Difference? YES. Better? Not so sure - banding on the TV in #1 is major in both cases

Getting rid of the banding on the TV would require very heavy debanding that would end up killing details. In other words, banding on the TV is a source issue. That doesn't change the fact that the 10-bit encode preserves gradients much better all over the place. Also, while the bitrate might be starved for 8-bit (as evident by notable banding), it works quite fine for 10-bit. Denpa Onna is not a particularly hard source to compress.  And being somewhat bitrate starved wouldn't even hurt the comparison, really - it would just makes the difference(s) easier to spot. Also, you've completely ignored the other comparison I posted. Compare this to this for example - the huge gradient is very obviously better preserved with 10-bit than with 8-bit. You really should stop trying to downplay the compressional benefits of 10-bit, it just makes you look like an ignorant luddite.

but still, comparisons made by the encoder himself to prove his own point are to be taken with a grain of salt, no?

You really think people would even be switching to 10-bit if it didn't offer compressional benefits? I've done these comparisons as much for myself as I've done them for the public, and for the sake of both I do the comparisons properly. If all these comparisons were just "cheated" in favor of 10-bit and it didn't actually offer any improvements I'd still be encoding everything in 8-bit. Stop being stupid.

Also, as other people mentioned already, I'm an encoder (among other things) in Underwater and UTW, the former which I also lead.
« Last Edit: December 31, 2011, 11:52:20 PM by Daiz »

Offline Aerah

  • Member
  • Posts: 61
  • [h264][1080p][40Mbit][MKV][FLAC]
Re: Hi10P and 8-bit encodes
« Reply #309 on: January 01, 2012, 12:29:42 AM »
@Aadieu/Aerah: I can see the differences between 10-bit and 8-bit in that comparison very clearly as well; it's very obvious, especially the TV in #1, the table in #5, and the night sky in #3. And Daiz is a pretty experienced encoder, he can be trusted to do the comparison properly.

The only thing I noticed is clear lack of bitrate in the 8-bit pic - there is some blocking present - 5th pic bottom right.
These compression artifacts is the most annoying things ever when seen in massive amounts.

Unless you tell me what to look for, I am not going to find it quickly.
And if I am not going to find it while I am watching it the encode is of good quality.
That is not really a good argument for 10bit if you have to point out what and where.

For example the ed-6999-10bit-3000kbps.png is simply ed-6999-8bit-3000kbps.png dithered more which can be easily done client side. Additionally, why would the sharpness of the 8-bit encode not decrease with added bitrate?
« Last Edit: January 01, 2012, 12:32:27 AM by Aerah »
Intel / AMD / NVIDIA
MPC:HC

Offline Bob2004

  • Member
  • Posts: 2562
Re: Hi10P and 8-bit encodes
« Reply #310 on: January 01, 2012, 12:42:02 AM »
@Aadieu/Aerah: I can see the differences between 10-bit and 8-bit in that comparison very clearly as well; it's very obvious, especially the TV in #1, the table in #5, and the night sky in #3. And Daiz is a pretty experienced encoder, he can be trusted to do the comparison properly.

The only thing I noticed is clear lack of bitrate in the 8-bit pic - there is some blocking present - 5th pic bottom right.
These compression artifacts is the most annoying things ever when seen in massive amounts.

Both the 8-bit and 10-bit are exactly the same bitrate. The reason the 8-bit looks more bitrate starved is because the better compression granted by 10-bit means it can make much more efficient use of that bitrate. In other words, 10-bit looks like it's higher bitrate (ie. better quality), even though it's actually not.

This means that when encoding in 10-bit, you can either encode it at a higher crf (and thus lower bitrate - and therefore lower filesize) and still get the same quality, or you can encode it at the same crf and get improved quality at a similar size.
« Last Edit: January 01, 2012, 01:23:19 AM by Bob2004 »

Offline Aerah

  • Member
  • Posts: 61
  • [h264][1080p][40Mbit][MKV][FLAC]
Re: Hi10P and 8-bit encodes
« Reply #311 on: January 01, 2012, 01:10:51 AM »
@Aadieu/Aerah: I can see the differences between 10-bit and 8-bit in that comparison very clearly as well; it's very obvious, especially the TV in #1, the table in #5, and the night sky in #3. And Daiz is a pretty experienced encoder, he can be trusted to do the comparison properly.

The only thing I noticed is clear lack of bitrate in the 8-bit pic - there is some blocking present - 5th pic bottom right.
These compression artifacts is the most annoying things ever when seen in massive amounts.

Both the 8-bit and 10-bit are exactly the same bitrate. The reason the 8-bit looks more bitrate starved is because the better compression granted by 10-bit means it can make much more efficient use of that bitrate. In other words, 10-bit looks like it's higher bitrate (ie. better quality), even though it's actually not.

This means that when encoding in 10-bit, you can either encode it at a lower crf (and thus lower bitrate - and therefore lower filesize) and still get the same quality, or you can encode it at the same crf and get improved quality at a similar size.

Well, yes. I know.
The issue is that there is nothing 10-bit exclusive about this - just up the bitrate on the 8-bit encode and what difference will remain then?
Show me the exclusive 10-bit quality.
Intel / AMD / NVIDIA
MPC:HC

Offline Wolfpup

  • Member
  • Posts: 9
Re: Hi10P and 8-bit encodes
« Reply #312 on: January 01, 2012, 01:13:21 AM »
10-bit is good for those that do not use a 'stand-alone' media player that uses hardware decoding as most units like that are designed to only decode 8-bit. I have one of these devices connected connected to my TV and i plug an external HDD into that device to watch the anime that i get here. My device is a WD TV Live Plus HD with a WD 1TB portable HDD pluged into it. The settings for the HDMI output are LOCKED to 8-bit color with no way to change this setting. So for me at least i will be only getting the 8-bit releases due to hardware limitations.

Offline doll_licca

  • Member
  • Posts: 39
  • Founder of Licca Fansubs
Re: Hi10P and 8-bit encodes
« Reply #313 on: January 01, 2012, 01:15:42 AM »
Neither have I ever seen anything tagged [Daiz]. So, unless this person is affiliated with some major fansubbing or RAW group or is somehow else a well-known authority, ...
I don't think group affiliation and similar has any relevance to what is good encoding or not, nor do I have any strong opinion on 10/8 (although I notice that 10bit plays effortlessly on my old PC with little RAM), but as far as I know Daiz works for example for Underwater and UTW.

Otherwise I think this thread shows (an unusal amount?) of needless heat. Why not cool down and try to reason with rational arguments  :)
When my group started doing Hi10P encodes, it resulted in one of the very few times I've had to moderate comments on the blog (mostly to keep the blog reasonably family-friendly).  People feel very passionate about this entire 8-bit vs. 10-bit debate to the point of written and verbal abuse.

This entire thread is merely a microcosm of that.  It's like certain political hot-topics here in the United States; you support one side of an issue, you alienate the other side.

Although, it's very simple to placate both sides on the encoder's part; just do both 8-bit and 10-bit encodes.  It's very minimal effort to do so, and I've probably encoded a lot more episodes of Hi10P than most encoders.


Offline slimcliffy

  • Member
  • Posts: 27
    • Slimcliffy Production's
Re: 10-bit and 8-bit encodes
« Reply #314 on: January 01, 2012, 01:29:54 AM »
 To begin with,Daiz is entirely correct on nearly all points, but there is in fact a time where 10-bits is an improvement over 8-bits even when the source itself is only 8-bits.

To start, 8-bit means that for red(r), green(g), and blue(b), the values 0 to 255 can be represented. For 10-bit, the rgb values can be from 0 to 1023. This means that per component, 10-bit is 4 times as detailed as 8-bit. Therefore, if you had a raw image with 10-bit depth, it would have a color palette 64 times as large (4x4x4=64) to represent the image on your screen. In the case of high definition video, with the exception of footage from EXTREMELY high end cameras (starting with the RedOne cinema camera and upward), you will never come across media of this scale. The reason is, it would require a signal of 3.125 gigabits per second to properly transmit this signal. TV networks with half million dollar cameras broadcast sports from the arena to the network at less than 1/3rd this speed, with quality loss.

The piddly 50mbit/sec that you get from high definition formats would almost definitely not benefit from higher bit depths as it already is stretching itself quite far by employing 150:1 compression to begin with.

The case where 10-bits for a consumer screen makes a big difference is in upscaling video from a lower resolution. Each color channel (red, green, blue), before scaling is multiplied by 4 to make it a 10-bit value to begin with. Then the image is scaled up by finding values inbetween "neighboring" pixels to jam in-between each pixel.

If you work in 8-bit, and you have a pixel with the value 1 and the pixel next to it is the value 2, then if you were to double the size of the image, the pixel inserted inbetween is calculated by adding the two values together, then dividing them in half. So, 1+2 = 3 / 2 = 1.5.

1.5 is not a valid pixel value. So, it would become either 1 or 2 since scaling systems are generally smart enough to use a more complex calculation that takes other pixels into account as well.

Using the same values, in 10-bit, therefore multiplying the 1 and 2 each by 4, we get the values 4 and 8 to start with instead. So 4 + 8 = 12 / 2 = 6. 6 is obviously a valid value, so now instead of the 8-bit version which would be either 1.1.2 or 1.2.2, we have a higher quality scaling of 4.6.8 instead.

The result is that the "sub-pixel-sampling" or the pixels in-between the encoded pixels are of a higher precision. The visible result, in special circumstances (generally you saw it more during the earlier jumps from 5 to 6 pixels per channel) is that color banding in the picture is much less.

The quality is even further improved when linear and temporal color scaling is taken into consideration. This is when the previous pictures and pixels around each pixel are used to help scale the current picture. So the scaler has as much data as possible to help it guess the new value of each pixel when scaling.

To summarize, depending on the quality of the processor being used for scaling on the TV/PC, it is possible to greatly improve the quality of a SD, 720p, even a 1080i (during the deinterlacing phase) picture on a 1080p screen using 10-bit color channel resolution since detail is filled in by guessing numbers for pixels that were not represented on the source media.

That being said, going from 1080p to a 1080p screen, you will not see any quality change.

16.7 million colors per pixel to a little over 1 billion colors per pixel is not as earth-shaking as it may sound. Thanks to motion in pictures, it's not likely to make a big enough different to matter, especially in the case of back-lit screens, but that's an entirely separate discussion.

(Tip: For less file space stop encoding in flac)  :D

Offline doll_licca

  • Member
  • Posts: 39
  • Founder of Licca Fansubs
Re: 10-bit and 8-bit encodes
« Reply #315 on: January 01, 2012, 04:40:58 AM »
(Tip: For less file space stop encoding in flac)  :D
That's until you run into someone who can tell when audio has been compressed. 

(This would be mostly folks who probably deal with music for a living.  Fortunately, the number of folks who are able to do this aren't that many.)

Offline Aadieu

  • Member
  • Posts: 103
Re: 10-bit and 8-bit encodes
« Reply #316 on: January 01, 2012, 12:08:41 PM »
Difference? YES. Better? Not so sure - banding on the TV in #1 is major in both cases

Getting rid of the banding on the TV would require very heavy debanding that would end up killing details. In other words, banding on the TV is a source issue. That doesn't change the fact that the 10-bit encode preserves gradients much better all over the place. Also, while the bitrate might be starved for 8-bit (as evident by notable banding), it works quite fine for 10-bit. Denpa Onna is not a particularly hard source to compress.  And being somewhat bitrate starved wouldn't even hurt the comparison, really - it would just makes the difference(s) easier to spot. Also, you've completely ignored the other comparison I posted. Compare this to this for example - the huge gradient is very obviously better preserved with 10-bit than with 8-bit. You really should stop trying to downplay the compressional benefits of 10-bit, it just makes you look like an ignorant luddite.

The DenpaOnna ones had detectable differences, at least, though arguably beneficial ones.

These two? If you'd told me that they were subsequent frames from one encode, I'd be more likely to believe you. If someone else who wasn't me and actually knew how to do all this looked up the file info for colour bitrate/creation time/whatever and proved that you mistakenly uploaded two screenshots from the same source instead of a comparison, I'd be more likely to believe you. If you said these were excellent examples how 8-bit and 10-bit looked pretty much the same, I'd believe you :-)

...as it is though, you figure in the added ability to run hardware media players, old hardware, *or* utilize good old non-beta players with time-tested codecs with all the possibilities of modern hardware to filter and process the hell out of an 8-bit source, reliably, without an issue or a crash in months of daily use, vs. lots of betas, issues, glitches, incompatibilities, stability issues, and lack of choice in 10-bit decoders, and is it worth it? NOT WITH THESE PROOFPICS IT AINT.

Why? Simple answer: JUST SWITCHING DECODERS OFTEN PROVIDES A MUCH MORE VISIBLE QUALITY CHANGE. That's without even touching post-processing options and filters that appear to be somewhat scarce, 10-bit functionality wise, at the moment.

Complex answer: cause old upscaled bitrate-starved "720p" (from a blocky original with lots of straight lines and staircase edges all over the place, as well as the occasional artifact) properly processed in KMP provides better image quality than either your 8-bit or your 10-bit screens, regardless of bitrate version. And it's not yet worth the hassle of relearning to tweak the newer semi-beta Pot Player to hopefully, after a certain learning curve, *maybe* achieve that sort of quality on a hi-bitrate 10-bit video. Someday.
« Last Edit: January 01, 2012, 12:16:42 PM by Aadieu »

Offline doll_licca

  • Member
  • Posts: 39
  • Founder of Licca Fansubs
Re: Hi10P and 8-bit encodes
« Reply #317 on: January 01, 2012, 01:32:25 PM »
Also, I've found that in my experience with Hi10P encoding versus the regular 8-bit High Profile, the source material strongly affects how much extra compression quality you'll get given the same bitrates.  Hi10P doesn't work as well with really old material, but there are some benefits.



Offline technomo12

  • Member
  • Posts: 1
Re: Hi10P and 8-bit encodes
« Reply #318 on: January 01, 2012, 04:59:31 PM »
Do em both as to please both crowd since you cant Please them all

anyway yea 10p has lower size but sometimes new stuff does not always means better

just a dime of thought

not back being a lurker

Online AceD

  • Member
  • Posts: 2665
    • Facebook
Re: Hi10P and 8-bit encodes
« Reply #319 on: January 01, 2012, 05:01:20 PM »
anyway yea 10p has lower size but sometimes new stuff does not always means better

just a dime of thought

In this case, it does mean better....it's been proven already.