Author Topic: Hi10P and 8-bit encodes  (Read 64174 times)

Offline DmonHiro

  • Member
  • Posts: 797
  • Neko The World
Re: Hi10P and 8-bit encodes
« Reply #360 on: January 02, 2012, 09:04:39 PM »
I'm in the middle of downloading EveTaku's 1st Ben-To episode in both 8 and 10 bits. I will them present a series of comparison shots.
Demons run when a good man goes to war. Night will fall and drown the sun, when a good man goes to war. Friendship dies and true love lies, night will fall and the dark will rise, when a good man goes to war. Demons run but count the cost, the battle's won, but the child is lost.

Offline RedSuisei

  • Member
  • Posts: 326
Re: Hi10P and 8-bit encodes
« Reply #361 on: January 02, 2012, 09:09:31 PM »
@DmonHiro: If not too much problem for you, to make sure there won't be any more bitching, please take the screenshots using standard CCCP installation (without tweaking etc. Meaning: ffdshow video decoder, standard vsfilter, and EVR-CP). This is to prevent people from saying that to get correct results you'd need to tweak your player correctly.

Offline mistie710

  • Member
  • Posts: 10
  • I be pointy!
    • Crashnet - Anime Reviews from Hell
Re: Hi10P and 8-bit encodes
« Reply #362 on: January 02, 2012, 09:12:02 PM »
This whole thread reminds me of somebody on rec.arts.anime.misc many years ago who referred to himself as "SpuerGenius" (yes, that is how he spelled it!) who stirred up the entire group because he was of the opinion that we should all upgrade our kit to the latest and greatest and that we shouldn't moan about our problems if we didn't. Of course, that was a long time ago now and we weren't talking about watching video on computers but the same thing comes to mind.

On one side, we have people arguing that we shouldn't adopt Hi10p as a standard quality because they can't play it. No doubt most of these folk are viewing on machines that are a little older and that may struggle with higher quality stuff but can't or won't go out and do what is necessary to upgrade.

On the other side we have people arguing that those that can't play Hi10p should change what they view their downloads on. Chances are that many of these folks have already upgraded to higher spec machines and would like to get the best possible result out of them.

My own thought is that the very reason why we are having such a discussion, especially given the vehement exchanges on occasion, might indicate that the time isn't quite right yet, but that we must consider that the time is coming and cannot be avoided forever. I don't necessarily like to be an early adopter because I've been stung that way too many times, but I realise that I can't stand still.

That's how I see it, anyway.  ;)

--
Chika
(Yes, the one that used to post on raam)

Offline Aadieu

  • Member
  • Posts: 103
Re: Hi10P and 8-bit encodes
« Reply #363 on: January 02, 2012, 09:26:29 PM »
Download ep1 of EveTaku's Bento 720p Hi10p and find out.

Downloaded EveTaku's Ben-to 01 Hi10p, took a screenshot of frame 727...what banding?


No clue what your screenshot is from, but it certainly doesn't seem to be EveTaku's Hi10p release.

If you're looking for red circles and yellow text - that part was added by me.

Otherwise, your screenshot appears to be the same. With loads of banding.

@Aadieu: The problem is that you said 10-bit has more banding than 8-bit, but you only gave a screenshot of the 10-bit encode. If you really want to show people what you mean, then you need to show a screenshot of the 8-bit encode as well. Preferably it should be encoded at the exact same settings from exact same source with exact same filters, but if you just post a screenshot of the same frame from the 8-bit version by EveTaku as well, and it does show less banding than the 8-bit, then you will at least present your point in a more acceptable manner. You can't say "This looks worse than XXX" when we don't even know ow XXX looks like. Even then, that will most likely just the case with EveTaku's version, and not something that applies globally, as I have seen much more releases where 10-bit is superior to 8-bit.


Let me put it another way (as before someone asked for proof):
I'm not saying 8-bit is better. I'm just saying that I, so far, fail to see any benefit of 10-bit. Certainly no jaw-dropping OMFG moments. The couple times I've watched 10-bit stuff, it looked average. Mediocre. Nothing special. And the promised benefits? Better 10-bit colour, nope, don't have any such big screens on the market. Less banding? I dunno, I've seen loads of banding in all 10-bit rips I've watched. Since no one believed me, I provided proofpics. Space saving? I still say the best way to save space is to ask someone from Mazui to teach you how to encode 720p properly...

The purported benefits, if they even exist, appear to be miniscule. Meanwhile, the topic is causing a shitstorm. Is it worth it? Not from where I'm standing.

Offline RedSuisei

  • Member
  • Posts: 326
Re: Hi10P and 8-bit encodes
« Reply #364 on: January 02, 2012, 09:35:28 PM »
If you're looking for red circles and yellow text - that part was added by me.

Otherwise, your screenshot appears to be the same. With loads of banding.
It seems you have a problem with your display (or maybe your eyes). I certainly don't see any banding in cyberbeing's shots. Even if there are any, it certainly doesn't look like your screenshot, which suggests you have a problem with both your player and your display (or eyes, maybe)

I'm not saying 8-bit is better.
What?
Quote from: Aadieu
In practice, it is consistently WORSE than 8-bit releases at the moment.
Sure, you're not saying 8-bit is better, but you did say 10-bit is worse than 8-bit. Wait...


I'm just saying that I, so far, fail to see any benefit of 10-bit. Certainly no jaw-dropping OMFG moments. The couple times I've watched 10-bit stuff, it looked average. Mediocre. Nothing special. And the promised benefits? Better 10-bit colour, nope, don't have any such big screens on the market. Less banding? I dunno, I've seen loads of banding in all 10-bit rips I've watched. Since no one believed me, I provided proofpics. Space saving? I still say the best way to save space is to ask someone from Mazui to teach you how to encode 720p properly...
Do improvements have to be jaw-dropping? While in most comparisons I've seen, the difference is pretty jaw-dropping already (at least when seen with my eyes and my display), any improvement is still an improvement.

Oh yeah, Mazui's rips aren't that good anyways. And please do not speak about how to encode properly when you're not an encoder yourself.

Offline Aerah

  • Member
  • Posts: 61
  • [h264][1080p][40Mbit][MKV][FLAC]
Re: Hi10P and 8-bit encodes
« Reply #365 on: January 02, 2012, 09:43:06 PM »
Any evidence that this banding cannot simply be fixed by using a dithering shader?
Intel / AMD / NVIDIA
MPC:HC

Offline RedSuisei

  • Member
  • Posts: 326
Re: Hi10P and 8-bit encodes
« Reply #366 on: January 02, 2012, 09:47:40 PM »
Any evidence that this banding cannot simply be fixed by using a dithering shader?
I seem to recall saying that client-side post-processing is heavy, in which case, why were you complaining about 10-bit again? If I were to pick a larger 8-bit encode which still requires another deband post-processing and a smaller 10-bit encode that doesn't require any more post-processing, I'd definitely pick the latter.

Offline dragon191

  • Member
  • Posts: 213
Re: Hi10P and 8-bit encodes
« Reply #367 on: January 02, 2012, 09:53:22 PM »
Hey hey hey, you can't just let 8-bit encodes disappear! I watch anime on my tablet too you know! I don't feel like re-encoding every single show I want to watch on it. This is just some war that some 10-bit fanboys want to win no matter what. All they do is watch their anime on their awesome PC and don't care about anyone else that watches it on something else. BakaBT is a community website that also allows the less technology gifted people to watch things on their tablet or other machines. You want to feel so elite to say "just re-encode it" to everyone that wants to watch it on something other than a PC?

Offline OnDeed

  • Member
  • Posts: 448
  • Uploader account for #OnDeed@irc.rizon.net
Re: Hi10P and 8-bit encodes
« Reply #368 on: January 02, 2012, 09:57:21 PM »
In case you're not aware, this isn't a democracy [...] What the mods are doing now is just a favor to the community[...]

Man... are you totally sure you aren't saying that just because it coincides with your own opinion? Also, if it isn't democracy, why do you post here with all the others? Unless you do it for the secret police's convenience's sake (since this isn't a democracy).

P.S. Again, hooray-10bit people have nothing to lose regardless the outcome of the poll, so they should imho take their 8bit enemies into consideration more than they do, here. What you dudes do is more like offering assistance to your landlord when he plans to increase you neighbour's rent.

------------------------

To get 10bit working you either go the minimal route: download and install CCCP and use MPC-HC.
If you want to get the best quality, you will have to give up on some CPU, and go the advanced route after getting CCCP:
- download and install lavfilters
- download and install madVR
- set up lav and mad to be used as default.

Seriously, stop recommending madvr to people. You are not going to help them after it doesn't work out, are you.
It takes a serious gpu to run it, so at least a put warning there. What if the poor soul you "helped" had some IGP or lowend card like Radeon 5400 or something? They won't even manage bilinear resizing, not to speak about bicubic! Fun fact: my directx9 radeon does better scaling than bicubic, using bloody overlay mixer.

So do the world a favour and only go recommending trusted and tested stuff, like CCCP.
Edit: Sorry for the overreaction, but I keep seeing these "tips" in many "guides" lately. People are way to confident when making those, in reality they are just unaware of the pitfalls.
« Last Edit: January 02, 2012, 10:22:50 PM by OnDeed »

Offline DmonHiro

  • Member
  • Posts: 797
  • Neko The World
Re: Hi10P and 8-bit encodes
« Reply #369 on: January 02, 2012, 10:18:00 PM »
Seriously, stop recommending madvr to people. You are not going to help them after it doesn't work out, are you.
It takes a serious gpu to run it, so at least a put warning there.
Don't get mad at me, I said it quite clearly that if you want the best quality possible, you should use madvr, but you will ,have to sacrifice CPU to it. And that's ture. madVR does give you the best possible quality for 10bit. The fact that it requires more CPU is the sacrifice you have to make. If you don't want to or can't, then you stick with EVR or some other filter.
Demons run when a good man goes to war. Night will fall and drown the sun, when a good man goes to war. Friendship dies and true love lies, night will fall and the dark will rise, when a good man goes to war. Demons run but count the cost, the battle's won, but the child is lost.

Offline OnDeed

  • Member
  • Posts: 448
  • Uploader account for #OnDeed@irc.rizon.net
Re: Hi10P and 8-bit encodes
« Reply #370 on: January 02, 2012, 10:27:43 PM »
It's not about cpu at all. Madvr needs gaming GPU, period. It like, does stuff with shaders you know, meaning that if you run it on a slow card, you simply won't hit realtime playback. And people sort of expect video renderer to always work, whatever the input resolution and framerate is...
In this regard, madvr has the downsides of haali renderer of years ago, only 20x worse (since the average gpu power has risen, :P)

Offline Aadieu

  • Member
  • Posts: 103
Re: Hi10P and 8-bit encodes
« Reply #371 on: January 02, 2012, 10:34:05 PM »
If you're looking for red circles and yellow text - that part was added by me.

Otherwise, your screenshot appears to be the same. With loads of banding.
It seems you have a problem with your display (or maybe your eyes). I certainly don't see any banding in cyberbeing's shots. Even if there are any, it certainly doesn't look like your screenshot, which suggests you have a problem with both your player and your display (or eyes, maybe)

[/quote]

Looked at it again, side by side, yep, less banding in his. Still some there (less on the thighs, but there; still some on the bag), but not as bad.

Filter? Different decoder?

Different frames? Cause I will freely admit that I scrolled through that scene frame by frame and went with the worst-offending banding I could find. And I didn't use any anti-banding filters or GPU-intensive decoders, just whatever Pot Player's default was.

But if it is the very same frame... See what I mean about the standard's software side not being very mature? Different software is producing significantly different results. And, chances are, neither of us knows exactly what he's doing right/wrong/differently to get where he is.

Offline DmonHiro

  • Member
  • Posts: 797
  • Neko The World
Re: Hi10P and 8-bit encodes
« Reply #372 on: January 02, 2012, 11:01:00 PM »
It's not about cpu at all. Madvr needs gaming GPU, period.
Perhaps I'm confusing something, but weren't current GPUs unable to process 10bit videos correctly, and as such everything fell on the CPU? Isn't that why DXVA cannot be used with 10bit and why people bitch that 10bit decoding is software only?
Demons run when a good man goes to war. Night will fall and drown the sun, when a good man goes to war. Friendship dies and true love lies, night will fall and the dark will rise, when a good man goes to war. Demons run but count the cost, the battle's won, but the child is lost.

Offline OnDeed

  • Member
  • Posts: 448
  • Uploader account for #OnDeed@irc.rizon.net
Re: Hi10P and 8-bit encodes
« Reply #373 on: January 02, 2012, 11:17:48 PM »
Ummm...
You seem to be completely mixing two things: video decoding and video rendering.

1) video decoding takes in h.264 from the demuxer and decodes it to uncompressed yuv frames.
2 video rendering takes in uncompressed yuv frames (in case of 10-bit, optionally downconverting them to plain 8-bit yuv), scales them to desired resolution and finally converts them to rgb colourspace. Then it tells the vga to put it on display at some timestamps.

DXVA does strictly 1).
Madvr does strictly 2). Madvr's goal is to do everything in step 2) using high-quality algorithms on the gpu, with shaders.

What 10-bit breaks about DXVA is stuff strictly belonging to 1) - decoding. DXVA can only do 8bit, because it is more or less fixed function hardware. Thus with 10bit, everything needs to be done on cpu in step 1).
Renderers don't just use gpu for doing step 2), they usually need some cpu power too, like games. It's probably just overlay renderer that can have negligible cpu usage. Without gpu acceleration though, they would be orders of magnitude slower. Software rendering of video wasn't even viable in the past. You can try it: have ffdshow use its scaler filter to get into fullscreen and force rgb output (there will still be some gpu work with drawing and refreshing, but...)

The only issue that 10-bit causes for step 2) is that most renderers aren't ready for input of 10-bit uncompressed frames (exceptions are MPlayer's -vo gl and madvr afaik). This is easily solved by inserting a downconversion filter at the end of step 1), which most decoders do, eliminating this problem. Alternativelly, you can push conversion to rgb from step 2 here, doing it at end of step 1) on cpu (example: lav video or ffdshow with rgb32 output forced).

TL;DR
For doing its stuff (step 2), MadVR needs lots of gpu shader power (because it does it in a high-end uncompromising way).

Note that madvr has optional decoders to do step 1, but that is irrelevant, it is simply as if it bundled its own small personal ffdshow along. Nothing changes about the process.
« Last Edit: January 02, 2012, 11:23:28 PM by OnDeed »

Offline DmonHiro

  • Member
  • Posts: 797
  • Neko The World
Re: Hi10P and 8-bit encodes
« Reply #374 on: January 02, 2012, 11:42:22 PM »
TL;DR
For doing its stuff (step 2), MadVR needs lots of gpu shader power (because it does it in a high-end uncompromising way).
Note that madvr has optional decoders to do step 1, but that is irrelevant, it is simply as if it bundled its own small personal ffdshow along. Nothing changes about the process.
Aha.... thanks, now I get it. So are you saying that one does not need to use madvr to display 10bit properly, because that's what I was let do believe.
« Last Edit: January 02, 2012, 11:51:13 PM by DmonHiro »
Demons run when a good man goes to war. Night will fall and drown the sun, when a good man goes to war. Friendship dies and true love lies, night will fall and the dark will rise, when a good man goes to war. Demons run but count the cost, the battle's won, but the child is lost.

Offline Aerah

  • Member
  • Posts: 61
  • [h264][1080p][40Mbit][MKV][FLAC]
Re: Hi10P and 8-bit encodes
« Reply #375 on: January 02, 2012, 11:52:26 PM »
Any evidence that this banding cannot simply be fixed by using a dithering shader?
I seem to recall saying that client-side post-processing is heavy, in which case, why were you complaining about 10-bit again? If I were to pick a larger 8-bit encode which still requires another deband post-processing and a smaller 10-bit encode that doesn't require any more post-processing, I'd definitely pick the latter.
It is a very nice attack argument you have - creationist style.

Dithering shouldn't be CPU territory.
You are, possibly intentionally, confusing CPU and GPU tasks.

10-bit is NOT post-processing - 10-bit requires the CPU to decode and thus is extremely CPU heavy.

The 10-bit crowd seems well hardware equipped for some real-time dithering shaders - so the question remains,
Why can't the quality / hardware fetishist crowd simply use dithering shaders and leave the video AS IS compatible with most GPUs?

Seriously, stop recommending madvr to people. You are not going to help them after it doesn't work out, are you.
It takes a serious gpu to run it, so at least a put warning there.
Don't get mad at me, I said it quite clearly that if you want the best quality possible, you should use madvr, but you will ,have to sacrifice CPU to it. And that's ture. madVR does give you the best possible quality for 10bit. The fact that it requires more CPU is the sacrifice you have to make. If you don't want to or can't, then you stick with EVR or some other filter.
... and people like you are negative to the 10-bit movement because your recommendations simply alienate user hardware which are fully capable of running 10-bit.

Reccomending MadVR is bad for two reason,
MadVR uses more CPU than any other renderer - thus worst performance for CPU-heavy encodes (such as HD XVID and 10-bit)
MadVR uses the GPU for some quality improving Shish kebab - 10-bit is CPU-only and any Intel GPU is PERFECT for it.

You are alienating users with borderline-acceptable CPUs and normal (less-than-gamer) GPUs.
« Last Edit: January 02, 2012, 11:56:24 PM by Aerah »
Intel / AMD / NVIDIA
MPC:HC

Offline DmonHiro

  • Member
  • Posts: 797
  • Neko The World
Re: Hi10P and 8-bit encodes
« Reply #376 on: January 02, 2012, 11:55:38 PM »
... and people like you are negative to the 10-bit movement because your recommendations simply alienate user hardware which are fully capable of running 10-bit.
Reccomending MadVR is bad for two reason,
MadVR uses more CPU than any other renderer - thus worst performance for CPU-heavy encodes (such as HD XVID and 10-bit)
MadVR uses the GPU for some quality improving Shish kebab - 10-bit is CPU-only.
You are alienating users with borderline-acceptable CPUs and less-than-gamer GPUs.
Yes, I was misinformed as to how madVR actually works. OnDeed has been kind enough to explain it to me. Still, the fact that madVR does provide higher quality is correct, so my previous statement that you would have to sacrifice more resources to it if you want the highest quality output still stands. Basically, if you have a high-end PC, use madVR. If you don't, use EVR.

However, we are getting WAY off topic here, and should probably stop.
Demons run when a good man goes to war. Night will fall and drown the sun, when a good man goes to war. Friendship dies and true love lies, night will fall and the dark will rise, when a good man goes to war. Demons run but count the cost, the battle's won, but the child is lost.

Offline RedSuisei

  • Member
  • Posts: 326
Re: Hi10P and 8-bit encodes
« Reply #377 on: January 03, 2012, 12:14:39 AM »
It is a very nice attack argument you have - creationist style.

Dithering shouldn't be CPU territory.
You are, possibly intentionally, confusing CPU and GPU tasks.

10-bit is NOT post-processing - 10-bit requires the CPU to decode and thus is extremely CPU heavy.

The 10-bit crowd seems well hardware equipped for some real-time dithering shaders - so the question remains,
Why can't the quality / hardware fetishist crowd simply use dithering shaders and leave the video AS IS compatible with most GPUs?
What the hell? You seem like you really don't know what you're talking about. Whoever said 10-bit is post-processing? 10-bit is heavier on CPU, but 8-bit with this "dither shader" you mentioned will also require more resource as well. In this case, the hardware requirements will end up the same, maybe even higher (For sure I can't run any deband filter on any 1080p encode without some serious lag, but I can play 10-bit 1080p with only some unnoticeable frame drops). Also, why should we, as the end-user, suffer through manually setting up the dithering when the encoder can do it as well, and if the encoder actually knows what he's doing, give better results?

Btw as OnDeed mentioned, madVR doesn't take much more CPU power than any other renderer. At the very least, using an old 2.4GHz C2D shows almost no increase in CPU usage by using madVR or EVR. It takes much more GPU though.
« Last Edit: January 03, 2012, 12:22:49 AM by RedSuisei »

Offline bobthedog

  • Member
  • Posts: 30
Re: Hi10P and 8-bit encodes
« Reply #378 on: January 03, 2012, 12:56:33 AM »
Hey hey hey, you can't just let 8-bit encodes disappear! I watch anime on my tablet too you know! I don't feel like re-encoding every single show I want to watch on it. This is just some war that some 10-bit fanboys want to win no matter what. All they do is watch their anime on their awesome PC and don't care about anyone else that watches it on something else. BakaBT is a community website that also allows the less technology gifted people to watch things on their tablet or other machines. You want to feel so elite to say "just re-encode it" to everyone that wants to watch it on something other than a PC?

Yeah, that's pretty much what they're doing.  But y'know what...?  WE  -  ARE  -  THE 51%!

(At least, as of this moment.)     ;D

Offline Aerah

  • Member
  • Posts: 61
  • [h264][1080p][40Mbit][MKV][FLAC]
Re: Hi10P and 8-bit encodes
« Reply #379 on: January 03, 2012, 01:02:11 AM »
It is a very nice attack argument you have - creationist style.

Dithering shouldn't be CPU territory.
You are, possibly intentionally, confusing CPU and GPU tasks.

10-bit is NOT post-processing - 10-bit requires the CPU to decode and thus is extremely CPU heavy.

The 10-bit crowd seems well hardware equipped for some real-time dithering shaders - so the question remains,
Why can't the quality / hardware fetishist crowd simply use dithering shaders and leave the video AS IS compatible with most GPUs?
What the hell? You seem like you really don't know what you're talking about. Whoever said 10-bit is post-processing? 10-bit is heavier on CPU, but 8-bit with this "dither shader" you mentioned will also require more resource as well. In this case, the hardware requirements will end up the same, maybe even higher (For sure I can't run any deband filter on any 1080p encode without some serious lag, but I can play 10-bit 1080p with only some unnoticeable frame drops). Also, why should we, as the end-user, suffer through manually setting up the dithering when the encoder can do it as well, and if the encoder actually knows what he's doing, give better results?

Btw as OnDeed mentioned, madVR doesn't take much more CPU power than any other renderer. At the very least, using an old 2.4GHz C2D shows almost no increase in CPU usage by using madVR or EVR. It takes much more GPU though.
You said that or you worded your argument in such a convoluted manner that I thought you did.

You are seriously underestimating the power of modern GPUs - how old is the NV 9800 for example? - that is plenty of power for real time dithering (exactly like the cat picture on the wiki article). Again, video type does not effect the GPU's ability to apply shaders before output (even with DXVA as DXVA is just a chip on the GPU dedicated to h264).

Because that way - quality fetishists can use fancy filtering to achieve their boners and the rest can just watch movies in a standard video format?

Kinda contradicting though, you talking about the horrible performance of shaders on hardware while saying that you do not notice the massive performance differences between EVR|VMR and MadVR and Haali Renderer (Installed with Haali Splitter).
Intel / AMD / NVIDIA
MPC:HC