BakaBT > Announcements
Hi10P and 8-bit encodes
Aerah:
--- Quote from: RedSuisei on January 02, 2012, 09:47:40 PM ---
--- Quote from: Aerah on January 02, 2012, 09:43:06 PM ---Any evidence that this banding cannot simply be fixed by using a dithering shader?
--- End quote ---
I seem to recall saying that client-side post-processing is heavy, in which case, why were you complaining about 10-bit again? If I were to pick a larger 8-bit encode which still requires another deband post-processing and a smaller 10-bit encode that doesn't require any more post-processing, I'd definitely pick the latter.
--- End quote ---
It is a very nice attack argument you have - creationist style.
Dithering shouldn't be CPU territory.
You are, possibly intentionally, confusing CPU and GPU tasks.
10-bit is NOT post-processing - 10-bit requires the CPU to decode and thus is extremely CPU heavy.
The 10-bit crowd seems well hardware equipped for some real-time dithering shaders - so the question remains,
Why can't the quality / hardware fetishist crowd simply use dithering shaders and leave the video AS IS compatible with most GPUs?
--- Quote from: DmonHiro on January 02, 2012, 10:18:00 PM ---
--- Quote from: OnDeed on January 02, 2012, 09:57:21 PM ---Seriously, stop recommending madvr to people. You are not going to help them after it doesn't work out, are you.
It takes a serious gpu to run it, so at least a put warning there.
--- End quote ---
Don't get mad at me, I said it quite clearly that if you want the best quality possible, you should use madvr, but you will ,have to sacrifice CPU to it. And that's ture. madVR does give you the best possible quality for 10bit. The fact that it requires more CPU is the sacrifice you have to make. If you don't want to or can't, then you stick with EVR or some other filter.
--- End quote ---
... and people like you are negative to the 10-bit movement because your recommendations simply alienate user hardware which are fully capable of running 10-bit.
Reccomending MadVR is bad for two reason,
MadVR uses more CPU than any other renderer - thus worst performance for CPU-heavy encodes (such as HD XVID and 10-bit)
MadVR uses the GPU for some quality improving Shish kebab - 10-bit is CPU-only and any Intel GPU is PERFECT for it.
You are alienating users with borderline-acceptable CPUs and normal (less-than-gamer) GPUs.
DmonHiro:
--- Quote from: Aerah on January 02, 2012, 11:52:26 PM ---... and people like you are negative to the 10-bit movement because your recommendations simply alienate user hardware which are fully capable of running 10-bit.
Reccomending MadVR is bad for two reason,
MadVR uses more CPU than any other renderer - thus worst performance for CPU-heavy encodes (such as HD XVID and 10-bit)
MadVR uses the GPU for some quality improving Shish kebab - 10-bit is CPU-only.
You are alienating users with borderline-acceptable CPUs and less-than-gamer GPUs.
--- End quote ---
Yes, I was misinformed as to how madVR actually works. OnDeed has been kind enough to explain it to me. Still, the fact that madVR does provide higher quality is correct, so my previous statement that you would have to sacrifice more resources to it if you want the highest quality output still stands. Basically, if you have a high-end PC, use madVR. If you don't, use EVR.
However, we are getting WAY off topic here, and should probably stop.
RedSuisei:
--- Quote from: Aerah on January 02, 2012, 11:52:26 PM ---It is a very nice attack argument you have - creationist style.
Dithering shouldn't be CPU territory.
You are, possibly intentionally, confusing CPU and GPU tasks.
10-bit is NOT post-processing - 10-bit requires the CPU to decode and thus is extremely CPU heavy.
The 10-bit crowd seems well hardware equipped for some real-time dithering shaders - so the question remains,
Why can't the quality / hardware fetishist crowd simply use dithering shaders and leave the video AS IS compatible with most GPUs?
--- End quote ---
What the hell? You seem like you really don't know what you're talking about. Whoever said 10-bit is post-processing? 10-bit is heavier on CPU, but 8-bit with this "dither shader" you mentioned will also require more resource as well. In this case, the hardware requirements will end up the same, maybe even higher (For sure I can't run any deband filter on any 1080p encode without some serious lag, but I can play 10-bit 1080p with only some unnoticeable frame drops). Also, why should we, as the end-user, suffer through manually setting up the dithering when the encoder can do it as well, and if the encoder actually knows what he's doing, give better results?
Btw as OnDeed mentioned, madVR doesn't take much more CPU power than any other renderer. At the very least, using an old 2.4GHz C2D shows almost no increase in CPU usage by using madVR or EVR. It takes much more GPU though.
bobthedog:
--- Quote from: dragon191 on January 02, 2012, 09:53:22 PM ---Hey hey hey, you can't just let 8-bit encodes disappear! I watch anime on my tablet too you know! I don't feel like re-encoding every single show I want to watch on it. This is just some war that some 10-bit fanboys want to win no matter what. All they do is watch their anime on their awesome PC and don't care about anyone else that watches it on something else. BakaBT is a community website that also allows the less technology gifted people to watch things on their tablet or other machines. You want to feel so elite to say "just re-encode it" to everyone that wants to watch it on something other than a PC?
--- End quote ---
Yeah, that's pretty much what they're doing. But y'know what...? WE - ARE - THE 51%!
(At least, as of this moment.) ;D
Aerah:
--- Quote from: RedSuisei on January 03, 2012, 12:14:39 AM ---
--- Quote from: Aerah on January 02, 2012, 11:52:26 PM ---It is a very nice attack argument you have - creationist style.
Dithering shouldn't be CPU territory.
You are, possibly intentionally, confusing CPU and GPU tasks.
10-bit is NOT post-processing - 10-bit requires the CPU to decode and thus is extremely CPU heavy.
The 10-bit crowd seems well hardware equipped for some real-time dithering shaders - so the question remains,
Why can't the quality / hardware fetishist crowd simply use dithering shaders and leave the video AS IS compatible with most GPUs?
--- End quote ---
What the hell? You seem like you really don't know what you're talking about. Whoever said 10-bit is post-processing? 10-bit is heavier on CPU, but 8-bit with this "dither shader" you mentioned will also require more resource as well. In this case, the hardware requirements will end up the same, maybe even higher (For sure I can't run any deband filter on any 1080p encode without some serious lag, but I can play 10-bit 1080p with only some unnoticeable frame drops). Also, why should we, as the end-user, suffer through manually setting up the dithering when the encoder can do it as well, and if the encoder actually knows what he's doing, give better results?
Btw as OnDeed mentioned, madVR doesn't take much more CPU power than any other renderer. At the very least, using an old 2.4GHz C2D shows almost no increase in CPU usage by using madVR or EVR. It takes much more GPU though.
--- End quote ---
You said that or you worded your argument in such a convoluted manner that I thought you did.
You are seriously underestimating the power of modern GPUs - how old is the NV 9800 for example? - that is plenty of power for real time dithering (exactly like the cat picture on the wiki article). Again, video type does not effect the GPU's ability to apply shaders before output (even with DXVA as DXVA is just a chip on the GPU dedicated to h264).
Because that way - quality fetishists can use fancy filtering to achieve their boners and the rest can just watch movies in a standard video format?
Kinda contradicting though, you talking about the horrible performance of shaders on hardware while saying that you do not notice the massive performance differences between EVR|VMR and MadVR and Haali Renderer (Installed with Haali Splitter).
Navigation
[0] Message Index
[#] Next page
[*] Previous page
Go to full version