BakaBT > Announcements
Hi10P and 8-bit encodes
OnDeed:
--- Quote from: RedSuisei on January 03, 2012, 12:14:39 AM ---Btw as OnDeed mentioned, madVR doesn't take much more CPU power than any other renderer. At the very least, using an old 2.4GHz C2D shows almost no increase in CPU usage by using madVR or EVR. It takes much more GPU though.
--- End quote ---
I didn't say that... actually people do say it takes more cpu. And wait a moment, EVR has quite some cpu overhead, compared to overlay, so it's not a huge win to beat it :) But people seldom need to worry about madvr's cpu usage as said, since it needs a gaming-worthy gpu anyway (thus it probably won't find itself in a pc with slow cpu).
P.S.
Regarding re-encoding downloaded rips... that is imho silly, obnoxious and unreasonable. Just a PITA. If anyone really wants to recommend that to people, he or she better stop and go right for the "get you stuff elsewhere" line. Getting other encodes at other places is a practical choice, re-encoding isn't.
parusit:
--- Quote from: OnDeed on January 03, 2012, 03:17:03 AM ---Regarding re-encoding downloaded rips... that is imho silly, obnoxious and unreasonable. Just a PITA. If anyone really wants to recommend that to people, he or she better stop and go right for the "get you stuff elsewhere" line. Getting other encodes at other places is a practical choice, re-encoding isn't.
--- End quote ---
Totally agree.
p.s. Maybe too much of self-centered thinking here(any side). To the point it looks like Discrimination.
Aadieu:
Time out!
Will someone tell me already why it is that persons involved in this discussion are getting radically different results from Hi10p playback, with little to OVER9000 level banding, as seen in screenshots of the same scene from the same file?
Also a good time to figure out what part of that is attributable to decoders, what part - to settings, what part to GPU/processing capabilities, and what part to filters?
Cause while it's pretty easy to blame my screenshot on my "doing it wrong", is it also not easy to suspect the better results of "doing it TOO right" (aka introducing results of various technologies that vastly improve visuals and misrepresenting these results as those of the 8bit >> 10bit switch, when in fact they might have shit-all to do with colour bitrates or encodings)?
...ARE WE EVEN SURE THAT ANY OF THE DIFFERENCE "SHOWN" ANYWHERE AT ALL CAN BE ATTRIBUTED TO ANYTHING BUT GPU AND MADVR? Or any other differences due to owners of finetuned hi-end systems posting their top-of-the-line 10bit screens vs. basic users with average systems and no finetuning providing the 8bits for comparison?
Also, could somebody that got "pretty" results (screenshots) on a 10bit release do comparison shots with the same group's similar-filesize 8bit version of the same anime, while running the same player and the same presets? (Yeah I know that guy from UTW did that, but he was selecting screens of his own encoding to illustrate his own point - not exactly a perfect unbiased situation, a clean experiment would have to be done by someone without a prior conclusion and without the encoder's knowledge; and, in any case, both of his encodes were badly bitrate-starved, yet both looked pretty good and not exactly radically different, either)
dragon191:
I think this community has a lot of people who can't play high resolution 10bit files properly. So I hope that BakaBT will decide that there will remain an 8bit SD slot. I mean, it took ages before XviD was waved goodbye, so I don't see a reason to not do the same here. :>
cyberbeing:
--- Quote from: Aadieu on January 03, 2012, 05:39:04 AM ---Time out!
Will someone tell me already why it is that persons involved in this discussion are getting radically different results from Hi10p playback, with little to OVER9000 level banding, as seen in screenshots of the same scene from the same file?
Also a good time to figure out what part of that is attributable to decoders, what part - to settings, what part to GPU/processing capabilities, and what part to filters?
Cause while it's pretty easy to blame my screenshot on my "doing it wrong", is it also not easy to suspect the better results of "doing it TOO right" (aka introducing results of various technologies that vastly improve visuals and misrepresenting these results as those of the 8bit >> 10bit switch, when in fact they might have shit-all to do with colour bitrates or encodings)?
--- End quote ---
If TV->PC levels conversion and/or YUV->RGB conversion is done poorly, you will create banding which doesn't exist in the source.
You or something you are using is doing something is doing something that hurts quality. Since this actually shows up in your screenshots, I'd suspect PotPlayer, or how you have PotPlayer set up may be to blame. Could you retest using MPC-HC + FFDShow with a standard CCCP install? I'd also like to see you take a screenshot with madVR, which will avoid anything your GPU Driver may or may not be doing which hurts quality.
The other factor is your monitor. Especially in the past, there were many LCD displays which would show banding when displaying a smooth 8-bit gradient. Do you see banding in any of the following smooth gradients?
RGB Spectrum
RGB Linear Gradient
Gray Linear Gradient
Red Linear Gradient
Green Linear Gradient
Blue Linear Gradient
Gray Horizontal Gradient
Gray Vertical Gradient
--- Quote from: Aadieu on January 03, 2012, 05:39:04 AM ---...ARE WE EVEN SURE THAT ANY OF THE DIFFERENCE "SHOWN" ANYWHERE AT ALL CAN BE ATTRIBUTED TO ANYTHING BUT GPU AND MADVR? Or any other differences due to owners of finetuned hi-end systems posting their top-of-the-line 10bit screens vs. basic users with average systems and no finetuning providing the 8bits for comparison?
Also, could somebody that got "pretty" results (screenshots) on a 10bit release do comparison shots with the same group's similar-filesize 8bit version of the same anime, while running the same player and the same presets? (Yeah I know that guy from UTW did that, but he was selecting screens of his own encoding to illustrate his own point - not exactly a perfect unbiased situation, a clean experiment would have to be done by someone without a prior conclusion and without the encoder's knowledge; and, in any case, both of his encodes were badly bitrate-starved, yet both looked pretty good and not exactly radically different, either)
--- End quote ---
EveTaku Ben-To 01v2 8bit (359.3 MB) w/ EVR-CP
EveTaku Ben-To 01v2 10bit (304.1 MB) w/ EVR-CP
EveTaku Ben-To 01v2 8bit (359.3 MB) w/ madVR
EveTaku Ben-To 01v2 10bit (304.1 MB) w/ madVR
None of these look as bad as your screenshot and I'm not doing anything special. Decoded Video -> Renderer, that's it.
For reasons stated above, madVR will have less banding compared to EVR-CP, but there is still a noticeable difference between 8bit and 10-bit either way.
As far as discrete GPUs which work with madVR, you are probably looking at ~$55-75 for something decent. madVR really doesn't need GPU which is particularly powerful, just something above most entry-level discrete, mobile, and integrated GPUs. As stated by RedSuisei, there is a slightly higher CPU load using madVR (+ 1% to 5% depending on CPU speed), but unless you are running your CPU constantly at 90%+ it shouldn't be noticeable.
The computer I'm testing is rather low-end by today's standards, and it handles 10bit 1080p under ~35Mbps with FFDShow, LAV Video, or CoreAVC + madVR + xy-VSFilter just fine:
AMD X2 4800+ (939) @2.4Ghz
2GB DDR400
NVIDIA 7800GTX 512
WinXP SP3 x86
Navigation
[0] Message Index
[#] Next page
[*] Previous page
Go to full version