Author Topic: Hi10P and 8-bit encodes  (Read 64117 times)

Offline Meomix

  • Member
  • Posts: 4993
  • For our glorious order
    • MAL
Re: Hi10P and 8-bit encodes
« Reply #260 on: December 28, 2011, 04:48:18 AM »
I strongly oppose removing 8-bit releases.

My current system (Core2 Quad 2.5GHz) is just capable of playing 1080p 8-bit releases, and I'm not planning to buy a new system for at least another 12 months. In fact, I even found one 8-bit release that I can't even play properly: Roberta's Blood Trail, at the flashback scenes where they add a lot of animated noise.

It would help if codecs would utilize more cores, but currently they don't. (using CCCP)

I'm already disappointed that there's no 8-bit release anymore for Puella Magi Madoka Magica.

Well then you're out of luck ZomBie, encoders only give a shit about their convenience not yours.
Did you know Satan was supposedly gods RIGHT HAND MAN, not his left. Blows your theory out of the water now doesn't it.

Offline parusit

  • Member
  • Posts: 48
Re: Hi10P and 8-bit encodes
« Reply #261 on: December 28, 2011, 07:23:00 AM »
Chill guys. This is not 10-bit vs 8-bit wars like we usually seen.

This is just about the transition for the sake of the community. It will be for sure, just about in how long, 6 months? 3? or zero? and how? That's all.


10-bit supporters, we surely got what we want as the 10-bit is fully support already. No worries here.  8)
And personally I'm waiting for more 10-bit stuffs to be here on bakabt, there aren't that many for now (Still kinda new?).

It's just about the fellow dudes anime watcher who are trying to follow you guys.  :)
« Last Edit: December 28, 2011, 07:36:42 AM by parusit »

Offline Cuan

  • NTR Janitor
  • Member
  • Posts: 488
  • Catholic schoolgirl
Re: Hi10P and 8-bit encodes
« Reply #262 on: December 28, 2011, 08:52:14 AM »
I'm already disappointed that there's no 8-bit release anymore for Puella Magi Madoka Magica.
Because nobody offers it. Same case as for OreImo: go ahead and do it yourself.

Offline Meomix

  • Member
  • Posts: 4993
  • For our glorious order
    • MAL
Re: Hi10P and 8-bit encodes
« Reply #263 on: December 28, 2011, 10:52:16 AM »
Ah E99, i cant believe CCCP was an inside joke.  :P
Did you know Satan was supposedly gods RIGHT HAND MAN, not his left. Blows your theory out of the water now doesn't it.

Offline defineANIME

  • Member
  • Posts: 10
Re: Hi10P and 8-bit encodes
« Reply #264 on: December 28, 2011, 10:23:23 PM »
I've got an Atom/ION2 board that can play 8-bit 1080p flawlessly, but dies on 10-bit (artefacts) and even with CPU decoding playing 720p is so-so. Subjectively I am *not* getting better perceived quality from 10-bit releases. I also don't believe that the few percent difference in encode sizes justifies this step, since I've got 100/10 line.

But the community decided to do this and I can't say a thing. That's why I am working on my re-encoding procedure and download 10-bit releases when there's no 8-bit one. So... keep 8-bit releases for a while and give them a special slot, it's way better then home-made transcode. Hopefully, support will come with time.

As a side note, 10-bit SD is just silly.
Running Fedora, watching anime using mplayer or XBMC with vdpau, reading manga with feh. Google these out. ;]

Offline ryrynz

  • Member
  • Posts: 38
Re: Hi10P and 8-bit encodes
« Reply #265 on: December 28, 2011, 10:58:43 PM »
10-bit vidcard and a 10-bit display? This switch in terminology recently is hella confusing: does a Geforce 330M outputting 1080p @ 32-bit colour via HDMI to a Samsung LCD FullHD TV meet these criteria?
[/quote]

No they do not, you need a professional graphics card and high quality monitor/TV.

Offline moe_imouto

  • Member
  • Posts: 6
Re: Hi10P and 8-bit encodes
« Reply #266 on: December 29, 2011, 01:50:56 AM »
I say we create more slots or at lease reserve one slot for 8-bit.

Here is my opinion on 10-bit vs 8-bit. 10-bit IS better than 8-bit. But is it worth it?
WARNING: the following content can be a little bit technical.

Here is some theoretical advantages of 10-bit (precisely speaking it is H.264/MPEG4-AVC Hi10P):
     1.1. 30 bit per pixel (bpp). (This is the only advantage of 10-bit. The rest are H.264/MPEG4-AVC's)
     1.2. Compression is 10bit by 10bit. Saves bandwidth.
     1.3. Motion compensation can be performed on 4x4 blocks. whereas mpeg-2 16x16.(which means even if your monitor do not support 30bpp, you will still get a little better quality )
     1.4. Support chroma sub-sampling 4:4:4 (with further extension to MPEG4-AVC) ,whereas MPEG-2 4:2:0
     
The disadvantages :
      2.1. High CPU consumption
      2.2. Most LCD support only 8bit per channel

A viable solution is to choose a 8-bit codec which support 1.2, 1.3, 1.4 (for example  H.264/MPEG4-AVC HiP).
« Last Edit: December 29, 2011, 12:18:53 PM by moe_imouto »

Offline orion1

  • Member
  • Posts: 13
  • Inuyasha+Kogome having sex = Demon Doggy Style!
    • Orion1
Re: Hi10P and 8-bit encodes
« Reply #267 on: December 29, 2011, 10:58:33 AM »
Quote from: PivosStaff
We are not able to overcome the physical realities of the AIOS' 1185 COU/SoC. This means that 10-bit video playback is not in the cards for the Aios. In fact, there is NO hardware acceleration of 10bit video on any product on the market for consumers today.


Well I suppose the party is over for this easy solution to watch high definition files from your computer wirelessly on your big screen. I think the argument is pretty much over about the decoders. I've prolonged moving my PC into the living room for too long and apparently spent a hundred dollars too much for the convenience of wanting to keep my office and living room separate. Buy another computer for living room? Not exactly in everyone's budget, but Pivos and other companies are working on creating new products that will play 10-bit encodes. How long until they hit the market? No idea but moving the computer to the living room and hooking it up directly to the big screen seems like one of the only options left for the high end convenience. If I have to I will, perhaps the ps3 or xbox360 route might also work. I voted to keep 8-bit around in the (A slot) but admit it is completely in my own self interest and not in the best interest of the bakabt staff or the direction of Bakabt in providing the highest quality video when possible.  :o  Do what you must, it might not be convenient for everyone, but sometimes the hard decisions have to be made. I do appreciate the open conversation on the issue though.  Some very creative solutions to solving the problem have been mentioned in this forum. As for the argument over fansub groups not offering 8-bit encodes anymore this winter, I'm guessing people are still going to complain. But this fall I really didn't have any problems finding 8-bit encodes and was grateful. I suppose as long as there is still a demand, groups will continue to try to please until it truly becomes too much of a burden. Of course if were up to me, 8-bit and 10-bit high def encodes would become the new standard and 480p would be finally placed to rest accept for when it's the only known release available of course.  ::)
« Last Edit: December 29, 2011, 11:56:05 AM by orion1 »

Offline Temuthril

  • Member
  • Posts: 1140
Re: Hi10P and 8-bit encodes
« Reply #268 on: December 29, 2011, 01:10:49 PM »
As a side note, 10-bit SD is just silly.
Elaborate.

What if I say H.264 SD is silly?

Offline moe_imouto

  • Member
  • Posts: 6
Re: Hi10P and 8-bit encodes
« Reply #269 on: December 29, 2011, 01:26:30 PM »
As a side note, 10-bit SD is just silly.
Elaborate.

What if I say H.264 SD is silly?

Standard H.264 is different from H.264 with Hi10P extension.

Offline tyrionlannister

  • Member
  • Posts: 116
Re: Hi10P and 8-bit encodes
« Reply #270 on: December 29, 2011, 02:14:38 PM »
Actually there is no standard or nonstandard H264. H264 is a standard which includes a number of profiles (18), amongst which are HiP (High Profile) and Hi10P (High 10 Profile). Since both are part of the H264 spec, you can't say that one is more standard than the other.

Offline Aadieu

  • Member
  • Posts: 103
Re: Hi10P and 8-bit encodes
« Reply #271 on: December 29, 2011, 02:45:38 PM »
10-bit vidcard and a 10-bit display? This switch in terminology recently is hella confusing: does a Geforce 330M outputting 1080p @ 32-bit colour via HDMI to a Samsung LCD FullHD TV meet these criteria?

No they do not, you need a professional graphics card and high quality monitor/TV.
[/quote]

OK kids, will somebody tell me (and the rest of the world) what the fuck you lot mean by "a professional graphics card" and "a high quality monitor/TV"?!! I've seen this mentioned half a billion times in relation to Hi10P by people who obviously know-shit all, because none of them ever elaborate, but just keep on mumbling how bloody modern and wonderfully gorgeous this shit is. Or is this yet another dumb pitch by the Wintel alliance to get us to all to prepare our cash for the next wave of costly gear, nevermind the fact that they can't be bothered to write up software to work with more than one core, more than 4 gigs of RAM, etc. etc....?!

Cuz if this doesn't even freakin show up on modern FullHD screens and non-onboard NVIDIA or ATI cards, then wtf is the point????

Btw, how the hell does 32 bit colour not support 3*10 = 30 bit colour?! And I've had 32 bit colour displays since back in the freaking 1990s, so what's this strange bull about some "modern" tech that supposedly requires "modern" hardware to supply less bits?

PS ...I thought this 10bit technology promised to end the banding problem?? So far, the few 10bit series that I've watched have had BY FAR the worst banding issues I've ever seen in 720p or FullHD

Offline RedSuisei

  • Member
  • Posts: 326
Re: Hi10P and 8-bit encodes
« Reply #272 on: December 29, 2011, 03:00:38 PM »
OK kids, will somebody tell me (and the rest of the world) what the fuck you lot mean by "a professional graphics card" and "a high quality monitor/TV"?!! I've seen this mentioned half a billion times in relation to Hi10P by people who obviously know-shit all, because none of them ever elaborate, but just keep on mumbling how bloody modern and wonderfully gorgeous this shit is. Or is this yet another dumb pitch by the Wintel alliance to get us to all to prepare our cash for the next wave of costly gear, nevermind the fact that they can't be bothered to write up software to work with more than one core, more than 4 gigs of RAM, etc. etc....?!

Cuz if this doesn't even freakin show up on modern FullHD screens and non-onboard NVIDIA or ATI cards, then wtf is the point????

Btw, how the hell does 32 bit colour not support 3*10 = 30 bit colour?! And I've had 32 bit colour displays since back in the freaking 1990s, so what's this strange bull about some "modern" tech that supposedly requires "modern" hardware to supply less bits?

PS ...I thought this 10bit technology promised to end the banding problem?? So far, the few 10bit series that I've watched have had BY FAR the worst banding issues I've ever seen in 720p or FullHD
It would have been better if you posted in a calm manner, that'd make people more likely to help you.

Anyway, these professional graphics card that supports 10-bit output and displays with 10-bit per color channel aren't meant for consumer usage. It's even rarely found in professional environment either. But on standard, FullHD display with standard consumer graphics card, 10-bit video will be dithered down to 8-bit (or whatever bit-depth the display is) video so it can be displayed properly. Before you went on and rage about what's the point of 10-bit if it's dithered down to 8-bit anyway, the dithering here is the advantage (google up about dithering).

Also, 32 bit-depth display actually consists of 4 channels: 3 color channels and 1 alpha channel (8-bit per channel * 4 = 32). Usually the alpha chanel is ignored though.

I'm also wondering since some people have mentioned this already, but can you please point out which 10-bit release has worse banding than the 8-bit one? I've never seen one myself, especially the ones uploaded at BakaBT.

In the end though, 10-bit itself doesn't remove banding, it prevents more banding to be introduced during the encoding process. If the source already had banding, then the encoder would need to deband the video first before encoding.

Next time, please ask in a more polite manner. I'm pretty sure many people here would be inclined to answer if you do.
« Last Edit: December 29, 2011, 03:02:32 PM by RedSuisei »

Offline moe_imouto

  • Member
  • Posts: 6
Re: Hi10P and 8-bit encodes
« Reply #273 on: December 29, 2011, 06:04:59 PM »
Actually there is no standard or nonstandard H264. H264 is a standard which includes a number of profiles (18), amongst which are HiP (High Profile) and Hi10P (High 10 Profile). Since both are part of the H264 spec, you can't say that one is more standard than the other.
Yeah. I should have used  'main profile' rather than 'standard'.
PS: Some profiles were added as the Fidelity Range Extensions. In this sense, the word 'standard' is OK here.

Offline pcmack101

  • Member
  • Posts: 1
Re: Hi10P and 8-bit encodes
« Reply #274 on: December 29, 2011, 07:09:50 PM »
A month ago I would have been against any change from 8-bit, but now I'm indifferent. I convert everything I download to an .MP4 to watch using my PS3 and 46" Sony LCD TV. Novembers update to CCCP made it so I can use XviD4PSP5 to re-encode even Hi10P to a .MP4 that plays well on the PS3, looks damn good, too!

Offline Aadieu

  • Member
  • Posts: 103
Re: Hi10P and 8-bit encodes
« Reply #275 on: December 29, 2011, 09:19:55 PM »

In the end though, 10-bit itself doesn't remove banding, it prevents more banding to be introduced during the encoding process. If the source already had banding, then the encoder would need to deband the video first before encoding.

Next time, please ask in a more polite manner. I'm pretty sure many people here would be inclined to answer if you do.

Ben-To [EveTaku][Hi10P][720P]  takes the crown for banding. Very visible in many different ways immediately in the first few seconds. It's not a bad release or anything, other than the banding everywhere part, but the banding is extra-obvious, and just surprising considering 10-bits one and only tangible benefit (cause c'mon, dithering???)...

Offline OnDeed

  • Member
  • Posts: 448
  • Uploader account for #OnDeed@irc.rizon.net
Re: Hi10P and 8-bit encodes
« Reply #276 on: December 29, 2011, 10:46:09 PM »

In the end though, 10-bit itself doesn't remove banding, it prevents more banding to be introduced during the encoding process. If the source already had banding, then the encoder would need to deband the video first before encoding.

Next time, please ask in a more polite manner. I'm pretty sure many people here would be inclined to answer if you do.

Ben-To [EveTaku][Hi10P][720P]  takes the crown for banding. Very visible in many different ways immediately in the first few seconds. It's not a bad release or anything, other than the banding everywhere part, but the banding is extra-obvious, and just surprising considering 10-bits one and only tangible benefit (cause c'mon, dithering???)...

Well 10-bit encoding is only the last part of the key combination here.
The factors at work:

1) your source is banded OR it gets easily banded once compressed (because the dither covering the gradients is faint and easily destroyed)

2) your filtering: using smoothers (especially the likes of dfttest and fft3dfilter) will create banding for you with pleasure. Likewise, you can do various antibanding filtering.
(The popular stupid way: create your own banding with the smoothers and then run gradfun to get rid of it, which usually takes away the compression benefits of the denoiser. Bonus points for sharpening afterwards because your source got smoothed...)

3) encoding. In this step, you finally try to achieve the banding-less result, but your success naturally depends on steps 1 and 2. If source had banding but you didn't do anything about it in step 2, you can't save it in step 3.

In this step, you can however screw up a good result from step 2 by using wrong options. Using good options here won't save you if you are inputting garbage, they will merely help you to not get garbage from your good input. Those good options are various: higher bitrate, higher aq and psyrdo, and encoding in 10-bit is another of them. Best is probably to combine all you got, because in 8-bit, it was often an uphill battle to get successful in this last step, x264 would often negate your efforts from step2... It's precisely this difficulty of getting a good result in this step in the past that encoders are so enthusiastic about 10-bit. Torturing yourself with gradfun only to find the banding back after encoding was a major pita.

TL;DR
10bit is a tool, it's not a silver bullet.
« Last Edit: December 29, 2011, 10:48:04 PM by OnDeed »

Offline datora

  • Member
  • Posts: 1411
  • "Warning! Otaku logic powers in use!"
Re: Hi10P and 8-bit encodes
« Reply #277 on: December 30, 2011, 05:30:14 AM »
.
I've plowed through this thread a few times now, and there doesn't appear much more to be said.

What I would suggest at this point is that bakaBT staff make a decision on when they want the changeover to occur, something convenient for your schedule and other duties, then announce it.

My input on that is to nominate 01 July 2012 as the changeover date for policies on torrent evaluation.  It's semi-arbitrary, but six months seems a fairly reasonable timeframe, and it's not like all 8-bit will get deleted at midnight 30 June.

The idea here is that members have this time to download the hell out of the 8-bit torrents that are in danger of being deleted in favor of superior 10-bit encodes.  This same time frame allows many comparison threads to hash out which candidates get the axe first.

Possibly do it in stages: every Monday post a list of (10? or 15?) torrents that will get axed at midnight the following Sunday so that folks have a last opportunity to get a copy if they want it.

All new offers have to stand on their own merits for quality, with the 'normal' tradeoffs on video/audio/translations/ & filesizes & etc., and 8-bit or 10-bit is not a factor ... only superior quality and any archival value for rarities.

I made a few other suggestions in an earlier post and stand by them as possibilities.  In particular, the guidelines and requirements for making new offers and maintaining old ones might very well benefit from a bit of an overhaul to improve quality of offers and reduce the workload on staff for the approval/rejection process.

This is a proposed framework, all factors within it can be adjusted to fit other considerations (01 July sucks?  Fine, maybe 01 September etc.).


For me, the big thing is just knowing when the effective date is, and having a substantial lead time to plan for it.  In my world, six months is exceptionally generous.
I win, once again, in my never-ending struggle against victory.

Offline speedfreek20

  • Member
  • Posts: 37
Re: Hi10P and 8-bit encodes
« Reply #278 on: December 30, 2011, 11:11:10 AM »
Been browsing a few pages and a lot of the same arguments over and over but i'm not really going to go into much of them (aside from the technical ones which I have no clue about)

As far as those saying to upgrade to those that have insufficient hardware, it's not always within ones budget to do so, be it have a job or not, because really computers/laptops are generally a luxury item, or at least as far as watching anime and such.

I'm just lucky I got a quad core laptop earlier this year, though I have a WD Live Hub hooked up to my TV as a means to serve my media easier, as my laptop doesn't really have a proper spot to sit, plus just the general hassle of having it hooked up and running, if it was a desktop then that would be something else, as I would more than likely have it permanently on the TV.

I'm not even sure where i'm going with this train of thought, anyway moving on.

As far as 8bit vs 10bit goes, I say have a slot for both, or at the very least a slot for 720p 8bit. I have friends that have a range of hardware, one doesn't watch HD because of the hardware he has but I doubt many, if any, SD releases would be in 10bit but i wonder how that would run on less powerful hardware.

Seeing as I tend to share my anime out (well fansubs anyway), I like to be able to cover most bases, so more often than not i'll get the one I want and if I know someone that wants to watch it, i'll get a lower quality version if they dont have the hardware.

I don't think there should be a time limit either on existing torrents but more of a general phase out, give it a couple years. Cull off the older ones  first that have very little activity, especially if there are a few HQ releases of it but keep at least one of each as there's always going to be someone that may not be able to use 10bit properly.

Though if my WD Live Hub was compatible with 10bit or I had some kind of other compatible, convenient solution then I would worry about it even less personally but regardless keep 8bit and 10bit in tandem.

As far as HD releases go, even if 1080p is reserved for 10bit and 720p is for 8bit, I wouldn't really see an issue with that. Though honestly there isn't that great of a difference, sure I can tell at times whats 1080 and whats 720 but it depends on the source  but in the end, as long as its a clean, crisp HD image then it's all good as far as i'm concerned.

Offline 0squid0

  • Member
  • Posts: 8
  • Magical tractor turns into a field...
Re: Hi10P and 8-bit encodes
« Reply #279 on: December 30, 2011, 12:42:33 PM »
"My two cents"

I realize everyone is looking at what makes the best video output as far as codec + resolution, but are we talking about bumping off the old grading system we have now?

If so, I'm not at all okay with that! I almost exclusively download from the C group, yeah my PC is "ancient," but we're also talking about compatibility right? I mean, shouldn't we be just as concerned with using formats that have potential AND current usefulness?

Like, avi is a somewhat hated format because of its limitations, but it's soo freakin' compatible! Very few devices now (I can't think of any actually) can't play avi, and MP4 has some of the best compression (I realize I'm talking about the smallest scales here while everybody else talks about the high end) while maintaining excellent quality. Seriously. MP4. Not on here mind you, but I've DLed several series in that format with phenomenal compression and way better quality than the often bloated mkv.

But then again, it's just another torrent file, why not keep the groups as is, and add the high-end as an option 0, or something like that... A lot of anime on BakaBT don't even have more than one option, which can really suck if you have to choose your torrent based on whatever device you have to play it on... lets face it... money, and the technology you have available is a motivator here too.

(Please don't flame me if I said anything stupid, I'm tired, and I don't even use 720 and up slots. I wouldn't DL them even if I did have some high end device. Too much memory!)