BakaBT > Announcements
Hi10P and 8-bit encodes
tyrionlannister:
Actually there is no standard or nonstandard H264. H264 is a standard which includes a number of profiles (18), amongst which are HiP (High Profile) and Hi10P (High 10 Profile). Since both are part of the H264 spec, you can't say that one is more standard than the other.
Aadieu:
--- Quote from: ryrynz on December 28, 2011, 10:58:43 PM ---
--- Quote from: Aadieu on December 28, 2011, 04:42:14 AM ---
--- End quote ---
10-bit vidcard and a 10-bit display? This switch in terminology recently is hella confusing: does a Geforce 330M outputting 1080p @ 32-bit colour via HDMI to a Samsung LCD FullHD TV meet these criteria?
--- End quote ---
No they do not, you need a professional graphics card and high quality monitor/TV.
[/quote]
OK kids, will somebody tell me (and the rest of the world) what the fuck you lot mean by "a professional graphics card" and "a high quality monitor/TV"?!! I've seen this mentioned half a billion times in relation to Hi10P by people who obviously know-shit all, because none of them ever elaborate, but just keep on mumbling how bloody modern and wonderfully gorgeous this shit is. Or is this yet another dumb pitch by the Wintel alliance to get us to all to prepare our cash for the next wave of costly gear, nevermind the fact that they can't be bothered to write up software to work with more than one core, more than 4 gigs of RAM, etc. etc....?!
Cuz if this doesn't even freakin show up on modern FullHD screens and non-onboard NVIDIA or ATI cards, then wtf is the point????
Btw, how the hell does 32 bit colour not support 3*10 = 30 bit colour?! And I've had 32 bit colour displays since back in the freaking 1990s, so what's this strange bull about some "modern" tech that supposedly requires "modern" hardware to supply less bits?
PS ...I thought this 10bit technology promised to end the banding problem?? So far, the few 10bit series that I've watched have had BY FAR the worst banding issues I've ever seen in 720p or FullHD
RedSuisei:
--- Quote from: Aadieu on December 29, 2011, 02:45:38 PM ---OK kids, will somebody tell me (and the rest of the world) what the fuck you lot mean by "a professional graphics card" and "a high quality monitor/TV"?!! I've seen this mentioned half a billion times in relation to Hi10P by people who obviously know-shit all, because none of them ever elaborate, but just keep on mumbling how bloody modern and wonderfully gorgeous this shit is. Or is this yet another dumb pitch by the Wintel alliance to get us to all to prepare our cash for the next wave of costly gear, nevermind the fact that they can't be bothered to write up software to work with more than one core, more than 4 gigs of RAM, etc. etc....?!
Cuz if this doesn't even freakin show up on modern FullHD screens and non-onboard NVIDIA or ATI cards, then wtf is the point????
Btw, how the hell does 32 bit colour not support 3*10 = 30 bit colour?! And I've had 32 bit colour displays since back in the freaking 1990s, so what's this strange bull about some "modern" tech that supposedly requires "modern" hardware to supply less bits?
PS ...I thought this 10bit technology promised to end the banding problem?? So far, the few 10bit series that I've watched have had BY FAR the worst banding issues I've ever seen in 720p or FullHD
--- End quote ---
It would have been better if you posted in a calm manner, that'd make people more likely to help you.
Anyway, these professional graphics card that supports 10-bit output and displays with 10-bit per color channel aren't meant for consumer usage. It's even rarely found in professional environment either. But on standard, FullHD display with standard consumer graphics card, 10-bit video will be dithered down to 8-bit (or whatever bit-depth the display is) video so it can be displayed properly. Before you went on and rage about what's the point of 10-bit if it's dithered down to 8-bit anyway, the dithering here is the advantage (google up about dithering).
Also, 32 bit-depth display actually consists of 4 channels: 3 color channels and 1 alpha channel (8-bit per channel * 4 = 32). Usually the alpha chanel is ignored though.
I'm also wondering since some people have mentioned this already, but can you please point out which 10-bit release has worse banding than the 8-bit one? I've never seen one myself, especially the ones uploaded at BakaBT.
In the end though, 10-bit itself doesn't remove banding, it prevents more banding to be introduced during the encoding process. If the source already had banding, then the encoder would need to deband the video first before encoding.
Next time, please ask in a more polite manner. I'm pretty sure many people here would be inclined to answer if you do.
moe_imouto:
--- Quote from: tyrionlannister on December 29, 2011, 02:14:38 PM ---Actually there is no standard or nonstandard H264. H264 is a standard which includes a number of profiles (18), amongst which are HiP (High Profile) and Hi10P (High 10 Profile). Since both are part of the H264 spec, you can't say that one is more standard than the other.
--- End quote ---
Yeah. I should have used 'main profile' rather than 'standard'.
PS: Some profiles were added as the Fidelity Range Extensions. In this sense, the word 'standard' is OK here.
pcmack101:
A month ago I would have been against any change from 8-bit, but now I'm indifferent. I convert everything I download to an .MP4 to watch using my PS3 and 46" Sony LCD TV. Novembers update to CCCP made it so I can use XviD4PSP5 to re-encode even Hi10P to a .MP4 that plays well on the PS3, looks damn good, too!
Navigation
[0] Message Index
[#] Next page
[*] Previous page
Go to full version