BakaBT > Announcements

Hi10P and 8-bit encodes

<< < (52/107) > >>

Aerah:

--- Quote from: RedSuisei on December 27, 2011, 09:00:52 PM ---madVR renderer (not the decoder) doesn't use CPU, it uses GPU. The decoder doesn't use much more CPU than ffdshow/LAVFilters (Of course there may be some slight difference in CPU usage between using madVR or not). And it does dither the video down to 8bit.

@Aerah: If your rig can't use madVR renderer for 10-bit then it also can't use madVR renderer for 8-bit. Since madVR itself does the processing in 16-bit (IIRC), 10-bit and 8-bit will give roughly the same load to the GPU (I say roughly because there are some stuff that does slightly make the two differ, but not by much I believe). And really, an old nvidia 9400MG with a measly 256MB of onboard video memory can use madVR on 10-bit, why can't you?

--- End quote ---

--- Quote from: DmonHiro on December 27, 2011, 08:12:20 PM ---madvr does use more cpu

--- End quote ---

MadVR simply uses a lot more CPU (even with all the 'fancy' settings turned off or to minimal) compared to EVR (Vista and up) and VMR (XP/2003).
10-bit is CPU only so CPU hogging renderer will not help 10-bit performance - using EVR/VMR CP is a much better choice.
Thats all - people should avoid MadVR if they have performance problems as it only makes performance WORSE.
:)

IMO MadVR does ride the 'I-can-totally-see-it' quality difference train.
Might as well start comparing EVR:CP vs. EVR:CP HalfFP vs. EVR:CP FullFP.

straypup:
IMO, compare them as equals for now, but in the future look into it again. I am running an old PCI GeForce FX5500 video card and I can play 10-bit on my system that is running: Win. 7 Pro, 1.5GB ram, and a 2.0GHZ AMD 64x2 3000+. It's an older computer, but don't let people say that older systems can not play them.

slimcliffy:
I have not taken to time to read each and every post, so my bad if it has already been said. I was under the impression the point and reason why BakaBt was made was to keep it as original as possible. This falls under the line of image being sharpen, blurred, upscale, and yes changing the original bit depth of the image.




bobthedog:

--- Quote from: RedSuisei on December 27, 2011, 11:03:59 PM ---As you said, you should've been adult enough to accept the differences in people's opinion and not demand others to do as you want.

--- End quote ---

Nahhh.  That's not what I said...

Are you, perchance, seeking the American Republican Presidential Nomination?

You are certainly well qualified...

 ::)

(And afterwards, you would make a perfect Fox News commentator.)

Aadieu:

--- Quote from: RedSuisei on December 27, 2011, 10:29:16 PM ---
--- Quote from: DmonHiro on December 27, 2011, 10:22:24 PM --- Also, madVR is the only one that DOESN'T dither to 8bit for me.

--- End quote ---
They do, otherwise you won't be able to watch anything without a video card that supports 10-bit output and a 10-bit display. madVR dithers the video to the display's bit-depth after alll of it's processing is done. Unless, of course, you're talking about madVR being capable of accepting 10-bit input without having to be dithered by the video decoder, in which case though, madVR will still dither at the end.


--- End quote ---

10-bit vidcard and a 10-bit display? This switch in terminology recently is hella confusing: does a Geforce 330M outputting 1080p @ 32-bit colour via HDMI to a Samsung LCD FullHD TV meet these criteria?

Navigation

[0] Message Index

[#] Next page

[*] Previous page

Go to full version