2015-05-21, 18:25
(2015-05-21, 15:42)raul.orozco Wrote: Warner306, I've run the 10 bit test with my brand new OLED TV (LG 55EC930V) and to my eyes, the TV can support 10 bits. I can perfectly see that banding goes away in the pattern image when 10 bit, FSE and Direct3D 11 are on. In this context, should I then set Dither to None going forward?
My GPU is a modest Nvidia GTS 730 that supports up to Jinc 3 - AR for both Chroma and Luma scaling without throwing frames, but only for non-interlaced content, which by the way is the big majority of my video library. Rendering times are around 30-32 ms with this set-up.
Thanks a lot !!!
No, you will still want dithering. All madVR processing is done at 16-bits. It still has to dither the result to 10-bits. You can get away with using Ordered dithering, though, to save processing power because any dithering pattern will be minimized by the higher bit depth.