2015-08-18, 15:51
(2015-08-18, 12:04)fritsch Wrote: Some gpus (as nvidia GTX 960 2nd gen) have some internal circuits to "alter the 10 bit to 8 bit" and then decode it as 8 bit. So we need to be careful in the future if someone really tells: 10 bit decoding :-)
Or would it actually decode the full 10 bits and then scale it down from 10 to 8 bits? What's the situation on the display side when it comes to 10-bit, is it already commonly supported? And if the display does not support it, then this kind of downscaling needs to happen also in the fully 10-bit capable GPU.
I wonder what's next after 10-bit 4k@60fps.. they'll improve something again so people keep buying new TVs.
Anyhow, there's no HW decoding of 10-bit content scaled down to 8-bit or not for Intel HW and to be honest, I don't see much need this year, probably not next year either. 2017 starts to look like a good time to buy 4k TV and a new HTPC.