On 20/10/16 19:07, ilker tezcan wrote:
Also, old GPUs doesn't supports playback to videos encoded "10-bit" x264 in GPU (hardware-acceleration) mode. Therefore the CPU usage increases while the 10-bit videos playing. _______________________________________________

I think the question as to when to use 10bit can be reduced to "Before encoding!". It will take a while before 10bit or 12bit has finally arrived for the consumers. I wouldn't be surprised if the higher bit depth didn't find wide acceptance before manufacturers have finally arrived at 16bit. I don't see how consumers are going to buy new displays with every 2bit step. Marketing may try to sell it, but consumers will have a hard time telling a difference and only at 16bit will manufacturers find the most common ground and can pour it into the consumer market (i.e. still image processing is at 16bit for a while now) and only then may we see 8bit getting pushed back. We will also likely have 1080p and UHD everywhere with frame rates of 60Hz and more before this happens. Just my $0.02 opinion.

_______________________________________________
ffmpeg-user mailing list
[email protected]
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
[email protected] with subject "unsubscribe".

Reply via email to