We are moving from 8-bit to 10-bit video? I am curious.. is the human eye really even capable of detecting such a fine colour resolution? I mean sure, a mere photon may be enough to generate a signal; but that is not the same as actually being able to recognize the colour difference during video playback, say. For instance, I guess in current RGB videos we can randomly tamper with the least significant bit of each 8-bit channel, and notice no quality change whatsoever.
Though of course, computer vision can probably do a lot with the higher colour depth that we cannot. On 6 August 2013 20:33, Alexandr Kuznetsov <[email protected]> wrote: > Hi, > > Also CPU doesn't support 16-bit floats or 10-bit integers. There is no > instruction set to operate on them, so it will be extremely slow. > However, those formats are used for storage. GPU on the other hand might > supports halfs = 16 bit floats. > > Best, > Alex > > On 8/6/2013 3:38 AM, Mikhail Rachinskiy wrote: > > Ohhh mgod! I almost got a heart attack. > > > > > > -- > > > > Regards, > > Mikhail Rachinskiy > > _______________________________________________ > > Bf-committers mailing list > > [email protected] > > http://lists.blender.org/mailman/listinfo/bf-committers > > _______________________________________________ > Bf-committers mailing list > [email protected] > http://lists.blender.org/mailman/listinfo/bf-committers > _______________________________________________ Bf-committers mailing list [email protected] http://lists.blender.org/mailman/listinfo/bf-committers
