Hi I have a Gem patch for an installation that basically maps a 1x12px image created with [pix_set] to a fullscreen [rectangle]. When the pixel values are close enough to each other, the gradients between the pixels shows edges that look like low bit depth (and probably are due low bit depth). I am looking for a way to display the Gem window with a higher bit depth. My external monitor advertises itself as capable of 30-bit (which I assume means 10 bit per channel).
Here my questions: * Is it correct that in OpenGL calculations are done with floats? * Are the gradients calculated with high (>8bit) precision? * Is precision lost during transport to display? * What can be done to feed a monitor/projector with higher bit depth? * What can be done on macOS with a HDMI projector attached? Here a screenshot of the Gem display: https://netpd.org/~roman/tmp/12px-gradients.png Roman
signature.asc
Description: This is a digitally signed message part
_______________________________________________ [email protected] mailing list UNSUBSCRIBE and account-management -> https://lists.puredata.info/listinfo/pd-list
