Geoff Canyon wrote:
The docs say:

    Cross-platform note:  On Mac OS and OS X systems, the
    maximum width of an image is 16384 divided by the
    screen's bit depth. (For example, if the number of colors
    is "Millions", the maximum image width is 4096 pixels.)

First, how do you divide 16384 by 4096 you get 4. How is 4 related to "Millions?"

Second, any idea what the other options are?

Now that I think about it, could it mean _byte_ depth? It takes 4 bytes to store a pixel in millions of colors. So then thousands would be 2, correct? and anything less than thousands 1?

Any help much appreciated, as my bug-free code example has a bug until this is resolved ;-)

You pretty much have it I think Geoff. I don't know if bit depth is a
correct term here or not, it sounds technically incorrect to me, since
the concept is byte-based. It may be one of those "casual usages".

Commonly, a pixel uses 4 bytes, one each for red,green,blue and one for
alpha data. R,G and B having 256 values each multiply to give 16,777,216
possibilities according to my steam-driven calculator, hence "millions"
of colours. Thousands would be 2 bytes 256*256=65536 quite correct, and one-byte, usually called "8 bit" colour is an arbitrary set of up to 256 colours referenced by a look up table with 256 entries.

Martin Baxter

_______________________________________________
use-revolution mailing list
[email protected]
Please visit this url to subscribe, unsubscribe and manage your subscription 
preferences:
http://lists.runrev.com/mailman/listinfo/use-revolution

Reply via email to