> > i found int ggiCrossBlit(ggi_visual *src, int sx, int sy,
> > int sw, int sh, ggi_visual *dst, int dx, int dy);
> > which has compiler dependend sizes.
> > is this intended ?
Yes. Compilers set int size to be the data size that is most efficient to
handle.
> > ggi won't work on a 8 bit computer with display size larger than
> > 127 dots. for example the sinclair ZX Spectrum has 256 dots wide
> > display. ;)
Sure, but 8 bit computers tend to have 16 bits of adress space at max, which
will be too small anyway to run a full blown LibGGI.
So hacking it would be necessary anyway.
> No, LibGGI expects ANSI C at least. IIRC in ANSI C 16 bits is as small
> as int can get.
To be precise, ANSI states, that int must have a range of -32767 to 32767
at least. (Mind the symmetric range).
CU, ANdy
--
= Andreas Beck | Email : <[EMAIL PROTECTED]> =