> 2) While looking at the GGI API it struck me that there is ggiPutHLine,
> ggiGetHline, ggiPutVline, ggiGetVline.
> Why isn't there a ggiPutLine ggiGetLine?
What would you need that for ? Filling horizontal or sometimes vertical
lines with a pattern is a common thing for scanline-oriented games and such
to build up their screen. Filling arbitrarily tilted lines doesn't seem so
useful, especially as there will be an arbitrary distortion depending von
the inclination of the line. You would have to precompensate for that to let
it make any sense.
> Is this already implemented?
No. It would be relatively trivial to do so, but I don't think it is worth
the bother. If you want it for stuff like dotted lines I'd rather put that
in LibXMI (or LibGGI2D if it were maintained).
The other reason for not implementing it, is that it would not be nicely
acceleratable on most graphics HW.
> It strikes me that this could be useful
Could you explain ? Maybe I missed a useful case ?
> and there doesn't seem to be any way for the user to implement these
> routines themself.
He can fall back to using a Bresenham algorithm and draw the line himself. I
think the case is so rare, that these users needing it would take the
bother.
> It is quite doable (even I could implement this).
Be careful. LibGGI is specified to draw "perfect" lines, i.e. Lines that are
within +- 0.5 pixels to the mathematically correct points with some extra
rules for resolving the "exactly 0.5 away" cases.
The reason for this is, that hardware and software drawing should match up
perfectly, so that when hardware accel fails for some reason (like having
turned on video output or similar), software rendering should be able to
erase the line just drawn with hardware.
I don't doubt you could do it, but its not as easy as it seems. Unless of
course by duplicating the existing linedrawing code.
CU, Andy
--
= Andreas Beck | Email : <[EMAIL PROTECTED]> =