On Sat, 18 Jan 2003, Andreas Beck wrote:
- Why do we use that complex X font structure which will also cause problems
for depths below 8, when we could just convert it once and for all
in misc.c to a bitmap format of our own?
The X font structure is used so that the backbuffer can be
This should fix the clipping problem. I'm going to commit it into
the devel tree and if noone finds it to cause trouble it can be
placed in the stable tree.
--
Brian
? .deps
? .libs
? Makefile
? X.la
? box.lo
? buffer.lo
? color.lo
? fillscreen.lo
? gtext.lo
? hline.lo
? line.lo
?
On Mon, 20 Jan 2003, Fabio Alemagna wrote:
On Mon, 20 Jan 2003, Brian S. Julin wrote:
Stupid question from an outsider: couldn't it be possible to make the
application still run by making it use an offscreen buffer while not
visible because of VT switching? It would be really
On Mon, 20 Jan 2003, Fabio Alemagna wrote:
On Mon, 20 Jan 2003, Jos Hulzink wrote:
Why should you backup the whole gfx board's memory? Isn't there any way to
back up only the area actually used by the application?
You know, Amigas deal with full screen graphics and swappable screens
On Tue, 21 Jan 2003, Christoph Egger wrote:
How about using the memory target within libggi's kgi-target, when the
application runs in background?
This background mode can be done by copying first waiting until the accel
is idle, then copying the framebuffer content into the userspace memory
On Tue, 21 Jan 2003, Christoph Egger wrote:
How about using the memory target within libggi's kgi-target, when the
application runs in background?
That's basically what I mean.
[...]
The big question is, if this will still work, once libggiovl becomes useable
and the GGI apps requests
On Tue, 21 Jan 2003, Christoph Egger wrote:
How about using the memory target within libggi's kgi-target, when the
application runs in background?
That's basically what I mean.
Then you were not precisely enough. By using the word offscreen, every-
body thought, you mean an offscreen area
Hi,
On Mon, 20 Jan 2003, Jos Hulzink wrote:
On Mon, 20 Jan 2003, Fabio Alemagna wrote:
On Mon, 20 Jan 2003, Brian S. Julin wrote:
Stupid question from an outsider: couldn't it be possible to make the
application still run by making it use an offscreen buffer while not
visible
On Tue, 21 Jan 2003, Christoph Egger wrote:
On Tue, 21 Jan 2003, Christoph Egger wrote:
How about using the memory target within libggi's kgi-target, when the
application runs in background?
That's basically what I mean.
Then you were not precisely enough. By using the word
On Tue, 21 Jan 2003, Christoph Egger wrote:
Why not? If there's enough space in the gfx board memory then the
offscreen buffer should be allocated there.
If you mean both, then please say that... :-)
Well, offscreen to me just means non visible but still available, so it
didn't matter to me
Fabio Alemagna [EMAIL PROTECTED] wrote:
Stupid question from an outsider: couldn't it be possible to make the
application still run by making it use an offscreen buffer while not
visible because of VT switching? It would be really annoying, imho, if
the application stopped altogether...
Been
I've just again stumbled over another minor annoyance:
For the widget lib, the default behaviour of LibGGI to make the cursor small
is not quite good. As there is already a -nocursor flag, it would
be pretty nice to also have -keepcursor, i.e. don't touch it, keep
whatever it has now.
Shouldn't
On Tue, 21 Jan 2003, Andreas Beck wrote:
Why not? If there's enough space in the gfx board memory then the
offscreen buffer should be allocated there.
And not be available for another application I start on the switched to
console?
Why not? Just make the bg app go back to use its own
On Tue, 21 Jan 2003, Andreas Beck wrote:
Fabio Alemagna [EMAIL PROTECTED] wrote:
Stupid question from an outsider: couldn't it be possible to make the
application still run by making it use an offscreen buffer while not
visible because of VT switching? It would be really annoying, imho,
Why not? If there's enough space in the gfx board memory then the
offscreen buffer should be allocated there.
And not be available for another application I start on the switched to
console?
Why not? Just make the bg app go back to use its own offscreen buffer
That requires
Been there, done that. Annoying.
What exactly is annoying?
See below.
The Hardware has quite some state, and not all of it can be retirieved
easily. Thus basically IMHO we will have to face the fact, that the
application will have to cooperate a little when switching away.
If Amigas
My general feelings towards the prospect of keeping tasks drawing in
the background, whether on memvisuals or in VRAM, is that it would be
a very nice feature but it would also be an incredibly hard and time
consuming thing to implement.
The challenges, most already noted by others in this
On Mon, 20 Jan 2003, Filip Spacek wrote:
On Tue, 21 Jan 2003, Andreas Beck wrote:
Bottom line:
Yes, it should be possible for an application to save its screen contents
on switchaway. However I do not recommend to try to outsmart the application
and do that behind its back. Tell
On Tue, 21 Jan 2003, Fabio Alemagna wrote:
C'mon, guys, it's not about outsmarting applications, other OS's can do
it pretty well.
Other OSes don't allow different applications to run in different video
modes. X doesn't either.
--
Brian
On Tue, 21 Jan 2003, Andreas Beck wrote:
Why not? If there's enough space in the gfx board memory then the
offscreen buffer should be allocated there.
And not be available for another application I start on the switched to
console?
Why not? Just make the bg app go back to use its
On Mon, 20 Jan 2003, Brian S. Julin wrote:
On Tue, 21 Jan 2003, Fabio Alemagna wrote:
C'mon, guys, it's not about outsmarting applications, other OS's can do
it pretty well.
Other OSes don't allow different applications to run in different video
modes.
Really? What about AROS and
On Tue, 21 Jan 2003, Andreas Beck wrote:
The Hardware has quite some state, and not all of it can be retirieved
easily. Thus basically IMHO we will have to face the fact, that the
application will have to cooperate a little when switching away.
If Amigas can do it,
Amigas have a very
O.K. - last one for today.
Windows cannot. Look at any semi-crashed application (i.e. one in an
endless loop) there. If it does not redraw a window on its own anymore,
it will not revert to the previos state when you move something over it.
What has that to do with the matter we're
Fabio Alemagna [EMAIL PROTECTED] wrote:
Not necessarily, it just requires a way to centralize memory allocations.
You are only talking about memory. I have already shown that and how it can
be done.
How do you handle half-done accelerator commands? I have not yet heard a
convincing way to
24 matches
Mail list logo