GGI leak
I was deleting a lot of leaks in GGL, and I think I've found a leak into GGI, here is the stack info: Leaked 0x806ecf0 (2048 bytes) _default_zero(): /tmp/degas/lib/ggi2-1.99.2.0b2.1/ggi/visual.c:46 alloc_fb(): /tmp/degas/lib/ggi2-1.99.2.0b2.1/display/memory/mode.c:91 _GGIdomode(): /tmp/degas/lib/ggi2-1.99.2.0b2.1/display/memory/mode.c:145 _GGIdomode(): /tmp/degas/lib/ggi2-1.99.2.0b2.1/display/memory/mode.c:169 _ggiCheck4Defaults(): /tmp/degas/lib/ggi2-1.99.2.0b2.1/ggi/mode.c:68 ggl_visual_set_mode(): /home/ryu/devel/GPUL/ggl/src/graphics/gglvisual.c:105 ggl_timeout_start(): /home/ryu/devel/GPUL/ggl/src/ggltimeout.c:152 ggl_timeout_start(): /home/ryu/devel/GPUL/ggl/src/ggltimeout.c:152 ggl_timeout_start(): /home/ryu/devel/GPUL/ggl/src/ggltimeout.c:152 ggl_timeout_start(): /home/ryu/devel/GPUL/ggl/src/ggltimeout.c:152 (???) _start(): (null):0 I thought that maybe this leak were produced because of GGL code, but I couldn't find more leaks on it. The code at line 105 of ggl_visual_set_mode() is: error= ggiSetMode(visual-visual,gmode); And visual-visual is a successfully opened visual with the type shown below: struct _GglVisual{ GglBase base; /* Base class */ int id; /* Class Identifier */ ggi_visual_t visual; /* GGI Visual*/ [...] The code at line 152 of ggl_timeout_start does nothing but return a number (no mallocs). The reason why the stack looks like this is that I catch alarm signal and use linux interval timers [setitimer()]. The example that I've tested creates 24 visuals (1 with target X and 23 with target memory) and never loses pointers to any of them. I wish this is enough information :) -- _ /_) \/ / / email: mailto:[EMAIL PROTECTED] / \ / (_/ www : http://programacion.mundivia.es/ryu [ GGL developer ] [ IRCore developer ] [ GPUL member ]
Re: [linux-fbdev] Accelerated 2D graphics lib
James Simmons wrote: On Sat, 11 Dec 1999, Jeff Garzik wrote: IMNSHO there is no need to waste time and duplicate effort by created an accelerated 2D gfx lib for fbdev... OpenGL already exists, and it already has an optimized rendering pipeline for 2D as well as 3D. It already exist in OpenGL. Mesa-GGI uses libGGI which in turns supports many targets. One of them being fbdev. Look into added accel support to libGGI. As Mesa-GGI programs work on teh consoel or in X. Well since it is so easy, there is no reason why unaccelerated [direct] fbdev support shouldn't be added to OpenGL. Like I said, it's an easy hack job off of src/SVGA/svgamesa.c. As for 2D and 3D acceleration, I'll answer that question when I get to it. For S3, I would like to put all possible smarts into s3lib. That would allow sharing of S3 code between s3fb, XFree86, GGI, GLX, DRI, etc. Then people can pick and choose the flavor which suits them best. Last I heard GGI was working with XFree86 to support accels when using the X environment. That sounds like a real mess to me, but it seems doable. GGI is a good idea but since it attempts to be so generic it often makes things more complex than they need to be... -- Jeff Garzik | Just once, I wish we would encounter Building 1024| an alien menace that wasn't immune to MandrakeSoft, Inc. | bullets. -- The Brigadier, "Dr. Who"
Re: [linux-fbdev] Accelerated 2D graphics lib
James Simmons wrote: On Sun, 12 Dec 1999, Jeff Garzik wrote: It already exist in OpenGL. Mesa-GGI uses libGGI which in turns supports many targets. One of them being fbdev. Look into added accel support to libGGI. As Mesa-GGI programs work on teh consoel or in X. Well since it is so easy, there is no reason why unaccelerated [direct] fbdev support shouldn't be added to OpenGL. Like I said, it's an easy hack job off of src/SVGA/svgamesa.c. Why? It's already [there ...] fbdev works on many more platforms than GGI. fbdev is primary console for a lot of non-x86 Linux platforms, and KGI doesn't exist (hasn't been tested/debugged at least) on all those platforms yet. -- Jeff Garzik | Just once, I wish we would encounter Building 1024| an alien menace that wasn't immune to MandrakeSoft, Inc. | bullets. -- The Brigadier, "Dr. Who"