Sorry, the default refresh rate is actually 40, not 60. This is somewhat arbitrary still. I'm not sure what would be the best strategy for setting a more sensible default. We could do a quick benchmark the first time an application is run and select the fastest rate at which idling is possible?
If your application relies on a very specific refresh rate I would suggest a custom event loop (see http://wiki.libagar.org/wiki/Custom_event_loop) instead of relying on AG_SetRefreshRate(). Some drivers in the next release (1.4) will ignore the refresh rate setting entirely (e.g., the GLX driver will redraw things only in response to "expose" events from X). On Wed, Sep 23, 2009 at 03:10:21PM -0300, Paulo Pinheiro wrote: > Ok, i have new informations. > > I made a simple example here: > > #include <agar/core.h> > #include <agar/gui.h> > > int main(int argc, char *argv[]) > { > AG_Window *win; > AG_Fixed *fix; > AG_Button *btn; > AG_InitCore( "player_demo", 0 ); > agVerbose = 1; > AG_InitVideo( 640,480,0, AG_VIDEO_OPENGL ); > > win = AG_WindowNew( 0 ); > AG_WindowMaximize( win ); > AG_WindowShow( win ); > AG_SetRefreshRate(50); > AG_BindGlobalKey(SDLK_ESCAPE, KMOD_NONE, AG_Quit); > AG_EventLoop_FixedFPS(); > } > > When NOT using the "AG_SetRefreshRate()" function on this example, the cpu > load was 'ok', between 9%-15% > here's a top "screenshot". > > PID COMMAND %CPU TIME #TH #PRTS #MREGS RPRVT RSHRD RSIZE VSIZE > 1176 player_dem 14.1% 0:04.03 2 95 211 4096K 23M 14M 393M > > > When i set the "AG_SetRefreshRate(50)" to 50 fps the cpu usage was between > 80%-101% > > PID COMMAND %CPU TIME #TH #PRTS #MREGS RPRVT RSHRD RSIZE VSIZE > 1202 player_dem 99.7% 1:00.17 2 95 206 4616K 22M 14M 393M > > 264 firefox-bi 8.3% 4:11.66 15 145 812 59M 22M 121M 491M > > > If agar is running 60fps by default, why, when using 50fps, i got so high > cpu usage for a simple window? > > Here is my GL info: > > Video display is 32bpp (00ff0000,0000ff00,000000ff) > Reference surface is 32bpp (000000ff,0000ff00,00ff0000,ff000000) > GL Version: 1.2 APPLE-1.5.48 > GL Vendor: Intel Inc. > GL Renderer: Intel GMA X3100 OpenGL Engine > btw, i'm using Mac, but it happens on linux too. > > I'm hoping that i am doing something wrong and waiting for clarifications. > > Thanks Julien. > > On Wed, Sep 23, 2009 at 4:21 AM, Julien Nadeau <[email protected]>wrote: > > > On Wed, Sep 23, 2009 at 03:50:54AM -0300, Paulo Pinheiro wrote: > > > Hello, > > > > > > I`m using agar for a video player application and i notice a high cpu > > usage... > > > When using agar i saw the app eating up 90-100% of CPU( tested with top > > ), > > > while using pure SDL it was only about 30%. I tried to profile the > > program > > > and 30% of time was in Draw routine while the rest was in SDL_PumpEvents, > > > notice that all the player complexity is in draw routine. Is it normal > > > behavior or am i doing something wrong? > > > > The stock event loop will use all CPU it needs to achieve your target > > refresh rate; the default refresh rate is 60fps. If you use SDL mode, > > you'll probably want a much lower value that so that the event loop gets > > a chance to idle. See: http://libagar.org/man3/AG_SetRefreshRate > > > > Using SDL mode, most of your CPU time is spent filling rectangles and > > drawing pixels in software. This is why SDL mode is not recommended > > except on platforms lacking hardware-accelerated graphics. Using OpenGL > > mode (pass AG_VIDEO_OPENGL or AG_VIDEO_OPENGL_OR_SDL to AG_InitVideo()), > > all rendering is done efficiently in hardware, with almost no CPU usage > > involved. > > > > It is always preferable for applications to support both modes, OpenGL > > for performance and falling back to SDL for portability. > > > > > > -- > Paulo Victor de Almeida Pinheiro > ------------------------------ > Laboratório de Redes de Comunicação e Segurança da Informação - LARCES > Mestrado em Ciência da Computação - UECE _______________________________________________ Agar mailing list [email protected] http://libagar.org/lists.html
