Failed to post via Gmane so trying direct emailing instead: I need to figure out why my program uses 10% CPU when it is essentially idle, waiting for a client connect via TCP or for the time for a new task start to arrive. It is a command line program running as a service on Linux (Raspberry Pi4). Developed on the RPi4 with Lazarus 2.0.12 and FPC 3.2.0
This is the main loop in the program following creatrion of the handler objects (which are idle): While not (bSTerm or bSInt or bsHup) do begin //Eternal loop to wait for system messages CheckSynchronize(5); //Timeout here instead of using sleep if CheckRestartReq then begin Debug_Writeln('Restart timeout reached - exiting'); FLogServ.StdLog('Restart timeout reached - exiting'); break; end; end; The while loop breaks on system messages sent by systemd when running as a service or from the keyboard (Ctrl-C) if started in a terminal. Can this loop by itself cause the 10% CPU usage? The checksynchronize() is needed because some of the handler objects spawn communications threads that need this in order to transfer their data to the main object. I have looked for a way to "profile" the program and found LazProfiler, which has a forum support thread in which I posted questions about profiling my problem: https://forum.lazarus.freepascal.org/index.php/topic,38983.msg420564.html#msg420564 But it looks like an uphill task to do the profiling... -- Bo Berglund Developer in Sweden -- _______________________________________________ lazarus mailing list lazarus@lists.lazarus-ide.org https://lists.lazarus-ide.org/listinfo/lazarus