So I have this service application which is ported from a Windows Delphi 2007
TService app to a regular command line program for use in Linux.
It runs on a RaspberryPi3B+

Normally it sits idle doing nothing but waiting for a client to connect via TCP
or a time to occur when it needs to execute a task. Or else for system signals
from systemd.

So in the main program there is a loop after the initialization has been done
looking like this:

  While not (bSTerm or bSInt or bsHup) do //System exit signals
  begin
    //Here is where the server runs as defined elsewhere
    //Eternal loop to wait for system messages
    CheckSynchronize(5); //Timeout here instead of using sleep
    if CheckRestartReq then //Check if we need to restart the service
    begin
      Debug_Writeln('Restart timeout reached - exiting');
      FLogServ.StdLog('Restart timeout reached - exiting');
      break;
    end;
  end;

The actual stuff that is done happens inside the objects that get created before
reaching this loop and they are the following events:

If a TCP connection is made from a client a number of handler objects are
created and started for the client communications. These are done using Indy.
But if there is no client connection the objects do not exist.

On a timer timeout a check is made if there are any programmed tasks waiting for
execution. These can be scheduled on minute resolution times.
If there is no waiting task then nothing happens.

In this scenario I expected next to no CPU usage when this is running as a
service controlled by systemd, but in fact top shows the service to consume 10%
CPU when it is doing nothing, no client connected and no task running...

How can I find out what is the cause of this?

I am developing using Lazarus 2.0.12 and FPC 3.2.0 on the RPi3B+.


-- 
Bo Berglund
Developer in Sweden

_______________________________________________
fpc-pascal maillist  -  fpc-pascal@lists.freepascal.org
https://lists.freepascal.org/cgi-bin/mailman/listinfo/fpc-pascal

Reply via email to