This is a known issue.

The basic memory consumption model is as follows:

Scan Daemon Parent  - Memory consumed to read in pertinent script info
Amount of memory consumed directly proportional to the # of scripts
you have.

Scan Task (child of Scan Daemon) - one forked for each connection opened
to service a client request.  Memory from parent would ideally be in
a "copy-on-write" mode, but it appears that all of the parent's memory
is copied, probably due to memory structure changes as the parent gets
ready to begin running a scan (building deps? setting other flags not
previously set?).

Scan-IP (child of Scan Task) - one forked for each IP address that is
to be scanned.

Script Execution (child of Scan-IP) - one forked for each new script.
Again, has copy-on-write memory, so while 'ps' will show high memory
usage, an overall system view of memory consumption shows only marginal
increase in memory usage for most scripts.

If we take a typical scan request of, say, a class C network, your
optimimum platform memory consumption will be in a model where a client
connects up to the scanner, and passes all IP addresses to be scanned
in a single request.  If we were to allow all IP addresses to be scanned
simultaneously, we would have a total of 1+1+ConcIP+256*ConScript
processes running, where ConcIP is the Concurrent # of IPs being tested,
and ConScript is the concurrent # of scripts executed at any one time
against a given IP.

The worst scenario is to have a separate client connection to the
scanner for each IP to be tested, in which case we would have
1+ConcIP+ConcIP+256*ConScript.  Since the memory consumption
is primarily on the first 3 values, it becomes important to put
multiple targets into a single request (i.e. minimize client
to scanner daemon connections) so as to minimize memory usage.

Thomas


On 28/11/12 10:09 AM, Jan-Christopher Brand wrote:
> Hi,
> 
>  
> 
> I saw that with each task started, the OpenVAS-Scanner needs about 7mb
> more memory. After I updated to the newest version from trunk - because
> I’ve seen something about fixed memory-leeks - it got better, but still
> each start of a task adds 3,2mb memory usage to the scanner. Is this a
>  known behavior? And will this be fixed? I’m starting thousands of tasks
> after each other, so the 3,2mb are quite much ;-)
> 
>  
> 
> Mit freundlichen Grüßen,
> 
>  
> 
> Jan-Christopher Brand
> 
>  
> 
> 
> 
> _______________________________________________
> Openvas-devel mailing list
> Openvas-devel@wald.intevation.org
> https://lists.wald.intevation.org/cgi-bin/mailman/listinfo/openvas-devel

_______________________________________________
Openvas-devel mailing list
Openvas-devel@wald.intevation.org
https://lists.wald.intevation.org/cgi-bin/mailman/listinfo/openvas-devel

Reply via email to