On Sun, Aug 28, 2022 at 8:24 AM Dale <[email protected]> wrote:
>
> What I would like to do is limit the amount of memory torrent
> software can use.

While ulimit/cgroups/etc will definitely do the job, they're probably
not the solution you want.  Those will cause memory allocation to
fail, and I'm guessing at that point your torrent software will just
die.

I'd see if you can do something within the settings of the program to
limit its memory use, and then use a resource limit at the OS level as
a failsafe, so that a memory leak doesn't eat up all your memory.

Otherwise your next email will be asking how to automatically restart
a dead service.  Systemd has support for that built-in, and there are
also options for non-systemd, but you're going to be constantly having
restarts and it might not even run for much time at all depending on
how bad the problem is.  It is always best to tame memory use within
an application.

Something I wish linux supported was discardable memory, for
caches/etc.  A program should be able to allocate memory while passing
a hint to the kernel saying that the memory is discardable.  If the
kernel is under memory pressure it can then just deallocate the memory
and then have some way to notify the process that the memory no longer
is allocated.  That might optionally support giving warning first, or
it might be some kind of new trappable exception for segfaults to
discarded memory.  (Since access to memory doesn't involve system
calls it might be hard to have more graceful error handling.  I guess
an option would be to first tell the kernel to lock the memory before
accessing it, then release the lock, so that the memory isn't
discarded after checking that it is safe.)

-- 
Rich

Reply via email to