I deal with a lot of very large files on a regular basis.  I've noticed that
when I delve into these directories using in mintty and issue the command ls
-l (or ls -color=auto),  a very large junk of memory is consumed.   The
memory leak seems to be proportionate to the number and size of files within
the containing folder.   

To reproduce:

generate or use a folder containing 50 (or more) 2G+ files.

//  In this demonstration, I a ran the command on a directory containing 143
files ranging in size from 2GB to 5GB.


$>  free
              total        used        free      shared  buff/cache
available

Mem:       50276004    16465148    33810856           0           0
33810856

Swap:      12058624      186468    11872156

 

$>  ls -l -color=auto
. (contents displayed after some delay)

$>  free

              total        used        free      shared  buff/cache
available

Mem:       50276004    19844660    30431344           0           0
30431344

Swap:      12058624      186460    11872164


// After 10 consecutive executions of the 'ls -al --color=auto' command in
this directory, ls has consumed 86% of my system's real memory.

$> free


              total        used        free      shared  buff/cache
available

Mem:       50276004    43587560     6688444           0           0
6688444

Swap:      12058624      301068    11757556

 





// If I continue (usually unknowingly) my system will completely be depleted
of resources to the point my mouse will barely respond to movement.
--



--
Problem reports:      https://cygwin.com/problems.html
FAQ:                  https://cygwin.com/faq/
Documentation:        https://cygwin.com/docs.html
Unsubscribe info:     https://cygwin.com/ml/#unsubscribe-simple

Reply via email to