Hi Michael,

I've seen this on occasion. One quick and dirty fix is simply to increase the 
ulimit. The too many 
open files comes from a combination of socket connections (which take up a 
file), along with the writing
of the downloaded files. This isn't a pushpull issue per-se, but simply a 
reaching-the-bounds-of-the-OS 
situation that you run into downloading such a large number of files 
concurrently. One other option is to 
increase the daemon wait time to stage downloads, per remote site and per 
product type.

Brian F.'s advice on the termination seems great too, so try that out too.

HTH,
Chris

On Dec 15, 2011, at 2:13 PM, Starch, Michael D (388L) wrote:

> All,
> 
> We have been seeing several errors with the pushpull system.
> 
> We occasionally get the exception "To may open files" and occasionally fail 
> to connect to an ftp site 4 times in a row, which causes psuhpull to 
> terminate.
> 
> Has anyone else seen this and found workarounds?
> 
> Michael Starch
> Sounder PEATE Data System Operator
> NASA Jet Propulsion Laboratory
> [email protected]
> 
> 
> 
> 


++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Chris Mattmann, Ph.D.
Senior Computer Scientist
NASA Jet Propulsion Laboratory Pasadena, CA 91109 USA
Office: 171-266B, Mailstop: 171-246
Email: [email protected]
WWW:   http://sunset.usc.edu/~mattmann/
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Adjunct Assistant Professor, Computer Science Department
University of Southern California, Los Angeles, CA 90089 USA
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Reply via email to