Status: New
Owner: ----
Labels: Type-Defect Priority-Medium
New issue 1700 by mathias....@gmail.com: Scalability issue (too many open
files)
http://code.google.com/p/robotframework/issues/detail?id=1700
I'm trying to do some tests involving a lot of processes. I'm using
the 'Run process' call for this, so only 1 process is actually running at a
time.
What I noticed, as I scale up the number of process, I start running
into 'too many open files'.
I believe this is because Robot Framework (2.8.4) doesn't clean up the
PIPEs used during 'Run Process'.
This is on Linux (Ubuntu 14.04).
I've attached a testcase which reproduces the issue. Depending on how
ulimit is configured, you'll run into it with the attached testcase as-is,
or you'll need to increase the counter.
Alternatively, you can add a sleep in the loop, and look at the number of
files in /proc/<fd of robot framework>/fd and see it increase.
One workaround I have for now: I add a piece of code to close the
descriptors at the end of robot/libraries/Process.py:wait_for_process:
if process.stdin:
pass
if process.stdout:
process.stdout.close()
if process.stderr:
process.stderr.close()
I believe this may be the correct solution for stdin, but not for stdout
and stderr. Stdout and stderr will require reading out the data before
closing the stream.
Attachments:
TestFD.txt 148 bytes
--
You received this message because this project is configured to send all
issue notifications to this address.
You may adjust your notification preferences at:
https://code.google.com/hosting/settings
--
---
You received this message because you are subscribed to the Google Groups "robotframework-commit" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to robotframework-commit+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.