On Fri, Sep 02, 2005 at 03:45:16AM +0100, Ian Lynagh wrote: > On Thu, Sep 01, 2005 at 07:40:26AM -0400, David Roundy wrote: > > On Wed, Aug 31, 2005 at 07:01:09PM +0100, Ian Lynagh wrote: > > > On Tue, Aug 30, 2005 at 08:22:06AM -0400, David Roundy wrote: > > > > > > + my $how_many = 1100; > > > > > > This test is quite slow for me; can we do "ulimit -n 15" on Windows? > > > > Hmmm. That would be a good idea, it's slow for me also. Isn't ulimit > > a bash builtin, so it ought to be implemented in the windows shell even > > if it doesn't have any effect? > > Sorry, should have said: my reason for asking is that my man page says: > > -n The maximum number of open file descriptors (most systems do not > allow this value to be set)
Well, if the test doesn't work properly on windows, I'm not sure that's entirely critical. We could "ulimit -n 15 || true" and then the test would still *run* if the ulimit fails, it just wouldn't give a very interesting result. The current scheme of using >1024 patches also doesn't perform the desired test unless the ulimit is smaller than that. I suppose we could get (almost) the best of both worlds by running the equivalent of ulimit -n 15 $num_patches = `echo ulimit -n | sh` + 10; (and then record a few extra patches to put us over the limit.) Then if the ulimit succeeds we record few patches and have a fast test. If it fails, we record many patches and run a slow (but meaningful test). The catch is that I can't figure out how to do the equivalent of `ulimit -n 15` in perl. It seems to just set the ulimit of the spawned shell, and not affect the perl process at all. The simplest solution might be to switch to shell for this test, but I don't like writing loops in shell. :( I suppose we could also try running ulimit in the driver that *calls* this script, but that seems fragile. -- David Roundy http://www.darcs.net _______________________________________________ darcs-devel mailing list [email protected] http://www.abridgegame.org/cgi-bin/mailman/listinfo/darcs-devel
