It looks like I'm running into the 1024-open file limit. As soon as the DFS opens file 1025, it starts getting failures which cascades into the segmentation fault that I'm seeing.
Is there a maximum file-handle count - a quick search of the code didn't turn up anything here? On Linux systems, we could do a quick "ulimit -a | grep files" to get that limit and make sure the Local DFS doesn't go above it. Reading the /proc entries directly is a cleaner approach or just having it as part of the configuration file would work. I've run ulimit -n 2048 to increase the maximum file count to 2048 and will re-run the test. Thanks, Eric --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Hypertable Development" group. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [email protected] For more options, visit this group at http://groups.google.com/group/hypertable-dev?hl=en -~----------~----~----~----~------~----~------~--~---
