Dear Romain, Dear Gmsh developers and users, I can confirm Romain's problem and his solution:
http://www.geuz.org/pipermail/gmsh/2008/003316.html >> Romain Quey wrote: >>> Dear Gmsh developers, dear all, >>> >>> Using gmsh 2.2.0 on Linux, I'm having the following error: >>> >>> $ gmsh -3 n1100-id1.geo >>> Info : Parsing file 'n1100-id1.geo' >>> Error : [on processor 0] Unable to open file 'n1100-id1.msh' >>> >>> My geo file contains many 'Volume' definitions (1100) -- see the >>> attached example. [...] > You are right, the bug does not occur here, but when writing the mesh. > > However, I think the reason for the bug is really this one: when > increasing the max number of openable files per process on my system > (from 1024 to 1000000), I can successfully mesh the file. I think that > if you run 'ulimit -n 1024' on your system, gmsh will crash too. I observed similar problems with models with many volumes a while ago and reported them on the gmsh mailing list: http://www.geuz.org/pipermail/gmsh/2007/002946.html http://www.geuz.org/pipermail/gmsh/2007/002831.html http://www.geuz.org/pipermail/gmsh/2007/002826.html http://www.geuz.org/pipermail/gmsh/2007/002947.html http://www.geuz.org/search/search-geuz.cgi?q=magpar&ul=%2Fpipermail%2Fgmsh%2F&ps=10 Based on Romain's observation I tested his solution to increase the resources (max. number of open files) on the shell and I can confirm that this fixes my problems, too. Here is an example (from one of my earlier posts) which exhibits the problem (mesh generation is successful, but the msh file cannot be saved): http://www.geuz.org/pipermail/gmsh/attachments/20071214/5a438bd7/attachment-0005.gz To make this increase in the resources permanent one has to modify the limits in /etc/security/limits.conf by adding or modifying the line * hard nofile 10000 After that it is necessary to login on a new shell or even reboot the machine because new processes inherit this setting from the process/shell which launches it. So, finally the question is why gmsh keeps so many files open (one for each volume!?) and how this could be fixed. Thanks for the useful discussion and help on this mailing list which fixed my problem, too! Werner -- magpar - Parallel Finite Element Micromagnetics Package WWW: http://magnet.atp.tuwien.ac.at/scholz/magpar/ email: magpar at magnet.atp.tuwien.ac.at list: majordomo at magnet.atp.tuwien.ac.at _______________________________________________ gmsh mailing list [email protected] http://www.geuz.org/mailman/listinfo/gmsh
