These aren't mpi problems, they are problems with your system. This type of question is all over the mailing list, so do have a thorough look through the archives to search for solutions. Also, read here for a bit of background:
http://wiki.gromacs.org/index.php/blowing_up If you need further help fixing the problem, please provide a description of your system, what you have done so far, and your .mdp file. -Justin Quoting fabracht sdf <[EMAIL PROTECTED]>: > I finallly managed to compile gromacs with mpi support, thanks to this > mailing list, but i encountered some problems during my heating process. > I typed the command grompp -np 4 .... > and then mpirun -np 4 mdrun -... > that work out just fine for my minimization, but my heating well i think it > all collapsed. the warning comes out like this. > This first line is the end of many others that look almost the same. > Well, I am new at doing parallel runs, but I didn't use to have this type of > problem when i used mdrun without parallel support. > I would like, if possible, some advice on the problem. > Thank you in advance. > FabrÃcio Bracht > > 2780 2782 86.4 0.1251 > 10293535107740740517837563833443348857967447872767677507395291864535498202923000697921266631802318051677748694356690233134327724315919817320285963619926665436209667405260044702144728804850275940332715881909506291349877903079274820737742355599362145056132495305476834829176742035319916407745130558128128.0000 > 0.1250 > step 0 > Warning: 1-4 interaction between 1 and 8 at distance > 43885648070156953668679170455683956179476920623456280845667077863696278941602310820546250146687554479097819048903759400566246209207791071928566884060437161360627600451028963901489648585765463696505095004653577791267904813562400454512943804592192332319886212529061114941178128342988027402519363840901120.000 > which is larger than the 1-4 table size 1.000 nm > These are ignored for the rest of the simulation > This usually means your system is exploding, > if not, you should increase table-extension in your mdp file > > ------------------------------------------------------- > Program mdrun, VERSION 3.3.2 > Source code file: nsgrid.c, line: 220 > > Fatal error: > Number of grid cells is zero. Probably the system and box collapsed. > > ------------------------------------------------------- > > "I'd Be Water If I Could" (Red Hot Chili Peppers) > > Error on node 0, will try to stop all the nodes > Halting parallel program mdrun on CPU 0 out of 4 > > ------------------------------------------------------- > Program mdrun, VERSION 3.3.2 > Source code file: nsgrid.c, line: 220 > > Fatal error: > Number of grid cells is zero. Probably the system and box collapsed. > > ------------------------------------------------------- > > "I'd Be Water If I Could" (Red Hot Chili Peppers) > > Error on node 2, will try to stop all the nodes > Halting parallel program mdrun on CPU 2 out of 4 > > ------------------------------------------------------- > Program mdrun, VERSION 3.3.2 > Source code file: nsgrid.c, line: 220 > > Fatal error: > Number of grid cells is zero. Probably the system and box collapsed. > > ------------------------------------------------------- > > "I'd Be Water If I Could" (Red Hot Chili Peppers) > > Error on node 1, will try to stop all the nodes > Halting parallel program mdrun on CPU 1 out of 4 > > ------------------------------------------------------- > Program mdrun, VERSION 3.3.2 > Source code file: nsgrid.c, line: 220 > > Fatal error: > Number of grid cells is zero. Probably the system and box collapsed. > > ------------------------------------------------------- > > "I'd Be Water If I Could" (Red Hot Chili Peppers) > > > gcq#257: "I'd Be Water If I Could" (Red Hot Chili Peppers) > > Error on node 3, will try to stop all the nodes > Halting parallel program mdrun on CPU 3 out of 4 > > gcq#257: "I'd Be Water If I Could" (Red Hot Chili Peppers) > > > gcq#257: "I'd Be Water If I Could" (Red Hot Chili Peppers) > > > gcq#257: "I'd Be Water If I Could" (Red Hot Chili Peppers) > > ----------------------------------------------------------------------------- > One of the processes started by mpirun has exited with a nonzero exit > code. This typically indicates that the process finished in error. > If your process did not finish in error, be sure to include a "return > 0" or "exit(0)" in your C code before exiting the application. > > PID 6888 failed on node n0 (127.0.0.1) with exit status 1. > ----------------------------------------------------------------------------- > ======================================== Justin A. Lemkul Graduate Research Assistant Department of Biochemistry Virginia Tech Blacksburg, VA [EMAIL PROTECTED] | (540) 231-9080 http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin/ ======================================== _______________________________________________ gmx-users mailing list [email protected] http://www.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php

