Re: [gmx-users] REMD: mdrun_mpi crash with segmentation fault (but mpi is working)

2015-02-10 Thread Justin Lemkul



On 2/10/15 7:35 AM, Felipe Villanelo wrote:

Absolutely nothing is written in the log file, just the citations



That indicates that the simulation systems are totally unstable and crash 
immediately.  Test by running each job individually (not as part of REMD) and 
see if you can do any diagnosis and troubleshooting based on 
http://www.gromacs.org/Documentation/Terminology/Blowing_Up.


-Justin


Felipe Villanelo Lizana
Bioquímico
Laboratorio de Biología Estructural y Molecular
Universidad de Chile

2015-02-03 10:01 GMT-03:00 Felipe Villanelo el.maest...@gmail.com:


Hi,

I trying to learn REMD following the tutorial on gromacs page
http://www.gromacs.org/Documentation/Tutorials/GROMACS_USA_Workshop_and_Conference_2013/An_introduction_to_replica_exchange_simulations%3A_Mark_Abraham,_Session_1B
 on
a 4-cores computer.
However when I try to use the command:
mpirun -np 4 mdrun_mpi -v -multidir equil[0123] (as the tutorial says)
the program crashed with the following error:
mpirun noticed that process rank 2 with PID 13013 on node debian exited on
signal 11 (Segmentation fault).

The mpi is running fine with the 4 cores if I run a simple gromacs
simulation (NPT equil) in the same machine.
So I think it is not a problem of mpi configuration (as I read in another
thread)

These with gromacs version is 5.0.2

If I try to run the same with an older version of gromacs (4.5.5) the
error is different (previously adjusting the options on the mdp file to
match changes in syntaxis betweeen versions):

[debian:23526] *** An error occurred in MPI_comm_size
[debian:23526] *** on communicator MPI_COMM_WORLD
[debian:23526] *** MPI_ERR_COMM: invalid communicator
[debian:23526] *** MPI_ERRORS_ARE_FATAL (your MPI job will now abort)

But this version also work fine with mpi using the 4 cores on a simple
simulation

Thanks
Bye

Felipe Villanelo Lizana
Bioquímico
Laboratorio de Biología Estructural y Molecular
Universidad de Chile



--
==

Justin A. Lemkul, Ph.D.
Ruth L. Kirschstein NRSA Postdoctoral Fellow

Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 629
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201

jalem...@outerbanks.umaryland.edu | (410) 706-7441
http://mackerell.umaryland.edu/~jalemkul

==
--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] REMD: mdrun_mpi crash with segmentation fault (but mpi is working)

2015-02-10 Thread Felipe Villanelo
Absolutely nothing is written in the log file, just the citations

Felipe Villanelo Lizana
Bioquímico
Laboratorio de Biología Estructural y Molecular
Universidad de Chile

2015-02-03 10:01 GMT-03:00 Felipe Villanelo el.maest...@gmail.com:

 Hi,

 I trying to learn REMD following the tutorial on gromacs page
 http://www.gromacs.org/Documentation/Tutorials/GROMACS_USA_Workshop_and_Conference_2013/An_introduction_to_replica_exchange_simulations%3A_Mark_Abraham,_Session_1B
  on
 a 4-cores computer.
 However when I try to use the command:
 mpirun -np 4 mdrun_mpi -v -multidir equil[0123] (as the tutorial says)
 the program crashed with the following error:
 mpirun noticed that process rank 2 with PID 13013 on node debian exited on
 signal 11 (Segmentation fault).

 The mpi is running fine with the 4 cores if I run a simple gromacs
 simulation (NPT equil) in the same machine.
 So I think it is not a problem of mpi configuration (as I read in another
 thread)

 These with gromacs version is 5.0.2

 If I try to run the same with an older version of gromacs (4.5.5) the
 error is different (previously adjusting the options on the mdp file to
 match changes in syntaxis betweeen versions):

 [debian:23526] *** An error occurred in MPI_comm_size
 [debian:23526] *** on communicator MPI_COMM_WORLD
 [debian:23526] *** MPI_ERR_COMM: invalid communicator
 [debian:23526] *** MPI_ERRORS_ARE_FATAL (your MPI job will now abort)

 But this version also work fine with mpi using the 4 cores on a simple
 simulation

 Thanks
 Bye

 Felipe Villanelo Lizana
 Bioquímico
 Laboratorio de Biología Estructural y Molecular
 Universidad de Chile

-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] REMD: mdrun_mpi crash with segmentation fault (but mpi is working)

2015-02-06 Thread Mark Abraham
Hi,

What was the last thing written to the log files?

Mark

On Tue, Feb 3, 2015 at 2:01 PM, Felipe Villanelo el.maest...@gmail.com
wrote:

 Hi,

 I trying to learn REMD following the tutorial on gromacs page
 
 http://www.gromacs.org/Documentation/Tutorials/GROMACS_USA_Workshop_and_Conference_2013/An_introduction_to_replica_exchange_simulations%3A_Mark_Abraham,_Session_1B
 
 on
 a 4-cores computer.
 However when I try to use the command:
 mpirun -np 4 mdrun_mpi -v -multidir equil[0123] (as the tutorial says)
 the program crashed with the following error:
 mpirun noticed that process rank 2 with PID 13013 on node debian exited on
 signal 11 (Segmentation fault).

 The mpi is running fine with the 4 cores if I run a simple gromacs
 simulation (NPT equil) in the same machine.
 So I think it is not a problem of mpi configuration (as I read in another
 thread)

 These with gromacs version is 5.0.2

 If I try to run the same with an older version of gromacs (4.5.5) the error
 is different (previously adjusting the options on the mdp file to match
 changes in syntaxis betweeen versions):

 [debian:23526] *** An error occurred in MPI_comm_size
 [debian:23526] *** on communicator MPI_COMM_WORLD
 [debian:23526] *** MPI_ERR_COMM: invalid communicator
 [debian:23526] *** MPI_ERRORS_ARE_FATAL (your MPI job will now abort)

 But this version also work fine with mpi using the 4 cores on a simple
 simulation

 Thanks
 Bye

 Felipe Villanelo Lizana
 Bioquímico
 Laboratorio de Biología Estructural y Molecular
 Universidad de Chile
 --
 Gromacs Users mailing list

 * Please search the archive at
 http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
 posting!

 * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

 * For (un)subscribe requests visit
 https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
 send a mail to gmx-users-requ...@gromacs.org.

-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] REMD: mdrun_mpi crash with segmentation fault (but mpi is working)

2015-02-03 Thread Felipe Villanelo
Hi,

I trying to learn REMD following the tutorial on gromacs page
http://www.gromacs.org/Documentation/Tutorials/GROMACS_USA_Workshop_and_Conference_2013/An_introduction_to_replica_exchange_simulations%3A_Mark_Abraham,_Session_1B
on
a 4-cores computer.
However when I try to use the command:
mpirun -np 4 mdrun_mpi -v -multidir equil[0123] (as the tutorial says)
the program crashed with the following error:
mpirun noticed that process rank 2 with PID 13013 on node debian exited on
signal 11 (Segmentation fault).

The mpi is running fine with the 4 cores if I run a simple gromacs
simulation (NPT equil) in the same machine.
So I think it is not a problem of mpi configuration (as I read in another
thread)

These with gromacs version is 5.0.2

If I try to run the same with an older version of gromacs (4.5.5) the error
is different (previously adjusting the options on the mdp file to match
changes in syntaxis betweeen versions):

[debian:23526] *** An error occurred in MPI_comm_size
[debian:23526] *** on communicator MPI_COMM_WORLD
[debian:23526] *** MPI_ERR_COMM: invalid communicator
[debian:23526] *** MPI_ERRORS_ARE_FATAL (your MPI job will now abort)

But this version also work fine with mpi using the 4 cores on a simple
simulation

Thanks
Bye

Felipe Villanelo Lizana
Bioquímico
Laboratorio de Biología Estructural y Molecular
Universidad de Chile
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.