Dear Mark, Dear Szilárd,
Thank you for your help.
I did try different I_MPI... option without success.
Something I can't figure is I can run jobs with 2 or more OpenMP threads
per MPI process, but not just one.
It crash doing one OpenMP threads per MPI process, even I disable I_MPI_PIN.
On Sat, Dec 6, 2014 at 9:29 AM, Éric Germaneau german...@sjtu.edu.cn
wrote:
Dear Mark, Dear Szilárd,
Thank you for your help.
I did try different I_MPI... option without success.
Something I can't figure is I can run jobs with 2 or more OpenMP threads
per MPI process, but not just one.
It
Thanks Mark for having tried to help.
On 12/06/2014 10:08 PM, Mark Abraham wrote:
On Sat, Dec 6, 2014 at 9:29 AM, Éric Germaneau german...@sjtu.edu.cn
wrote:
Dear Mark, Dear Szilárd,
Thank you for your help.
I did try different I_MPI... option without success.
Something I can't figure is I
I don't think this is a sysconf issue. As you seem to have 16-core (hw
thread?) nodes, it looks like sysnconf returned the correct value
(16), but the OpenMP runtime actually returned 1. This typically means
that the OpenMP runtime was initialized outside mdrun and for some
reason (which I'm not
On a second thought (and a quick googling), it _seems_ that this is an
issue caused by the following:
- the OpenMP runtime gets initialized outside mdrun and its threads
(or just the master thread), get their affinity set;
- mdrun then executes the sanity check, point at which
omp_get_num_procs(),