Nope. Need a back trace.
Mark
On Mar 13, 2014 4:59 AM, "arrow50311" wrote:
> Hi,
>
> Gromacs 4.0.7 is different story, and I did not test it with the same
> system
> though.
>
> The job script is as follows
>
> #@ job_name = job4.6
> #@ comment = "Hello World Job"
> #@ error = $(job_name).$(jobi
Hi,
Gromacs 4.0.7 is different story, and I did not test it with the same system
though.
The job script is as follows
#@ job_name = job4.6
#@ comment = "Hello World Job"
#@ error = $(job_name).$(jobid).err
#@ output = $(job_name).$(jobid).out
#@ environment = COPY_ALL
#@ wall_clock_limit = 10:00
I've run gromacs v4.6.3 on BG/Q without problem and my colleagues have run
older versions of gromacs on the BG/L also without problem. (No BG/P
experience here, unfortunately). Still, it's worth having you post your job
submission script for us to take a look at.
Also, is it just gromacs 4.6.2
The real problem cannot be diagnosed without a valid stack trace from the
core file. No GROMACS developers have access to such a machine, so any
resolution is up to you guys providing the information.
Mark
On Mar 12, 2014 10:44 PM, "arrow50311" wrote:
> Is there any follow-up for this question?
Is there any follow-up for this question?
I met with exactly the same problem on Bluegene/P.
Could anyone offer a help?
Thank you,
Prentice Bisbal wrote
> Mark,
>
> Since I was working with 4.6.2, I built 4.6.3 to see if this was the
> result of a bug in 4.6.2. It isn't I get the same error
Hi,
As the message says, you need at least one MPI process doing PP per GPU, so
you need to arrange your MPI to have more than one process if you want to
use both GPUs. Depending on your simulation and hardware, you may do better
with any even number of processes, corresponding OpenMP setup, and m
Hi users,
I would like to run my job on gpu. I have compiled gromacs version 4.6.1
with cuda 5 in parallel, have changed cutoff scheme to verlet and generated
new *.tpr and finally running:
mdrun -v -deffnm md
I can see following lines in my output:
changing nstlist from 5 to 40, rlist from 1 to
On 3/12/14, 7:51 AM, sujithkakkat . wrote:
Hello,
After a while I got back to the problem posted here. This issue was the
large value for average pressure(~25 bar against the reference pressure of
1 bar) in NPT simulations with parrinello rahman barostat. The system
studied is cyclohexane-w
On 3/12/14, 1:37 AM, Sathish Kumar wrote:
Hai
I am trying to calculate torsional angles in DNA, for that first I
made angle.ndx file by using
mk_angndx -f prod.tpr -n angle.ndx -type dihedral
the out put angle.ndx is looks like as following
[ Phi=0.0_3_1.31 ]
12 2 3
Hello,
After a while I got back to the problem posted here. This issue was the
large value for average pressure(~25 bar against the reference pressure of
1 bar) in NPT simulations with parrinello rahman barostat. The system
studied is cyclohexane-water system with an interface.
The forcefield is
On Wed, Mar 12, 2014 at 4:05 AM, Sucharita Dey wrote:
> I have done the subscription, hope my email will be accepted.
>
> Sucharita Dey,PhD, Research Fellow, Cancer Science Institute of Singapore,
> National University of Singapore. 14, Medical Drive, #12-01, Singapore
> 117599. Tel: (65) 903552
Sounds like a normal case of being too rough with a system during
equilibration, which can lead to
http://www.gromacs.org/Documentation/Terminology/Blowing_Up if you get
unlucky. There's general advice there.
Mark
On Wed, Mar 12, 2014 at 7:05 AM, Chetan Mahajan wrote:
> Hi Tsjerk,
>
> Thanks. I
On Wed, Mar 12, 2014 at 9:56 AM, Szilárd Páll wrote:
> That's a "normal" side-effect of mdrun trying to set thread affinities
> [1] on a platform which does not support this. Also, this is not an
> error, mdrun should have continued to run.
>
Indeed. To find the function in which the actual probl
True, if it's 4.6. If it's earlier, you will need to run mdrun and inspect
the log file.
Mark
On Wed, Mar 12, 2014 at 9:44 AM, Szilárd Páll wrote:
> Check the version header, it contain a "Precision" field (e.g. mdrun
> -version).
> --
> Szilárd
>
>
> On Wed, Mar 12, 2014 at 9:03 AM, Dario Corr
That's a "normal" side-effect of mdrun trying to set thread affinities
[1] on a platform which does not support this. Also, this is not an
error, mdrun should have continued to run.
Note that you would be much better off with upgrading to the latest
version, there have been many improvements, espe
Check the version header, it contain a "Precision" field (e.g. mdrun -version).
--
Szilárd
On Wed, Mar 12, 2014 at 9:03 AM, Dario Corrada wrote:
> How I can check if a packaged version of GROMACS (such as those available
> from Fedora or Ubuntu repositories) has been pre-compiled with single or
How I can check if a packaged version of GROMACS (such as those
available from Fedora or Ubuntu repositories) has been pre-compiled with
single or double precision?
--
Dario CORRADA, PhD
Bioinformatics and Computational Chemistry specialist
URL..: http://it.linkedin.com/in/dariocorrada/
ma
17 matches
Mail list logo