vivek sharma wrote:
Sorry for the incomplete mail...i sent it by mistake

what i want to add is I am not able to run it with any of the option.....any help and suggestion will be highly appreciated.
FYI size of my system is around 45000 atoms.


The performance will depend upon a lot of factors, the first of which is the version of Gromacs you're using. The new release (4.0rc1) is substantially faster than any previous release. Other factors are mostly related to hardware and connectivity between nodes.

If you try to run on a huge number of processors, you wind up with a lot of latency, which counteracts the perceived speed increase.

As a general rule (when using, e.g. GMX-3.3.x) a reasonable starting place for determining the number of CPU's is (total atoms) / (# of atoms in largest molecule) to get good balance across the nodes. The -shuffle and -sort options of grompp often improve scaling for heterogeneous systems like membranes.

-Justin

Thanks in advance,
Vivek

2008/9/25 vivek sharma <[EMAIL PROTECTED] <mailto:[EMAIL PROTECTED]>>

    Hi friends,
    I am also facing the similar problem when tried to scale gromacs for
    more number of processors ,
    I have tried one job using gromacs on EKA, in an attempt to scale it
    for more number of processor I am able to get the reduction in
    simulation time upto 20 processors, it is taking more time for 40
    processor for same simulation, and when tried with 60 processor, it
    crashed with segmentation fault.
    i HAVE TRIED OTHER OPTION LIKE CONSTRAINT_ALGO, COULOMBTYPE AND
    SHUFFLE OPTION

    2008/9/25 Tiago Marques <[EMAIL PROTECTED] <mailto:[EMAIL PROTECTED]>>

        We currently have no funds available to migrate to infiniband
        but we will in the future.

        I thought on doing interface bonding but I really think that
        isn't really the problem here, there must be something I'm
        missing, since most applications scale well to 32 cores on GbE.
        I can't scale any application to more than 8 though.

Best regards,
                                   Tiago Marques


        On Tue, Sep 23, 2008 at 6:30 PM, Diego Enry
        <[EMAIL PROTECTED] <mailto:[EMAIL PROTECTED]>> wrote:

            Tiago you can try merging two network interfaces with "channel
            bonding" it's native on all new (2.6.x) linux kernels. You
            only need
            two network adapters (most dual socket boards come with
            then), two
            network switches ( or two VPN on the same switch).

            To tell you the truth, you will not much improvement even
            with the
            latest gromacs version (4beta). However other software that
            may be
            used by your group like NAMD, GAMESS, will benefit a lot
            from this
            approach. (it almost doubles network bandwidth)

            The best solution for gromacs is to migrate to infiniband.
            Go for it,
            it is not super expensive anymore.


            On Tue, Sep 23, 2008 at 1:48 PM, Jochen Hub <[EMAIL PROTECTED]
            <mailto:[EMAIL PROTECTED]>> wrote:
             > Tiago Marques wrote:
             >> I don't know how large the system is. I'm the cluster's
            system administrator
             >> and don't understand much of what's going on. The test
            was given to me by a
             >> person who works with it. I can ask him or look at it,
            if you can point me
             >> how to do it.
             >
             > Hi,
             >
             > you can count the nr of atoms in the structure:
             >
             > grep -c ATOM protein.pdb
             >
             > Jochen
             >
             >>
             >> Thanks, I will look at some of his posts.
             >>
             >> Best regards,
             >>
             >>                         Tiago Marques
             >>
             >>
             >> On Tue, Sep 23, 2008 at 4:03 PM, Jochen Hub
            <[EMAIL PROTECTED] <mailto:[EMAIL PROTECTED]>> wrote:
             >> Tiago Marques wrote:
             >>> Hi!
             >>>
             >>> I've been using Gromacs on dual-socket quad-core Xeons
            with 8GiB of RAM,
             >>> connected with Gigabit Ethernet and I always seem to
            have problems scaling
             >>> to more than a node.
             >>>
             >>> When I run a test on 16 cores, it does run but the
            result is often slower
             >>> than when running on only 8 cores on the same machine.
            The best result
             >> I've
             >>> managed is not being slower than 8 cores on 16.
             >>>
             >>> What am I missing here, or are the tests inappropriate
            to run over more
             >> than
             >>> one machine?
             >>
             >> How large is your system? Which gromacs version are you
            using?
             >>
             >> And have a look at the messages by Carsten Kutzner in
            this list, he
             >> wrote a lot on gromacs scaling.
             >>
             >> Jochen
             >>
             >>> Best regards,
             >>>
             >>> Tiago Marques
             >>>
             >>>
             >>>
             >>>
            
------------------------------------------------------------------------
             >>>
             >>> _______________________________________________
             >>> gmx-users mailing list [email protected]
            <mailto:[email protected]>
             >>> http://www.gromacs.org/mailman/listinfo/gmx-users
             >>> Please search the archive at
            http://www.gromacs.org/search before posting!
             >>> Please don't post (un)subscribe requests to the list.
            Use the
             >>> www interface or send it to
            [EMAIL PROTECTED]
            <mailto:[EMAIL PROTECTED]>.
             >>> Can't post? Read
            http://www.gromacs.org/mailing_lists/users.php
             >>
             >>
             >> --
             >> ************************************************
             >> Dr. Jochen Hub
             >> Max Planck Institute for Biophysical Chemistry
             >> Computational biomolecular dynamics group
             >> Am Fassberg 11
             >> D-37077 Goettingen, Germany
             >> Email: jhub[at]gwdg.de <http://gwdg.de>
             >> Tel.: +49 (0)551 201-2312
             >> ************************************************
             >> _______________________________________________
             >> gmx-users mailing list [email protected]
            <mailto:[email protected]>
             >> http://www.gromacs.org/mailman/listinfo/gmx-users
             >> Please search the archive at
            http://www.gromacs.org/search before posting!
             >> Please don't post (un)subscribe requests to the list.
            Use the
             >> www interface or send it to
            [EMAIL PROTECTED]
            <mailto:[EMAIL PROTECTED]>.
             >> Can't post? Read
            http://www.gromacs.org/mailing_lists/users.php
             >>
             >>
             >>
             >>
            
------------------------------------------------------------------------
             >>
             >> _______________________________________________
             >> gmx-users mailing list    [email protected]
            <mailto:[email protected]>
             >> http://www.gromacs.org/mailman/listinfo/gmx-users
             >> Please search the archive at
            http://www.gromacs.org/search before posting!
             >> Please don't post (un)subscribe requests to the list.
            Use the
             >> www interface or send it to
            [EMAIL PROTECTED]
            <mailto:[EMAIL PROTECTED]>.
             >> Can't post? Read
            http://www.gromacs.org/mailing_lists/users.php
             >
             >
             > --
             > ************************************************
             > Dr. Jochen Hub
             > Max Planck Institute for Biophysical Chemistry
             > Computational biomolecular dynamics group
             > Am Fassberg 11
             > D-37077 Goettingen, Germany
             > Email: jhub[at]gwdg.de <http://gwdg.de>
             > Tel.: +49 (0)551 201-2312
             > ************************************************
             > _______________________________________________
             > gmx-users mailing list    [email protected]
            <mailto:[email protected]>
             > http://www.gromacs.org/mailman/listinfo/gmx-users
             > Please search the archive at
            http://www.gromacs.org/search before posting!
             > Please don't post (un)subscribe requests to the list. Use the
             > www interface or send it to [EMAIL PROTECTED]
            <mailto:[EMAIL PROTECTED]>.
             > Can't post? Read
            http://www.gromacs.org/mailing_lists/users.php
             >



            --
            Diego Enry B. Gomes
            Laboratório de Modelagem e Dinamica Molecular
            Universidade Federal do Rio de Janeiro - Brasil.
            _______________________________________________
            gmx-users mailing list    [email protected]
            <mailto:[email protected]>
            http://www.gromacs.org/mailman/listinfo/gmx-users
            Please search the archive at http://www.gromacs.org/search
            before posting!
            Please don't post (un)subscribe requests to the list. Use the
            www interface or send it to [EMAIL PROTECTED]
            <mailto:[EMAIL PROTECTED]>.
            Can't post? Read http://www.gromacs.org/mailing_lists/users.php




        _______________________________________________
        gmx-users mailing list    [email protected]
        <mailto:[email protected]>
        http://www.gromacs.org/mailman/listinfo/gmx-users
        Please search the archive at http://www.gromacs.org/search
        before posting!
        Please don't post (un)subscribe requests to the list. Use the
        www interface or send it to [EMAIL PROTECTED]
        <mailto:[EMAIL PROTECTED]>.
        Can't post? Read http://www.gromacs.org/mailing_lists/users.php




------------------------------------------------------------------------

_______________________________________________
gmx-users mailing list    [email protected]
http://www.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED]
Can't post? Read http://www.gromacs.org/mailing_lists/users.php

--
========================================

Justin A. Lemkul
Graduate Research Assistant
Department of Biochemistry
Virginia Tech
Blacksburg, VA
jalemkul[at]vt.edu | (540) 231-9080
http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin

========================================
_______________________________________________
gmx-users mailing list    [email protected]
http://www.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the
www interface or send it to [EMAIL PROTECTED]
Can't post? Read http://www.gromacs.org/mailing_lists/users.php

Reply via email to