2011/10/6 Matthew Zwier <[email protected]>

> Hi,
>
> If you don't get any takers, you could always just make a huge box of water
> (which usually dominates explicit-solvent MD costs) and run it.  That way,
> you could scale up the size of the box arbitrarily to achieve good
> parallelization across that many cores.  I'm not sure that'd be
> scientifically useful, but it sounds like it would fit your business needs
> just fine.
>
> Thank you for your suggestion. Fortunately Gromacs has an active community:
I've received multiple replies already, so anybody who's still thinking of
replying is too late.

Regards,

Maik


> On Thu, Oct 6, 2011 at 9:43 AM, Maik Nijhuis <
> [email protected]> wrote:
>
>> Dear Gromacs users,
>>
>> For one of our customers I have to test a cluster using a parallel
>> application that runs for 1 week on 113 nodes with 12 cores each. The nodes
>> have 20GB memory available. A large Gromacs simulation would be ideal.
>> Unfortunately, I do not have a proper large input file for Gromacs.
>>
>> Since I don't like wasting power and CPU cycles, I'd like to ask you if
>> anyone has a large input file that will keep the cluster busy for one week.
>> I will run the simulation for free using Gromacs 4.5.5, and send you the
>> output.
>>
>> Please send me an email when you're interested. First come, first serve.
>>
>> Regards,
>>
>> Maik Nijhuis
>>
>
-- 
 [image: clustervision_logo.png] Dr. Maik Nijhuis
HPC Benchmark Specialist


Direct: +31 20 407 7556
Skype: maiknijhuis
[email protected]


ClusterVision BV
Nieuw-Zeelandweg 15B
1045 AL Amsterdam
The Netherlands
Tel: +31 20 407 7550
Fax: +31 84 759 8389
www.clustervision.com

<<clustervision_logo.gif>>

-- 
gmx-users mailing list    [email protected]
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to [email protected].
Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

Reply via email to