ion: https://www.open-mpi.org/mailman/listinfo.cgi/users
Link to this post:
http://www.open-mpi.org/community/lists/users/2016/06/29450.php
--
=
Mehmet Belgin, Ph.D. (mehmet.bel...@oit.gatech.edu)
Scientific Computing Consultant | OIT - Academic and Researc
3.18.1 rc2 with the latest version of Open MPI with no trouble
(OS is CentOS 7.2). What version of Open MPI are you planning to use?
What OS, and version?
Good luck.
--
Llolsten
*From:*users [mailto:users-boun...@open-mpi.org] *On Behalf Of *Mehmet
Belgin
*Sent:* Monday, June 13, 2016 7:05 PM
stories?) that you
won't mind sharing.
Thank you very much in advance!
-Mehmet
--
=
Mehmet Belgin, Ph.D. (mehmet.bel...@oit.gatech.edu)
Scientific Computing Consultant | OIT - Academic and Research Technologies
Georgia Institute of Technology
258 4th
Thank you Brice for your quick reply! We will give BIOS upgrade a try
and share our findings with the list.
-Mehmet
On 5/9/16 6:10 PM, Brice Goglin wrote:
Le 09/05/2016 23:58, Mehmet Belgin a écrit :
Greetings!
We've been receiving this error for a while on our 64-core Interlagos
AMD
Sorry for the typo in the subject, I meant "Topology" ;)
On 5/9/16 5:58 PM, Mehmet Belgin wrote:
Greetings!
We've been receiving this error for a while on our 64-core Interlagos
AMD machines:
*
Greetings!
We've been receiving this error for a while on our 64-core Interlagos
AMD machines:
* hwloc has encountered what looks like an error from the operating system.
*
* Socket (P#2 cpuset 0x,0x0)
http://www.open-mpi.org/faq/?category=building#installdirs>
>
>
>> On Feb 16, 2015, at 10:38 AM, Mehmet Belgin <mehmet.bel...@oit.gatech.edu
>> <mailto:mehmet.bel...@oit.gatech.edu>> wrote:
>>
>> I am sure the subject line is confusing, so let me try
I am sure the subject line is confusing, so let me try to clarify. We installed
openMPI in “usr/local/packages” on a node that we use for compilations, but
this is actually a network-attached share, which is mounted under a different
name on compute nodes.
I believe the installation path is
Hello everyone,
OpenMPI crashes when doing parallel HDF5 on both NFS and Panasas systems:
On NFS, we are getting:
ADIOI_Set_lock:: No locks available
ADIOI_Set_lock:offset 69744, length 256
application called MPI_Abort(MPI_COMM_WORLD, 1) - process 124
File locking failed in ADIOI_Set_lock(fd