FWIW, if it would be easier, we can just pull a new hwloc tarball -- that's how 
we've done it in the past (vs. trying to pull individual patches).  It's also 
easier to pull a release tarball, because then we can say "hwloc vX.Y.Z is in 
OMPI vA.B.C", rather than have to try to examine/explain what exact level of 
hwloc is in OMPI (based on patches, etc.).


On Dec 15, 2014, at 4:39 AM, Brice Goglin <brice.gog...@inria.fr> wrote:

> Le 15/12/2014 10:35, Jorge D'Elia a écrit :
>> Hi Brice,
>> 
>> ----- Mensaje original -----
>>> De: "Brice Goglin" <brice.gog...@inria.fr>
>>> CC: "Open MPI Users" <us...@open-mpi.org>
>>> Enviado: Jueves, 11 de Diciembre 2014 19:46:44
>>> Asunto: Re: [OMPI users] OpenMPI 1.8.4 and hwloc in Fedora 14 using a beta 
>>> gcc 5.0 compiler.
>>> 
>>> This problem was fixed in hwloc upstream recently.
>>> 
>>> https://github.com/open-mpi/hwloc/commit/790aa2e1e62be6b4f37622959de9ce3766ebc57e
>> Great! However, yesterday I downloaded the versions 1.8.3 (stable) and 
>> 1.8.4rc3 of OpenMPI, and tried to use its more traditional configuration. 
>> It was OK on ia64 (as before) but failed again on ia32.  Then again, 
>> I had to use the external installation of hwloc in order to fix it. 
>> 
> 
> It's fixed in "upstream hwloc", not in OMPI yet. I have prepared a long
> branch of hwloc fixes that OMPI should pull, but it will take some time.
> thanks
> Brice
> 
> _______________________________________________
> users mailing list
> us...@open-mpi.org
> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
> Link to this post: 
> http://www.open-mpi.org/community/lists/users/2014/12/25995.php


-- 
Jeff Squyres
jsquy...@cisco.com
For corporate legal information go to: 
http://www.cisco.com/web/about/doing_business/legal/cri/

Reply via email to