woohoo!! Thanks!

> On Sep 9, 2015, at 11:59 AM, Howard Pritchard <hpprit...@gmail.com> wrote:
> 
> Hi Ralph,
> 
> mpirun works for me now on master on the NERSC systems.
> 
> 
> 
> Thanks,
> 
> 
> 
> Howard
> 
> 
> 
> 
> 
> ----------
> 
> sent from my smart phonr so no good type.
> 
> Howard
> 
> On Sep 8, 2015 7:49 PM, "Ralph Castain" <r...@open-mpi.org 
> <mailto:r...@open-mpi.org>> wrote:
> Hi folks
> 
> I’ve poked around this evening and gotten the Slurm support in master to at 
> least build, and for mpirun to now work correctly under a Slurm job 
> allocation. This should all be committed as soon as auto-testing completes:
> 
> https://github.com/open-mpi/ompi/pull/877 
> <https://github.com/open-mpi/ompi/pull/877>
> 
> Howard/Nathan: I believe I fixed mpirun for ALPS too - please check.
> 
> Direct launch under Slurm still segfaults, and I’m out of time chasing it 
> down. Could someone please take a look? It seems to have something to do with 
> the hash table support in the base, but I’m not sure of the problem.
> 
> Thanks
> Ralph
> 
> _______________________________________________
> devel mailing list
> de...@open-mpi.org <mailto:de...@open-mpi.org>
> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/devel 
> <http://www.open-mpi.org/mailman/listinfo.cgi/devel>
> Link to this post: 
> http://www.open-mpi.org/community/lists/devel/2015/09/17987.php 
> <http://www.open-mpi.org/community/lists/devel/2015/09/17987.php>_______________________________________________
> devel mailing list
> de...@open-mpi.org
> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/devel
> Link to this post: 
> http://www.open-mpi.org/community/lists/devel/2015/09/17991.php

Reply via email to