Re: [OMPI users] MPI inside MPI (still)

2014-12-12 Thread Alex A. Schmidt
Hello Gilles, Ok, I believe I have a simple toy app running as I think it should: 'n' parent processes running under mpi_comm_world, each one spawning its own 'm' child processes (each child group work together nicely, returning the expected result for an mpi_allreduce call). Now, as I mentioned

Re: [OMPI users] MPI inside MPI (still)

2014-12-11 Thread Alex A. Schmidt
Gilles, Well, yes, I guess I'll do tests with the real third party apps and let you know. These are huge quantum chemistry codes (dftb+, siesta and Gaussian) which greatly benefits from a parallel environment. My code is just a front end to use those, but since we have a lot of data to

Re: [OMPI users] MPI inside MPI (still)

2014-12-11 Thread Gilles Gouaillardet
Alex, just to make sure ... this is the behavior you expected, right ? Cheers, Gilles On 2014/12/12 13:27, Alex A. Schmidt wrote: > Gilles, > > Ok, very nice! > > When I excute > > do rank=1,3 > call MPI_Comm_spawn('hello_world',' >

Re: [OMPI users] MPI inside MPI (still)

2014-12-11 Thread Alex A. Schmidt
Gilles, Ok, very nice! When I excute do rank=1,3 call MPI_Comm_spawn('hello_world',' ',5,MPI_INFO_NULL,rank,MPI_COMM_WORLD,my_intercomm,MPI_ERRCODES_IGNORE,status) enddo I do get 15 instances of the 'hello_world' app running: 5 for each parent rank 1, 2 and 3. Thanks a lot, Gilles. Best

Re: [OMPI users] MPI inside MPI (still)

2014-12-11 Thread Gilles Gouaillardet
Alex, just ask MPI_Comm_spawn to start (up to) 5 tasks via the maxprocs parameter : int MPI_Comm_spawn(char *command, char *argv[], int maxprocs, MPI_Info info, int root, MPI_Comm comm, MPI_Comm *intercomm, int array_of_errcodes[]) INPUT

Re: [OMPI users] MPI inside MPI (still)

2014-12-11 Thread Alex A. Schmidt
Hello Gilles, Thanks for your reply. The "env -i PATH=..." stuff seems to work!!! call system("sh -c 'env -i PATH=/usr/lib64/openmpi/bin:/bin mpirun -n 2 hello_world' ") did produce the expected result with a simple openmi "hello_world" code I wrote. I might be harder though with the real

Re: [OMPI users] MPI inside MPI (still)

2014-12-11 Thread Gilles Gouaillardet
Alex, can you try something like call system(sh -c 'env -i /.../mpirun -np 2 /.../app_name') -i start with an empty environment that being said, you might need to set a few environment variables manually : env -i PATH=/bin ... and that being also said, this "trick" could be just a bad idea :