Hello,

In the context of my PhD research, I have been developing a run-time
performance analyzer for MPI-based applications.
My tool provides a controller process for each MPI task. In particular, when
a MPI job is started, a special wrapper script is generated that first
starts my controller processes and next each controller spawns an actual MPI
task (that performs MPI_Init etc.). I use dynamic instrumentation API
(DynInst API) to control and instrument MPI tasks.

The point is I need to intercommunicate my controller processes, in
particular I need a point-to-point communication between arbitrary pair of
controllers. So it seems reasonable to take advantage of MPI itself and use
it for communication. However I am not sure what would be the impact of
calling MPI_Init and communicating from controller processes taking into
account both controllers and actual MPI  processes where started with the
same mpirun invocation. Actually I would need to assure that controllers
have a separate MPI execution enviroment while the application has another
one.

Any suggestions how to achive that? Obviously another option is to use
sockets to intercommunicate controllers, but having MPI this seems to be
overkill.

Thank you in advance for your help.

Regards,
--Oleg

PhD student, Universitat Autonoma de Barcelona, Spain

Reply via email to