[hwloc-devel] Create success (hwloc git 1.11.2-71-gcf1cfb0)

2016-03-19 Thread MPI Team
Creating nightly hwloc snapshot git tarball was a success. Snapshot: hwloc 1.11.2-71-gcf1cfb0 Start time: Sat Mar 19 21:03:15 EDT 2016 End time: Sat Mar 19 21:04:51 EDT 2016 Your friendly daemon, Cyrador

[hwloc-devel] Create success (hwloc git dev-1065-gaa87e49)

2016-03-19 Thread MPI Team
Creating nightly hwloc snapshot git tarball was a success. Snapshot: hwloc dev-1065-gaa87e49 Start time: Sat Mar 19 21:01:02 EDT 2016 End time: Sat Mar 19 21:02:55 EDT 2016 Your friendly daemon, Cyrador

Re: [OMPI devel] Scaling down open mpi for embedded application

2016-03-19 Thread Ralph Castain
There have been a couple of folks who did this before (one for a set-top cable TV box, another for a small satellite), and some folks run OMPI on small RaspberryPi “clusters”, so it is indeed doable. I would suggest going with a newer version as Gilles said, just so you start with something we

Re: [OMPI devel] Scaling down open mpi for embedded application

2016-03-19 Thread Gilles Gouaillardet
Monika, is there ant reason why you use openmpi 1.4.5 ? it is quite antique today, 1.10.2 is the latest stable version. strictly speaking, openmpi does not require linux, it works fine on Solaris, bsd variants, Cygwin and other arch. the memory footprint is made of the library size, and the

[OMPI devel] Scaling down open mpi for embedded application

2016-03-19 Thread Monika Hemnani
I am building a multiprocessor system with soft-core processor(Microblaze) and operating system xilkernel(OS from xilinx). I want to scale down Open mpi with the the functionality of mainly sending and recieving only. Also I want the MPI library to have a low memory footprint(in kBs). As my OS