Tomorrow’s Open MPI call (7/4/2023) is cancelled due to the U.S. Holiday.
Open MPI developers,
I’ve got some bad news from a OMPI v5.0.0 release timeframe. IBM has asked
Austen and I (and our team) to focus 100% on another project for the next two
full weeks.
Open MPI v5.0.x still has a few remaining blocking items including
documentation, PRRTE 3.0 release,
Good meeting today on HAN / Adapt performance.
Joseph is going to go run some more numbers. We will finalize our discussion
next week.
Same time, different Webex Info:
Agenda:
o Discuss Joseph's new results of HAN run with newly-tuned coll_tune.
o Come to consensus about plans for v5.0.0
August 18th, from 2-3pm Central US time is the winner. Here’s Webex-info:
Open MPI - Discuss HAN / Adapt performance
Hosted by Geoff Paulsen
https://ibm.webex.com/ibm/j.php?MTID=m330b3d97ef8828d8b7dae0fa7105d47e
Thursday, Aug 18, 2022 2:00 pm | 1 hour | (UTC-05:00) Central Time (US & Canada)
Resending new Doodle (https://doodle.com/meeting/participate/id/e76ENl1d) with
times next week.
Sorry for the inconvenience.
Please add yourself, and select times you’re available. We’ll decide on
Tuesday’s Web-ex and reply to this email with final time / dial-in info.
Description:
Discuss
I’ve created a doodle here: https://doodle.com/meeting/participate/id/axGpPgre
Please add yourself, and select times you’re available. We’ll decide at Noon
Eastern time on Thursday and reply to this email with final (I think we can
also update the doodle at that time to reflect the best time,
Open MPI v5.0.0rc3 is now available for testing
(https://www.open-mpi.org/software/ompi/v5.0/).
Pease test and send feedback either to users at lists dot open dash mpi dot org
or create an issue at https://github.com/open-mpi/ompi/issues/.
See https://github.com/open-mpi/ompi/blob/v5.0.x/NEWS,
Open MPI developers,
The v5.0 RM managers would like to solicit opinions to come to consensusaround our three independent MCA frameworks in Open MPI v5.0.x.
As you know, Open MPI v5.0.x has abstracted the runtime away to usethe Open PMIx Reference Runtime Environment (PRRTE) implementation.
Open MPI developers,
We have not been able to meet together face to face for quite some time. We'd like to schedule a few 2-hour blocks for detailed discussions on topics of interest.
Please fill out https://doodle.com/poll/rd7szze3agmyq4m5?utm_source=poll_medium=link, include your name
Open MPI developers, We've created the Open MPI v5.0.x branch today, and are receiving bugfixes. Please cherry-pick any master PRs to v5.0.x once they've been merged to master.
We're targeting an aggressive but achievable release date of May 15th. If you're in charge of your organization's CI
The first release candidate for Open-MPI v4.0.6 rc1 is now available for testing:https://www.open-mpi.org/software/ompi/v4.0/
Some fixes include:
- Update embedded PMIx to 3.2.2. This update addresses several
Open MPI v4.0.5rc2 is now available for download and test at: https://www.open-mpi.org/software/ompi/v4.0/
Please test and give feedback soon.
Thanks!The Open-MPI Team
Announcing Open-MPI v4.0.5rc1 available for download and test at https://www.open-mpi.org/software/ompi/v4.0/
Please test and send feedback to devel@lists.open-mpi.org
Fixed in v4.0.5: When launching under SLURM's srun, Open MPI will honor SLURM's binding policy even if that would leave the
Open MPI v4.0.4rc2 is now available for download and test at: sso_last: https://www.open-mpi.org/software/ompi/v4.0/Changes from v4.0.4rc1 include: view commit OPAL/UCX: enabling new API provided by UCX view commit event/external: Fix typo in LDFLAGS vs LIBS var before check view commit
Open MPI Developers, At today's web-ex we've decided to push back the date for branching Open-MPI v5.0 from master to May 14th. We're still targeting June 30th as the release date (see v5.0.0 milestone: https://github.com/open-mpi/ompi/milestone/37). If possible, we're still interested in
Ben, Oops, looks like I may have pushed a v4.0.2 branch around March 10th. Fortunately the v4.0.2 tag is fine and unaltered. I've deleted the v4.0.2 branch. Thanks for bringing this to our attention. Geoff Paulsen
Open MPI v4.0.3rc4 has been posted to https://www.open-mpi.org/software/ompi/v4.0/.
Please test this on your systems, as it's likely to become v4.0.3.
4.0.3 -- March, 2020
Thanks so much for testing. If further testing reveals anything, please create an issue at https://github.com/open-mpi/ompi/.
---Geoffrey PaulsenSoftware Engineer, IBM Spectrum MPIEmail: gpaul...@us.ibm.com
- Original message -From: "Heinz, Michael William" To: Open MPI Developers Cc:
Please test v4.0.3rc3:
https://www.open-mpi.org/software/ompi/v4.0/
Changes since v4.0.2 include:
4.0.3 -- January, 2020
- Add support for Mellanox Connectx6.- Fix a problem with Fortran compiler wrappers ignoring use of disable-wrapper-runpath configure option.
Please test v4.0.3rc1:
https://www.open-mpi.org/software/ompi/v4.0/
Changes since v4.0.2 include:
4.0.3 -- January, 2020 - Add support for Mellanox Connectx6. - Improve dimensions returned by MPI_Dims_create for certain cases. Thanks to @aw32 for reporting.
The third (and possibly final) release candidate for the Open MPI v4.0.2 release is posted at
https://www.open-mpi.org/software/ompi/v4.0/
Fixes since 4.0.2rc2 include:
- Silent failure of OMPI over OFI with large messages sizes.
- Conform MPIR_Breakpoint to MPIR standard.
- btl/vader: when using
Does anyone have any thoughts about the cache-alignment issue in osc/sm, reported in https://github.com/open-mpi/ompi/issues/6950?
___
devel mailing list
devel@lists.open-mpi.org
https://lists.open-mpi.org/mailman/listinfo/devel
Open MPI's autogen.pl has not had it's minimum tool version updated since sometime between v1.10 and v2.0.0. My colleague and I ran into this today when investigating incorrect CFLAGS being used. Jeff Squyres was able to root cause this as our use of old automake. Open MPI publishes it's
23 matches
Mail list logo