[OMPI devel] Open MPI 4.1.6rc3 posted

2023-09-25 Thread Barrett, Brian via devel
Open MPI 4.1.6rc3 is posted at https://www.open-mpi.org/software/ompi/v4.1/. Assuming MTT looks good tonight and there are no other regressions found, we’re going to ship 4.1.6 later this week. Brian

[OMPI devel] Open MPI 4.1.5rc4 release candidate posted

2023-02-17 Thread Barrett, Brian via devel
The next (and hopefully last) release candidate for the 4.1.5 release is posted at https://www.open-mpi.org/software/ompi/v4.1/. Significant changes since the 2nd release candidate (we skipped the 3rd release candidate for logistical reasons) are: * Fixed a crash in the rdma osc component

[OMPI devel] Open MPI 4.1.5rc2 posted

2022-12-10 Thread Barrett, Brian via devel
Open MPI 4.1.5rc2 is now available at https://www.open-mpi.org/software/ompi/v4.1/. Barring any negative feedback, we're intending to release the 4.1.5 at the end of this week, so please give things a go. Thanks, Brian & Jeff

Re: [OMPI devel] Open MPI v5.0.0rc9 is available for testing

2022-11-17 Thread Barrett, Brian via devel
--enable-mca-no-build=io-romio341 should still work. Or just --disable-io-romio. No comment around the RHEL7 part; that's pretty old, but I don't think we've officially said it is too old. Probably something worth filing a ticket for so that we can run to ground before 5.0 release. Oddly,

[OMPI devel] Open MPI Submodule changes in main branch

2022-09-15 Thread Barrett, Brian via devel
Hi all - tl;dr Please run git submodule update --init --recursive next time you update your checkout of the main branch. This morning, we merged a change that adds another submodule to the main branch. This submodule (named oac, because we suck at naming) will contain a set of Autoconf

[OMPI devel] Component configure change proposal

2022-09-06 Thread Barrett, Brian via devel
Hi all - I filed https://github.com/open-mpi/ompi/pull/10769 this morning, proposing to remove two capabilities from OMPI's configure script: 1. The ability to take an OMPI tarball, and drop in a component source tree, then run configure to build OMPI with that component. 2. The

[OMPI devel] Open MPI 4.1.4rc1 posted

2022-04-27 Thread Barrett, Brian via devel
Open MPI 4.1.4rc1 has been posted. Significant difference from v4.1.3 is the addition of the UCC collectives component. Ideally, we would like to release 4.1.4 by the end of the month, although the end of next week might be more likely. Brian & Jeff

[OMPI devel] Master tree removed from nightly tarball archive

2022-04-18 Thread Barrett, Brian via devel
All - A week late, but I removed the (now 3 weeks out of data) master tree from the archive of nightly tarballs. Theoretically, this means that MTT tests against master (instead of main) will now start failing due to download failures. Please check your MTT runs and update any configs to

Re: [OMPI devel] Script-based wrapper compilers

2022-04-05 Thread Barrett, Brian via devel
vel] Script-based wrapper compilers > > > > Brian, > > > > My 0.02 US$ > > > > Script based wrapper compilers are very useful when cross compiling, > > so ideally, they should be maintained. > > > > Cheers,

[OMPI devel] Open MPI 4.1.3rc2 posted

2022-03-30 Thread Barrett, Brian via devel
The second release candidate of Open MPI 4.1.3rc1, a bug fix release, has been posted on the Open MPI web site. This RC fixes a potential crash when using the smcuda transport mechanism and cleans up some documentation. Assuming there are no major bugs found, we plan on releasing tomorrow,

[OMPI devel] Script-based wrapper compilers

2022-03-23 Thread Barrett, Brian via devel
Does anyone still use the script based wrapper compilers? I have been working on fixing a number of static library compile issues caused by us historically not having been great about tracking library dependencies and the OMPI/PMIx/PRRTE split. Part of this is some fairly significant

[OMPI devel] Open MPI 4.1.3rc1 posted

2022-03-17 Thread Barrett, Brian via devel
The first release candidate of Open MPI 4.1.3rc1, a bug fix release, has been posted on the Open MPI web site. Assuming there are no regressions from 4.1.2 found in the next couple of days, we plan on releasing 4.1.3 at the end of next week. Please file issues on GitHub for any new issues you

Re: [OMPI devel] Announcing Open MPI v5.0.0rc2

2022-01-01 Thread Barrett, Brian via devel
Marco - There are some patches that haven't made it to the 5.0 branch to make this behavior better. I didn't get a chance to back port them before the holiday break, but they will be in the next RC. That said, the issue below is a warning, not an error, so you should still end up with a

Re: [OMPI devel] External PMIx/PRRTE and "make dist"

2021-11-12 Thread Barrett, Brian via devel
ule "3rd-party/openpmix" is missing. Perhaps you forgot to "git clone --recursive ...", or you need to "git submodule update --init --recursive"...? Even though I specified --with-pmix=/usr/local. -Original Message----- From: devel On

[OMPI devel] External PMIx/PRRTE and "make dist"

2021-11-12 Thread Barrett, Brian via devel
Just a quick heads up that I just committed https://github.com/open-mpi/ompi/pull/9649, which changes Open MPI's behavior around PMIX/PRRTE and external builds. Previously, the configure script for the internally packaged PMIX and PRRTE were always run. Now, if the user specifies

Re: [OMPI devel] Question regarding the completion of btl_flush

2021-09-29 Thread Barrett, Brian via devel
Thanks, George. I think we’re on the same page. I’d love for Nathan to jump in here, since I’m guessing he has opinions on this subject. Once we reach consensus, Wei or I will submit a PR to clarify the BTL documentation. Brian On 9/29/21, 7:40 AM, "George Bosilca"

Re: [OMPI devel] Question regarding the completion of btl_flush

2021-09-28 Thread Barrett, Brian via devel
George – Is your comment about the code path referring to the BTL code or the OSC RDMA code? The OSC code seems to expect remote completion, at least for the fence operation. Fence is implemented as a btl flush followed by a window-wide barrier. There’s no ordering specified between the

[OMPI devel] Open MPI 4.1.2rc1 posted

2021-09-24 Thread Barrett, Brian via devel
The first release candidate for Open MPI 4.1.2 has been posted at https://www.open-mpi.org/software/ompi/v4.1/. Barring any major issues, we anticipate a short release candidate period before an official 4.1.2 release. Major changes from 4.1.1 include: - Fix oshmem_shmem_finalize() when

[OMPI devel] Configure --help results

2021-02-01 Thread Barrett, Brian via devel
Josh posted https://github.com/open-mpi/ompi/pull/8432 last week to propagate arguments from PMIx and PRRTE into the top level configure --help, and it probably deserves more discussion. There are three patches, the last of which renames arguments so you could (for example) specify different

[OMPI devel] UCX and older hardwre

2020-10-21 Thread Barrett, Brian via devel
UCX folks - As part of https://github.com/open-mpi/ompi/issues/7968, a README item was added about the segfault in UCX for older IB hardware. That note said the issue would be fixed in UCX 1.10. Aurelien later added a note saying it was fixed in UCX 1.9.0-rc3. Which version should be

Re: [OMPI devel] Which compiler versions to test?

2020-10-19 Thread Barrett, Brian via devel
Sorry, keep forgetting to reply. I think your list is a reasonable starting point. I think AWS is still a little thin in its OS/compiler testing, we definitely have room to improve there. We probably won't pick up the older Intel compilers, for various reasons. We are running recent Intel

[OMPI devel] Open MPI 3rd party packaging changes

2020-10-01 Thread Barrett, Brian via devel
All - Only 6 months after I promised the code would be done, the changes we discussed in February around 3rd party packages (Libevent, HWLOC, PMIx, and PRRTE) are merged to master. With these changes, Open MPI will prefer an external version of any of those packages if a "new enough" version

[OMPI devel] Libevent changes

2020-07-06 Thread Barrett, Brian via devel
https://github.com/open-mpi/ompi/pull/6784 went in while I was on vacation. I'm a little confused; I thought we no longer patched libevent locally? This is certainly going to be a problem as we move to external dependencies; we won't have a way of pulling in this change (whether using the

[OMPI devel] Open MPI release update

2020-06-15 Thread Barrett, Brian via devel
Greetings - As you may know, Open MPI 5.0 is going to include an ambitious improvement in Open MPI's runtime system along with a number of performance improvements, and was targeted to ship this summer. While we are still going to make those improvements to our runtime system, it is taking us

[OMPI devel] Jenkins / Web server outage tonight (Pacific Time)

2020-04-29 Thread Barrett, Brian via devel
Hi all - As part of supporting Pandoc man pages, I'm going to update the base images used to run tests in AWS. I usually screw this up once or twice, so expect Jenkins to be offline for an hour or two this evening, starting at 7:00pm Pacific Time. At the same time, I am going to do some

Re: [OMPI devel] Add multi nic support for ofi MTL using hwloc

2020-03-20 Thread Barrett, Brian via devel
Ok, that makes total sense. I'm leaning towards us fixing this in the OFI MTL rather than making everyone load. I agree with you that it probably doesn't matter, but let's not create a corner case. I'm also going to follow up with the dev who wrote this code, but my guess is that we should

Re: [OMPI devel] Add multi nic support for ofi MTL using hwloc

2020-03-20 Thread Barrett, Brian via devel
But does raise the question; should we call get_topology() for belt and suspenders in OFI? Or will that cause your concerns from the start of this thread? Brian From: Ralph Castain Date: Friday, March 20, 2020 at 9:31 AM To: OpenMPI Devel Cc: "Barrett, Brian" Subject: RE: [EXTERNAL] [OMPI

Re: [OMPI devel] Add multi nic support for ofi MTL using hwloc

2020-03-20 Thread Barrett, Brian via devel
PMIx folks - When using mpirun for launching, it looks like opal_hwloc_topology isn't filled in at the point where we need the information (mtl_ofi_component_init()). This would end up being before the modex fence, since the goal is to figure out which address the process should publish. I'm

Re: [OMPI devel] Reachable framework integration

2020-01-02 Thread Barrett, Brian via devel
Ralph - Are you looking for the "best" single connection between two hosts, or the best set of pairings, or even "just any pairing that works"? The TCP BTL code is complicated because it's looking for the best overall set of pairings, to maximize the number (and quality) of links between the

Re: [OMPI devel] Reachable framework integration

2020-01-02 Thread Barrett, Brian via devel
George, this is a great reminder to please review that code. William's being way too polite, and we know this helps with some of our problems :). Brian From: devel on behalf of George Bosilca via devel Reply-To: Open MPI Developers Date: Thursday, January 2, 2020 at 9:48 AM To: Open MPI

[OMPI devel] Open MPI 3.0.5 and 3.1.5 release plans

2019-11-05 Thread Barrett, Brian via devel
We have posted release candidates for both Open MPI 3.0.5 and 3.1.5, with plans to release on Friday, Nov 8, if we do not hear any new issues compared to the previous releases. Please give the releases a test, if you are so inclined. https://www.open-mpi.org/software/ompi/v3.0/

[OMPI devel] Open MPI 3.0.4 and 3.1.4 Now Available

2019-04-15 Thread Barrett, Brian via devel
The Open MPI Team, representing a consortium of research, academic, and industry partners, is pleased to announce the release of Open MPI 3.0.4 and 3.1.4. Both 3.0.4 and 3.1.4 are bug fix releases with largely the same set of bug fixes. All users of both series are encouraged to upgrade when

[OMPI devel] 3.0.4rc1/3.1.4rc1 posted

2019-03-27 Thread Barrett, Brian via devel
Both 3.0.4rc1 and 3.1.4rc1 are posted in the usual places. Given the late release cycle and the small set of changes, we are intending to release on Friday, unless someone reports a blocking issue. So have a go, give feedback, be happy. 3.0.4rc1: download:

[OMPI devel] Open MPI Jenkins down

2018-11-20 Thread Barrett, Brian via devel
All - A recent security patch on the Open MPI Jenkins setup has appeared to have caused some fairly significant instability. Something in our build configuration is causing Jenkins to become unresponsive to any scheduling changes (including completing jobs or starting new testing instances).

Re: [OMPI devel] Open-MPI backwards compatibility and library version changes

2018-11-16 Thread Barrett, Brian via devel
> for you under the hood (I have seen that, but I cannot remember which > one), and that would be a bad thing, at least from an Open MPI point > of view. > > > Cheers, > > Gilles > On Wed, Nov 14, 2018 at 6:46 PM Christopher Samuel > wrote: >> >>

Re: [OMPI devel] Open-MPI backwards compatibility and library version changes

2018-11-14 Thread Barrett, Brian via devel
Chris - When we look at ABI stability for Open MPI releases, we look only at the MPI and SHMEM interfaces, not the internal interfaces used by Open MPI internally. libopen-pal.so is an internal library, and we do not guarantee ABI stability across minor releases. In 3.0.3, there was a

[OMPI devel] 3.0.3/3.1.3 delay

2018-10-26 Thread Barrett, Brian via devel
Due to the shared memory one-sided issue (which was reverted today), I’m going to hold off on releasing 3.0.3 and 3.1.3 until at least Monday. Would like to see the weekend’s MTT runs run clean before we release. Brian ___ devel mailing list

[OMPI devel] Open MPI 3.0.3rc2 and 3.1.3rc2 released

2018-10-23 Thread Barrett, Brian via devel
I have posted Open MPI 3.0.3rc2 as well as 3.1.3rc2 this afternoon. Both are minor changes over rc1: the SIGCHLD and opal_fifo race patches are the primary changes. Assuming no negative feedback, I’ll release both 3.0.3 and 3.1.3 on Friday. Brian

[OMPI devel] Patcher on MacOS

2018-09-28 Thread Barrett, Brian via devel
Is there any practical reason to have the memory patcher component enabled for MacOS? As far as I know, we don’t have any transports which require memory hooks on MacOS, and with the recent deprecation of the syscall interface, it emits a couple of warnings. It would be nice to crush said

[OMPI devel] Mac OS X 10.4.x users?

2018-09-28 Thread Barrett, Brian via devel
All - In trying to clean up some warnings, I noticed one (around pack/unpack in net/if.h) that is due to a workaround of a bug in MacOS X 10.4.x and earlier. The simple way to remove the warning would be to remove the workaround, which would break the next major version of Open MPI on 10.4.x

[OMPI devel] v3.1.2rc1 is posted

2018-08-15 Thread Barrett, Brian via devel
The first release candidate for the 3.1.2 release is posted at https://www.open-mpi.org/software/ompi/v3.1/ Major changes include fixing the race condition in vader (the same one that caused v2.1.5rc1 to be posted today) as well as: - Assorted Portals 4.0 bug fixes. - Fix for possible data

Re: [OMPI devel] v3.1.1rc2 posted

2018-07-02 Thread Barrett, Brian via devel
e any problem and performance numbers look good. Thanks From: Barrett, Brian via devel mailto:devel@lists.open-mpi.org>> Date: July 1, 2018 at 6:31:26 PM EDT To: Open MPI Developers mailto:devel@lists.open-mpi.org>> Cc: Barrett, Brian mailto:b

[OMPI devel] v3.1.1rc2 posted

2018-07-01 Thread Barrett, Brian via devel
v3.1.1rc2 is posted at the usual place: https://www.open-mpi.org/software/ompi/v3.1/ Primary changes are some important UCX bug fixes and a forward compatibility fix in PMIx. We’re targeting a release on Friday, please test and send results before then. Thanks, Brian

[OMPI devel] Open MPI 3.1.1rc1 posted

2018-06-14 Thread Barrett, Brian via devel
The first release candidate for Open MPI 3.1.1 is posted at https://www.open-mpi.org/software/ompi/v3.1/. We’re a bit behind on getting it out the door, so appreciate any testing feedback you have. Brian ___ devel mailing list

Re: [OMPI devel] OMPI OS X CI offline

2018-05-25 Thread Barrett, Brian via devel
> > On May 25, 2018, at 12:44 PM, Jeff Squyres (jsquyres) > wrote: > > Looks like "piper" OS X OMPI CI machine is offline. I just marked it offline > in Jenkins and will bot:ompi:retest all the PR's that are obviously stuck. Which wasn’t going to do anything, because the

[OMPI devel] Open MPI 3.1.0 Release Update

2018-05-03 Thread Barrett, Brian via devel
It appears that we have resolved the outstanding issues with 3.1.0. I’ve posted 3.1.0rc6 at https://www.open-mpi.org/software/ompi/v3.1/. Please give it a go and let me know what you find. Barring anyone posting a blocking issue, I intend to post 3.1.0 (final) tomorrow morning Pacific Time.

Re: [OMPI devel] Open MPI 3.1.0rc4 posted

2018-04-17 Thread Barrett, Brian via devel
pport for PMIx v1 > > Cheers, > > Gilles > > > > "Barrett, Brian via devel" <devel@lists.open-mpi.org> wrote: >> In what we hope is the last RC for the 3.1.0 series, I’ve posted 3.1.0rc4 at: >> >> https://www.open-mpi.o

[OMPI devel] Open MPI 3.1.0rc4 posted

2018-04-17 Thread Barrett, Brian via devel
In what we hope is the last RC for the 3.1.0 series, I’ve posted 3.1.0rc4 at: https://www.open-mpi.org/software/ompi/v3.1/ Please give it a try and provide feedback asap; goal is to release end of the week if we don’t find any major issues. Brian

Re: [OMPI devel] Github CI stalls: ARM and/or SLES

2018-03-28 Thread Barrett, Brian via devel
To keep things moving, I removed ARM from the pull request checker until LANL and ARM can get their builders back online. You should be able to restart your build request builds and have them complete now. Brian > On Mar 28, 2018, at 9:12 AM, Barrett, Brian via devel > <devel@l

Re: [OMPI devel] Github CI stalls: ARM and/or SLES

2018-03-28 Thread Barrett, Brian via devel
The ARM builders are all down; it was ARM that caused the problems. Brian > On Mar 28, 2018, at 6:48 AM, Jeff Squyres (jsquyres) > wrote: > > Several PR's from last night appear to be stalled in the community CI. I > can't tell if they're stalled in ARM or SLES builds --

Re: [OMPI devel] Upcoming nightly tarball URL changes

2018-03-21 Thread Barrett, Brian via devel
Yeah, it’s failing in exactly the reason I thought it would (and why we didn’t support md5sum files in the first place). Josh and I updated the MTT client yesterday and I pushed the change today. Please update your client and it should work. Brian On Mar 20, 2018, at 10:35 PM, Boris Karasev

Re: [OMPI devel] Upcoming nightly tarball URL changes

2018-03-16 Thread Barrett, Brian via devel
Eventual consistency for the win. It looks like I forgot to set a short cache time for the CDN that fronts the artifact repository. So the previous day’s file was returned. I fixed that and flushed the cache on the CDN, so it should work now. Brian On Mar 15, 2018, at 10:41 PM, Boris

[OMPI devel] Open MPI 3.1.0rc3 Posted

2018-03-14 Thread Barrett, Brian via devel
Open MPI 3.1.0rc3 is now available at https://www.open-mpi.org/software/ompi/v3.1/. Assuming that there are no new negative bug reports, the plan is to release 3.1.1 next Monday. Changes since 3.1.0rc2 include: - Fixes to parallel debugger attach functionality - Fixes to the MPI I/O

[OMPI devel] Open MPI 3.0.1rc4 Posted

2018-03-14 Thread Barrett, Brian via devel
Open MPI 3.0.1rc4 is now available at https://www.open-mpi.org/software/ompi/v3.0/. Assuming that there are no new negative bug reports, the plan is to release 3.0.1 next Monday. Changes since 3.0.1rc3 include: - Fixes to parallel debugger attach functionality - Fixes to the MPI I/O

Re: [OMPI devel] Upcoming nightly tarball URL changes

2018-03-08 Thread Barrett, Brian via devel
Sorry about that; we screwed up understanding the “API” between MTT and the Open MPI download path. We made a change in the nightly tarball builder and things should be working now. Brian > On Mar 8, 2018, at 2:31 AM, Christoph Niethammer wrote: > > Hello, > > After the

Re: [OMPI devel] cannot push directly to master anymore

2018-01-31 Thread Barrett, Brian via devel
> On Jan 31, 2018, at 8:33 AM, r...@open-mpi.org wrote: > > > >> On Jan 31, 2018, at 7:36 AM, Jeff Squyres (jsquyres) >> wrote: >> >> On Jan 31, 2018, at 10:14 AM, Gilles Gouaillardet >> wrote: >>> >>> I tried to push some trivial

[OMPI devel] Open MPI 3.1.0 pre-release available

2018-01-23 Thread Barrett, Brian via devel
The Open MPI team is pleased to announce the first pre-release of the Open MPI 3.1 series, available at: https://www.open-mpi.org/software/ompi/v3.1/ RC1 has two known issues: - We did not complete work to support hwloc 2.x, even when hwloc is built as an external library. This may or

[OMPI devel] Open MPI 3.0.1rc2 available for testing

2018-01-23 Thread Barrett, Brian via devel
I’ve posted the first public release candidate of Open MPI 3.0.1 this evening. It can be downloaded for testing from: https://www.open-mpi.org/software/ompi/v3.0/ We appreciate any testing you can do in preparation for a release in the next week or two. Thanks, Brian & Howard

[OMPI devel] 32-bit builder in OMPI Jenkins

2018-01-09 Thread Barrett, Brian via devel
All - Just an FYI, as we previously discussed, there’s now a 32 bit builder in the OMPI Community Jenkins CI tests. The test passes on the current branches, so this shouldn’t impact anyone until such time as you break 32 bit x86 builds :). Sorry this took so long (I believe the original

[OMPI devel] 3.1.0 NEWS updates

2017-12-11 Thread Barrett, Brian via devel
All - We’re preparing to start the 3.1.0 release process. There have been a number of updates since the v3.0.x branch was created and we haven’t necessarily been great at updating the NEWS file. I took a stab at the update, can everyone have a look and see what I missed?

[OMPI devel] 3.0.1 NEWS updates

2017-12-11 Thread Barrett, Brian via devel
All - We’re preparing to start the 3.0.1 release process. There have been a number of updates since 3.0.0 and we haven’t necessarily been great at updating the NEWS file. I took a stab at the update, can everyone have a look and see what I missed?

Re: [OMPI devel] OSC module change

2017-11-30 Thread Barrett, Brian via devel
t;> In fact the communicator's group was already retained in the window >> structure. So everything was already in place. I pushed the last >> modifications, and everything seems ready to be merged in PR#4527. >> >> Jeff, the fixup commits are squashed :) >> >> Clément

Re: [OMPI devel] OSC module change

2017-11-29 Thread Barrett, Brian via devel
ry component? Clément On 11/28/2017 07:46 PM, Barrett, Brian via devel wrote: The following is perfectly legal: MPI_Comm_dup(some_comm, _comm); MPI_Win_create(…., tmp_comm, ); MPI_Comm_free(tmp_comm); So I don’t think stashing away a communicator is the solution. Is a group sufficient? I think

Re: [OMPI devel] OSC module change

2017-11-28 Thread Barrett, Brian via devel
ply because the existence of a communicator linked to the window is more of less enforced by the MPI standard. George. On Tue, Nov 28, 2017 at 1:02 PM, Barrett, Brian via devel <devel@lists.open-mpi.org<mailto:devel@lists.open-mpi.org>> wrote: The objection I have t

Re: [OMPI devel] OSC module change

2017-11-28 Thread Barrett, Brian via devel
The objection I have to this is that it forces an implementation where every one-sided component is backed by a communicator. While that’s the case today, it’s certainly not required. If you look at Portal 4, for example, there’s one collective call outside of initialization, and that’s a

[OMPI devel] 3.0 / 3.1 Pull Requests caught up?

2017-11-01 Thread Barrett, Brian via devel
All - I think we’re finally caught up on pull requests for the 3.0 and 3.1 release. There are a small number of PRs against both branches waiting for code reviews, but otherwise everything has been committed. If I somehow missed something, first please make sure it has a “Target: ” label on

[OMPI devel] Jenkins jobs offline

2017-10-27 Thread Barrett, Brian via devel
The ARM builder for CI tests is offline, meaning that all CI jobs will fail until Pasha can get it back online. It looks like the agent didn’t call back after Jenkins restarted this morning. Brian ___ devel mailing list devel@lists.open-mpi.org

[OMPI devel] Jenkins/MTT/Trac outage Sunday morning

2017-10-27 Thread Barrett, Brian via devel
All - There will be a 1-2 hour outage of the server that hosts Open MPI’s Jenkins / MTT / Trac servers on Sunday morning (likely starting around 8:30am PDT). In addition to the usual security updates, I’m going to repartition the root volume a little bit in an attempt to mitigate the blast

[OMPI devel] HWLOC / rmaps ppr build failure

2017-10-04 Thread Barrett, Brian via devel
It looks like a change in either HWLOC or the rmaps ppr component is causing Cisco build failures on master for the last couple of days: https://mtt.open-mpi.org/index.php?do_redir=2486 rmaps_ppr.c:665:17: error: ‘HWLOC_OBJ_NUMANODE’ undeclared (first use in this function); did you mean

[OMPI devel] Cuda build break

2017-10-04 Thread Barrett, Brian via devel
All - It looks like nVidia’s MTT started failing on 9/26, due to not finding Cuda. There’s a suspicious commit given the error message in the hwloc cuda changes. Jeff and Brice, it’s your patch, can you dig into the build failures? Brian ___ devel

Re: [OMPI devel] Jenkins nowhere land again

2017-10-03 Thread Barrett, Brian via devel
test problem, but can't get it into the repo. On Oct 3, 2017, at 1:22 PM, Barrett, Brian via devel <devel@lists.open-mpi.org<mailto:devel@lists.open-mpi.org>> wrote: I'm not sure entirely what we want to do. It looks like both Nathan and I's OS X servers died on the same day. It looks like m

Re: [OMPI devel] Jenkins nowhere land again

2017-10-03 Thread Barrett, Brian via devel
I’m not sure entirely what we want to do. It looks like both Nathan and I’s OS X servers died on the same day. It looks like mine might be a larger failure than just Jenkins, because I can’t log into the machine remotely. It’s going to be a couple hours before I can get home. Nathan, do you

[OMPI devel] Open MPI 3.0.0rc5 available for testing

2017-09-07 Thread Barrett, Brian via devel
Hi all - Open MPI 3.0.0rc5 is available for testing at the usual place: https://www.open-mpi.org/software/ompi/v3.0/ There are four major changes since the last RC: * Components built as DSOs now link against the library associated with their project, making it significantly easier to

[OMPI devel] Open MPI 3.1 Feature List

2017-09-05 Thread Barrett, Brian via devel
All - With 3.0 (finally) starting to wrap up, we’re starting discussion of the 3.1 release. As a reminder, we are targeting 2017 for the release, are going to cut the release from master, and are not going to have a feature whitelist for the release. We are currently looking at a timeline

[OMPI devel] Open MPI 3.0.0rc4 available

2017-08-29 Thread Barrett, Brian via devel
The fourth release candidate for Open MPI 3.0.0 is now available for download. Changes since rc2 include: * Better handling of OPAL_PREFIX for PMIx * Update to hwloc 1.11.7 * Sync with PMIx 2.0.1 * Revert atomics behavior to that of the 2.x series * usnic, openib, and portals4 bug fixes * Add

[OMPI devel] v3.0.0 blocker issues

2017-08-01 Thread Barrett, Brian via devel
Here’s the full list: https://github.com/open-mpi/ompi/issues?q=is%3Aissue%20is%3Aopen%20label%3A%22Target%3A%203.0.x%22%20label%3A%22blocker%22 There’s obviously a bunch of XLC issues in there, and IBM’s working on the right documentation / configure checks so that we feel comfortable

Re: [OMPI devel] Issue/PR tagging

2017-08-01 Thread Barrett, Brian via devel
can you send the link - I can't find it. 2017-07-19 16:47 GMT-07:00 Barrett, Brian via devel <devel@lists.open-mpi.org<mailto:devel@lists.open-mpi.org>>: I’ll update the wiki (and figure out where on our wiki to put more general information), but the basics are: If you find a

[OMPI devel] Open MPI 3.0.0 Release Candidate 2 available

2017-07-29 Thread Barrett, Brian via devel
Just in time for your weekend, the Open MPI team is releasing Open MPI 3.0.0rc2, the second release candidate of the 3.0.0 series. New in this release candidate is PMIx 2.0 integration, so we would appreciate any testing around run-time environments. There have also been a round of bug fixes

Re: [OMPI devel] Issue/PR tagging

2017-07-19 Thread Barrett, Brian via devel
I’ll update the wiki (and figure out where on our wiki to put more general information), but the basics are: If you find a bug, file an issue. Add Target:v??? labels for any branch it impacts. If we decide later not to fix the issue on a branch, we’ll remove the label Open/find an issue for

[OMPI devel] Yoda SPML and master/v3.0.0

2017-07-13 Thread Barrett, Brian via devel
Mellanox developers - The btl sm header leak in the yoda spml brought up questions about the status of the yoda spml. My understanding was that Mellanox was going to remove it after the decision that we didn’t require supporting btl transports and Mellanox no longer wanting to support yoda.

[OMPI devel] v3.0.x / v3.x branch mixup

2017-07-07 Thread Barrett, Brian via devel
Hi all - Earlier this week, we discovered that a couple of pull requests had been posted against the deprecated v3.x branch (instead of the active v3.0.x branch). Worse, Github allowed me to merge those requests, despite Github reporting that nobody had permissions to write to the branch

[OMPI devel] 3.0.0 review reminder

2017-06-30 Thread Barrett, Brian via devel
There are a number of outstanding PRs on the 3.0.0 branch which are awaiting review. The door for these is rapidly closing, so if you have some time, please go review the PRs and take appropriate action. Brian ___ devel mailing list

[OMPI devel] Open MPI 3.0.0 first release candidate posted

2017-06-28 Thread Barrett, Brian via devel
The first release candidate of Open MPI 3.0.0 is now available (https://www.open-mpi.org/software/ompi/v3.0/). We expect to have at least one more release candidate, as there are still outstanding MPI-layer issues to be resolved (particularly around one-sided). We are posting 3.0.0rc1 to get

Re: [OMPI devel] Abstraction violation!

2017-06-22 Thread Barrett, Brian via devel
l in a distro that didn’t have an MPI >> installed... >> >>> On Jun 22, 2017, at 5:02 PM, r...@open-mpi.org wrote: >>> >>> It apparently did come in that way. We just never test -no-ompi and so it >>> wasn’t discovered until a downstream projec

Re: [OMPI devel] Abstraction violation!

2017-06-22 Thread Barrett, Brian via devel
I’m confused; looking at history, there’s never been a time when opal/util/info.c hasn’t included mpi.h. That seems odd, but so does info being in opal. Brian > On Jun 22, 2017, at 3:46 PM, r...@open-mpi.org wrote: > > I don’t understand what someone was thinking, but you CANNOT #include

Re: [OMPI devel] Mellanox Jenkins

2017-06-22 Thread Barrett, Brian via devel
ience. Best, Josh Ladd On Wed, Jun 21, 2017 at 8:25 PM, Artem Polyakov <artpo...@gmail.com<mailto:artpo...@gmail.com>> wrote: Brian, I'm going to push for the fix tonight. If won't work - we will do as you advised. 2017-06-21 17:23 GMT-07:00 Barrett, Brian via devel <devel@list

Re: [OMPI devel] Mellanox Jenkins

2017-06-21 Thread Barrett, Brian via devel
In the mean time, is it possible to disable the jobs that listen for pull requests on Open MPI’s repos? I’m trying to get people out of the habit of ignoring CI results, so no results are better than failed results :/. Brian > On Jun 21, 2017, at 1:49 PM, Jeff Squyres (jsquyres)

Re: [OMPI devel] SLURM 17.02 support

2017-06-19 Thread Barrett, Brian via devel
By the way, there was a change between 2.x and 3.0.x: 2.x: Hello, world, I am 0 of 1, (Open MPI v2.1.2a1, package: Open MPI bbarrett@ip-172-31-64-10 Distribution, ident: 2.1.2a1, repo rev: v2.1.1-59-gdc049e4, Unreleased developer copy, 148) Hello, world, I am 0 of 1, (Open MPI v2.1.2a1,

Re: [OMPI devel] Master MTT results

2017-06-01 Thread Barrett, Brian via devel
Also, we merged the RTE changes into 3.0.x this morning, so we should see how that works tonight on MTT. Thanks for all the work, Ralph! Brian > On Jun 1, 2017, at 8:34 AM, r...@open-mpi.org wrote: > > Hey folks > > I scanned the nightly MTT results from last night on master, and the RTE >

Re: [OMPI devel] Time to remove Travis?

2017-06-01 Thread Barrett, Brian via devel
+1 On Jun 1, 2017, at 7:36 AM, Howard Pritchard > wrote: I vote for removal too. Howard r...@open-mpi.org > schrieb am Do. 1. Juni 2017 um 08:10: I’d vote to remove it - it’s

Re: [OMPI devel] Open MPI 3.x branch naming

2017-05-31 Thread Barrett, Brian via devel
> On May 31, 2017, at 7:52 AM, r...@open-mpi.org wrote: > >> On May 31, 2017, at 7:48 AM, Jeff Squyres (jsquyres) <jsquy...@cisco.com> >> wrote: >> >> On May 30, 2017, at 11:37 PM, Barrett, Brian via devel >> <devel@lists.open-mpi.org> wrote: &g

Re: [OMPI devel] Open MPI 3.x branch naming

2017-05-30 Thread Barrett, Brian via devel
ubmitted before Saturday and doesn’t make it due to reviews, you’ll have to resubmit. Brian On May 5, 2017, at 4:21 PM, r...@open-mpi.org<mailto:r...@open-mpi.org> wrote: +1 Go for it :-) On May 5, 2017, at 2:34 PM, Barrett, Brian via devel <devel@lists.open-mpi.org<mailto:devel@lists.open

Re: [OMPI devel] Open MPI 3.x branch naming

2017-05-30 Thread Barrett, Brian via devel
x PRs on 3.x for the rest of the week. If something is submitted before Saturday and doesn’t make it due to reviews, you’ll have to resubmit. Brian On May 5, 2017, at 4:21 PM, r...@open-mpi.org<mailto:r...@open-mpi.org> wrote: +1 Go for it :-) On May 5, 2017, at 2:34 PM, Barrett, Brian via devel

Re: [OMPI devel] Open MPI 3.x branch naming

2017-05-23 Thread Barrett, Brian via devel
, 2017, at 2:34 PM, Barrett, Brian via devel <devel@lists.open-mpi.org<mailto:devel@lists.open-mpi.org>> wrote: To be clear, we’d do the move all at once on Saturday morning. Things that would change: 1) nightly tarballs would rename from openmpi-v3.x--.tar.gz to openmpi-v3.0.x--

Re: [OMPI devel] Open MPI 3.x branch naming

2017-05-05 Thread Barrett, Brian via devel
ng will exist at some point. I'am in favor of doing it asap (but I have no stakes in the game as UTK does not have an MTT). George. On Fri, May 5, 2017 at 1:53 PM, Barrett, Brian via devel <devel@lists.open-mpi.org<mailto:devel@lists.open-mpi.org>> wrote: Hi everyone - We’ve been

[OMPI devel] Open MPI 3.x branch naming

2017-05-05 Thread Barrett, Brian via devel
Hi everyone - We’ve been having discussions among the release managers about the choice of naming the branch for Open MPI 3.0.0 as v3.x (as opposed to v3.0.x). Because the current plan is that each “major” release (in the sense of the three release points from master per year, not necessarily

[OMPI devel] 3.0 merge backlog

2017-05-05 Thread Barrett, Brian via devel
Hi all - Howard’s out of office this week and I got swamped with a couple of internal issues, so we’ve been behind in getting merges pulled into 3.x. I merged a batch this morning and am going to let Jenkins / MTT catch up with testing. Assuming testing looks good, I’ll do another batch

Re: [OMPI devel] [open-mpi/ompi] ompi/opal: add support for HDR and NDR link speeds (#3434)

2017-04-28 Thread Barrett, Brian via devel
Just a reminder from my email last night... The Pull Request bots are going to be a little messed up until Howard and I finish the great Java 8 update of 2017 (hopefully another couple hours). Brian ? From: Jeff Squyres Sent:

[OMPI devel] TCP BTL's multi-link behavior

2017-04-26 Thread Barrett, Brian via devel
George - Do you remember why you adjusted both the latency and bandwidth for secondary links when using multi-link support with the TCP BTL [1]? I think I understand why, but your memory of 10 years ago is hopefully more accurate than my reverse engineering ;). Thanks, Brian [1]

Re: [OMPI devel] external hwloc causing libevent problems?

2017-04-07 Thread Barrett, Brian via devel
ote: >> It could be that a fairly recent change was made to fix that conflict. I >> believe Jeff and Gilles modified the internal hwloc header name(s) to ensure >> we got the right one of those. However, that didn’t get done for libevent >> and/or pmix, so the conflict likely still

  1   2   >