Re: [OMPI devel] master nightly tarballs stopped on 11/21

2016-11-23 Thread Barrett, Brian via devel
Odd, I'm getting the output of crontab still. Let me run by hand and see what's up :/. On Nov 23, 2016, at 08:28, "r...@open-mpi.org" > wrote: I'll turn my crontab back on for the holiday, in case Brian isn't available -

Re: [OMPI devel] Pull request: LANL-XXX tests failing

2017-03-30 Thread Barrett, Brian via devel
Yeah, sorry about that; we had a configuration issue while trying to clean up some issues on the server which hosts the OMPI Jenkins install. It should be back to normal now. Brian On Mar 30, 2017, at 11:46 AM, Howard Pritchard > wrote: well

[OMPI devel] external hwloc causing libevent problems?

2017-04-05 Thread Barrett, Brian via devel
All - On the telecon yesterday, there was discussion of external hwloc causing problems if libevent was also installed in the same location. Does anyone have details on exactly what the failure mode is? I tried what I think is the issue (./configure —with-hwloc=external with libevent

Re: [OMPI devel] external hwloc causing libevent problems?

2017-04-05 Thread Barrett, Brian via devel
t versions instead of the internal ones - and havoc ensues. > > >> On Apr 5, 2017, at 11:30 AM, Barrett, Brian via devel >> <devel@lists.open-mpi.org> wrote: >> >> All - >> >> On the telecon yesterday, there was discussion of external hwloc causin

Re: [OMPI devel] external hwloc causing libevent problems?

2017-04-07 Thread Barrett, Brian via devel
ote: >> It could be that a fairly recent change was made to fix that conflict. I >> believe Jeff and Gilles modified the internal hwloc header name(s) to ensure >> we got the right one of those. However, that didn’t get done for libevent >> and/or pmix, so the conflict likely still

[OMPI devel] Open MPI 3.0.0 Release Candidate 2 available

2017-07-29 Thread Barrett, Brian via devel
Just in time for your weekend, the Open MPI team is releasing Open MPI 3.0.0rc2, the second release candidate of the 3.0.0 series. New in this release candidate is PMIx 2.0 integration, so we would appreciate any testing around run-time environments. There have also been a round of bug fixes

[OMPI devel] v3.0.0 blocker issues

2017-08-01 Thread Barrett, Brian via devel
Here’s the full list: https://github.com/open-mpi/ompi/issues?q=is%3Aissue%20is%3Aopen%20label%3A%22Target%3A%203.0.x%22%20label%3A%22blocker%22 There’s obviously a bunch of XLC issues in there, and IBM’s working on the right documentation / configure checks so that we feel comfortable

[OMPI devel] Yoda SPML and master/v3.0.0

2017-07-13 Thread Barrett, Brian via devel
Mellanox developers - The btl sm header leak in the yoda spml brought up questions about the status of the yoda spml. My understanding was that Mellanox was going to remove it after the decision that we didn’t require supporting btl transports and Mellanox no longer wanting to support yoda.

[OMPI devel] v3.0.x / v3.x branch mixup

2017-07-07 Thread Barrett, Brian via devel
Hi all - Earlier this week, we discovered that a couple of pull requests had been posted against the deprecated v3.x branch (instead of the active v3.0.x branch). Worse, Github allowed me to merge those requests, despite Github reporting that nobody had permissions to write to the branch

Re: [OMPI devel] Issue/PR tagging

2017-07-19 Thread Barrett, Brian via devel
I’ll update the wiki (and figure out where on our wiki to put more general information), but the basics are: If you find a bug, file an issue. Add Target:v??? labels for any branch it impacts. If we decide later not to fix the issue on a branch, we’ll remove the label Open/find an issue for

[OMPI devel] 3.0.0 review reminder

2017-06-30 Thread Barrett, Brian via devel
There are a number of outstanding PRs on the 3.0.0 branch which are awaiting review. The door for these is rapidly closing, so if you have some time, please go review the PRs and take appropriate action. Brian ___ devel mailing list

[OMPI devel] Open MPI 3.0.0 first release candidate posted

2017-06-28 Thread Barrett, Brian via devel
The first release candidate of Open MPI 3.0.0 is now available (https://www.open-mpi.org/software/ompi/v3.0/). We expect to have at least one more release candidate, as there are still outstanding MPI-layer issues to be resolved (particularly around one-sided). We are posting 3.0.0rc1 to get

[OMPI devel] TCP BTL's multi-link behavior

2017-04-26 Thread Barrett, Brian via devel
George - Do you remember why you adjusted both the latency and bandwidth for secondary links when using multi-link support with the TCP BTL [1]? I think I understand why, but your memory of 10 years ago is hopefully more accurate than my reverse engineering ;). Thanks, Brian [1]

Re: [OMPI devel] [open-mpi/ompi] ompi/opal: add support for HDR and NDR link speeds (#3434)

2017-04-28 Thread Barrett, Brian via devel
Just a reminder from my email last night... The Pull Request bots are going to be a little messed up until Howard and I finish the great Java 8 update of 2017 (hopefully another couple hours). Brian ? From: Jeff Squyres Sent:

Re: [OMPI devel] Issue/PR tagging

2017-08-01 Thread Barrett, Brian via devel
can you send the link - I can't find it. 2017-07-19 16:47 GMT-07:00 Barrett, Brian via devel <devel@lists.open-mpi.org<mailto:devel@lists.open-mpi.org>>: I’ll update the wiki (and figure out where on our wiki to put more general information), but the basics are: If you find a

Re: [OMPI devel] SLURM 17.02 support

2017-06-19 Thread Barrett, Brian via devel
By the way, there was a change between 2.x and 3.0.x: 2.x: Hello, world, I am 0 of 1, (Open MPI v2.1.2a1, package: Open MPI bbarrett@ip-172-31-64-10 Distribution, ident: 2.1.2a1, repo rev: v2.1.1-59-gdc049e4, Unreleased developer copy, 148) Hello, world, I am 0 of 1, (Open MPI v2.1.2a1,

Re: [OMPI devel] Mellanox Jenkins

2017-06-21 Thread Barrett, Brian via devel
In the mean time, is it possible to disable the jobs that listen for pull requests on Open MPI’s repos? I’m trying to get people out of the habit of ignoring CI results, so no results are better than failed results :/. Brian > On Jun 21, 2017, at 1:49 PM, Jeff Squyres (jsquyres)

Re: [OMPI devel] Abstraction violation!

2017-06-22 Thread Barrett, Brian via devel
I’m confused; looking at history, there’s never been a time when opal/util/info.c hasn’t included mpi.h. That seems odd, but so does info being in opal. Brian > On Jun 22, 2017, at 3:46 PM, r...@open-mpi.org wrote: > > I don’t understand what someone was thinking, but you CANNOT #include

Re: [OMPI devel] Abstraction violation!

2017-06-22 Thread Barrett, Brian via devel
l in a distro that didn’t have an MPI >> installed... >> >>> On Jun 22, 2017, at 5:02 PM, r...@open-mpi.org wrote: >>> >>> It apparently did come in that way. We just never test -no-ompi and so it >>> wasn’t discovered until a downstream projec

Re: [OMPI devel] Mellanox Jenkins

2017-06-22 Thread Barrett, Brian via devel
ience. Best, Josh Ladd On Wed, Jun 21, 2017 at 8:25 PM, Artem Polyakov <artpo...@gmail.com<mailto:artpo...@gmail.com>> wrote: Brian, I'm going to push for the fix tonight. If won't work - we will do as you advised. 2017-06-21 17:23 GMT-07:00 Barrett, Brian via devel <devel@list

Re: [OMPI devel] Open MPI 3.x branch naming

2017-05-23 Thread Barrett, Brian via devel
, 2017, at 2:34 PM, Barrett, Brian via devel <devel@lists.open-mpi.org<mailto:devel@lists.open-mpi.org>> wrote: To be clear, we’d do the move all at once on Saturday morning. Things that would change: 1) nightly tarballs would rename from openmpi-v3.x--.tar.gz to openmpi-v3.0.x--

Re: [OMPI devel] Open MPI 3.x branch naming

2017-05-30 Thread Barrett, Brian via devel
x PRs on 3.x for the rest of the week. If something is submitted before Saturday and doesn’t make it due to reviews, you’ll have to resubmit. Brian On May 5, 2017, at 4:21 PM, r...@open-mpi.org<mailto:r...@open-mpi.org> wrote: +1 Go for it :-) On May 5, 2017, at 2:34 PM, Barrett, Brian via devel

Re: [OMPI devel] Open MPI 3.x branch naming

2017-05-30 Thread Barrett, Brian via devel
ubmitted before Saturday and doesn’t make it due to reviews, you’ll have to resubmit. Brian On May 5, 2017, at 4:21 PM, r...@open-mpi.org<mailto:r...@open-mpi.org> wrote: +1 Go for it :-) On May 5, 2017, at 2:34 PM, Barrett, Brian via devel <devel@lists.open-mpi.org<mailto:devel@lists.open

Re: [OMPI devel] Open MPI 3.x branch naming

2017-05-31 Thread Barrett, Brian via devel
> On May 31, 2017, at 7:52 AM, r...@open-mpi.org wrote: > >> On May 31, 2017, at 7:48 AM, Jeff Squyres (jsquyres) <jsquy...@cisco.com> >> wrote: >> >> On May 30, 2017, at 11:37 PM, Barrett, Brian via devel >> <devel@lists.open-mpi.org> wrote: &g

Re: [OMPI devel] Master MTT results

2017-06-01 Thread Barrett, Brian via devel
Also, we merged the RTE changes into 3.0.x this morning, so we should see how that works tonight on MTT. Thanks for all the work, Ralph! Brian > On Jun 1, 2017, at 8:34 AM, r...@open-mpi.org wrote: > > Hey folks > > I scanned the nightly MTT results from last night on master, and the RTE >

Re: [OMPI devel] Time to remove Travis?

2017-06-01 Thread Barrett, Brian via devel
+1 On Jun 1, 2017, at 7:36 AM, Howard Pritchard > wrote: I vote for removal too. Howard r...@open-mpi.org > schrieb am Do. 1. Juni 2017 um 08:10: I’d vote to remove it - it’s

[OMPI devel] 3.0 merge backlog

2017-05-05 Thread Barrett, Brian via devel
Hi all - Howard’s out of office this week and I got swamped with a couple of internal issues, so we’ve been behind in getting merges pulled into 3.x. I merged a batch this morning and am going to let Jenkins / MTT catch up with testing. Assuming testing looks good, I’ll do another batch

[OMPI devel] Open MPI 3.x branch naming

2017-05-05 Thread Barrett, Brian via devel
Hi everyone - We’ve been having discussions among the release managers about the choice of naming the branch for Open MPI 3.0.0 as v3.x (as opposed to v3.0.x). Because the current plan is that each “major” release (in the sense of the three release points from master per year, not necessarily

Re: [OMPI devel] Open MPI 3.x branch naming

2017-05-05 Thread Barrett, Brian via devel
ng will exist at some point. I'am in favor of doing it asap (but I have no stakes in the game as UTK does not have an MTT). George. On Fri, May 5, 2017 at 1:53 PM, Barrett, Brian via devel <devel@lists.open-mpi.org<mailto:devel@lists.open-mpi.org>> wrote: Hi everyone - We’ve been

[OMPI devel] HWLOC / rmaps ppr build failure

2017-10-04 Thread Barrett, Brian via devel
It looks like a change in either HWLOC or the rmaps ppr component is causing Cisco build failures on master for the last couple of days: https://mtt.open-mpi.org/index.php?do_redir=2486 rmaps_ppr.c:665:17: error: ‘HWLOC_OBJ_NUMANODE’ undeclared (first use in this function); did you mean

[OMPI devel] Cuda build break

2017-10-04 Thread Barrett, Brian via devel
All - It looks like nVidia’s MTT started failing on 9/26, due to not finding Cuda. There’s a suspicious commit given the error message in the hwloc cuda changes. Jeff and Brice, it’s your patch, can you dig into the build failures? Brian ___ devel

Re: [OMPI devel] Jenkins nowhere land again

2017-10-03 Thread Barrett, Brian via devel
test problem, but can't get it into the repo. On Oct 3, 2017, at 1:22 PM, Barrett, Brian via devel <devel@lists.open-mpi.org<mailto:devel@lists.open-mpi.org>> wrote: I'm not sure entirely what we want to do. It looks like both Nathan and I's OS X servers died on the same day. It looks like m

Re: [OMPI devel] Jenkins nowhere land again

2017-10-03 Thread Barrett, Brian via devel
I’m not sure entirely what we want to do. It looks like both Nathan and I’s OS X servers died on the same day. It looks like mine might be a larger failure than just Jenkins, because I can’t log into the machine remotely. It’s going to be a couple hours before I can get home. Nathan, do you

[OMPI devel] Open MPI 3.0.0rc4 available

2017-08-29 Thread Barrett, Brian via devel
The fourth release candidate for Open MPI 3.0.0 is now available for download. Changes since rc2 include: * Better handling of OPAL_PREFIX for PMIx * Update to hwloc 1.11.7 * Sync with PMIx 2.0.1 * Revert atomics behavior to that of the 2.x series * usnic, openib, and portals4 bug fixes * Add

[OMPI devel] Open MPI 3.1 Feature List

2017-09-05 Thread Barrett, Brian via devel
All - With 3.0 (finally) starting to wrap up, we’re starting discussion of the 3.1 release. As a reminder, we are targeting 2017 for the release, are going to cut the release from master, and are not going to have a feature whitelist for the release. We are currently looking at a timeline

[OMPI devel] Open MPI 3.0.0rc5 available for testing

2017-09-07 Thread Barrett, Brian via devel
Hi all - Open MPI 3.0.0rc5 is available for testing at the usual place: https://www.open-mpi.org/software/ompi/v3.0/ There are four major changes since the last RC: * Components built as DSOs now link against the library associated with their project, making it significantly easier to

Re: [OMPI devel] OSC module change

2017-11-28 Thread Barrett, Brian via devel
The objection I have to this is that it forces an implementation where every one-sided component is backed by a communicator. While that’s the case today, it’s certainly not required. If you look at Portal 4, for example, there’s one collective call outside of initialization, and that’s a

Re: [OMPI devel] OSC module change

2017-11-28 Thread Barrett, Brian via devel
ply because the existence of a communicator linked to the window is more of less enforced by the MPI standard. George. On Tue, Nov 28, 2017 at 1:02 PM, Barrett, Brian via devel <devel@lists.open-mpi.org<mailto:devel@lists.open-mpi.org>> wrote: The objection I have t

Re: [OMPI devel] OSC module change

2017-11-30 Thread Barrett, Brian via devel
t;> In fact the communicator's group was already retained in the window >> structure. So everything was already in place. I pushed the last >> modifications, and everything seems ready to be merged in PR#4527. >> >> Jeff, the fixup commits are squashed :) >> >> Clément

Re: [OMPI devel] OSC module change

2017-11-29 Thread Barrett, Brian via devel
ry component? Clément On 11/28/2017 07:46 PM, Barrett, Brian via devel wrote: The following is perfectly legal: MPI_Comm_dup(some_comm, _comm); MPI_Win_create(…., tmp_comm, ); MPI_Comm_free(tmp_comm); So I don’t think stashing away a communicator is the solution. Is a group sufficient? I think

[OMPI devel] 3.0 / 3.1 Pull Requests caught up?

2017-11-01 Thread Barrett, Brian via devel
All - I think we’re finally caught up on pull requests for the 3.0 and 3.1 release. There are a small number of PRs against both branches waiting for code reviews, but otherwise everything has been committed. If I somehow missed something, first please make sure it has a “Target: ” label on

[OMPI devel] 3.1.0 NEWS updates

2017-12-11 Thread Barrett, Brian via devel
All - We’re preparing to start the 3.1.0 release process. There have been a number of updates since the v3.0.x branch was created and we haven’t necessarily been great at updating the NEWS file. I took a stab at the update, can everyone have a look and see what I missed?

[OMPI devel] 3.0.1 NEWS updates

2017-12-11 Thread Barrett, Brian via devel
All - We’re preparing to start the 3.0.1 release process. There have been a number of updates since 3.0.0 and we haven’t necessarily been great at updating the NEWS file. I took a stab at the update, can everyone have a look and see what I missed?

Re: [OMPI devel] OMPI OS X CI offline

2018-05-25 Thread Barrett, Brian via devel
> > On May 25, 2018, at 12:44 PM, Jeff Squyres (jsquyres) > wrote: > > Looks like "piper" OS X OMPI CI machine is offline. I just marked it offline > in Jenkins and will bot:ompi:retest all the PR's that are obviously stuck. Which wasn’t going to do anything, because the

[OMPI devel] Open MPI 3.1.1rc1 posted

2018-06-14 Thread Barrett, Brian via devel
The first release candidate for Open MPI 3.1.1 is posted at https://www.open-mpi.org/software/ompi/v3.1/. We’re a bit behind on getting it out the door, so appreciate any testing feedback you have. Brian ___ devel mailing list

[OMPI devel] v3.1.1rc2 posted

2018-07-01 Thread Barrett, Brian via devel
v3.1.1rc2 is posted at the usual place: https://www.open-mpi.org/software/ompi/v3.1/ Primary changes are some important UCX bug fixes and a forward compatibility fix in PMIx. We’re targeting a release on Friday, please test and send results before then. Thanks, Brian

Re: [OMPI devel] v3.1.1rc2 posted

2018-07-02 Thread Barrett, Brian via devel
e any problem and performance numbers look good. Thanks From: Barrett, Brian via devel mailto:devel@lists.open-mpi.org>> Date: July 1, 2018 at 6:31:26 PM EDT To: Open MPI Developers mailto:devel@lists.open-mpi.org>> Cc: Barrett, Brian mailto:b

[OMPI devel] Open MPI 3.1.0 Release Update

2018-05-03 Thread Barrett, Brian via devel
It appears that we have resolved the outstanding issues with 3.1.0. I’ve posted 3.1.0rc6 at https://www.open-mpi.org/software/ompi/v3.1/. Please give it a go and let me know what you find. Barring anyone posting a blocking issue, I intend to post 3.1.0 (final) tomorrow morning Pacific Time.

[OMPI devel] Jenkins/MTT/Trac outage Sunday morning

2017-10-27 Thread Barrett, Brian via devel
All - There will be a 1-2 hour outage of the server that hosts Open MPI’s Jenkins / MTT / Trac servers on Sunday morning (likely starting around 8:30am PDT). In addition to the usual security updates, I’m going to repartition the root volume a little bit in an attempt to mitigate the blast

[OMPI devel] Jenkins jobs offline

2017-10-27 Thread Barrett, Brian via devel
The ARM builder for CI tests is offline, meaning that all CI jobs will fail until Pasha can get it back online. It looks like the agent didn’t call back after Jenkins restarted this morning. Brian ___ devel mailing list devel@lists.open-mpi.org

[OMPI devel] 32-bit builder in OMPI Jenkins

2018-01-09 Thread Barrett, Brian via devel
All - Just an FYI, as we previously discussed, there’s now a 32 bit builder in the OMPI Community Jenkins CI tests. The test passes on the current branches, so this shouldn’t impact anyone until such time as you break 32 bit x86 builds :). Sorry this took so long (I believe the original

Re: [OMPI devel] cannot push directly to master anymore

2018-01-31 Thread Barrett, Brian via devel
> On Jan 31, 2018, at 8:33 AM, r...@open-mpi.org wrote: > > > >> On Jan 31, 2018, at 7:36 AM, Jeff Squyres (jsquyres) >> wrote: >> >> On Jan 31, 2018, at 10:14 AM, Gilles Gouaillardet >> wrote: >>> >>> I tried to push some trivial

[OMPI devel] Upcoming nightly tarball URL changes

2018-02-26 Thread Barrett, Brian via devel
On Sunday, March 18th, the Open MPI team is going to make a change in where nightly tarballs are stored that will likely impact MTT test configuration. If you have an automated system (including MTT) that fetches nightly tarballs, you will likely need to make a change in the next two weeks to

[OMPI devel] v3.1.2rc1 is posted

2018-08-15 Thread Barrett, Brian via devel
The first release candidate for the 3.1.2 release is posted at https://www.open-mpi.org/software/ompi/v3.1/ Major changes include fixing the race condition in vader (the same one that caused v2.1.5rc1 to be posted today) as well as: - Assorted Portals 4.0 bug fixes. - Fix for possible data

[OMPI devel] Open MPI 3.0.1rc2 available for testing

2018-01-23 Thread Barrett, Brian via devel
I’ve posted the first public release candidate of Open MPI 3.0.1 this evening. It can be downloaded for testing from: https://www.open-mpi.org/software/ompi/v3.0/ We appreciate any testing you can do in preparation for a release in the next week or two. Thanks, Brian & Howard

[OMPI devel] Open MPI 3.1.0 pre-release available

2018-01-23 Thread Barrett, Brian via devel
The Open MPI team is pleased to announce the first pre-release of the Open MPI 3.1 series, available at: https://www.open-mpi.org/software/ompi/v3.1/ RC1 has two known issues: - We did not complete work to support hwloc 2.x, even when hwloc is built as an external library. This may or

Re: [OMPI devel] Upcoming nightly tarball URL changes

2018-03-08 Thread Barrett, Brian via devel
Sorry about that; we screwed up understanding the “API” between MTT and the Open MPI download path. We made a change in the nightly tarball builder and things should be working now. Brian > On Mar 8, 2018, at 2:31 AM, Christoph Niethammer wrote: > > Hello, > > After the

[OMPI devel] Open MPI 3.0.1rc4 Posted

2018-03-14 Thread Barrett, Brian via devel
Open MPI 3.0.1rc4 is now available at https://www.open-mpi.org/software/ompi/v3.0/. Assuming that there are no new negative bug reports, the plan is to release 3.0.1 next Monday. Changes since 3.0.1rc3 include: - Fixes to parallel debugger attach functionality - Fixes to the MPI I/O

[OMPI devel] Open MPI 3.1.0rc3 Posted

2018-03-14 Thread Barrett, Brian via devel
Open MPI 3.1.0rc3 is now available at https://www.open-mpi.org/software/ompi/v3.1/. Assuming that there are no new negative bug reports, the plan is to release 3.1.1 next Monday. Changes since 3.1.0rc2 include: - Fixes to parallel debugger attach functionality - Fixes to the MPI I/O

Re: [OMPI devel] Open MPI 3.1.0rc4 posted

2018-04-17 Thread Barrett, Brian via devel
pport for PMIx v1 > > Cheers, > > Gilles > > > > "Barrett, Brian via devel" <devel@lists.open-mpi.org> wrote: >> In what we hope is the last RC for the 3.1.0 series, I’ve posted 3.1.0rc4 at: >> >> https://www.open-mpi.o

[OMPI devel] Open MPI 3.1.0rc4 posted

2018-04-17 Thread Barrett, Brian via devel
In what we hope is the last RC for the 3.1.0 series, I’ve posted 3.1.0rc4 at: https://www.open-mpi.org/software/ompi/v3.1/ Please give it a try and provide feedback asap; goal is to release end of the week if we don’t find any major issues. Brian

Re: [OMPI devel] Github CI stalls: ARM and/or SLES

2018-03-28 Thread Barrett, Brian via devel
To keep things moving, I removed ARM from the pull request checker until LANL and ARM can get their builders back online. You should be able to restart your build request builds and have them complete now. Brian > On Mar 28, 2018, at 9:12 AM, Barrett, Brian via devel > <devel@l

Re: [OMPI devel] Github CI stalls: ARM and/or SLES

2018-03-28 Thread Barrett, Brian via devel
The ARM builders are all down; it was ARM that caused the problems. Brian > On Mar 28, 2018, at 6:48 AM, Jeff Squyres (jsquyres) > wrote: > > Several PR's from last night appear to be stalled in the community CI. I > can't tell if they're stalled in ARM or SLES builds --

Re: [OMPI devel] Upcoming nightly tarball URL changes

2018-03-16 Thread Barrett, Brian via devel
Eventual consistency for the win. It looks like I forgot to set a short cache time for the CDN that fronts the artifact repository. So the previous day’s file was returned. I fixed that and flushed the cache on the CDN, so it should work now. Brian On Mar 15, 2018, at 10:41 PM, Boris

Re: [OMPI devel] Upcoming nightly tarball URL changes

2018-03-21 Thread Barrett, Brian via devel
Yeah, it’s failing in exactly the reason I thought it would (and why we didn’t support md5sum files in the first place). Josh and I updated the MTT client yesterday and I pushed the change today. Please update your client and it should work. Brian On Mar 20, 2018, at 10:35 PM, Boris Karasev

[OMPI devel] 3.0.3/3.1.3 delay

2018-10-26 Thread Barrett, Brian via devel
Due to the shared memory one-sided issue (which was reverted today), I’m going to hold off on releasing 3.0.3 and 3.1.3 until at least Monday. Would like to see the weekend’s MTT runs run clean before we release. Brian ___ devel mailing list

[OMPI devel] Open MPI 3.0.3rc2 and 3.1.3rc2 released

2018-10-23 Thread Barrett, Brian via devel
I have posted Open MPI 3.0.3rc2 as well as 3.1.3rc2 this afternoon. Both are minor changes over rc1: the SIGCHLD and opal_fifo race patches are the primary changes. Assuming no negative feedback, I’ll release both 3.0.3 and 3.1.3 on Friday. Brian

Re: [OMPI devel] Open-MPI backwards compatibility and library version changes

2018-11-14 Thread Barrett, Brian via devel
Chris - When we look at ABI stability for Open MPI releases, we look only at the MPI and SHMEM interfaces, not the internal interfaces used by Open MPI internally. libopen-pal.so is an internal library, and we do not guarantee ABI stability across minor releases. In 3.0.3, there was a

[OMPI devel] Patcher on MacOS

2018-09-28 Thread Barrett, Brian via devel
Is there any practical reason to have the memory patcher component enabled for MacOS? As far as I know, we don’t have any transports which require memory hooks on MacOS, and with the recent deprecation of the syscall interface, it emits a couple of warnings. It would be nice to crush said

[OMPI devel] Mac OS X 10.4.x users?

2018-09-28 Thread Barrett, Brian via devel
All - In trying to clean up some warnings, I noticed one (around pack/unpack in net/if.h) that is due to a workaround of a bug in MacOS X 10.4.x and earlier. The simple way to remove the warning would be to remove the workaround, which would break the next major version of Open MPI on 10.4.x

Re: [OMPI devel] Open-MPI backwards compatibility and library version changes

2018-11-16 Thread Barrett, Brian via devel
> for you under the hood (I have seen that, but I cannot remember which > one), and that would be a bad thing, at least from an Open MPI point > of view. > > > Cheers, > > Gilles > On Wed, Nov 14, 2018 at 6:46 PM Christopher Samuel > wrote: >> >>

[OMPI devel] Open MPI Jenkins down

2018-11-20 Thread Barrett, Brian via devel
All - A recent security patch on the Open MPI Jenkins setup has appeared to have caused some fairly significant instability. Something in our build configuration is causing Jenkins to become unresponsive to any scheduling changes (including completing jobs or starting new testing instances).