Hi,

Sorry for my late summary.

I verified RC1 and have been fixed some problems.
Here are remained problems I found:

  * "yum install"/"dnf install" don't work. It seems that
    Yum repository's metadata is broken. It may be a our new
    binary upload script problem. The script may have a
    retry related problem.

  * "dev/release/verify-release-candidate.sh wheels 0.16.0 1"
    doesn't work. See the following log for details. I
    think that this is a verify script problem but I don't
    know how to fix it. It may be a Conda related problem.

https://github.com/apache/arrow/blob/master/dev/release/verify-release-candidate.sh#L613

----
+ conda install -y --file /home/kou/work/cpp/arrow.kou/ci/conda_env_python.yml 
pandas
+ '[' 5 -lt 1 ']'
+ local cmd=install
+ shift
+ case "$cmd" in
+ 
OLDPATH=/tmp/arrow-0.16.0.j9Uct/test-miniconda/envs/_verify_wheel-2.7mu/bin:/tmp/arrow-0.16.0.j9Uct/test-miniconda/condabin:/home/kou/work/go/bin:/bin:/home/kou/local/bin:/home/kou/.config/composer/vendor/bin:/var/lib/gems/2.5.0/bin:/usr/local/bin:/usr/bin:/usr/games
+ __add_sys_prefix_to_path
+ '[' -n '' ']'
++ dirname /tmp/arrow-0.16.0.j9Uct/test-miniconda/bin/conda
+ SYSP=/tmp/arrow-0.16.0.j9Uct/test-miniconda/bin
++ dirname /tmp/arrow-0.16.0.j9Uct/test-miniconda/bin
+ SYSP=/tmp/arrow-0.16.0.j9Uct/test-miniconda
+ '[' -n '' ']'
+ 
PATH=/tmp/arrow-0.16.0.j9Uct/test-miniconda/bin:/tmp/arrow-0.16.0.j9Uct/test-miniconda/envs/_verify_wheel-2.7mu/bin:/tmp/arrow-0.16.0.j9Uct/test-miniconda/condabin:/home/kou/work/go/bin:/bin:/home/kou/local/bin:/home/kou/.config/composer/vendor/bin:/var/lib/gems/2.5.0/bin:/usr/local/bin:/usr/bin:/usr/games
+ export PATH
+ /tmp/arrow-0.16.0.j9Uct/test-miniconda/bin/conda install -y --file 
/home/kou/work/cpp/arrow.kou/ci/conda_env_python.yml pandas
Collecting package metadata (current_repodata.json): ...working... done
Solving environment: ...working... failed with initial frozen solve. Retrying 
with flexible solve.
Collecting package metadata (repodata.json): ...working... done
Solving environment: ...working... failed with initial frozen solve. Retrying 
with flexible solve.

PackagesNotFoundError: The following packages are not available from current 
channels:

  - pytest-faulthandler
  - pytest-lazy-fixture

Current channels:

  - https://repo.anaconda.com/pkgs/main/linux-64
  - https://repo.anaconda.com/pkgs/main/noarch
  - https://repo.anaconda.com/pkgs/r/linux-64
  - https://repo.anaconda.com/pkgs/r/noarch

To search for alternate channels that may provide the conda package you're
looking for, navigate to

    https://anaconda.org

and use the search bar at the top of the page.
----

Thanks,
--
kou

In <cahm19a4jyxuoxqsfkcwkusrbhzsgbpoq5yok6skfrtnyhsn...@mail.gmail.com>
  "Re: [VOTE] Release Apache Arrow 0.16.0 - RC1" on Thu, 30 Jan 2020 21:52:39 
+0100,
  Krisztián Szűcs <szucs.kriszt...@gmail.com> wrote:

> Hi,
> 
> RC2 is in progress. The source is uploaded, I'm waiting for the binaries.
> Thanks everyone for the help!
> 
> - Krisztian
> 
> On Thu, Jan 30, 2020 at 12:50 AM Krisztián Szűcs
> <szucs.kriszt...@gmail.com> wrote:
>>
>> Let's try to fix it, then we can cut RC2 afterwards
>>
>> Thanks for your help!
>>
>> On Wed, Jan 29, 2020 at 10:46 PM Wes McKinney <wesmck...@gmail.com> wrote:
>> >
>> > I just commented on the issue. Seems likely to be fallout from
>> > ARROW-3789 and so I think we should fix it
>> >
>> > On Wed, Jan 29, 2020 at 3:19 PM Bryan Cutler <cutl...@gmail.com> wrote:
>> > >
>> > > An update on Spark integration tests: the new error looks to be a
>> > > regression so I made https://issues.apache.org/jira/browse/ARROW-7723 and
>> > > marked as a blocker. It's possible to work around this bug, so I wouldn't
>> > > call it a hard blocker if we need to proceed with the release.
>> > >
>> > > On Wed, Jan 29, 2020 at 7:45 AM Neal Richardson 
>> > > <neal.p.richard...@gmail.com>
>> > > wrote:
>> > >
>> > > > The place where the segfault is triggered in the R nightlies is a 
>> > > > couple of
>> > > > tests after the one I added in that patch. If that patch is causing the
>> > > > segfaults, we can skip the new test (
>> > > >
>> > > > https://github.com/apache/arrow/blob/master/r/tests/testthat/test-parquet.R#L125
>> > > > )
>> > > > and investigate later. The patch is exercising previously existing
>> > > > codepaths that were not tested, so I don't think that identifying and
>> > > > fixing the segfault should be release blocking (though we should 
>> > > > clearly
>> > > > fix it).
>> > > >
>> > > > Neal
>> > > >
>> > > >
>> > > >
>> > > > On Wed, Jan 29, 2020 at 7:33 AM David Li <li.david...@gmail.com> wrote:
>> > > >
>> > > > > The Flight leak should be unrelated to that commit, the failing test
>> > > > > already existed before - it's a flaky test
>> > > > > https://issues.apache.org/jira/browse/ARROW-7721.
>> > > > >
>> > > > > I'm hoping to look at the issue this week but we might just want to
>> > > > > disable the test for now.
>> > > > >
>> > > > > David
>> > > > >
>> > > > > On 1/29/20, Krisztián Szűcs <szucs.kriszt...@gmail.com> wrote:
>> > > > > > Hi,
>> > > > > >
>> > > > > > - The fuzzit builds has been disabled by Neal on the current 
>> > > > > > master.
>> > > > > > - Created a PR to resolve occasionally failing python dataset 
>> > > > > > tests [1]
>> > > > > > - Merged the fix for C# timezone error [2]
>> > > > > > - Merged various fixes for the release scripts.
>> > > > > > - The nightly Gandiva OS X build is still failing, but because of a
>> > > > > travis
>> > > > > >   deployment timeout, which shouldn't block the release.
>> > > > > >
>> > > > > > We still have failing tests:
>> > > > > > - failing Spark integration test
>> > > > > > - failing nightly R builds (see the 2020-01-29 nightly report)
>> > > > > > - master reports a Java flight memory leak
>> > > > > >
>> > > > > > Spark:
>> > > > > > Joris created a fix for the immediate issue [3], but now we have a
>> > > > > > different
>> > > > > > spark test error, see the discussion in the PR [3].
>> > > > > > I put up a PR [6] to check the regressions 0.16 arrow release would
>> > > > > > introduce
>> > > > > > for spark interoperability, and it turns out that arrow 0.15.1 is 
>> > > > > > not
>> > > > > > compatible
>> > > > > > with neither spark 2.4.4 nor spark 2.4.5-rc1, so 0.16 arrow release
>> > > > could
>> > > > > > only be compatible with spark 3.0 or spark master which we have 
>> > > > > > tests
>> > > > > for.
>> > > > > > So I'm a bit confused how to interpret arrow backward compatibility
>> > > > with
>> > > > > > Spark, thus what should and what should not block the release.
>> > > > > > Either way we'll need to fix the remaining spark issues and add 
>> > > > > > nightly
>> > > > > > spark
>> > > > > > integration tests for both the next spark release and spark master.
>> > > > > >
>> > > > > > R:
>> > > > > > There is the same segfault in each R nightly builds [4]. There was 
>> > > > > > a
>> > > > > single
>> > > > > > change [5] which could introduce the regression compared to the
>> > > > previous
>> > > > > > builds.
>> > > > > > I've tried to reproduce the builds using docker-compose, but 
>> > > > > > locally
>> > > > > > 3.6-bionic
>> > > > > > has passed for me. I'm trying to wipe my local cache and rerun to 
>> > > > > > see
>> > > > > > whether I can reproduce it.
>> > > > > >
>> > > > > > Java/Flight leak:
>> > > > > > The current master reports memory leak [6] which I guess is 
>> > > > > > surfaced by
>> > > > > > change [7]
>> > > > > >
>> > > > > > If we manage to fix the issues above today than I can cut RC2 
>> > > > > > tomorrow.
>> > > > > >
>> > > > > > Thanks, Krisztian
>> > > > > >
>> > > > > > [1]: https://github.com/apache/arrow/pull/6319
>> > > > > > [2]: https://github.com/apache/arrow/pull/6309
>> > > > > > [3]: https://github.com/apache/arrow/pull/6312
>> > > > > > [4]: 
>> > > > > > https://github.com/ursa-labs/crossbow/branches/all?query=r-base
>> > > > > > [5]:
>> > > > > >
>> > > > >
>> > > > https://github.com/apache/arrow/commit/8b7911b086d120359e2000fbedb0c38c0f13f683
>> > > > > > [6]: https://github.com/apache/arrow/runs/415037585#step:5:1533
>> > > > > > [7]:
>> > > > > >
>> > > > >
>> > > > https://github.com/apache/arrow/commit/8b42288f58caa84a40bb7a13c1731ff919c934f2
>> > > > > >
>> > > > > > On Wed, Jan 29, 2020 at 11:06 AM Sutou Kouhei <k...@clear-code.com>
>> > > > > wrote:
>> > > > > >>
>> > > > > >> Hi,
>> > > > > >>
>> > > > > >> > Thank you.  After the C# download fix, I have the following C# 
>> > > > > >> > test
>> > > > > >> > failure:
>> > > > > >> > https://gist.github.com/pitrou/d82ed1ff80db43b63f0c3d5e5f2474a4
>> > > > > >>
>> > > > > >> https://github.com/apache/arrow/pull/6309
>> > > > > >> will fix it.
>> > > > > >>
>> > > > > >> I think that this is a test problem, not an implementation
>> > > > > >> problem.
>> > > > > >>
>> > > > > >>
>> > > > > >> Workaround:
>> > > > > >>
>> > > > > >>   TZ=UTC dev/release/verify-release-candidate.sh ...
>> > > > > >>
>> > > > > >>
>> > > > > >> Thanks,
>> > > > > >> --
>> > > > > >> kou
>> > > > > >>
>> > > > > >> In <2bd07a17-600b-7f49-3ea1-a0b1acc91...@python.org>
>> > > > > >>   "Re: [VOTE] Release Apache Arrow 0.16.0 - RC1" on Wed, 29 Jan 
>> > > > > >> 2020
>> > > > > >> 10:11:46 +0100,
>> > > > > >>   Antoine Pitrou <anto...@python.org> wrote:
>> > > > > >>
>> > > > > >> >
>> > > > > >> > Thank you.  After the C# download fix, I have the following C# 
>> > > > > >> > test
>> > > > > >> > failure:
>> > > > > >> > https://gist.github.com/pitrou/d82ed1ff80db43b63f0c3d5e5f2474a4
>> > > > > >> >
>> > > > > >> > Regards
>> > > > > >> >
>> > > > > >> > Antoine.
>> > > > > >> >
>> > > > > >> >
>> > > > > >> >
>> > > > > >> > Le 29/01/2020 à 00:42, Sutou Kouhei a écrit :
>> > > > > >> >> Hi,
>> > > > > >> >>
>> > > > > >> >>> Source verification succeeded on Java and C++ and then failed
>> > > > > >> >>> downloading some C# thing:
>> > > > > >> >>> https://gist.github.com/pitrou/5c4a98387153ef415ef64b8aa2457e63
>> > > > > >> >>
>> > > > > >> >> I've created a pull request to fix it:
>> > > > > >> >> https://github.com/apache/arrow/pull/6307
>> > > > > >> >>
>> > > > > >> >>
>> > > > > >> >> Thanks,
>> > > > > >> >> --
>> > > > > >> >> kou
>> > > > > >
>> > > > >
>> > > >

Reply via email to