I’ve tried a couple of times. The latest test run took 12 hr+

1 aborted suite:
00:53:25.769 WARN org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite: 
Failed to download Spark 2.3.2 from 
http://mirrors.koehn.com/apache//spark/spark-2.3.2/spark-2.3.2-bin-hadoop2.7.tgz:
 Error writing to server
00:53:25.812 WARN org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite: 
Failed to download Spark 2.3.2 from 
http://mirror.cc.columbia.edu/pub/software/apache//spark/spark-2.3.2/spark-2.3.2-bin-hadoop2.7.tgz:
 Error writing to server
00:53:25.838 WARN org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite: 
Failed to download Spark 2.3.2 from 
https://archive.apache.org/dist/spark/spark-2.3.2/spark-2.3.2-bin-hadoop2.7.tgz:
 Socket closed

org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite *** ABORTED ***
Exception encountered when invoking run on a nested suite - Unable to download 
Spark 2.3.2 (HiveExternalCatalogVersionsSuite.scala:97)

And then it stopped. I checked this morning the archive link should be valid. 
Try to see if I can try again/resume from it.


________________________________
From: Takeshi Yamamuro <linguin....@gmail.com>
Sent: Sunday, January 20, 2019 6:45 PM
To: Sean Owen
Cc: Spark dev list
Subject: Re: [VOTE] Release Apache Spark 2.3.3 (RC1)

Oh, sorry for that and I misunderstood the Apache release policy.
Yea, its ok to keep the RC1 voting.

Best,
Takeshi

On Mon, Jan 21, 2019 at 11:07 AM Sean Owen 
<sro...@gmail.com<mailto:sro...@gmail.com>> wrote:
OK, if it passes tests, I'm +1 on the release.
Can anyone else verify the tests pass?

What is the reason for a new RC? I didn't see any other issues reported.

On Sun, Jan 20, 2019 at 8:03 PM Takeshi Yamamuro 
<linguin....@gmail.com<mailto:linguin....@gmail.com>> wrote:
>
> Hi, all
>
> Thanks for the checks, Sean and Felix.
> I'll start the next vote as RC2 this Tuesday noon (PST).
>
> > Sean
> I re-run JavaTfIdfSuite on my env and it passed.
> I used `-Pyarn -Phadoop-2.7 -Phive -Phive-thriftserver -Pmesos -Psparkr` and
> run the tests on a EC2 instance below (I launched the new instance for the 
> tests);
> ----
> $ cat /etc/os-release
> NAME="Amazon Linux"
> VERSION="2"
> ID="amzn"
> ID_LIKE="centos rhel fedora"
> VERSION_ID="2"
> PRETTY_NAME="Amazon Linux 2"
> ANSI_COLOR="0;33"
> CPE_NAME="cpe:2.3:o:amazon:amazon_linux:2"
> HOME_URL="https://amazonlinux.com/";
> $ java -version
> openjdk version "1.8.0_191"
> OpenJDK Runtime Environment (build 1.8.0_191-b12)
> OpenJDK 64-Bit Server VM (build 25.191-b12, mixed mode)
>
>
>
>
> On Mon, Jan 21, 2019 at 9:53 AM Felix Cheung 
> <felixcheun...@hotmail.com<mailto:felixcheun...@hotmail.com>> wrote:
>>
>> +1
>>
>> My focus is on R (sorry couldn’t cross validate what’s Sean is seeing)
>>
>> tested:
>> reviewed doc
>> R package test
>> win-builder, r-hub
>> Tarball/package signature
>>
>>
>>
>> ________________________________
>> From: Takeshi Yamamuro <linguin....@gmail.com<mailto:linguin....@gmail.com>>
>> Sent: Thursday, January 17, 2019 6:49 PM
>> To: Spark dev list
>> Subject: [VOTE] Release Apache Spark 2.3.3 (RC1)
>>
>> Please vote on releasing the following candidate as Apache Spark version 
>> 2.3.3.
>>
>> The vote is open until January 20 8:00PM (PST) and passes if a majority +1 
>> PMC votes are cast, with
>> a minimum of 3 +1 votes.
>>
>> [ ] +1 Release this package as Apache Spark 2.3.3
>> [ ] -1 Do not release this package because ...
>>
>> To learn more about Apache Spark, please see http://spark.apache.org/
>>
>> The tag to be voted on is v2.3.3-rc1 (commit 
>> b5ea9330e3072e99841270b10dc1d2248127064b):
>> https://github.com/apache/spark/tree/v2.3.3-rc1
>>
>> The release files, including signatures, digests, etc. can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v2.3.3-rc1-bin/
>>
>> Signatures used for Spark RCs can be found in this file:
>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1297
>>
>> The documentation corresponding to this release can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v2.3.3-rc1-docs/
>>
>> The list of bug fixes going into 2.3.3 can be found at the following URL:
>> https://issues.apache.org/jira/projects/SPARK/versions/12343759
>>
>> FAQ
>>
>> =========================
>> How can I help test this release?
>> =========================
>>
>> If you are a Spark user, you can help us test this release by taking
>> an existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install
>> the current RC and see if anything important breaks, in the Java/Scala
>> you can add the staging repository to your projects resolvers and test
>> with the RC (make sure to clean up the artifact cache before/after so
>> you don't end up building with a out of date RC going forward).
>>
>> ===========================================
>> What should happen to JIRA tickets still targeting 2.3.3?
>> ===========================================
>>
>> The current list of open tickets targeted at 2.3.3 can be found at:
>> https://issues.apache.org/jira/projects/SPARK and search for "Target 
>> Version/s" = 2.3.3
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should
>> be worked on immediately. Everything else please retarget to an
>> appropriate release.
>>
>> ==================
>> But my bug isn't fixed?
>> ==================
>>
>> In order to make timely releases, we will typically not hold the
>> release unless the bug in question is a regression from the previous
>> release. That being said, if there is something which is a regression
>> that has not been correctly targeted please ping me or a committer to
>> help target the issue.
>>
>> --
>> ---
>> Takeshi Yamamuro
>
>
>
> --
> ---
> Takeshi Yamamuro


--
---
Takeshi Yamamuro

Reply via email to