I tested with Maven and `-Pyarn -Phadoop-2.7 -Pkinesis-asl -Phive
-Phive-thriftserver` on CentOS/JDK8.

The difference seems to be `-Pmesos -Psparkr` from your and `-Pkinesis-asl`
from mine.

Do you think it's related? BTW, at least, we have a green balls on Jenkins.

https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-branch-2.2-test-maven-hadoop-2.7/591/


On Wed, Jan 9, 2019 at 3:37 PM Sean Owen <sro...@apache.org> wrote:

> BTW did you run with the same profiles, I wonder; I test with,
> generally, -Pyarn -Phadoop-2.7 -Phive -Phive-thriftserver -Pmesos
> -Psparkr
>
> I am checking mostly because none of that weird error would happen
> without testing hive-thriftserver.
>
> The others are probably just flakiness or something else odd, and I'd
> look past them if others are not seeing them.
>
> The licenses and signatures looked fine, and it built correctly, at least.
>
> On Wed, Jan 9, 2019 at 5:09 PM Dongjoon Hyun <dongjoon.h...@gmail.com>
> wrote:
> >
> > Hi, Sean.
> >
> > It looks strange. I didn't hit them. I'm not sure but it looks like some
> flakiness at 2.2.x era.
> > For me, those test passes. (I ran twice before starting a vote and
> during this voting from the source tar file)
> >
> > Bests,
> > Dongjoon
> >
> > On Wed, Jan 9, 2019 at 1:42 PM Sean Owen <sro...@apache.org> wrote:
> >>
> >> I wonder if anyone else is seeing the following issues, or whether
> >> it's specific to my environment:
> >>
> >> With -Phive-thriftserver, it compiles fine. However during tests, I get
> ...
> >> [error]
> /home/ubuntu/spark-2.2.3/sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/thrift/ThriftCLIService.java:64:
> >> error: package org.eclipse.jetty.server does not exist
> >> [error]   protected org.eclipse.jetty.server.Server httpServer;
> >> [error]                                     ^
> >>
> >> That's weird. I'd have to dig into the POM to see if this dependency
> >> for some reason would not be available at test time. But does this
> >> profile pass for anyone else?
> >>
> >> I'm also seeing test failures like the following. Yes, there's more,
> >> just seeing if anyone sees these?
> >>
> >> - event ordering *** FAILED ***
> >>   The code passed to failAfter did not complete within 10 seconds.
> >> (StreamingQueryListenerSuite.scala:411)
> >>
> >> - HDFSMetadataLog: metadata directory collision *** FAILED ***
> >>   The await method on Waiter timed out. (HDFSMetadataLogSuite.scala:201)
> >>
> >> - recovery *** FAILED ***
> >>   == Results ==
> >>   !== Correct Answer - 1 ==   == Spark Answer - 0 ==
> >>   !struct<_1:int,_2:int>      struct<>
> >>   ![10,5]
> >>
> >>
> >>
> >> On Tue, Jan 8, 2019 at 1:14 PM Dongjoon Hyun <dongjoon.h...@gmail.com>
> wrote:
> >> >
> >> > Please vote on releasing the following candidate as Apache Spark
> version 2.2.3.
> >> >
> >> > The vote is open until January 11 11:30AM (PST) and passes if a
> majority +1 PMC votes are cast, with
> >> > a minimum of 3 +1 votes.
> >> >
> >> > [ ] +1 Release this package as Apache Spark 2.2.3
> >> > [ ] -1 Do not release this package because ...
> >> >
> >> > To learn more about Apache Spark, please see http://spark.apache.org/
> >> >
> >> > The tag to be voted on is v2.2.3-rc1 (commit
> 4acb6ba37b94b90aac445e6546426145a5f9eba2):
> >> > https://github.com/apache/spark/tree/v2.2.3-rc1
> >> >
> >> > The release files, including signatures, digests, etc. can be found
> at:
> >> > https://dist.apache.org/repos/dist/dev/spark/v2.2.3-rc1-bin/
> >> >
> >> > Signatures used for Spark RCs can be found in this file:
> >> > https://dist.apache.org/repos/dist/dev/spark/KEYS
> >> >
> >> > The staging repository for this release can be found at:
> >> >
> https://repository.apache.org/content/repositories/orgapachespark-1295
> >> >
> >> > The documentation corresponding to this release can be found at:
> >> > https://dist.apache.org/repos/dist/dev/spark/v2.2.3-rc1-docs/
> >> >
> >> > The list of bug fixes going into 2.2.3 can be found at the following
> URL:
> >> > https://issues.apache.org/jira/projects/SPARK/versions/12343560
> >> >
> >> > FAQ
> >> >
> >> > =========================
> >> > How can I help test this release?
> >> > =========================
> >> >
> >> > If you are a Spark user, you can help us test this release by taking
> >> > an existing Spark workload and running on this release candidate, then
> >> > reporting any regressions.
> >> >
> >> > If you're working in PySpark you can set up a virtual env and install
> >> > the current RC and see if anything important breaks, in the Java/Scala
> >> > you can add the staging repository to your projects resolvers and test
> >> > with the RC (make sure to clean up the artifact cache before/after so
> >> > you don't end up building with a out of date RC going forward).
> >> >
> >> > ===========================================
> >> > What should happen to JIRA tickets still targeting 2.2.3?
> >> > ===========================================
> >> >
> >> > The current list of open tickets targeted at 2.2.3 can be found at:
> >> > https://issues.apache.org/jira/projects/SPARK and search for "Target
> Version/s" = 2.2.3
> >> >
> >> > Committers should look at those and triage. Extremely important bug
> >> > fixes, documentation, and API tweaks that impact compatibility should
> >> > be worked on immediately. Everything else please retarget to an
> >> > appropriate release.
> >> >
> >> > ==================
> >> > But my bug isn't fixed?
> >> > ==================
> >> >
> >> > In order to make timely releases, we will typically not hold the
> >> > release unless the bug in question is a regression from the previous
> >> > release. That being said, if there is something which is a regression
> >> > that has not been correctly targeted please ping me or a committer to
> >> > help target the issue.
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >>
>

Reply via email to