This is https://issues.apache.org/jira/browse/SPARK-20201.
On Fri, Nov 17, 2017 at 8:51 AM, Felix Cheung <felixche...@apache.org> wrote: > I wasn’t able to test this out. > > Is anyone else seeing this error? I see a few JVM fixes and getting back > ported, are they related to this? > > This issue seems important to hold any update until we know more. > > On Wed, Nov 15, 2017 at 7:01 PM Sean Owen <so...@cloudera.com> wrote: >> >> The signature is fine, with your new sig. Updated hashes look fine too. >> LICENSE is still fine to my knowledge. >> >> Is anyone else seeing this failure? >> >> - GenerateOrdering with ShortType >> *** RUN ABORTED *** >> java.lang.StackOverflowError: >> at org.codehaus.janino.CodeContext.flowAnalysis(CodeContext.java:370) >> at org.codehaus.janino.CodeContext.flowAnalysis(CodeContext.java:541) >> at org.codehaus.janino.CodeContext.flowAnalysis(CodeContext.java:541) >> at org.codehaus.janino.CodeContext.flowAnalysis(CodeContext.java:541) >> at org.codehaus.janino.CodeContext.flowAnalysis(CodeContext.java:541) >> at org.codehaus.janino.CodeContext.flowAnalysis(CodeContext.java:541) >> at org.codehaus.janino.CodeContext.flowAnalysis(CodeContext.java:541) >> at org.codehaus.janino.CodeContext.flowAnalysis(CodeContext.java:541) >> >> This looks like SPARK-16845 again; see >> https://issues.apache.org/jira/browse/SPARK-16845?focusedCommentId=16018840&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-16018840 >> >> >> On Wed, Nov 15, 2017 at 12:25 AM Felix Cheung <felixche...@apache.org> >> wrote: >> >> Please vote on releasing the following candidate as Apache Spark version >> 2.2.1. The vote is open until Monday November 20, 2017 at 23:00 UTC and >> passes if a majority of at least 3 PMC +1 votes are cast. >> >> >> >> >> [ ] +1 Release this package as Apache Spark 2.2.1 >> >> >> [ ] -1 Do not release this package because ... >> >> >> >> >> To learn more about Apache Spark, please see https://spark.apache.org/ >> >> >> >> >> The tag to be voted on is v2.2.1-rc1 >> https://github.com/apache/spark/tree/v2.2.1-rc1 >> (41116ab7fca46db7255b01e8727e2e5d571a3e35) >> >> >> List of JIRA tickets resolved in this release can be found here >> https://issues.apache.org/jira/projects/SPARK/versions/12340470 >> >> >> >> >> The release files, including signatures, digests, etc. can be found at: >> https://dist.apache.org/repos/dist/dev/spark/spark-2.2.1-rc1-bin/ >> >> >> Release artifacts are signed with the following key: >> https://dist.apache.org/repos/dist/dev/spark/KEYS >> >> >> The staging repository for this release can be found at: >> https://repository.apache.org/content/repositories/orgapachespark-1256/ >> >> >> The documentation corresponding to this release can be found at: >> >> https://dist.apache.org/repos/dist/dev/spark/spark-2.2.1-rc1-docs/_site/index.html >> >> >> >> >> FAQ >> >> >> How can I help test this release? >> >> >> If you are a Spark user, you can help us test this release by taking an >> existing Spark workload and running on this release candidate, then >> reporting any regressions. >> >> >> If you're working in PySpark you can set up a virtual env and install the >> current RC and see if anything important breaks, in the Java/Scala you can >> add the staging repository to your projects resolvers and test with the RC >> (make sure to clean up the artifact cache before/after so you don't end up >> building with a out of date RC going forward). >> >> >> What should happen to JIRA tickets still targeting 2.2.1? >> >> >> Committers should look at those and triage. Extremely important bug fixes, >> documentation, and API tweaks that impact compatibility should be worked on >> immediately. Everything else please retarget to 2.2.2. >> >> >> But my bug isn't fixed!??! >> >> >> In order to make timely releases, we will typically not hold the release >> unless the bug in question is a regression from 2.2.0. That being said if >> there is something which is a regression form 2.2.0 that has not been >> correctly targeted please ping a committer to help target the issue (you can >> see the open issues listed as impacting Spark 2.2.1 / 2.2.2 here. >> >> >> What are the unresolved issues targeted for 2.2.1? >> >> >> At the time of the writing, there is one resolved SPARK-22471 would help >> stability, and one in progress on joins SPARK-22042 >> >> >> >> > -- Marcelo --------------------------------------------------------------------- To unsubscribe e-mail: dev-unsubscr...@spark.apache.org