[ https://issues.apache.org/jira/browse/SPARK-4814?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14340275#comment-14340275 ]
Sean Owen commented on SPARK-4814: ---------------------------------- General question on back-ports: at this stage, is it realistic to expect another 0.9.x release? 1.0.x? 1.1.x? That is I'm wondering if something marked for back-port for 0.9, 1.0, 1.1 can be simply resolved now or not. > Enable assertions in SBT, Maven tests / AssertionError from Hive's > LazyBinaryInteger > ------------------------------------------------------------------------------------ > > Key: SPARK-4814 > URL: https://issues.apache.org/jira/browse/SPARK-4814 > Project: Spark > Issue Type: Bug > Components: Spark Core, SQL > Affects Versions: 1.1.0 > Reporter: Sean Owen > Assignee: Sean Owen > Labels: backport-needed > Fix For: 1.3.0, 1.1.2, 1.2.2 > > > Follow up to SPARK-4159, wherein we noticed that Java tests weren't running > in Maven, in part because a Java test actually fails with {{AssertionError}}. > That code/test was fixed in SPARK-4850. > The reason it wasn't caught by SBT tests was that they don't run with > assertions on, and Maven's surefire does. > Turning on assertions in the SBT build is trivial, adding one line: > {code} > javaOptions in Test += "-ea", > {code} > This reveals a test failure in Scala test suites though: > {code} > [info] - alter_merge_2 *** FAILED *** (1 second, 305 milliseconds) > [info] Failed to execute query using catalyst: > [info] Error: Job aborted due to stage failure: Task 1 in stage 551.0 > failed 1 times, most recent failure: Lost task 1.0 in stage 551.0 (TID 1532, > localhost): java.lang.AssertionError > [info] at > org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryInteger.init(LazyBinaryInteger.java:51) > [info] at > org.apache.hadoop.hive.serde2.columnar.ColumnarStructBase$FieldInfo.uncheckedGetField(ColumnarStructBase.java:110) > [info] at > org.apache.hadoop.hive.serde2.columnar.ColumnarStructBase.getField(ColumnarStructBase.java:171) > [info] at > org.apache.hadoop.hive.serde2.objectinspector.ColumnarStructObjectInspector.getStructFieldData(ColumnarStructObjectInspector.java:166) > [info] at > org.apache.spark.sql.hive.HadoopTableReader$$anonfun$fillObject$1.apply(TableReader.scala:318) > [info] at > org.apache.spark.sql.hive.HadoopTableReader$$anonfun$fillObject$1.apply(TableReader.scala:314) > [info] at scala.collection.Iterator$$anon$11.next(Iterator.scala:328) > [info] at > org.apache.spark.sql.execution.Aggregate$$anonfun$execute$1$$anonfun$6.apply(Aggregate.scala:132) > [info] at > org.apache.spark.sql.execution.Aggregate$$anonfun$execute$1$$anonfun$6.apply(Aggregate.scala:128) > [info] at org.apache.spark.rdd.RDD$$anonfun$13.apply(RDD.scala:615) > [info] at org.apache.spark.rdd.RDD$$anonfun$13.apply(RDD.scala:615) > [info] at > org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35) > [info] at > org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:264) > [info] at org.apache.spark.rdd.RDD.iterator(RDD.scala:231) > [info] at > org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35) > [info] at > org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:264) > [info] at org.apache.spark.rdd.RDD.iterator(RDD.scala:231) > [info] at > org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:68) > [info] at > org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41) > [info] at org.apache.spark.scheduler.Task.run(Task.scala:56) > [info] at > org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:195) > [info] at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) > [info] at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) > [info] at java.lang.Thread.run(Thread.java:745) > {code} > The items for this JIRA are therefore: > - Enable assertions in SBT > - Fix this failure > - Figure out why Maven scalatest didn't trigger it - may need assertions > explicitly turned on too. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org