[ 
https://issues.apache.org/jira/browse/SPARK-37135?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-37135:
------------------------------------

    Assignee:     (was: Apache Spark)

> Fix some mirco-benchmarks run failed 
> -------------------------------------
>
>                 Key: SPARK-37135
>                 URL: https://issues.apache.org/jira/browse/SPARK-37135
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core, SQL
>    Affects Versions: 3.3.0
>            Reporter: Yang Jie
>            Priority: Minor
>
> 2 mirco-benchmarks run failed:
>  
> org.apache.spark.serializer.KryoSerializerBenchmark
> {code:java}
> Running org.apache.spark.serializer.KryoSerializerBenchmark:Running 
> org.apache.spark.serializer.KryoSerializerBenchmark:Running benchmark: 
> Benchmark KryoPool vs old"pool of 1" implementation  Running case: 
> KryoPool:true21/10/27 16:09:26 ERROR SparkContext: Error initializing 
> SparkContext.java.lang.AssertionError: assertion failed: spark.test.home is 
> not set! at scala.Predef$.assert(Predef.scala:223) at 
> org.apache.spark.deploy.worker.Worker.<init>(Worker.scala:148) at 
> org.apache.spark.deploy.worker.Worker$.startRpcEnvAndEndpoint(Worker.scala:954)
>  at 
> org.apache.spark.deploy.LocalSparkCluster.$anonfun$start$2(LocalSparkCluster.scala:71)
>  at 
> org.apache.spark.deploy.LocalSparkCluster.$anonfun$start$2$adapted(LocalSparkCluster.scala:65)
>  at scala.collection.immutable.Range.foreach(Range.scala:158) at 
> org.apache.spark.deploy.LocalSparkCluster.start(LocalSparkCluster.scala:65) 
> at 
> org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2971)
>  at org.apache.spark.SparkContext.<init>(SparkContext.scala:562) at 
> org.apache.spark.SparkContext.<init>(SparkContext.scala:138) at 
> org.apache.spark.serializer.KryoSerializerBenchmark$.createSparkContext(KryoSerializerBenchmark.scala:86)
>  at 
> org.apache.spark.serializer.KryoSerializerBenchmark$.sc$lzycompute$1(KryoSerializerBenchmark.scala:58)
>  at 
> org.apache.spark.serializer.KryoSerializerBenchmark$.sc$1(KryoSerializerBenchmark.scala:58)
>  at 
> org.apache.spark.serializer.KryoSerializerBenchmark$.$anonfun$run$3(KryoSerializerBenchmark.scala:63)
>  at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23) at 
> scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659) at 
> scala.util.Success.$anonfun$map$1(Try.scala:255) at 
> scala.util.Success.map(Try.scala:213) at 
> scala.concurrent.Future.$anonfun$map$1(Future.scala:292) at 
> scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33) at 
> scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) at 
> scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at 
> java.base/java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1426)
>  at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290) 
> at 
> java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
>  at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656) 
> at 
> java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594) 
> at 
> java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183){code}
> org.apache.spark.sql.execution.benchmark.DateTimeBenchmark
> {code:java}
> Exception in thread "main" 
> org.apache.spark.sql.catalyst.parser.ParseException: Exception in thread 
> "main" org.apache.spark.sql.catalyst.parser.ParseException: Cannot mix 
> year-month and day-time fields: interval 1 month 2 day(line 1, pos 38)
> == SQL ==cast(timestamp_seconds(id) as date) + interval 1 month 2 
> day--------------------------------------^^^
>  at 
> org.apache.spark.sql.errors.QueryParsingErrors$.mixedIntervalUnitsError(QueryParsingErrors.scala:214)
>  at 
> org.apache.spark.sql.catalyst.parser.AstBuilder.constructMultiUnitsIntervalLiteral(AstBuilder.scala:2435)
>  at 
> org.apache.spark.sql.catalyst.parser.AstBuilder.$anonfun$visitInterval$1(AstBuilder.scala:2479)
>  at 
> org.apache.spark.sql.catalyst.parser.ParserUtils$.withOrigin(ParserUtils.scala:133)
>  at 
> org.apache.spark.sql.catalyst.parser.AstBuilder.visitInterval(AstBuilder.scala:2454)
>  at 
> org.apache.spark.sql.catalyst.parser.AstBuilder.visitInterval(AstBuilder.scala:57)
>  at 
> org.apache.spark.sql.catalyst.parser.SqlBaseParser$IntervalContext.accept(SqlBaseParser.java:17681)
> {code}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to