[
https://issues.apache.org/jira/browse/SPARK-17155?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Hyukjin Kwon updated SPARK-17155:
---------------------------------
Labels: bulk-closed (was: )
> usage of a Dataset inside a Future throws MissingRequirementError
> -----------------------------------------------------------------
>
> Key: SPARK-17155
> URL: https://issues.apache.org/jira/browse/SPARK-17155
> Project: Spark
> Issue Type: Bug
> Components: Spark Core, SQL
> Affects Versions: 1.6.1
> Reporter: Mikael Valot
> Priority: Major
> Labels: bulk-closed
>
> The following code throws an exception in the DSE (Datastax enterprise) spark
> shell:
> {code}
> dse spark --master=local[2]
> {code}
> {code:java}
> case class A(i1: Int, i2: Int)
> import scala.concurrent.ExecutionContext.Implicits.global
> import scala.concurrent.Future
> import scala.concurrent.Await
> import scala.concurrent.duration.Duration
> import sqlContext.implicits._
> import org.apache.spark.sql.functions._
> val fut = Future{ Seq(A(1, 2)).toDS() }
> Await.result(fut, Duration.Inf).show()
> {code}
> {code}
> scala.reflect.internal.MissingRequirementError: object $line8.$read not found.
> at
> scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
> at
> scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
> at
> scala.reflect.internal.Mirrors$RootsBase.ensureModuleSymbol(Mirrors.scala:126)
> at
> scala.reflect.internal.Mirrors$RootsBase.staticModule(Mirrors.scala:161)
> at
> scala.reflect.internal.Mirrors$RootsBase.staticModule(Mirrors.scala:21)
> at
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$anonfun$1$$typecreator1$1.apply(<console>:70)
> at
> scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe$lzycompute(TypeTags.scala:231)
> at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe(TypeTags.scala:231)
> at
> org.apache.spark.sql.catalyst.ScalaReflection$class.localTypeOf(ScalaReflection.scala:654)
> at
> org.apache.spark.sql.catalyst.ScalaReflection$.localTypeOf(ScalaReflection.scala:30)
> at
> org.apache.spark.sql.catalyst.ScalaReflection$.dataTypeFor(ScalaReflection.scala:52)
> at
> org.apache.spark.sql.catalyst.encoders.ExpressionEncoder$.apply(ExpressionEncoder.scala:53)
> at
> org.apache.spark.sql.SQLImplicits.newProductEncoder(SQLImplicits.scala:41)
> at
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$anonfun$1.apply(<console>:70)
> at
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$anonfun$1.apply(<console>:70)
> at
> scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
> at
> scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
> at
> scala.concurrent.impl.ExecutionContextImpl$$anon$3.exec(ExecutionContextImpl.scala:107)
> at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> at
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
> at
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> at
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
> {code}
> It looks like a different ClassLoader is involved and cannot load my case
> class.
> However it works fine with a Tuple, and it works fine with the standalone
> version of Spark.
> {code:java}
> val fut = Future{ Seq((1, 2)).toDS() }
> Await.result(fut, Duration.Inf).show()
> +---+---+
>
> | _1| _2|
> +---+---+
> | 1| 2|
> +---+---+
> {code}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]