Was a solution ever found for this. Trying to run some test cases with sbt
test which use spark sql and in Spark 1.3.0 release with Scala 2.11.6 I get
this error. Setting fork := true in sbt seems to work but its a less than
idea work around.
On Tue, Mar 17, 2015 at 9:37 PM, Eric Charles wrote:
Launching from eclipse (scala-ide) as a scala process gives such error,
but as a java process (a java main class) works fine.
Launching as a scala process from Intellij works fine.
There is something wrong at eclipse side, not in Spark.
On 03/13/2015 11:47 AM, Jianshi Huang wrote:
> Liancheng
Liancheng also found out that the Spark jars are not included in the
classpath of URLClassLoader.
Hmm... we're very close to the truth now.
Jianshi
On Fri, Mar 13, 2015 at 6:03 PM, Jianshi Huang
wrote:
> I'm almost certain the problem is the ClassLoader.
>
> So adding
>
> fork := true
>
> so
I'm almost certain the problem is the ClassLoader.
So adding
fork := true
solves problems for test and run.
The problem is how can I fork a JVM for sbt console? fork in console :=
true seems not working...
Jianshi
On Fri, Mar 13, 2015 at 4:35 PM, Jianshi Huang
wrote:
> I guess it's a Cla
I guess it's a ClassLoader issue. But I have no idea how to debug it. Any
hints?
Jianshi
On Fri, Mar 13, 2015 at 3:00 PM, Eric Charles wrote:
> i have the same issue running spark sql code from eclipse workspace. If
> you run your code from the command line (with a packaged jar) or from
> Inte
i have the same issue running spark sql code from eclipse workspace. If
you run your code from the command line (with a packaged jar) or from
Intellij, I bet it should work.
IMHO This is some how related to eclipse env, but would love to know how
to fix it (whether via eclipse conf, or via a patch
Forget about my last message. I was confused. Spark 1.2.1 + Scala 2.10.4
started by SBT console command also failed with this error. However running
from a standard spark shell works.
Jianshi
On Fri, Mar 13, 2015 at 2:46 PM, Jianshi Huang
wrote:
> Hmm... look like the console command still star
Hmm... look like the console command still starts a Spark 1.3.0 with Scala
2.11.6 even I changed them in build.sbt.
So the test with 1.2.1 is not valid.
Jianshi
On Fri, Mar 13, 2015 at 2:34 PM, Jianshi Huang
wrote:
> I've confirmed it only failed in console started by SBT.
>
> I'm using sbt-sp
I've confirmed it only failed in console started by SBT.
I'm using sbt-spark-package plugin, and the initialCommands look like this
(I added implicit sqlContext to it):
> show console::initialCommands
[info] println("Welcome to\n" +
[info] " __\n" +
[info] " / __/__ _
BTW, I was running tests from SBT when I get the errors. One test turn a
Seq of case class to DataFrame.
I also tried run similar code in the console, but failed with same error.
I tested both Spark 1.3.0-rc2 and 1.2.1 with Scala 2.11.6 and 2.10.4
Any idea?
Jianshi
On Fri, Mar 13, 2015 at 2:23
Same issue here. But the classloader in my exception is somehow different.
scala.ScalaReflectionException: class
org.apache.spark.sql.catalyst.ScalaReflection in JavaMirror with
java.net.URLClassLoader@53298398 of type class java.net.URLClassLoader with
classpath
Jianshi
On Sun, Mar 1, 2015 at
I think its possible that the problem is that the scala compiler is not
being loaded by the primordial classloader (but instead by some child
classloader) and thus the scala reflection mirror is failing to initialize
when it can't find it. Unfortunately, the only solution that I know of is
to load
Also, can scala version play any role here?
I am using scala 2.11.5 but all spark packages have dependency to scala
2.11.2
Just wanted to make sure that scala version is not an issue here.
On Sat, Feb 28, 2015 at 9:18 AM, Ashish Nigam
wrote:
> Hi,
> I wrote a very simple program in scala to conv
Ted,
spark-catalyst_2.11-1.2.1.jar is present in the class path. BTW, I am running
the code locally in eclipse workspace.
Here’s complete exception stack trace -
Exception in thread "main" scala.ScalaReflectionException: class
org.apache.spark.sql.catalyst.ScalaReflection in JavaMirror with pr
Have you verified that spark-catalyst_2.10 jar was in the classpath ?
Cheers
On Sat, Feb 28, 2015 at 9:18 AM, Ashish Nigam
wrote:
> Hi,
> I wrote a very simple program in scala to convert an existing RDD to
> SchemaRDD.
> But createSchemaRDD function is throwing exception
>
> Exception in threa
Hi,
I wrote a very simple program in scala to convert an existing RDD to
SchemaRDD.
But createSchemaRDD function is throwing exception
Exception in thread "main" scala.ScalaReflectionException: class
org.apache.spark.sql.catalyst.ScalaReflection in JavaMirror with primordial
classloader with boot
16 matches
Mail list logo