This will load listed jars when SparkContext is created.
In case of REPL, we define and import classes after SparkContext created.
According to above mentioned site, Executor install class loader in
'addReplClassLoaderIfNeeded' method using "spark.repl.class.uri"
configuration.
Then I will try to m
Add all your jars like this and pass it to the SparkContext
List *jars* =
> Lists.newArrayList("/home/akhld/mobi/localcluster/x/spark-0.9.1-bin-hadoop2/assembly/target/scala-2.10/spark-assembly-0.9.1-hadoop2.2.0.jar",
>
> "/home/akhld/mobi/localcluster/codes/pig/build/ivy/lib/Pig/twitter4j-core-3.
I found a web page for hint.
http://ardoris.wordpress.com/2014/03/30/how-spark-does-class-loading/
I learned SparkIMain has internal httpserver to publish class object but
can't figure out how I use it in java.
Any ideas?
Thanks,
Kevin
--
View this message in context:
http://apache-spark-user-