What Spark package are you using? In particular, which hadoop version?

On Mon, Sep 21, 2015 at 9:14 AM, ekraffmiller
<ellen.kraffmil...@gmail.com> wrote:
> Hi,
> I’m trying to run a simple test program to access Spark though Java.  I’m
> using JDK 1.8, and Spark 1.5.  I’m getting an Exception from the
> JavaSparkContext constructor.  My initialization code matches all the sample
> code I’ve found online, so not sure what I’m doing wrong.
>
> Here is my code:
>
> SparkConf conf = new SparkConf().setAppName("Simple Application");
> conf.setMaster("local");
> conf.setAppName("my app");
> JavaSparkContext sc = new JavaSparkContext(conf);
>
> The stack trace of the Exception:
>
> java.lang.ExceptionInInitializerError: null
>         at java.lang.Class.getField(Class.java:1690)
>         at
> org.apache.spark.util.SparkShutdownHookManager.install(ShutdownHookManager.scala:220)
>         at
> org.apache.spark.util.ShutdownHookManager$.shutdownHooks$lzycompute(ShutdownHookManager.scala:50)
>         at
> org.apache.spark.util.ShutdownHookManager$.shutdownHooks(ShutdownHookManager.scala:48)
>         at
> org.apache.spark.util.ShutdownHookManager$.addShutdownHook(ShutdownHookManager.scala:189)
>         at
> org.apache.spark.util.ShutdownHookManager$.<init>(ShutdownHookManager.scala:58)
>         at
> org.apache.spark.util.ShutdownHookManager$.<clinit>(ShutdownHookManager.scala)
>         at
> org.apache.spark.storage.DiskBlockManager.addShutdownHook(DiskBlockManager.scala:147)
>         at
> org.apache.spark.storage.DiskBlockManager.<init>(DiskBlockManager.scala:54)
>         at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:75)
>         at 
> org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:173)
>         at org.apache.spark.SparkEnv$.create(SparkEnv.scala:345)
>         at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193)
>         at 
> org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:276)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:441)
>         at
> org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
>         at
> edu.harvard.iq.text.core.spark.SparkControllerTest.testMongoRDD(SparkControllerTest.java:63)
>
> Thanks,
> Ellen
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Exception-initializing-JavaSparkContext-tp24755.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to