Could it be possible that you have an older version of JavaSparkContext
(i.e. from an older version of Spark) in your path? Please check that there
aren't two versions of Spark accidentally included in your class path used
in Eclipse. It would not give errors in the import (as it finds the
imported packages and classes) but would give such errors as it may be
unfortunately finding an older version of JavaSparkContext class in the
class path.

TD


On Wed, Jan 15, 2014 at 4:14 PM, arjun biswas <arjunbiswas....@gmail.com>wrote:

> Hello All ,
>
> I have installed spark on my machine and was succesful in running sbt/sbt
> package as well as sbt/sbt assembly . I am trying to run the examples in
> java from eclipse . To be precise i am trying to run the JavaLogQuery
> example from eclipse . The issue is i am unable to resolve this
> compilation problem of *jarOfClass being not available inside the Java
> Spark Context* . I have added all the possible jars and is using Spark
> 0.8.1 incubating which is the latest one with scala 2.9.3 .I have all
> jars to the classpath to the point that i do not get any import error .
> However JavaSparkContext.jarOfClass gives the above error saying the
> jarOfClass method is unavailable in the JavaSparkContext . I am using
> Spark-0.8.1 incubating and scala 2.9.3 . Has anyone tried to run the java
> sample examples from eclipse . Please note that this is a compile time
> error in eclipse .
>
> Regards
> Arjun
>

Reply via email to