Try adding these jars to SPARK_CLASSPATH as well.

2014-02-04 Soumya Simanta <[email protected]>:

> Hi,
>
> I've a Spark cluster where I want to use classes from 3rd party jar in my
> shell.
>
> I'm starting my spark shell using the following command.
>
>
> MASTER="spark://n001:7077"
> ADD_JARS=/home/myuserid/twitter4j/lib/twitter4j-core-3.0.5.jar
> SPARK_MEM="24G" ./spark-shell
>
>
> I also see the following in the logs.
>
> 14/02/04 16:09:25 INFO SparkContext: Added JAR
> /home/myuserid/twitter4j/lib/twitter4j-core-3.0.5.jar at
> http://10.27.112.32:59460/jars/twitter4j-core-3.0.5.jar with timestamp
> 1391548165483
>
>
> However, when I try to import one of the classes in that jar file I get
> the following error.
>
>
> scala> import twitter4j.Status
>
> <console>:10: error: not found: value twitter4j
>
>        import twitter4j.Status
>
>               ^
>

Reply via email to