I've tried that and the same error occurs. Do you have any other suggestions?

Thanks!
Rahul

From: Akhil Das <ak...@sigmoidanalytics.com<mailto:ak...@sigmoidanalytics.com>>
Date: Wednesday, December 3, 2014 at 3:55 AM
To: Rahul Swaminathan 
<rahul.swaminat...@duke.edu<mailto:rahul.swaminat...@duke.edu>>
Cc: "u...@spark.incubator.apache.org<mailto:u...@spark.incubator.apache.org>" 
<u...@spark.incubator.apache.org<mailto:u...@spark.incubator.apache.org>>
Subject: Re: WordCount fails in .textFile() method
Resent-From: <rahul.swaminat...@duke.edu<mailto:rahul.swaminat...@duke.edu>>

Try running it in local mode. Looks like a jar conflict/missing.

SparkConf conf = new SparkConf().setAppName("JavaWordCount");
conf.set("spark.io.compression.codec","org.apache.spark.io.LZ4CompressionCodec");
conf.setMaster("local[2]").setSparkHome(System.getenv("SPARK_HOME"));
JavaSparkContext jsc = new JavaSparkContext(conf);
jsc.addJar("myJar.jar");
new JavaWordCount(jsc).doJob();

Thanks
Best Regards

On Wed, Dec 3, 2014 at 2:49 AM, Rahul Swaminathan 
<rahul.swaminat...@duke.edu<mailto:rahul.swaminat...@duke.edu>> wrote:
Hi,

I am trying to run JavaWordCount without using the spark-submit script. I have 
copied the source code for JavaWordCount and am using a JavaSparkContext with 
the following:

SparkConf conf = new SparkConf().setAppName("JavaWordCount");
conf.set("spark.io.compression.codec","org.apache.spark.io.LZ4CompressionCodec");
conf.setMaster("spark://127.0.0.1:7077<http://127.0.0.1:7077>").setSparkHome(System.getenv("SPARK_HOME"));
JavaSparkContext jsc = new JavaSparkContext(conf);
jsc.addJar("myJar.jar");
new JavaWordCount(jsc).doJob();

I am getting the following error in the .textFile() method:


Exception in thread "main" java.lang.NoSuchMethodError: 
com.google.common.hash.HashFunction.hashInt(I)Lcom/google/common/hash/HashCode;

at 
org.apache.spark.util.collection.OpenHashSet.org<http://org.apache.spark.util.collection.OpenHashSet.org>$apache$spark$util$collection$OpenHashSet$$hashcode(OpenHashSet.scala:261)


What can I do to solve this issue? It works fine when running from command line 
with spark-submit script.


Thanks,


Rahul


Reply via email to