I am having issues trying to run a test job on a built version of Spark with a 
custom Hadoop JAR. My custom hadoop version runs without issues and I can run 
jobs from a precompiled version of Spark (with Hadoop) no problem. 
However, whenever I try to run the same Spark example on the Spark version with 
my custom hadoop JAR - I get this error:"Exception in thread "main" 
java.lang.RuntimeException: java.lang.ClassNotFoundException: Class 
org.apache.hadoop.fs.Hdfs not found"
Does anybody know why this is happening?
Thanks,Charles.
                                          

Reply via email to