you should have just tried it and let us know what your experience had
been! Anyways, after spending long hours on this problem I realized this is
actually a classLoader problem.
If you use spark-submit this exception should go away but you haven't told
us how you are submitting a Job such that yo
Hi,
I am currently running this code from my IDE(Eclipse). I tried adding the
scope "provided" to the dependency without any effect. Should I build this
and submit it using the spark-submit command?
Thanks
Vaibhav
On 11 October 2016 at 04:36, Jakob Odersky wrote:
> Just thought of another pote
Just thought of another potential issue: you should use the "provided"
scope when depending on spark. I.e in your project's pom:
org.apache.spark
spark-core_2.11
2.0.1
provided
On Mon, Oct 10, 2016 at 2:00 PM, Jakob Odersky wrote:
> Ho do you sub
Ho do you submit the application? A version mismatch between the launcher,
driver and workers could lead to the bug you're seeing. A common reason for
a mismatch is if the SPARK_HOME environment variable is set. This will
cause the spark-submit script to use the launcher determined by that
environm
+1 Wooho I have the same problem. I have been trying hard to fix this.
On Mon, Oct 10, 2016 3:23 AM, vaibhav thapliyal vaibhav.thapliyal...@gmail.com
wrote:
Hi,
If I change the parameter inside the setMaster() to "local", the program runs.
Is there something wrong with the cluster installa
Hi,
If I change the parameter inside the setMaster() to "local", the program
runs. Is there something wrong with the cluster installation?
I used the spark-2.0.1-bin-hadoop2.7.tgz package to install on my cluster
with default configuration.
Thanks
Vaibhav
On 10 Oct 2016 12:49, "vaibhav thapliya
Here is the code that I am using:
public class SparkTest {
public static void main(String[] args) {
SparkConf conf = new SparkConf().setMaster("spark://
192.168.10.174:7077").setAppName("TestSpark");
JavaSparkContext sc = new JavaSparkContext(conf);
JavaRDD textFile