Re: Error submitting Spark Job in yarn-cluster mode on EMR

2018-05-08 Thread Marco Mistroni
Did you by any chances left a sparkSession.setMaster("local") lurking in your code? Last time i checked, to run on yarn you have to package a 'fat jar'. could you make sure the spark depedencies in your jar matches the version you are running on Yarn? alternatively please share code including

Error submitting Spark Job in yarn-cluster mode on EMR

2018-05-08 Thread SparkUser6
I have a simple program that works fine in the local mode. But I am having issues when I try to run the program in yarn-cluster mode. I know usually no such method happens when compile and run version mismatch but I made sure I took the same version. 205 [main] INFO

Re: Error in Spark job

2016-07-12 Thread Yash Sharma
Looks like the write to Aerospike is taking too long. Could you try writing the rdd directly to filesystem, skipping the Aerospike write. foreachPartition at WriteToAerospike.java:47, took 338.345827 s - Thanks, via mobile, excuse brevity. On Jul 12, 2016 8:08 PM, "Saurav Sinha"

Error in Spark job

2016-07-12 Thread Saurav Sinha
Hi, I am getting into an issue where job is running in multiple partition around 21000 parts. Setting Driver = 5G Executor memory = 10G Total executor core =32 It us falling when I am trying to write to aerospace earlier it is working fine. I am suspecting number of partition as reason.

[Error]Run Spark job as hdfs user from oozie workflow

2016-03-09 Thread Divya Gehlot
Hi, I have non secure Hadoop 2.7.2 cluster on EC2 having Spark 1.5.2 When I am submitting my spark scala script through shell script using Oozie workflow. I am submitting job as hdfs user but It is running as user = "yarn" so all the output should get store under user/yarn directory only . When

Re: How to catch error during Spark job?

2015-11-02 Thread Akhil Das
mail.com> wrote: > Hello, > > I had a question about error handling in Spark job: if an exception occurs > during the job, what is the best way to get notification of the failure? > Can Spark jobs return with different exit codes? > > For example, I wrote a dummy Spar

How to catch error during Spark job?

2015-10-27 Thread Isabelle Phan
Hello, I had a question about error handling in Spark job: if an exception occurs during the job, what is the best way to get notification of the failure? Can Spark jobs return with different exit codes? For example, I wrote a dummy Spark job just throwing out an Exception, as follows: import