Re: Spark based Kafka Producer

2015-09-11 Thread Atul Kulkarni
ter mode. >>>> >>>> spark-submit --master yarn-cluster ... >>>> >>>> On Thu, Sep 10, 2015 at 7:50 PM, Raghavendra Pandey < >>>> raghavendra.pan...@gmail.com> wrote: >>>> >>>>> What is the value of spar

Re: Spark based Kafka Producer

2015-09-11 Thread Atul Kulkarni
means only one thread can run and that is why your job is stuck. >>>> Specify it local[*], to make thread pool equal to number of cores... >>>> >>>> Raghav >>>> On Sep 11, 2015 6:06 AM, "Atul Kulkarni" >>>> wrote: >>

Re: Spark based Kafka Producer

2015-09-11 Thread Atul Kulkarni
ark master conf.. By default it is local, that >>> means only one thread can run and that is why your job is stuck. >>> Specify it local[*], to make thread pool equal to number of cores... >>> >>> Raghav >>> On Sep 11, 2015 6:06 AM, "Atul Kulkarni&quo

Re: Spark based Kafka Producer

2015-09-11 Thread Raghavendra Pandey
t it is local, that >> means only one thread can run and that is why your job is stuck. >> Specify it local[*], to make thread pool equal to number of cores... >> >> Raghav >> On Sep 11, 2015 6:06 AM, "Atul Kulkarni" wrote: >> >>> Hi Folks,

Re: Spark based Kafka Producer

2015-09-10 Thread Atul Kulkarni
and that is why your job is stuck. > Specify it local[*], to make thread pool equal to number of cores... > > Raghav > On Sep 11, 2015 6:06 AM, "Atul Kulkarni" wrote: > >> Hi Folks, >> >> Below is the code have for Spark based Kafka Producer to take advan

Re: Spark based Kafka Producer

2015-09-10 Thread Raghavendra Pandey
; Below is the code have for Spark based Kafka Producer to take advantage > of multiple executors reading files in parallel on my cluster but I am > stuck at The program not making any progress. > > Below is my scrubbed code: > > val sparkConf = new SparkConf().setApp

Spark based Kafka Producer

2015-09-10 Thread Atul Kulkarni
Hi Folks, Below is the code have for Spark based Kafka Producer to take advantage of multiple executors reading files in parallel on my cluster but I am stuck at The program not making any progress. Below is my scrubbed code: val sparkConf = new SparkConf().setAppName(applicationName) val ssc