Can someone please help either by explaining or pointing to documentation
the relationship between #executors needed and How to let the concurrent
jobs that are created by the above parameter run in parallel?
On Thu, Sep 24, 2015 at 11:56 PM, Atul Kulkarni
wrote:
> Hi Folks,
>
> I am
I
am curious if there is a requirement that #Executors be >= a particular
number (a calculation based on how many repartitions after unio od DSreams
etc. - I don't know I am grasping at Straws here.)
I would appreciate some help in this regard. Thanks in advance.
--
Regards,
Atul Kulkarni
really happening inside would be helpful in understanding the working so
that I don't make such mistake again.
Regards,
Atul.
On Fri, Sep 11, 2015 at 11:32 AM, Atul Kulkarni
wrote:
> Folks,
>
> Any help on this?
>
> Regards,
> Atul.
>
>
> On Fri, Sep 11, 2015 at
Folks,
Any help on this?
Regards,
Atul.
On Fri, Sep 11, 2015 at 8:39 AM, Atul Kulkarni
wrote:
> Hi Raghavendra,
>
> Thanks for your answers, I am passing 10 executors and I am not sure if
> that is the problem. It is still hung.
>
> Regards,
> Atul.
>
>
> On
mand line option
> --num-executors.You need more than 2 executors to make spark-streaming
> working.
>
> For more details on command line option, please go through
> http://spark.apache.org/docs/latest/running-on-yarn.html.
>
>
> On Fri, Sep 11, 2015 at 10:52 AM, Atul Kulkarni
and that is why your job is stuck.
> Specify it local[*], to make thread pool equal to number of cores...
>
> Raghav
> On Sep 11, 2015 6:06 AM, "Atul Kulkarni" wrote:
>
>> Hi Folks,
>>
>> Below is the code have for Spark based Kafka Producer to take advan
Streaming context is not able to read *.gz files?
I am not sure what more details I can provide to help explain my problem.
--
Regards,
Atul Kulkarni