Hey Aviem,
Thank you for noticing. You were right, I was able to run all my jobs in Spark
1.6.3. You are awesome!!!
Regards,
Sathish. J
On 16-Aug-2017, at 3:23 PM, Aviem Zur
> wrote:
2.0.0. I really doubt if its Spark setup issue. I wrote a
Hi Jayaraman,
Thanks for reaching out.
We run Beam using Spark runner daily on a yarn cluster.
It appears that in many of the logs you sent there is hanging when
connecting to certain servers on certain ports, could this be a network
issue or an issue with your Spark setup?
Could you please
Hi,
Thanks for trying it out.
I was running the job in local single node setup. I also spawn a HDInsights
cluster in Azure platform just to test the WordCount program. Its the same
result there too, stuck at the Evaluating ParMultiDo step. It runs fine in mvn
compile exec, but when bundled
Hi,
Was anyone able to run Beam application on Spark at all??
I tried all possible options and still no luck. No executors getting assigned
for the job submitted by below command even though explicitly specified,
$ ~/spark/bin/spark-submit --class org.apache.beam.examples.WordCount --master
Hi Sathish,
Do you see the tasks submitted on the history server ?
Regards
JB
On 08/01/2017 11:51 AM, Sathish Jayaraman wrote:
Hi,
I am trying to execute Beam example in local spark setup. When I try to submit
the sample WordCount jar via spark-submit, the job just hangs at 'INFO
Hi,
I am trying to execute Beam example in local spark setup. When I try to submit
the sample WordCount jar via spark-submit, the job just hangs at 'INFO
SparkRunner$Evaluator: Evaluating ParMultiDo(ExtractWords)’. But it runs fine
when executed directly. Below is the command I used to submit