Thanks for your response.
I gave correct master url. Moreover as i mentioned in my post, i could able
to run the sample program by using spark-submit. But it is not working when
i'm running from my machine. Any clue on this?
Thanks in advance.
--
View this message in context:
http://apache-sp
Wouldn't it likely be the opposite? Too much memory / too many cores being
requested relative to the resource that YARN makes available?
On Nov 24, 2014 11:00 AM, "Akhil Das" wrote:
> This can happen mainly because of the following:
>
> - Wrong master url (Make sure you give the master url which
This can happen mainly because of the following:
- Wrong master url (Make sure you give the master url which is listed on
top left corner of the webui - running on 8080)
- Allocated more memory/cores while creating the sparkContext.
Thanks
Best Regards
On Mon, Nov 24, 2014 at 4:13 PM, vdiwaka
Hi,
When i trying to execute the program from my laptop by connecting to HDP
environment (on which Spark also configured), i'm getting the warning
("Initial job has not accepted any resources; check your cluster UI to
ensure that workers are registered and have sufficient memory") and Job is
being