Re: Spark Job submit

2014-12-01 Thread Matt Narrell
Or setting the HADOOP_CONF_DIR property.  Either way, you must have the YARN 
configuration available to the submitting application to allow for the use of 
“yarn-client” or “yarn-master”

The attached stack trace below doesn’t provide any information as to why the 
job failed.

mn

> On Nov 27, 2014, at 12:14 AM, Akhil Das  wrote:
> 
> Try to add your cluster's core-site.xml, yarn-site.xml, and hdfs-site.xml to 
> the CLASSPATH (and on SPARK_CLASSPATH) and submit the job.
> 
> Thanks
> Best Regards
> 
> On Thu, Nov 27, 2014 at 12:24 PM, Naveen Kumar Pokala 
> mailto:npok...@spcapitaliq.com>> wrote:
> Code is in my windows machine and cluster is in some other network in UNIX. 
> In this case how it will identify the cluster. In case of spark cluster we 
> can clearly specify the URL like spark://ip:port. But in case of hadoop how 
> to specify that.
> 
>  
> 
> What I have done is copied the hadoop configuration files from network to 
> local and created dummy hadoop directory(in windows machine).
> 
>  
> 
> Submitted from spark submit by adding above dummy files location with 
> HADOOP_CONF_DIR variable.  Attaching the error.
> 
>  
> 
>  
> 
> 
> 
>  
> 
> Please suggest me how to proceed from the code and how to execute from spark 
> submit from windows machine.
> 
>  
> 
> Please provide me sample code if you have any.
> 
>  
> 
> -Naveen
> 
>  
> 
> From: Akhil Das [mailto:ak...@sigmoidanalytics.com 
> <mailto:ak...@sigmoidanalytics.com>] 
> Sent: Wednesday, November 26, 2014 10:03 PM
> To: Naveen Kumar Pokala
> Cc: user@spark.apache.org <mailto:user@spark.apache.org>
> Subject: Re: Spark Job submit
> 
>  
> 
> How about?
> 
>  
> 
> - Create a SparkContext 
> 
> - setMaster as yarn-cluster
> 
> - Create a JavaSparkContext with the above SparkContext
> 
>  
> 
> And that will submit it to the yarn cluster.
> 
> 
> 
> Thanks
> 
> Best Regards
> 
>  
> 
> On Wed, Nov 26, 2014 at 4:20 PM, Naveen Kumar Pokala  <mailto:npok...@spcapitaliq.com>> wrote:
> 
> Hi.
> 
>  
> 
> Is there a way to submit spark job on Hadoop-YARN  cluster from java code.
> 
>  
> 
> -Naveen
> 
>  
> 
> 



Re: Spark Job submit

2014-11-26 Thread Akhil Das
Try to add your cluster's core-site.xml, yarn-site.xml, and hdfs-site.xml
to the CLASSPATH (and on SPARK_CLASSPATH) and submit the job.

Thanks
Best Regards

On Thu, Nov 27, 2014 at 12:24 PM, Naveen Kumar Pokala <
npok...@spcapitaliq.com> wrote:

> Code is in my windows machine and cluster is in some other network in
> UNIX. In this case how it will identify the cluster. In case of spark
> cluster we can clearly specify the URL like spark://ip:port. But in case of
> hadoop how to specify that.
>
>
>
> What I have done is copied the hadoop configuration files from network to
> local and created dummy hadoop directory(in windows machine).
>
>
>
> Submitted from spark submit by adding above dummy files location with
> HADOOP_CONF_DIR variable.  Attaching the error.
>
>
>
>
>
>
>
> Please suggest me how to proceed from the code and how to execute from
> spark submit from windows machine.
>
>
>
> Please provide me sample code if you have any.
>
>
>
> -Naveen
>
>
>
> *From:* Akhil Das [mailto:ak...@sigmoidanalytics.com]
> *Sent:* Wednesday, November 26, 2014 10:03 PM
> *To:* Naveen Kumar Pokala
> *Cc:* user@spark.apache.org
> *Subject:* Re: Spark Job submit
>
>
>
> How about?
>
>
>
> - Create a SparkContext
>
> - setMaster as *yarn-cluster*
>
> - Create a JavaSparkContext with the above SparkContext
>
>
>
> And that will submit it to the yarn cluster.
>
>
> Thanks
>
> Best Regards
>
>
>
> On Wed, Nov 26, 2014 at 4:20 PM, Naveen Kumar Pokala <
> npok...@spcapitaliq.com> wrote:
>
> Hi.
>
>
>
> Is there a way to submit spark job on Hadoop-YARN  cluster from java code.
>
>
>
> -Naveen
>
>
>


RE: Spark Job submit

2014-11-26 Thread Naveen Kumar Pokala
Code is in my windows machine and cluster is in some other network in UNIX. In 
this case how it will identify the cluster. In case of spark cluster we can 
clearly specify the URL like spark://ip:port. But in case of hadoop how to 
specify that.

What I have done is copied the hadoop configuration files from network to local 
and created dummy hadoop directory(in windows machine).

Submitted from spark submit by adding above dummy files location with 
HADOOP_CONF_DIR variable.  Attaching the error.


[cid:image001.png@01D00A3D.141E3070]

Please suggest me how to proceed from the code and how to execute from spark 
submit from windows machine.

Please provide me sample code if you have any.

-Naveen

From: Akhil Das [mailto:ak...@sigmoidanalytics.com]
Sent: Wednesday, November 26, 2014 10:03 PM
To: Naveen Kumar Pokala
Cc: user@spark.apache.org
Subject: Re: Spark Job submit

How about?

- Create a SparkContext
- setMaster as yarn-cluster
- Create a JavaSparkContext with the above SparkContext

And that will submit it to the yarn cluster.

Thanks
Best Regards

On Wed, Nov 26, 2014 at 4:20 PM, Naveen Kumar Pokala 
mailto:npok...@spcapitaliq.com>> wrote:
Hi.

Is there a way to submit spark job on Hadoop-YARN  cluster from java code.

-Naveen



Re: Spark Job submit

2014-11-26 Thread Sandy Ryza
I think that actually would not work - yarn-cluster mode expects a specific
deployment path that uses SparkSubmit. Setting master as yarn-client should
work.

-Sandy

On Wed, Nov 26, 2014 at 8:32 AM, Akhil Das 
wrote:

> How about?
>
> - Create a SparkContext
> - setMaster as *yarn-cluster*
> - Create a JavaSparkContext with the above SparkContext
>
> And that will submit it to the yarn cluster.
>
> Thanks
> Best Regards
>
> On Wed, Nov 26, 2014 at 4:20 PM, Naveen Kumar Pokala <
> npok...@spcapitaliq.com> wrote:
>
>> Hi.
>>
>>
>>
>> Is there a way to submit spark job on Hadoop-YARN  cluster from java code.
>>
>>
>>
>> -Naveen
>>
>
>


Re: Spark Job submit

2014-11-26 Thread Akhil Das
How about?

- Create a SparkContext
- setMaster as *yarn-cluster*
- Create a JavaSparkContext with the above SparkContext

And that will submit it to the yarn cluster.

Thanks
Best Regards

On Wed, Nov 26, 2014 at 4:20 PM, Naveen Kumar Pokala <
npok...@spcapitaliq.com> wrote:

> Hi.
>
>
>
> Is there a way to submit spark job on Hadoop-YARN  cluster from java code.
>
>
>
> -Naveen
>