Re: issue while creating spark context

2015-03-24 Thread Sachin Singh
thanks Sean and Akhil,
I changed the the permission of  */user/spark/applicationHistory, *now it
works,


On Tue, Mar 24, 2015 at 7:35 PM, Sachin Singh 
wrote:

> thanks Sean,
> please can you suggest in which file or configuration I need to modify
> proper path, please elaborate which may help,
>
> thanks,
>
> Regards
> Sachin
>
>
> On Tue, Mar 24, 2015 at 7:15 PM, Sean Owen  wrote:
>
>> That's probably the problem; the intended path is on HDFS but the
>> configuration specifies a local path. See the exception message.
>>
>> On Tue, Mar 24, 2015 at 1:08 PM, Akhil Das 
>> wrote:
>> > Its in your local file system, not in hdfs.
>> >
>> > Thanks
>> > Best Regards
>> >
>> > On Tue, Mar 24, 2015 at 6:25 PM, Sachin Singh 
>> > wrote:
>> >>
>> >> hi,
>> >> I can see required permission is granted for this directory as under,
>> >>
>> >>  hadoop dfs -ls /user/spark
>> >> DEPRECATED: Use of this script to execute hdfs command is deprecated.
>> >> Instead use the hdfs command for it.
>> >>
>> >> Found 1 items
>> >> drwxrwxrwt   - spark spark  0 2015-03-20 01:04
>> >> /user/spark/applicationHistory
>> >>
>> >> regards
>> >> Sachin
>>
>
>


Re: issue while creating spark context

2015-03-24 Thread Sachin Singh
thanks Sean,
please can you suggest in which file or configuration I need to modify
proper path, please elaborate which may help,

thanks,

Regards
Sachin


On Tue, Mar 24, 2015 at 7:15 PM, Sean Owen  wrote:

> That's probably the problem; the intended path is on HDFS but the
> configuration specifies a local path. See the exception message.
>
> On Tue, Mar 24, 2015 at 1:08 PM, Akhil Das 
> wrote:
> > Its in your local file system, not in hdfs.
> >
> > Thanks
> > Best Regards
> >
> > On Tue, Mar 24, 2015 at 6:25 PM, Sachin Singh 
> > wrote:
> >>
> >> hi,
> >> I can see required permission is granted for this directory as under,
> >>
> >>  hadoop dfs -ls /user/spark
> >> DEPRECATED: Use of this script to execute hdfs command is deprecated.
> >> Instead use the hdfs command for it.
> >>
> >> Found 1 items
> >> drwxrwxrwt   - spark spark  0 2015-03-20 01:04
> >> /user/spark/applicationHistory
> >>
> >> regards
> >> Sachin
>


Re: issue while creating spark context

2015-03-24 Thread Sean Owen
That's probably the problem; the intended path is on HDFS but the
configuration specifies a local path. See the exception message.

On Tue, Mar 24, 2015 at 1:08 PM, Akhil Das  wrote:
> Its in your local file system, not in hdfs.
>
> Thanks
> Best Regards
>
> On Tue, Mar 24, 2015 at 6:25 PM, Sachin Singh 
> wrote:
>>
>> hi,
>> I can see required permission is granted for this directory as under,
>>
>>  hadoop dfs -ls /user/spark
>> DEPRECATED: Use of this script to execute hdfs command is deprecated.
>> Instead use the hdfs command for it.
>>
>> Found 1 items
>> drwxrwxrwt   - spark spark  0 2015-03-20 01:04
>> /user/spark/applicationHistory
>>
>> regards
>> Sachin

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: issue while creating spark context

2015-03-24 Thread Akhil Das
Its in your local file system, not in hdfs.

Thanks
Best Regards

On Tue, Mar 24, 2015 at 6:25 PM, Sachin Singh 
wrote:

> hi,
> I can see required permission is granted for this directory as under,
>
>  hadoop dfs -ls /user/spark
> DEPRECATED: Use of this script to execute hdfs command is deprecated.
> Instead use the hdfs command for it.
>
> Found 1 items
> *drwxrwxrwt   - spark spark  0 2015-03-20 01:04
> /user/spark/applicationHistory*
>
> regards
> Sachin
>
>
> On Tue, Mar 24, 2015 at 6:13 PM, Akhil Das 
> wrote:
>
>> write permission as its clearly saying:
>>
>> java.io.IOException:* Error in creating log directory:*
>> file:*/user/spark/*applicationHistory/application_1427194309307_0005
>>
>> Thanks
>> Best Regards
>>
>> On Tue, Mar 24, 2015 at 6:08 PM, Sachin Singh 
>> wrote:
>>
>>> Hi Akhil,
>>> thanks for your quick reply,
>>> I would like to request please elaborate i.e. what kind of permission
>>> required ..
>>>
>>> thanks in advance,
>>>
>>> Regards
>>> Sachin
>>>
>>> On Tue, Mar 24, 2015 at 5:29 PM, Akhil Das 
>>> wrote:
>>>
 Its an IOException, just make sure you are having the correct
 permission over */user/spark* directory.

 Thanks
 Best Regards

 On Tue, Mar 24, 2015 at 5:21 PM, sachin Singh 
 wrote:

> hi all,
> all of sudden I getting below error when I am submitting spark job
> using
> master as yarn its not able to create spark context,previously working
> fine,
> I am using CDH5.3.1 and creating javaHiveContext
> spark-submit --jars
>
> ./analiticlibs/mysql-connector-java-5.1.17.jar,./analiticlibs/log4j-1.2.17.jar
> --master yarn --class myproject.com.java.jobs.Aggregationtask
> sparkjob-1.0.jar
>
> error message-
> java.io.IOException: Error in creating log directory:
> file:/user/spark/applicationHistory/application_1427194309307_0005
> at
> org.apache.spark.util.FileLogger.createLogDir(FileLogger.scala:133)
> at org.apache.spark.util.FileLogger.start(FileLogger.scala:115)
> at
>
> org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:74)
> at org.apache.spark.SparkContext.(SparkContext.scala:353)
> at
>
> org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:61)
> at
>
> myproject.com.java.core.SparkAnaliticEngine.getJavaSparkContext(SparkAnaliticEngine.java:77)
> at
>
> myproject.com.java.core.SparkAnaliticTable.evmyprojectate(SparkAnaliticTable.java:108)
> at
>
> myproject.com.java.core.SparkAnaliticEngine.evmyprojectateAnaliticTable(SparkAnaliticEngine.java:55)
> at
>
> myproject.com.java.core.SparkAnaliticEngine.evmyprojectateAnaliticTable(SparkAnaliticEngine.java:65)
> at
>
> myproject.com.java.jobs.CustomAggregationJob.main(CustomAggregationJob.java:184)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at
> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
> at
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/issue-while-creating-spark-context-tp22196.html
> Sent from the Apache Spark User List mailing list archive at
> Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

>>>
>>
>


Re: issue while creating spark context

2015-03-24 Thread Sachin Singh
hi,
I can see required permission is granted for this directory as under,

 hadoop dfs -ls /user/spark
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.

Found 1 items
*drwxrwxrwt   - spark spark  0 2015-03-20 01:04
/user/spark/applicationHistory*

regards
Sachin


On Tue, Mar 24, 2015 at 6:13 PM, Akhil Das 
wrote:

> write permission as its clearly saying:
>
> java.io.IOException:* Error in creating log directory:*
> file:*/user/spark/*applicationHistory/application_1427194309307_0005
>
> Thanks
> Best Regards
>
> On Tue, Mar 24, 2015 at 6:08 PM, Sachin Singh 
> wrote:
>
>> Hi Akhil,
>> thanks for your quick reply,
>> I would like to request please elaborate i.e. what kind of permission
>> required ..
>>
>> thanks in advance,
>>
>> Regards
>> Sachin
>>
>> On Tue, Mar 24, 2015 at 5:29 PM, Akhil Das 
>> wrote:
>>
>>> Its an IOException, just make sure you are having the correct permission
>>> over */user/spark* directory.
>>>
>>> Thanks
>>> Best Regards
>>>
>>> On Tue, Mar 24, 2015 at 5:21 PM, sachin Singh 
>>> wrote:
>>>
 hi all,
 all of sudden I getting below error when I am submitting spark job using
 master as yarn its not able to create spark context,previously working
 fine,
 I am using CDH5.3.1 and creating javaHiveContext
 spark-submit --jars

 ./analiticlibs/mysql-connector-java-5.1.17.jar,./analiticlibs/log4j-1.2.17.jar
 --master yarn --class myproject.com.java.jobs.Aggregationtask
 sparkjob-1.0.jar

 error message-
 java.io.IOException: Error in creating log directory:
 file:/user/spark/applicationHistory/application_1427194309307_0005
 at
 org.apache.spark.util.FileLogger.createLogDir(FileLogger.scala:133)
 at org.apache.spark.util.FileLogger.start(FileLogger.scala:115)
 at

 org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:74)
 at org.apache.spark.SparkContext.(SparkContext.scala:353)
 at

 org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:61)
 at

 myproject.com.java.core.SparkAnaliticEngine.getJavaSparkContext(SparkAnaliticEngine.java:77)
 at

 myproject.com.java.core.SparkAnaliticTable.evmyprojectate(SparkAnaliticTable.java:108)
 at

 myproject.com.java.core.SparkAnaliticEngine.evmyprojectateAnaliticTable(SparkAnaliticEngine.java:55)
 at

 myproject.com.java.core.SparkAnaliticEngine.evmyprojectateAnaliticTable(SparkAnaliticEngine.java:65)
 at

 myproject.com.java.jobs.CustomAggregationJob.main(CustomAggregationJob.java:184)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at

 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at

 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:606)
 at
 org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
 at
 org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/issue-while-creating-spark-context-tp22196.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org


>>>
>>
>


Re: issue while creating spark context

2015-03-24 Thread Akhil Das
write permission as its clearly saying:

java.io.IOException:* Error in creating log directory:*
file:*/user/spark/*applicationHistory/application_1427194309307_0005

Thanks
Best Regards

On Tue, Mar 24, 2015 at 6:08 PM, Sachin Singh 
wrote:

> Hi Akhil,
> thanks for your quick reply,
> I would like to request please elaborate i.e. what kind of permission
> required ..
>
> thanks in advance,
>
> Regards
> Sachin
>
> On Tue, Mar 24, 2015 at 5:29 PM, Akhil Das 
> wrote:
>
>> Its an IOException, just make sure you are having the correct permission
>> over */user/spark* directory.
>>
>> Thanks
>> Best Regards
>>
>> On Tue, Mar 24, 2015 at 5:21 PM, sachin Singh 
>> wrote:
>>
>>> hi all,
>>> all of sudden I getting below error when I am submitting spark job using
>>> master as yarn its not able to create spark context,previously working
>>> fine,
>>> I am using CDH5.3.1 and creating javaHiveContext
>>> spark-submit --jars
>>>
>>> ./analiticlibs/mysql-connector-java-5.1.17.jar,./analiticlibs/log4j-1.2.17.jar
>>> --master yarn --class myproject.com.java.jobs.Aggregationtask
>>> sparkjob-1.0.jar
>>>
>>> error message-
>>> java.io.IOException: Error in creating log directory:
>>> file:/user/spark/applicationHistory/application_1427194309307_0005
>>> at
>>> org.apache.spark.util.FileLogger.createLogDir(FileLogger.scala:133)
>>> at org.apache.spark.util.FileLogger.start(FileLogger.scala:115)
>>> at
>>>
>>> org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:74)
>>> at org.apache.spark.SparkContext.(SparkContext.scala:353)
>>> at
>>>
>>> org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:61)
>>> at
>>>
>>> myproject.com.java.core.SparkAnaliticEngine.getJavaSparkContext(SparkAnaliticEngine.java:77)
>>> at
>>>
>>> myproject.com.java.core.SparkAnaliticTable.evmyprojectate(SparkAnaliticTable.java:108)
>>> at
>>>
>>> myproject.com.java.core.SparkAnaliticEngine.evmyprojectateAnaliticTable(SparkAnaliticEngine.java:55)
>>> at
>>>
>>> myproject.com.java.core.SparkAnaliticEngine.evmyprojectateAnaliticTable(SparkAnaliticEngine.java:65)
>>> at
>>>
>>> myproject.com.java.jobs.CustomAggregationJob.main(CustomAggregationJob.java:184)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>>
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at
>>>
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at
>>> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
>>> at
>>> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/issue-while-creating-spark-context-tp22196.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> -
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>
>>
>


Re: issue while creating spark context

2015-03-24 Thread Sachin Singh
Hi Akhil,
thanks for your quick reply,
I would like to request please elaborate i.e. what kind of permission
required ..

thanks in advance,

Regards
Sachin

On Tue, Mar 24, 2015 at 5:29 PM, Akhil Das 
wrote:

> Its an IOException, just make sure you are having the correct permission
> over */user/spark* directory.
>
> Thanks
> Best Regards
>
> On Tue, Mar 24, 2015 at 5:21 PM, sachin Singh 
> wrote:
>
>> hi all,
>> all of sudden I getting below error when I am submitting spark job using
>> master as yarn its not able to create spark context,previously working
>> fine,
>> I am using CDH5.3.1 and creating javaHiveContext
>> spark-submit --jars
>>
>> ./analiticlibs/mysql-connector-java-5.1.17.jar,./analiticlibs/log4j-1.2.17.jar
>> --master yarn --class myproject.com.java.jobs.Aggregationtask
>> sparkjob-1.0.jar
>>
>> error message-
>> java.io.IOException: Error in creating log directory:
>> file:/user/spark/applicationHistory/application_1427194309307_0005
>> at
>> org.apache.spark.util.FileLogger.createLogDir(FileLogger.scala:133)
>> at org.apache.spark.util.FileLogger.start(FileLogger.scala:115)
>> at
>>
>> org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:74)
>> at org.apache.spark.SparkContext.(SparkContext.scala:353)
>> at
>>
>> org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:61)
>> at
>>
>> myproject.com.java.core.SparkAnaliticEngine.getJavaSparkContext(SparkAnaliticEngine.java:77)
>> at
>>
>> myproject.com.java.core.SparkAnaliticTable.evmyprojectate(SparkAnaliticTable.java:108)
>> at
>>
>> myproject.com.java.core.SparkAnaliticEngine.evmyprojectateAnaliticTable(SparkAnaliticEngine.java:55)
>> at
>>
>> myproject.com.java.core.SparkAnaliticEngine.evmyprojectateAnaliticTable(SparkAnaliticEngine.java:65)
>> at
>>
>> myproject.com.java.jobs.CustomAggregationJob.main(CustomAggregationJob.java:184)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at
>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at
>> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/issue-while-creating-spark-context-tp22196.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>


Re: issue while creating spark context

2015-03-24 Thread Akhil Das
Its an IOException, just make sure you are having the correct permission
over */user/spark* directory.

Thanks
Best Regards

On Tue, Mar 24, 2015 at 5:21 PM, sachin Singh 
wrote:

> hi all,
> all of sudden I getting below error when I am submitting spark job using
> master as yarn its not able to create spark context,previously working
> fine,
> I am using CDH5.3.1 and creating javaHiveContext
> spark-submit --jars
>
> ./analiticlibs/mysql-connector-java-5.1.17.jar,./analiticlibs/log4j-1.2.17.jar
> --master yarn --class myproject.com.java.jobs.Aggregationtask
> sparkjob-1.0.jar
>
> error message-
> java.io.IOException: Error in creating log directory:
> file:/user/spark/applicationHistory/application_1427194309307_0005
> at
> org.apache.spark.util.FileLogger.createLogDir(FileLogger.scala:133)
> at org.apache.spark.util.FileLogger.start(FileLogger.scala:115)
> at
>
> org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:74)
> at org.apache.spark.SparkContext.(SparkContext.scala:353)
> at
>
> org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:61)
> at
>
> myproject.com.java.core.SparkAnaliticEngine.getJavaSparkContext(SparkAnaliticEngine.java:77)
> at
>
> myproject.com.java.core.SparkAnaliticTable.evmyprojectate(SparkAnaliticTable.java:108)
> at
>
> myproject.com.java.core.SparkAnaliticEngine.evmyprojectateAnaliticTable(SparkAnaliticEngine.java:55)
> at
>
> myproject.com.java.core.SparkAnaliticEngine.evmyprojectateAnaliticTable(SparkAnaliticEngine.java:65)
> at
>
> myproject.com.java.jobs.CustomAggregationJob.main(CustomAggregationJob.java:184)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at
> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/issue-while-creating-spark-context-tp22196.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>