[jira] [Commented] (SPARK-20338) Spaces in spark.eventLog.dir are not correctly handled

2017-06-21 Thread Apache Spark (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-20338?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16058666#comment-16058666
 ] 

Apache Spark commented on SPARK-20338:
--

User 'zuotingbing' has created a pull request for this issue:
https://github.com/apache/spark/pull/17638

> Spaces in spark.eventLog.dir are not correctly handled
> --
>
> Key: SPARK-20338
> URL: https://issues.apache.org/jira/browse/SPARK-20338
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 2.1.0
>Reporter: zuotingbing
>Assignee: zuotingbing
> Fix For: 2.3.0
>
>
> set spark.eventLog.dir=/home/mr/event log and submit an app ,we got error as 
> follows:
> 017-04-14 17:28:40,378 INFO org.apache.spark.SparkContext: Successfully 
> stopped SparkContext
> Exception in thread "main" ExitCodeException exitCode=1: chmod: cannot access 
> `/home/mr/event%20log/app-20170414172839-.inprogress': No such file or 
> directory
>   at org.apache.hadoop.util.Shell.runCommand(Shell.java:561)
>   at org.apache.hadoop.util.Shell.run(Shell.java:478)
>   at 
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:738)
>   at org.apache.hadoop.util.Shell.execCommand(Shell.java:831)
>   at org.apache.hadoop.util.Shell.execCommand(Shell.java:814)
>   at 
> org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:712)
>   at 
> org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:506)
>   at 
> org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:125)
>   at org.apache.spark.SparkContext.(SparkContext.scala:516)
>   at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2258)
>   at 
> org.apache.spark.sql.SparkSession$Builder$$anonfun$9.apply(SparkSession.scala:879)
>   at 
> org.apache.spark.sql.SparkSession$Builder$$anonfun$9.apply(SparkSession.scala:871)
>   at scala.Option.getOrElse(Option.scala:121)
>   at 
> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:871)
>   at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:58)
>   at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.(SparkSQLCLIDriver.scala:288)
>   at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:137)
>   at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
>   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>   at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>   at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>   at java.lang.reflect.Method.invoke(Method.java:606)
>   at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736)
>   at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
>   at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
>   at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
>   at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-20338) Spaces in spark.eventLog.dir are not correctly handled

2017-06-13 Thread Apache Spark (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-20338?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16047501#comment-16047501
 ] 

Apache Spark commented on SPARK-20338:
--

User 'zuotingbing' has created a pull request for this issue:
https://github.com/apache/spark/pull/18285

> Spaces in spark.eventLog.dir are not correctly handled
> --
>
> Key: SPARK-20338
> URL: https://issues.apache.org/jira/browse/SPARK-20338
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 2.1.0
>Reporter: zuotingbing
>
> set spark.eventLog.dir=/home/mr/event log and submit an app ,we got error as 
> follows:
> 017-04-14 17:28:40,378 INFO org.apache.spark.SparkContext: Successfully 
> stopped SparkContext
> Exception in thread "main" ExitCodeException exitCode=1: chmod: cannot access 
> `/home/mr/event%20log/app-20170414172839-.inprogress': No such file or 
> directory
>   at org.apache.hadoop.util.Shell.runCommand(Shell.java:561)
>   at org.apache.hadoop.util.Shell.run(Shell.java:478)
>   at 
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:738)
>   at org.apache.hadoop.util.Shell.execCommand(Shell.java:831)
>   at org.apache.hadoop.util.Shell.execCommand(Shell.java:814)
>   at 
> org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:712)
>   at 
> org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:506)
>   at 
> org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:125)
>   at org.apache.spark.SparkContext.(SparkContext.scala:516)
>   at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2258)
>   at 
> org.apache.spark.sql.SparkSession$Builder$$anonfun$9.apply(SparkSession.scala:879)
>   at 
> org.apache.spark.sql.SparkSession$Builder$$anonfun$9.apply(SparkSession.scala:871)
>   at scala.Option.getOrElse(Option.scala:121)
>   at 
> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:871)
>   at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:58)
>   at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.(SparkSQLCLIDriver.scala:288)
>   at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:137)
>   at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
>   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>   at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>   at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>   at java.lang.reflect.Method.invoke(Method.java:606)
>   at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736)
>   at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
>   at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
>   at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
>   at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-20338) Spaces in spark.eventLog.dir are not correctly handled

2017-04-19 Thread zuotingbing (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-20338?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15976103#comment-15976103
 ] 

zuotingbing commented on SPARK-20338:
-

Not sure who exactly to ping here, [~jerryshao] could you please take a quick 
look and review it? Thanks!


> Spaces in spark.eventLog.dir are not correctly handled
> --
>
> Key: SPARK-20338
> URL: https://issues.apache.org/jira/browse/SPARK-20338
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 2.1.0
>Reporter: zuotingbing
>
> set spark.eventLog.dir=/home/mr/event log and submit an app ,we got error as 
> follows:
> 017-04-14 17:28:40,378 INFO org.apache.spark.SparkContext: Successfully 
> stopped SparkContext
> Exception in thread "main" ExitCodeException exitCode=1: chmod: cannot access 
> `/home/mr/event%20log/app-20170414172839-.inprogress': No such file or 
> directory
>   at org.apache.hadoop.util.Shell.runCommand(Shell.java:561)
>   at org.apache.hadoop.util.Shell.run(Shell.java:478)
>   at 
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:738)
>   at org.apache.hadoop.util.Shell.execCommand(Shell.java:831)
>   at org.apache.hadoop.util.Shell.execCommand(Shell.java:814)
>   at 
> org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:712)
>   at 
> org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:506)
>   at 
> org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:125)
>   at org.apache.spark.SparkContext.(SparkContext.scala:516)
>   at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2258)
>   at 
> org.apache.spark.sql.SparkSession$Builder$$anonfun$9.apply(SparkSession.scala:879)
>   at 
> org.apache.spark.sql.SparkSession$Builder$$anonfun$9.apply(SparkSession.scala:871)
>   at scala.Option.getOrElse(Option.scala:121)
>   at 
> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:871)
>   at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:58)
>   at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.(SparkSQLCLIDriver.scala:288)
>   at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:137)
>   at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
>   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>   at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>   at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>   at java.lang.reflect.Method.invoke(Method.java:606)
>   at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736)
>   at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
>   at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
>   at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
>   at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-20338) Spaces in spark.eventLog.dir are not correctly handled

2017-04-19 Thread zuotingbing (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-20338?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15974395#comment-15974395
 ] 

zuotingbing commented on SPARK-20338:
-

cc [~srowen]

> Spaces in spark.eventLog.dir are not correctly handled
> --
>
> Key: SPARK-20338
> URL: https://issues.apache.org/jira/browse/SPARK-20338
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 2.1.0
>Reporter: zuotingbing
>
> set spark.eventLog.dir=/home/mr/event log and submit an app ,we got error as 
> follows:
> 017-04-14 17:28:40,378 INFO org.apache.spark.SparkContext: Successfully 
> stopped SparkContext
> Exception in thread "main" ExitCodeException exitCode=1: chmod: cannot access 
> `/home/mr/event%20log/app-20170414172839-.inprogress': No such file or 
> directory
>   at org.apache.hadoop.util.Shell.runCommand(Shell.java:561)
>   at org.apache.hadoop.util.Shell.run(Shell.java:478)
>   at 
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:738)
>   at org.apache.hadoop.util.Shell.execCommand(Shell.java:831)
>   at org.apache.hadoop.util.Shell.execCommand(Shell.java:814)
>   at 
> org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:712)
>   at 
> org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:506)
>   at 
> org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:125)
>   at org.apache.spark.SparkContext.(SparkContext.scala:516)
>   at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2258)
>   at 
> org.apache.spark.sql.SparkSession$Builder$$anonfun$9.apply(SparkSession.scala:879)
>   at 
> org.apache.spark.sql.SparkSession$Builder$$anonfun$9.apply(SparkSession.scala:871)
>   at scala.Option.getOrElse(Option.scala:121)
>   at 
> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:871)
>   at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:58)
>   at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.(SparkSQLCLIDriver.scala:288)
>   at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:137)
>   at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
>   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>   at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>   at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>   at java.lang.reflect.Method.invoke(Method.java:606)
>   at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736)
>   at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
>   at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
>   at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
>   at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-20338) Spaces in spark.eventLog.dir are not correctly handled

2017-04-14 Thread Apache Spark (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-20338?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15968882#comment-15968882
 ] 

Apache Spark commented on SPARK-20338:
--

User 'zuotingbing' has created a pull request for this issue:
https://github.com/apache/spark/pull/17638

> Spaces in spark.eventLog.dir are not correctly handled
> --
>
> Key: SPARK-20338
> URL: https://issues.apache.org/jira/browse/SPARK-20338
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 2.1.0
>Reporter: zuotingbing
>Priority: Minor
>
> set spark.eventLog.dir=/home/mr/event log and submit an app ,we got error as 
> follows:
> 017-04-14 17:28:40,378 INFO org.apache.spark.SparkContext: Successfully 
> stopped SparkContext
> Exception in thread "main" ExitCodeException exitCode=1: chmod: cannot access 
> `/home/mr/event%20log/app-20170414172839-.inprogress': No such file or 
> directory
>   at org.apache.hadoop.util.Shell.runCommand(Shell.java:561)
>   at org.apache.hadoop.util.Shell.run(Shell.java:478)
>   at 
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:738)
>   at org.apache.hadoop.util.Shell.execCommand(Shell.java:831)
>   at org.apache.hadoop.util.Shell.execCommand(Shell.java:814)
>   at 
> org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:712)
>   at 
> org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:506)
>   at 
> org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:125)
>   at org.apache.spark.SparkContext.(SparkContext.scala:516)
>   at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2258)
>   at 
> org.apache.spark.sql.SparkSession$Builder$$anonfun$9.apply(SparkSession.scala:879)
>   at 
> org.apache.spark.sql.SparkSession$Builder$$anonfun$9.apply(SparkSession.scala:871)
>   at scala.Option.getOrElse(Option.scala:121)
>   at 
> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:871)
>   at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:58)
>   at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.(SparkSQLCLIDriver.scala:288)
>   at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:137)
>   at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
>   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>   at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>   at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>   at java.lang.reflect.Method.invoke(Method.java:606)
>   at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736)
>   at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
>   at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
>   at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
>   at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org