[jira] [Commented] (SPARK-17591) Fix/investigate the failure of tests in Scala On Windows

2016-11-20 Thread Hyukjin Kwon (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-17591?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15682578#comment-15682578
 ] 

Hyukjin Kwon commented on SPARK-17591:
--

I will close this when I am able to proceed further the tests on Windows and to 
see more error logs rather then the ones described in the description.

> Fix/investigate the failure of tests in Scala On Windows
> 
>
> Key: SPARK-17591
> URL: https://issues.apache.org/jira/browse/SPARK-17591
> Project: Spark
>  Issue Type: Test
>  Components: Build, DStreams, Spark Core, SQL
>Reporter: Hyukjin Kwon
>
> {code}
> Tests run: 90, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 17.53 sec 
> <<< FAILURE! - in org.apache.spark.JavaAPISuite
> wholeTextFiles(org.apache.spark.JavaAPISuite)  Time elapsed: 0.313 sec  <<< 
> FAILURE!
> java.lang.AssertionError: 
> expected: > but was:
>   at org.apache.spark.JavaAPISuite.wholeTextFiles(JavaAPISuite.java:1089)
> {code}
> {code}
> Tests run: 8, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 0.062 sec <<< 
> FAILURE! - in org.apache.spark.launcher.SparkLauncherSuite
> testChildProcLauncher(org.apache.spark.launcher.SparkLauncherSuite)  Time 
> elapsed: 0.047 sec  <<< FAILURE!
> java.lang.AssertionError: expected:<0> but was:<1>
>   at 
> org.apache.spark.launcher.SparkLauncherSuite.testChildProcLauncher(SparkLauncherSuite.java:177)
> {code}
> {code}
> Tests run: 53, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 22.325 sec 
> <<< FAILURE! - in org.apache.spark.streaming.JavaAPISuite
> testCheckpointMasterRecovery(org.apache.spark.streaming.JavaAPISuite)  Time 
> elapsed: 3.418 sec  <<< ERROR!
> java.io.IOException: Failed to delete: 
> C:\projects\spark\streaming\target\tmp\1474255953021-0
>   at 
> org.apache.spark.streaming.JavaAPISuite.testCheckpointMasterRecovery(JavaAPISuite.java:1808)
> Running org.apache.spark.streaming.JavaDurationSuite
> {code}
> {code}
> Running org.apache.spark.streaming.JavaAPISuite
> Tests run: 53, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 22.325 sec 
> <<< FAILURE! - in org.apache.spark.streaming.JavaAPISuite
> testCheckpointMasterRecovery(org.apache.spark.streaming.JavaAPISuite)  Time 
> elapsed: 3.418 sec  <<< ERROR!
> java.io.IOException: Failed to delete: 
> C:\projects\spark\streaming\target\tmp\1474255953021-0
>   at 
> org.apache.spark.streaming.JavaAPISuite.testCheckpointMasterRecovery(JavaAPISuite.java:1808)
> {code}
> {code}
> Results :
> Tests in error: 
>   JavaAPISuite.testCheckpointMasterRecovery:1808 � IO Failed to delete: 
> C:\proje...
> Tests run: 74, Failures: 0, Errors: 1, Skipped: 0
> {code}
> The tests were aborted for unknown reason during SQL tests - 
> {{BroadcastJoinSuite}} emitting the exceptions below continuously:
> {code}
> 20:48:09.876 ERROR org.apache.spark.deploy.worker.ExecutorRunner: Error 
> running executor
> java.io.IOException: Cannot run program "C:\Progra~1\Java\jdk1.8.0\bin\java" 
> (in directory "C:\projects\spark\work\app-20160918204809-\0"): 
> CreateProcess error=206, The filename or extension is too long
>   at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
>   at 
> org.apache.spark.deploy.worker.ExecutorRunner.org$apache$spark$deploy$worker$ExecutorRunner$$fetchAndRunExecutor(ExecutorRunner.scala:167)
>   at 
> org.apache.spark.deploy.worker.ExecutorRunner$$anon$1.run(ExecutorRunner.scala:73)
> Caused by: java.io.IOException: CreateProcess error=206, The filename or 
> extension is too long
>   at java.lang.ProcessImpl.create(Native Method)
>   at java.lang.ProcessImpl.(ProcessImpl.java:386)
>   at java.lang.ProcessImpl.start(ProcessImpl.java:137)
>   at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)
>   ... 2 more
> {code}
> Here is the full log for the test - 
> https://ci.appveyor.com/project/spark-test/spark/build/15-scala-tests
> We may have to create sub-tasks if these are actual issues on Windows rather 
> than just mistakes in tests.
> I am willing to test this again after fixing some issues here in particular 
> the last one.
> I trigger the build by the comments below:
> {code}
> mvn -DskipTests -Phadoop-2.6 -Phive -Phive-thriftserver package
> mvn -Phadoop-2.6 -Phive -Phive-thriftserver --fail-never test
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-17591) Fix/investigate the failure of tests in Scala On Windows

2016-09-18 Thread Jagadeesan A S (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-17591?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15502356#comment-15502356
 ] 

Jagadeesan A S commented on SPARK-17591:


[~hyukjin.kwon] i would like to work on this issue. Maybe I can try and add a 
pull request along with your PR. 

> Fix/investigate the failure of tests in Scala On Windows
> 
>
> Key: SPARK-17591
> URL: https://issues.apache.org/jira/browse/SPARK-17591
> Project: Spark
>  Issue Type: Test
>  Components: Build, Spark Core, SQL, Streaming
>Reporter: Hyukjin Kwon
>
> {code}
> Tests run: 90, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 17.53 sec 
> <<< FAILURE! - in org.apache.spark.JavaAPISuite
> wholeTextFiles(org.apache.spark.JavaAPISuite)  Time elapsed: 0.313 sec  <<< 
> FAILURE!
> java.lang.AssertionError: 
> expected: > but was:
>   at org.apache.spark.JavaAPISuite.wholeTextFiles(JavaAPISuite.java:1089)
> {code}
> {code}
> Tests run: 8, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 0.062 sec <<< 
> FAILURE! - in org.apache.spark.launcher.SparkLauncherSuite
> testChildProcLauncher(org.apache.spark.launcher.SparkLauncherSuite)  Time 
> elapsed: 0.047 sec  <<< FAILURE!
> java.lang.AssertionError: expected:<0> but was:<1>
>   at 
> org.apache.spark.launcher.SparkLauncherSuite.testChildProcLauncher(SparkLauncherSuite.java:177)
> {code}
> {code}
> Tests run: 53, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 22.325 sec 
> <<< FAILURE! - in org.apache.spark.streaming.JavaAPISuite
> testCheckpointMasterRecovery(org.apache.spark.streaming.JavaAPISuite)  Time 
> elapsed: 3.418 sec  <<< ERROR!
> java.io.IOException: Failed to delete: 
> C:\projects\spark\streaming\target\tmp\1474255953021-0
>   at 
> org.apache.spark.streaming.JavaAPISuite.testCheckpointMasterRecovery(JavaAPISuite.java:1808)
> Running org.apache.spark.streaming.JavaDurationSuite
> {code}
> {code}
> Running org.apache.spark.streaming.JavaAPISuite
> Tests run: 53, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 22.325 sec 
> <<< FAILURE! - in org.apache.spark.streaming.JavaAPISuite
> testCheckpointMasterRecovery(org.apache.spark.streaming.JavaAPISuite)  Time 
> elapsed: 3.418 sec  <<< ERROR!
> java.io.IOException: Failed to delete: 
> C:\projects\spark\streaming\target\tmp\1474255953021-0
>   at 
> org.apache.spark.streaming.JavaAPISuite.testCheckpointMasterRecovery(JavaAPISuite.java:1808)
> {code}
> {code}
> Results :
> Tests in error: 
>   JavaAPISuite.testCheckpointMasterRecovery:1808 � IO Failed to delete: 
> C:\proje...
> Tests run: 74, Failures: 0, Errors: 1, Skipped: 0
> {code}
> The tests were aborted for unknown reason during SQL tests - 
> {{BroadcastJoinSuite}} emitting the exceptions below continuously:
> {code}
> 20:48:09.876 ERROR org.apache.spark.deploy.worker.ExecutorRunner: Error 
> running executor
> java.io.IOException: Cannot run program "C:\Progra~1\Java\jdk1.8.0\bin\java" 
> (in directory "C:\projects\spark\work\app-20160918204809-\0"): 
> CreateProcess error=206, The filename or extension is too long
>   at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
>   at 
> org.apache.spark.deploy.worker.ExecutorRunner.org$apache$spark$deploy$worker$ExecutorRunner$$fetchAndRunExecutor(ExecutorRunner.scala:167)
>   at 
> org.apache.spark.deploy.worker.ExecutorRunner$$anon$1.run(ExecutorRunner.scala:73)
> Caused by: java.io.IOException: CreateProcess error=206, The filename or 
> extension is too long
>   at java.lang.ProcessImpl.create(Native Method)
>   at java.lang.ProcessImpl.(ProcessImpl.java:386)
>   at java.lang.ProcessImpl.start(ProcessImpl.java:137)
>   at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)
>   ... 2 more
> {code}
> Here is the full log for the test - 
> https://ci.appveyor.com/project/spark-test/spark/build/15-scala-tests
> We may have to create sub-tasks if these are actual issues on Windows rather 
> than just mistakes in tests.
> I am willing to test this again after fixing some issues here in particular 
> the last one.
> I trigger the build by the comments below:
> {code}
> mvn -DskipTests -Phadoop-2.6 -Phive -Phive-thriftserver package
> mvn -Phadoop-2.6 -Phive -Phive-thriftserver --fail-never test
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-17591) Fix/investigate the failure of tests in Scala On Windows

2016-09-18 Thread Hyukjin Kwon (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-17591?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15502252#comment-15502252
 ] 

Hyukjin Kwon commented on SPARK-17591:
--

Please cc me if any of you is interested in and would like to submit a PR. I 
will manually run the tests.

> Fix/investigate the failure of tests in Scala On Windows
> 
>
> Key: SPARK-17591
> URL: https://issues.apache.org/jira/browse/SPARK-17591
> Project: Spark
>  Issue Type: Test
>  Components: Build, Spark Core, SQL, Streaming
>Reporter: Hyukjin Kwon
>
> {code}
> Tests run: 90, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 17.53 sec 
> <<< FAILURE! - in org.apache.spark.JavaAPISuite
> wholeTextFiles(org.apache.spark.JavaAPISuite)  Time elapsed: 0.313 sec  <<< 
> FAILURE!
> java.lang.AssertionError: 
> expected: > but was:
>   at org.apache.spark.JavaAPISuite.wholeTextFiles(JavaAPISuite.java:1089)
> {code}
> {code}
> Tests run: 8, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 0.062 sec <<< 
> FAILURE! - in org.apache.spark.launcher.SparkLauncherSuite
> testChildProcLauncher(org.apache.spark.launcher.SparkLauncherSuite)  Time 
> elapsed: 0.047 sec  <<< FAILURE!
> java.lang.AssertionError: expected:<0> but was:<1>
>   at 
> org.apache.spark.launcher.SparkLauncherSuite.testChildProcLauncher(SparkLauncherSuite.java:177)
> {code}
> {code}
> Tests run: 53, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 22.325 sec 
> <<< FAILURE! - in org.apache.spark.streaming.JavaAPISuite
> testCheckpointMasterRecovery(org.apache.spark.streaming.JavaAPISuite)  Time 
> elapsed: 3.418 sec  <<< ERROR!
> java.io.IOException: Failed to delete: 
> C:\projects\spark\streaming\target\tmp\1474255953021-0
>   at 
> org.apache.spark.streaming.JavaAPISuite.testCheckpointMasterRecovery(JavaAPISuite.java:1808)
> Running org.apache.spark.streaming.JavaDurationSuite
> {code}
> {code}
> Running org.apache.spark.streaming.JavaAPISuite
> Tests run: 53, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 22.325 sec 
> <<< FAILURE! - in org.apache.spark.streaming.JavaAPISuite
> testCheckpointMasterRecovery(org.apache.spark.streaming.JavaAPISuite)  Time 
> elapsed: 3.418 sec  <<< ERROR!
> java.io.IOException: Failed to delete: 
> C:\projects\spark\streaming\target\tmp\1474255953021-0
>   at 
> org.apache.spark.streaming.JavaAPISuite.testCheckpointMasterRecovery(JavaAPISuite.java:1808)
> {code}
> {code}
> Results :
> Tests in error: 
>   JavaAPISuite.testCheckpointMasterRecovery:1808 � IO Failed to delete: 
> C:\proje...
> Tests run: 74, Failures: 0, Errors: 1, Skipped: 0
> {code}
> The tests were aborted for unknown reason during SQL tests - 
> {{BroadcastJoinSuite}} emitting the exceptions below continuously:
> {code}
> 20:48:09.876 ERROR org.apache.spark.deploy.worker.ExecutorRunner: Error 
> running executor
> java.io.IOException: Cannot run program "C:\Progra~1\Java\jdk1.8.0\bin\java" 
> (in directory "C:\projects\spark\work\app-20160918204809-\0"): 
> CreateProcess error=206, The filename or extension is too long
>   at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
>   at 
> org.apache.spark.deploy.worker.ExecutorRunner.org$apache$spark$deploy$worker$ExecutorRunner$$fetchAndRunExecutor(ExecutorRunner.scala:167)
>   at 
> org.apache.spark.deploy.worker.ExecutorRunner$$anon$1.run(ExecutorRunner.scala:73)
> Caused by: java.io.IOException: CreateProcess error=206, The filename or 
> extension is too long
>   at java.lang.ProcessImpl.create(Native Method)
>   at java.lang.ProcessImpl.(ProcessImpl.java:386)
>   at java.lang.ProcessImpl.start(ProcessImpl.java:137)
>   at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)
>   ... 2 more
> {code}
> Here is the full log for the test - 
> https://ci.appveyor.com/project/spark-test/spark/build/15-scala-tests
> We may have to create sub-tasks if these are actual issues on Windows rather 
> than just mistakes in tests.
> I am willing to test this again after fixing some issues here in particular 
> the last one.
> I trigger the build by the comments below:
> {code}
> mvn -DskipTests -Phadoop-2.6 -Phive -Phive-thriftserver package
> mvn -Phadoop-2.6 -Phive -Phive-thriftserver --fail-never test
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org