Hyukjin Kwon created SPARK-18718:
------------------------------------

             Summary: Skip some test failures due th path length limitation and 
fix tests to pass on Windows
                 Key: SPARK-18718
                 URL: https://issues.apache.org/jira/browse/SPARK-18718
             Project: Spark
          Issue Type: Sub-task
          Components: Tests
            Reporter: Hyukjin Kwon
            Priority: Minor


There are some tests failed on Windows due to the wrong format of path and the 
limitation of path length as below:


- {{InsertSuite}}

  {code}
  Exception encountered when attempting to run a suite with class name: 
org.apache.spark.sql.sources.InsertSuite *** ABORTED *** (12 seconds, 547 
milliseconds)
      org.apache.spark.sql.AnalysisException: Path does not exist: 
file:/C:projectsspark        arget   
mpspark-177945ef-9128-42b4-8c07-de31f78bbbd6;
      at 
org.apache.spark.sql.execution.datasources.DataSource$$anonfun$14.apply(DataSource.scala:382)
      at 
org.apache.spark.sql.execution.datasources.DataSource$$anonfun$14.apply(DataSource.scala:370)
      at 
scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
{code}


- {{BroadcastJoinSuite}}
{code}
  04:09:40.882 ERROR org.apache.spark.deploy.worker.ExecutorRunner: Error 
running executor
  java.io.IOException: Cannot run program "C:\Progra~1\Java\jdk1.8.0\bin\java" 
(in directory "C:\projects\spark\work\app-20161205040542-0000\51658"): 
CreateProcess error=206, The filename or extension is too long
      at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
      at 
org.apache.spark.deploy.worker.ExecutorRunner.org$apache$spark$deploy$worker$ExecutorRunner$$fetchAndRunExecutor(ExecutorRunner.scala:167)
      at 
org.apache.spark.deploy.worker.ExecutorRunner$$anon$1.run(ExecutorRunner.scala:73)
  Caused by: java.io.IOException: CreateProcess error=206, The filename or 
extension is too long
      at java.lang.ProcessImpl.create(Native Method)
      at java.lang.ProcessImpl.<init>(ProcessImpl.java:386)
      at java.lang.ProcessImpl.start(ProcessImpl.java:137)
      at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)
      ... 2 more
  04:09:40.929 ERROR org.apache.spark.deploy.worker.ExecutorRunner: Error 
running executor

    (appearently infinite same error messages)
  
  ...
  {code}

- {{PathOptionSuite}}
{code}
  - path option also exist for write path *** FAILED *** (1 second, 93 
milliseconds)
    "C:[projectsspark   arget   mp]spark-5ab34a58-df8d-..." did not equal 
"C:[\projects\spark\target\tmp\]spark-5ab34a58-df8d-..." 
(PathOptionSuite.scala:93)
    org.scalatest.exceptions.TestFailedException:
        at 
org.scalatest.Assertions$class.newAssertionFailedException(Assertions.scala:500)
        at 
org.scalatest.FunSuite.newAssertionFailedException(FunSuite.scala:1555)
    ...
{code}

- {{SparkLauncherSuite}}

{code}
  Test org.apache.spark.launcher.SparkLauncherSuite.testChildProcLauncher 
failed: java.lang.AssertionError: expected:<0> but was:<1>, took 0.062 sec
    at 
org.apache.spark.launcher.SparkLauncherSuite.testChildProcLauncher(SparkLauncherSuite.java:177)
      ...
{code}

- {{UDFSuite}}
{code}
  - SPARK-8005 input_file_name *** FAILED *** (2 seconds, 234 milliseconds)
    
"file:///C:/projects/spark/target/tmp/spark-e4e5720a-2006-48f9-8b11-797bf59794bf/part-00001-26fb05e4-603d-471d-ae9d-b9549e0c7765.snappy.parquet"
 did not contain 
"C:\projects\spark\target\tmp\spark-e4e5720a-2006-48f9-8b11-797bf59794bf" 
(UDFSuite.scala:67)
    org.scalatest.exceptions.TestFailedException:
      at 
org.scalatest.Assertions$class.newAssertionFailedException(Assertions.scala:500)
      at org.scalatest.FunSuite.newAssertionFailedException(FunSuite.scala:1555)
    ...
{code}

This JIRA will complete SPARK-17591 for now because I could proceed further 
more tests on Windows.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to