Github user mengxr commented on a diff in the pull request:

    https://github.com/apache/spark/pull/9749#discussion_r45118179
  
    --- Diff: 
mllib/src/test/scala/org/apache/spark/ml/util/DefaultReadWriteTest.scala ---
    @@ -19,23 +19,30 @@ package org.apache.spark.ml.util
     
     import java.io.{File, IOException}
     
    +import org.apache.hadoop.fs.Path
     import org.scalatest.Suite
     
     import org.apache.spark.SparkFunSuite
    +import org.apache.spark.ml.{Model, Estimator}
     import org.apache.spark.ml.param._
     import org.apache.spark.mllib.util.MLlibTestSparkContext
    +import org.apache.spark.sql.DataFrame
     
     trait DefaultReadWriteTest extends TempDirectory { self: Suite =>
     
       /**
        * Checks "overwrite" option and params.
    +   * This saves to and loads from [[tempDir]], but creates a subdirectory 
with a random name
    +   * in order to avoid conflicts from multiple calls to this method.
        * @param instance ML instance to test saving/loading
        * @tparam T ML instance type
        * @return  Instance loaded from file
        */
       def testDefaultReadWrite[T <: Params with Writable](instance: T): T = {
         val uid = instance.uid
    -    val path = new File(tempDir, uid).getPath
    +    val subdirName = Identifiable.randomUID("test")
    +    val subdir = new Path(tempDir.getPath, subdirName).toString
    --- End diff --
    
    We don't need pull in `Path` from Hadoop. This should work:
    
    ~~~
    val subdir = new File(tempDir, subdirName)
    val path = new File(subdir, uid).getPath
    ~~~



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to