LuciferYang commented on code in PR #40945:
URL: https://github.com/apache/spark/pull/40945#discussion_r1176397320


##########
core/src/main/scala/org/apache/spark/deploy/SparkHadoopUtil.scala:
##########
@@ -623,15 +623,13 @@ private[spark] object SparkHadoopUtil extends Logging {
       fs.create(path)
     } else {
       try {
-        // Use reflection as this uses APIs only available in Hadoop 3
-        val builderMethod = fs.getClass().getMethod("createFile", 
classOf[Path])
         // the builder api does not resolve relative paths, nor does it create 
parent dirs, while
         // the old api does.
         if (!fs.mkdirs(path.getParent())) {
           throw new IOException(s"Failed to create parents of $path")
         }
         val qualifiedPath = fs.makeQualified(path)
-        val builder = builderMethod.invoke(fs, qualifiedPath)
+        val builder = fs.createFile(qualifiedPath)
         val builderCls = builder.getClass()
         // this may throw a NoSuchMethodException if the path is not on hdfs
         val replicateMethod = builderCls.getMethod("replicate")

Review Comment:
   I originally intended to fix it as follows:
   ```scala
    fs.createFile(qualifiedPath) match {
       case hb: HdfsDataOutputStreamBuilder => hb.replicate().build()
       case _ => fs.create(path)
    }
   ```
   But I'm not sure if there are other types of `FSDataOutputStreamBuilder` 
that support the `replicate` method. Do you have any better suggestions? 
@pan3793 @sunchao 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to