HyukjinKwon commented on a change in pull request #32184:
URL: https://github.com/apache/spark/pull/32184#discussion_r692191737



##########
File path: docs/job-scheduling.md
##########
@@ -252,10 +252,11 @@ properties:
 
 The pool properties can be set by creating an XML file, similar to 
`conf/fairscheduler.xml.template`,
 and either putting a file named `fairscheduler.xml` on the classpath, or 
setting `spark.scheduler.allocation.file` property in your
-[SparkConf](configuration.html#spark-properties).
+[SparkConf](configuration.html#spark-properties). The file path can either be 
a local file path or HDFS file path.

Review comment:
       Actually, this line isn't completely true. It can be a local file path 
only when `fs.default.scheme` is `file:///` which usually isn't in production.
   
   So, if users from old Spark versions use a path like `/path/to/file`, the 
files will be written into HDFS after the upgrade.
   
   Can we at least update the migration guide? We should also mention that it 
respects Hadoop properties now.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to