Github user srowen commented on the issue:
https://github.com/apache/spark/pull/18124
I don't think we'd change this then if it's kind of complicating the code
and for a non-standard usage anyway.
---
If your project is set up for it, you can reply to this email and have your
reply
Github user nonsleepr commented on the issue:
https://github.com/apache/spark/pull/18124
@jerryshao You're right, it won't be empty if I run the app using
spark-submit. The whole reason I encountered this bug is that in my project I'm
running Spark job remotely (via YARN's REST API) f
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/18124
I'm not sure how this could be happened, "SPARK_YARN_STAGING_DIR" is a
Spark internal environment variable which should be empty, unless you
deliberately unset it.
In the JIRA you mention
Github user nonsleepr commented on the issue:
https://github.com/apache/spark/pull/18124
@srowen Because `Path` will perform multiple checks. E.g. check for `null`
or empty string and then URI syntax exception.
---
If your project is set up for it, you can reply to this email and hav
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/18124
Can one of the admins verify this patch?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feat
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/18124
Why? instead of check the value of the env variable directly.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not