[ https://issues.apache.org/jira/browse/SPARK-43235?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17720979#comment-17720979 ]
Pralabh Kumar commented on SPARK-43235: --------------------------------------- can any one please look into this . If ok I can create PR for it . > ClientDistributedCacheManager doesn't set the LocalResourceVisibility.PRIVATE > if isPublic throws exception > ---------------------------------------------------------------------------------------------------------- > > Key: SPARK-43235 > URL: https://issues.apache.org/jira/browse/SPARK-43235 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 3.4.0 > Reporter: Pralabh Kumar > Priority: Minor > > Hi Spark Team . > Currently *ClientDistributedCacheManager* *getVisibility* methods checks > whether resource visibility can be set to private or public. > In order to set *LocalResourceVisibility.PUBLIC* ,isPublic checks permission > of all the ancestors directories for the executable directory . It goes till > the root folder to check permission of all the parents > (ancestorsHaveExecutePermissions) > checkPermissionOfOther calls FileStatus getFileStatus to check the > permission . > If the FileStatus getFileStatus throws exception Spark Submit fails . It > didn't sets the permission to Private. > if (isPublic(conf, uri, statCache)) > { LocalResourceVisibility.PUBLIC } > else > { LocalResourceVisibility.PRIVATE } > Generally if the user doesn't have permission to check for root folder > (specifically in case of cloud file system(GCS) (for the buckets) , methods > throws error IOException(Error accessing Bucket). > > *Ideally if there is an error in isPublic , which means Spark isn't able to > determine the execution permission of all the parents directory , it should > set the LocalResourceVisibility.PRIVATE. However, it currently throws an > exception in isPublic and hence Spark Submit fails* > > -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org