[ 
https://issues.apache.org/jira/browse/HADOOP-18330?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17676735#comment-17676735
 ] 

ASF GitHub Bot commented on HADOOP-18330:
-----------------------------------------

steveloughran commented on PR #4572:
URL: https://github.com/apache/hadoop/pull/4572#issuecomment-1382267572

   The first release candidate is up for testing. Download and test it. This is 
the only way you can be sure that the actual shipped version does what you 
need. 




> S3AFileSystem removes Path when calling createS3Client
> ------------------------------------------------------
>
>                 Key: HADOOP-18330
>                 URL: https://issues.apache.org/jira/browse/HADOOP-18330
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: fs/s3
>    Affects Versions: 3.3.0, 3.3.1, 3.3.2, 3.3.3
>            Reporter: Ashutosh Pant
>            Assignee: Ashutosh Pant
>            Priority: Minor
>              Labels: pull-request-available
>             Fix For: 3.3.5
>
>          Time Spent: 3h 50m
>  Remaining Estimate: 0h
>
> when using hadoop and spark to read/write data from an s3 bucket like -> 
> s3a://bucket/path and using a custom Credentials Provider, the path is 
> removed from the s3a URI and the credentials provider fails because the full 
> path is gone.
> In Spark 3.2,
> It was invoked as -> s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, 
> conf)
> .createS3Client(name, bucket, credentials); 
> But In spark 3.3.3
> It is invoked as s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, 
> conf).createS3Client(getUri(), parameters);
> the getUri() removes the path from the s3a URI



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to