jerqi opened a new issue, #9506:
URL: https://github.com/apache/gravitino/issues/9506

   ### Version
   
   main branch
   
   ### Describe what's wrong
   
   If you have multiple locations in a fileset,  the Gravitino server can't 
provide the credential vending token . Because Gravitino server only provides 
the default location credential vending token, but the client may require 
another location token.
   
   ### Error message and/or stacktrace
   
   3a://iceberg-test-strato/test_location_selector_5d2a9724_1: getFileStatus on 
s3a://iceberg-test-strato/test_location_selector_5d2a9724_1: 
com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden (Service: Amazon 
S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: 2B89081TTBGW4404; 
S3 Extended Request ID: 
4Z1rA+drsnRqUZcY0Qxe/OIKzwmIttJEKZXN+fdKShdj+fJgxRTbQVgEXF7n/wYNGuMYOI+VTkNYYZqnHBCCDMJBTpxeVtOFgHtFPV7hJ8I=;
 Proxy: null), S3 Extended Request ID: 
4Z1rA+drsnRqUZcY0Qxe/OIKzwmIttJEKZXN+fdKShdj+fJgxRTbQVgEXF7n/wYNGuMYOI+VTkNYYZqnHBCCDMJBTpxeVtOFgHtFPV7hJ8I=:403
 Forbidden
   java.nio.file.AccessDeniedException: 
s3a://iceberg-test-strato/test_location_selector_5d2a9724_1: getFileStatus on 
s3a://iceberg-test-strato/test_location_selector_5d2a9724_1: 
com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden (Service: Amazon 
S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: 2B89081TTBGW4404; 
S3 Extended Request ID: 
4Z1rA+drsnRqUZcY0Qxe/OIKzwmIttJEKZXN+fdKShdj+fJgxRTbQVgEXF7n/wYNGuMYOI+VTkNYYZqnHBCCDMJBTpxeVtOFgHtFPV7hJ8I=;
 Proxy: null), S3 Extended Request ID: 
4Z1rA+drsnRqUZcY0Qxe/OIKzwmIttJEKZXN+fdKShdj+fJgxRTbQVgEXF7n/wYNGuMYOI+VTkNYYZqnHBCCDMJBTpxeVtOFgHtFPV7hJ8I=:403
 Forbidden
        at 
org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:249)
        at 
org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:170)
        at 
org.apache.hadoop.fs.s3a.S3AFileSystem.s3GetFileStatus(S3AFileSystem.java:3286)
        at 
org.apache.hadoop.fs.s3a.S3AFileSystem.innerGetFileStatus(S3AFileSystem.java:3185)
        at 
org.apache.hadoop.fs.s3a.S3AFileSystem.getFileStatus(S3AFileSystem.java:3053)
        at 
org.apache.gravitino.filesystem.hadoop.DefaultGVFSOperations.getFileStatus(DefaultGVFSOperations.java:152)
        at 
org.apache.gravitino.filesystem.hadoop.GravitinoVirtualFileSystem.lambda$getFileStatus$5(GravitinoVirtualFileSystem.java:240)
        at org.apache.gravitino.file
   
   ### How to reproduce
   
   You can see existed the test cases
   ```
     @Test
     public void testCurrentLocationName() throws IOException {
       // create multiple locations fileset
       String filesetName = 
GravitinoITUtils.genRandomName("test_location_selector");
       NameIdentifier filesetIdent = NameIdentifier.of(schemaName, filesetName);
       Catalog catalog = metalake.loadCatalog(catalogName);
       String defaultStorageLocation = genStorageLocation(filesetName);
       String storageLocation1 = genStorageLocation(filesetName + "_1");
       String locationName1 = "location1";
       catalog
           .asFilesetCatalog()
           .createMultipleLocationFileset(
               filesetIdent,
               "fileset comment",
               Fileset.Type.MANAGED,
               ImmutableMap.of(
                   LOCATION_NAME_UNKNOWN, defaultStorageLocation, 
locationName1, storageLocation1),
               ImmutableMap.of(PROPERTY_DEFAULT_LOCATION_NAME, 
LOCATION_NAME_UNKNOWN));
       
Assertions.assertTrue(catalog.asFilesetCatalog().filesetExists(filesetIdent));
   
       // set location1 to current location
       Configuration configuration = new Configuration(conf);
       configuration.set(
           
GravitinoVirtualFileSystemConfiguration.FS_GRAVITINO_CURRENT_LOCATION_NAME, 
locationName1);
   
       Path hdfsPath1 = new Path(storageLocation1);
       try (FileSystem fs =
           
hdfsPath1.getFileSystem(convertGvfsConfigToRealFileSystemConfig(configuration)))
 {
         Path gvfsPath = genGvfsPath(filesetName);
         try (FileSystem gvfs = gvfsPath.getFileSystem(configuration)) {
           if (!gvfs.exists(gvfsPath)) {
             gvfs.mkdirs(gvfsPath);
           }
           String fileName = "test.txt";
           Path createPath = new Path(gvfsPath + "/" + fileName);
           // GCS need to close the stream to create the file manually.
           gvfs.create(createPath).close();
   
           Assertions.assertTrue(gvfs.exists(createPath));
           Assertions.assertTrue(fs.exists(new Path(storageLocation1 + "/" + 
fileName)));
           Assertions.assertFalse(fs.exists(new Path(defaultStorageLocation + 
"/" + fileName)));
         }
       }
   
       catalog.asFilesetCatalog().dropFileset(filesetIdent);
     }
   ```
   After you enabling the credential vending and remove AK/SK influence.  I 
drafted a PR
   it will throw
   ```
   3a://iceberg-test-strato/test_location_selector_5d2a9724_1: getFileStatus on 
s3a://iceberg-test-strato/test_location_selector_5d2a9724_1: 
com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden (Service: Amazon 
S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: 2B89081TTBGW4404; 
S3 Extended Request ID: 
4Z1rA+drsnRqUZcY0Qxe/OIKzwmIttJEKZXN+fdKShdj+fJgxRTbQVgEXF7n/wYNGuMYOI+VTkNYYZqnHBCCDMJBTpxeVtOFgHtFPV7hJ8I=;
 Proxy: null), S3 Extended Request ID: 
4Z1rA+drsnRqUZcY0Qxe/OIKzwmIttJEKZXN+fdKShdj+fJgxRTbQVgEXF7n/wYNGuMYOI+VTkNYYZqnHBCCDMJBTpxeVtOFgHtFPV7hJ8I=:403
 Forbidden
   java.nio.file.AccessDeniedException: 
s3a://iceberg-test-strato/test_location_selector_5d2a9724_1: getFileStatus on 
s3a://iceberg-test-strato/test_location_selector_5d2a9724_1: 
com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden (Service: Amazon 
S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: 2B89081TTBGW4404; 
S3 Extended Request ID: 
4Z1rA+drsnRqUZcY0Qxe/OIKzwmIttJEKZXN+fdKShdj+fJgxRTbQVgEXF7n/wYNGuMYOI+VTkNYYZqnHBCCDMJBTpxeVtOFgHtFPV7hJ8I=;
 Proxy: null), S3 Extended Request ID: 
4Z1rA+drsnRqUZcY0Qxe/OIKzwmIttJEKZXN+fdKShdj+fJgxRTbQVgEXF7n/wYNGuMYOI+VTkNYYZqnHBCCDMJBTpxeVtOFgHtFPV7hJ8I=:403
 Forbidden
        at 
org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:249)
        at 
org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:170)
        at 
org.apache.hadoop.fs.s3a.S3AFileSystem.s3GetFileStatus(S3AFileSystem.java:3286)
        at 
org.apache.hadoop.fs.s3a.S3AFileSystem.innerGetFileStatus(S3AFileSystem.java:3185)
        at 
org.apache.hadoop.fs.s3a.S3AFileSystem.getFileStatus(S3AFileSystem.java:3053)
        at 
org.apache.gravitino.filesystem.hadoop.DefaultGVFSOperations.getFileStatus(DefaultGVFSOperations.java:152)
        at 
org.apache.gravitino.filesystem.hadoop.GravitinoVirtualFileSystem.lambda$getFileStatus$5(GravitinoVirtualFileSystem.java:240)
        at org.apache.gravitino.file
   ```
   
   ### Additional context
   
   _No response_


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to