yuqi1129 commented on code in PR #5806:
URL: https://github.com/apache/gravitino/pull/5806#discussion_r1898259764


##########
docs/how-to-use-gvfs.md:
##########
@@ -77,16 +77,15 @@ Apart from the above properties, to access fileset like S3, 
GCS, OSS and custom
 | `s3-access-key-id`             | The access key of the AWS S3.               
                                                                                
                                                           | (none)        | 
Yes if it's a S3 fileset.| 0.7.0-incubating |
 | `s3-secret-access-key`         | The secret key of the AWS S3.               
                                                                                
                                                           | (none)        | 
Yes if it's a S3 fileset.| 0.7.0-incubating |
 
-At the same time, you need to place the corresponding bundle jar 
[`gravitino-aws-bundle-${version}.jar`](https://repo1.maven.org/maven2/org/apache/gravitino/aws-bundle/)
 in the Hadoop environment(typically located in 
`${HADOOP_HOME}/share/hadoop/common/lib/`).
-
+At the same time, you need to place the corresponding bundle jar 
[`gravitino-aws-bundle-${version}.jar`](https://repo1.maven.org/maven2/org/apache/gravitino/gravitino-aws-bundle/)
 in the Hadoop environment(typically located in 
`${HADOOP_HOME}/share/hadoop/common/lib/`).
 
 #### GCS fileset
 
 | Configuration item             | Description                                 
                                                                                
                                                             | Default value | 
Required                  | Since version    |
 
|--------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|---------------|---------------------------|------------------|
 | `gcs-service-account-file`     | The path of GCS service account JSON file.  
                                                                                
                                                             | (none)        | 
Yes if it's a GCS fileset.| 0.7.0-incubating |
 
-In the meantime, you need to place the corresponding bundle jar 
[`gravitino-gcp-bundle-${version}.jar`](https://repo1.maven.org/maven2/org/apache/gravitino/gcp-bundle/)
 in the Hadoop environment(typically located in 
`${HADOOP_HOME}/share/hadoop/common/lib/`).
+In the meantime, you need to place the corresponding bundle jar 
[`gravitino-gcp-bundle-${version}.jar`](https://repo1.maven.org/maven2/org/apache/gravitino/gravitino-gcp-bundle/)
 in the Hadoop environment(typically located in 
`${HADOOP_HOME}/share/hadoop/common/lib/`).

Review Comment:
   done



##########
docs/how-to-use-gvfs.md:
##########
@@ -137,8 +137,13 @@ You can configure these properties in two ways:
     ```
    
 :::note
-If you want to access the S3, GCS, OSS or custom fileset through GVFS, apart 
from the above properties, you need to place the corresponding bundle jar in 
the Hadoop environment. 
-For example if you want to access the S3 fileset, you need to place the S3 
bundle jar 
[`gravitino-aws-bundle-${version}.jar`](https://repo1.maven.org/maven2/org/apache/gravitino/aws-bundle/)
 in the Hadoop environment(typically located in 
`${HADOOP_HOME}/share/hadoop/common/lib/`) or add it to the classpath. 
+If you want to access the S3, GCS, OSS or custom fileset through GVFS, apart 
from the above properties, you need to place the corresponding bundle jars in 
the Hadoop environment. 
+For example, if you want to access the S3 fileset,  you need to place
+1. The S3 hadoop bundle jar 
[`gravitino-aws-bundle-${gravitino-version}.jar`](https://repo1.maven.org/maven2/org/apache/gravitino/gravitino-aws-bundle/)

Review Comment:
   done



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to