This is an automated email from the ASF dual-hosted git repository.

jshao pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/gravitino.git


The following commit(s) were added to refs/heads/main by this push:
     new c7980c50e [Minor] fix(docs): Fix doc format problem of environment 
variants. (#5421)
c7980c50e is described below

commit c7980c50e762314d6bf76dd14670fd14ba4b42ec
Author: Qi Yu <[email protected]>
AuthorDate: Fri Nov 1 15:11:55 2024 +0800

    [Minor] fix(docs): Fix doc format problem of environment variants. (#5421)
    
    ### What changes were proposed in this pull request?
    
    Quoted environment by `` in the document.
    
    ## Why are the changes needed?
    
    Variants like ${xxx} should be quoted by `` in the document to make it work 
well;
    
    ### Does this PR introduce _any_ user-facing change?
    
    N/A.
    
    ### How was this patch tested?
    
    N/A.
---
 docs/hadoop-catalog.md  | 6 +++---
 docs/how-to-use-gvfs.md | 8 ++++----
 2 files changed, 7 insertions(+), 7 deletions(-)

diff --git a/docs/hadoop-catalog.md b/docs/hadoop-catalog.md
index 46b22f4de..cad11fdc5 100644
--- a/docs/hadoop-catalog.md
+++ b/docs/hadoop-catalog.md
@@ -52,7 +52,7 @@ Apart from the above properties, to access fileset like HDFS, 
S3, GCS, OSS or cu
 | `s3-access-key-id`             | The access key of the AWS S3.               
                                                                                
                                                                                
                | (none)          | Yes if it's a S3 fileset. | 
0.7.0-incubating |
 | `s3-secret-access-key`         | The secret key of the AWS S3.               
                                                                                
                                                                                
                | (none)          | Yes if it's a S3 fileset. | 
0.7.0-incubating |
 
-At the same time, you need to place the corresponding bundle jar 
[gravitino-aws-bundle-{version}.jar](https://repo1.maven.org/maven2/org/apache/gravitino/aws-bundle/)
 in the directory ${GRAVITINO_HOME}/catalogs/hadoop/libs.
+At the same time, you need to place the corresponding bundle jar 
[`gravitino-aws-bundle-${version}.jar`](https://repo1.maven.org/maven2/org/apache/gravitino/aws-bundle/)
 in the directory `${GRAVITINO_HOME}/catalogs/hadoop/libs`.
 
 #### GCS fileset
 
@@ -62,7 +62,7 @@ At the same time, you need to place the corresponding bundle 
jar [gravitino-aws-
 | `default-filesystem-provider` | The name default filesystem providers of 
this Hadoop catalog if users do not specify the scheme in the URI. Default 
value is `builtin-local`, for GCS, if we set this value, we can omit the prefix 
'gs://' in the location.| `builtin-local` | No                        | 
0.7.0-incubating |
 | `gcs-service-account-file`    | The path of GCS service account JSON file.   
                                                                                
                                                                                
               | (none)          | Yes if it's a GCS fileset.| 0.7.0-incubating 
|
 
-In the meantime, you need to place the corresponding bundle jar 
[gravitino-gcp-bundle-{version}.jar](https://repo1.maven.org/maven2/org/apache/gravitino/gcp-bundle/)
 in the directory ${GRAVITINO_HOME}/catalogs/hadoop/libs.
+In the meantime, you need to place the corresponding bundle jar 
[`gravitino-gcp-bundle-${version}.jar`](https://repo1.maven.org/maven2/org/apache/gravitino/gcp-bundle/)
 in the directory `${GRAVITINO_HOME}/catalogs/hadoop/libs`.
 
 #### OSS fileset
 
@@ -74,7 +74,7 @@ In the meantime, you need to place the corresponding bundle 
jar [gravitino-gcp-b
 | `oss-access-key-id`           | The access key of the Aliyun OSS.            
                                                                                
                                                                                
                | (none)          | Yes if it's a OSS fileset.| 
0.7.0-incubating |
 | `oss-secret-access-key`       | The secret key of the Aliyun OSS.            
                                                                                
                                                                                
                | (none)          | Yes if it's a OSS fileset.| 
0.7.0-incubating |
 
-In the meantime, you need to place the corresponding bundle jar 
[gravitino-aliyun-bundle-{version}.jar](https://repo1.maven.org/maven2/org/apache/gravitino/aliyun-bundle/)
 in the directory ${GRAVITINO_HOME}/catalogs/hadoop/libs.
+In the meantime, you need to place the corresponding bundle jar 
[`gravitino-aliyun-bundle-${version}.jar`](https://repo1.maven.org/maven2/org/apache/gravitino/aliyun-bundle/)
 in the directory `${GRAVITINO_HOME}/catalogs/hadoop/libs`.
 
 :::note
 - Gravitino contains builtin file system providers for local file 
system(`builtin-local`) and HDFS(`builtin-hdfs`), that is to say if 
`filesystem-providers` is not set, Gravitino will still support local file 
system and HDFS. Apart from that, you can set the `filesystem-providerss` to 
support other file systems like S3, GCS, OSS or custom file system.
diff --git a/docs/how-to-use-gvfs.md b/docs/how-to-use-gvfs.md
index e3e03e131..4c1c5b943 100644
--- a/docs/how-to-use-gvfs.md
+++ b/docs/how-to-use-gvfs.md
@@ -78,7 +78,7 @@ Apart from the above properties, to access fileset like S3, 
GCS, OSS and custom
 | `s3-access-key-id`             | The access key of the AWS S3.               
                                                                                
                                                           | (none)        | 
Yes if it's a S3 fileset.| 0.7.0-incubating |
 | `s3-secret-access-key`         | The secret key of the AWS S3.               
                                                                                
                                                           | (none)        | 
Yes if it's a S3 fileset.| 0.7.0-incubating |
 
-At the same time, you need to place the corresponding bundle jar 
[gravitino-aws-bundle-{version}.jar](https://repo1.maven.org/maven2/org/apache/gravitino/aws-bundle/)
 in the Hadoop environment(typically located in 
`${HADOOP_HOME}/share/hadoop/common/lib/`).
+At the same time, you need to place the corresponding bundle jar 
[`gravitino-aws-bundle-${version}.jar`](https://repo1.maven.org/maven2/org/apache/gravitino/aws-bundle/)
 in the Hadoop environment(typically located in 
`${HADOOP_HOME}/share/hadoop/common/lib/`).
 
 
 #### GCS fileset
@@ -88,7 +88,7 @@ At the same time, you need to place the corresponding bundle 
jar [gravitino-aws-
 | `fs.gvfs.filesystem.providers` | The file system providers to add. Set it to 
`gs` if it's a GCS fileset, or a comma separated string that contains `gs` like 
`gs,s3` to support multiple kinds of fileset including `gs`. | (none)        | 
Yes if it's a GCS fileset.|   0.7.0-incubating |
 | `gcs-service-account-file`     | The path of GCS service account JSON file.  
                                                                                
                                                             | (none)        | 
Yes if it's a GCS fileset.| 0.7.0-incubating   |
 
-In the meantime, you need to place the corresponding bundle jar 
[gravitino-gcp-bundle-{version}.jar](https://repo1.maven.org/maven2/org/apache/gravitino/gcp-bundle/)
 in the Hadoop environment(typically located in 
`${HADOOP_HOME}/share/hadoop/common/lib/`).
+In the meantime, you need to place the corresponding bundle jar 
[`gravitino-gcp-bundle-${version}.jar`](https://repo1.maven.org/maven2/org/apache/gravitino/gcp-bundle/)
 in the Hadoop environment(typically located in 
`${HADOOP_HOME}/share/hadoop/common/lib/`).
 
 
 #### OSS fileset
@@ -100,7 +100,7 @@ In the meantime, you need to place the corresponding bundle 
jar [gravitino-gcp-b
 | `oss-access-key-id`             | The access key of the Aliyun OSS.          
                                                                                
                                                                    | (none)    
    | Yes if it's a OSS fileset.| 0.7.0-incubating |
 | `oss-secret-access-key`         | The secret key of the Aliyun OSS.          
                                                                                
                                                                    | (none)    
    | Yes if it's a OSS fileset.| 0.7.0-incubating |
 
-In the meantime, you need to place the corresponding bundle jar 
[gravitino-aliyun-bundle-{version}.jar](https://repo1.maven.org/maven2/org/apache/gravitino/aliyun-bundle/)
 in the Hadoop environment(typically located in 
`${HADOOP_HOME}/share/hadoop/common/lib/`).
+In the meantime, you need to place the corresponding bundle jar 
[`gravitino-aliyun-bundle-${version}.jar`](https://repo1.maven.org/maven2/org/apache/gravitino/aliyun-bundle/)
 in the Hadoop environment(typically located in 
`${HADOOP_HOME}/share/hadoop/common/lib/`).
 
 #### Custom fileset 
 Since 0.7.0-incubating, users can define their own fileset type and configure 
the corresponding properties, for more, please refer to [Custom 
Fileset](./hadoop-catalog.md#how-to-custom-your-own-hcfs-file-system-fileset).
@@ -136,7 +136,7 @@ You can configure these properties in two ways:
    
 :::note
 If you want to access the S3, GCS, OSS or custom fileset through GVFS, apart 
from the above properties, you need to place the corresponding bundle jar in 
the Hadoop environment. 
-For example if you want to access the S3 fileset, you need to place the S3 
bundle jar 
[gravitino-aws-bundle-{version}.jar](https://repo1.maven.org/maven2/org/apache/gravitino/aws-bundle/)
 in the Hadoop environment(typically located in 
`${HADOOP_HOME}/share/hadoop/common/lib/`) or add it to the classpath. 
+For example if you want to access the S3 fileset, you need to place the S3 
bundle jar 
[`gravitino-aws-bundle-${version}.jar`](https://repo1.maven.org/maven2/org/apache/gravitino/aws-bundle/)
 in the Hadoop environment(typically located in 
`${HADOOP_HOME}/share/hadoop/common/lib/`) or add it to the classpath. 
 :::
 
 2. Configure the properties in the `core-site.xml` file of the Hadoop 
environment:

Reply via email to