This is an automated email from the ASF dual-hosted git repository.

fanng pushed a commit to branch branch-0.8
in repository https://gitbox.apache.org/repos/asf/gravitino.git


The following commit(s) were added to refs/heads/branch-0.8 by this push:
     new 7a3f8ea70c [MINOR] fix(dos): Fix minor error about spelling and web 
link (#6294)
7a3f8ea70c is described below

commit 7a3f8ea70c68aa2cf2c72f7e74f6f40500d938ef
Author: github-actions[bot] 
<41898282+github-actions[bot]@users.noreply.github.com>
AuthorDate: Thu Jan 16 16:06:32 2025 +0800

    [MINOR] fix(dos): Fix minor error about spelling and web link (#6294)
    
    ### What changes were proposed in this pull request?
    
    Change `Hadop` -> `Hadoop` and fix a link error.
    
    ### Why are the changes needed?
    
    It's spelling mistakes.
    
    ### Does this PR introduce _any_ user-facing change?
    
    N/A
    
    ### How was this patch tested?
    
    N/A
    
    Co-authored-by: Qi Yu <[email protected]>
---
 docs/hadoop-catalog-with-s3.md | 2 +-
 docs/hadoop-catalog.md         | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/docs/hadoop-catalog-with-s3.md b/docs/hadoop-catalog-with-s3.md
index 2c8f8131b5..e5bd4c41f7 100644
--- a/docs/hadoop-catalog-with-s3.md
+++ b/docs/hadoop-catalog-with-s3.md
@@ -446,7 +446,7 @@ In order to access fileset with S3 using the GVFS Python 
client, apart from [bas
 | `s3_secret_access_key` | The secret key of the AWS S3.                       
                                                                                
         | (none)        | Yes      | 0.7.0-incubating |
 
 :::note
-- `s3_endpoint` is an optional configuration for GVFS **Python** client but a 
required configuration for GVFS **Java** client to access Hadop with AWS S3, 
and it is required for other S3-compatible storage services like MinIO.
+- `s3_endpoint` is an optional configuration for GVFS **Python** client but a 
required configuration for GVFS **Java** client to access Hadoop with AWS S3, 
and it is required for other S3-compatible storage services like MinIO.
 - If the catalog has enabled [credential 
vending](security/credential-vending.md), the properties above can be omitted.
 :::
 
diff --git a/docs/hadoop-catalog.md b/docs/hadoop-catalog.md
index abd8dfefb5..978efad90d 100644
--- a/docs/hadoop-catalog.md
+++ b/docs/hadoop-catalog.md
@@ -9,7 +9,7 @@ license: "This software is licensed under the Apache License 
version 2."
 ## Introduction
 
 Hadoop catalog is a fileset catalog that using Hadoop Compatible File System 
(HCFS) to manage
-the storage location of the fileset. Currently, it supports the local 
filesystem and HDFS. Since 0.7.0-incubating, Gravitino supports 
[S3](hadoop-catalog-with-S3.md), [GCS](hadoop-catalog-with-gcs.md), 
[OSS](hadoop-catalog-with-oss.md) and [Azure Blob 
Storage](hadoop-catalog-with-adls.md) through Hadoop catalog. 
+the storage location of the fileset. Currently, it supports the local 
filesystem and HDFS. Since 0.7.0-incubating, Gravitino supports 
[S3](hadoop-catalog-with-s3.md), [GCS](hadoop-catalog-with-gcs.md), 
[OSS](hadoop-catalog-with-oss.md) and [Azure Blob 
Storage](hadoop-catalog-with-adls.md) through Hadoop catalog. 
 
 The rest of this document will use HDFS or local file as an example to 
illustrate how to use the Hadoop catalog. For S3, GCS, OSS and Azure Blob 
Storage, the configuration is similar to HDFS, please refer to the 
corresponding document for more details.
 

Reply via email to