Copilot commented on code in PR #7455:
URL: https://github.com/apache/gravitino/pull/7455#discussion_r2160907243


##########
docs/docker-image-details.md:
##########
@@ -399,6 +399,7 @@ Changelog
 - datastrato/gravitino-ci-ranger:0.1.0
   - Docker image `datastrato/gravitino-ci-ranger:0.1.0`
   - Support Apache Ranger 2.4.0
-  - Use environment variable `RANGER_PASSWORD` to set up Apache Ranger admin 
password, Please notice Apache Ranger Password should be minimum 8 characters 
with min one alphabet and one numeric.
+  - Use environment variable `RANGER_PASSWORD` to -set up Apache Ranger admin 
password, Please 

Review Comment:
   [nitpick] Remove the extra hyphen before 'set up' and lowercase 'please' to 
correct grammar: "...to set up Apache Ranger admin password, please notice..."
   ```suggestion
     - Use environment variable `RANGER_PASSWORD` to set up Apache Ranger admin 
password, please 
   ```



##########
docs/fileset-catalog.md:
##########
@@ -1,39 +1,43 @@
 ---
-title: "Hadoop catalog"
-slug: /hadoop-catalog
+title: "Fileset catalog"
+slug: /fileset-catalog
 date: 2024-4-2
-keyword: hadoop catalog
+keyword: fileset catalog
 license: "This software is licensed under the Apache License version 2."
 ---
 
 ## Introduction
 
-Hadoop catalog is a fileset catalog that using Hadoop Compatible File System 
(HCFS) to manage
-the storage location of the fileset. Currently, it supports the local 
filesystem and HDFS. Since 0.7.0-incubating, Gravitino supports 
[S3](hadoop-catalog-with-s3.md), [GCS](hadoop-catalog-with-gcs.md), 
[OSS](hadoop-catalog-with-oss.md) and [Azure Blob 
Storage](hadoop-catalog-with-adls.md) through Hadoop catalog. 
+Fileset catalog is a fileset catalog that using Hadoop Compatible File System 
(HCFS) to manage
+the storage location of the fileset. Currently, it supports the local 
filesystem and HDFS. Since 
+0.7.0-incubating, Gravitino supports [S3](fileset-catalog-with-s3.md), 
[GCS](fileset-catalog-with-gcs.md),
+[OSS](fileset-catalog-with-oss.md) and [Azure Blob 
Storage](fileset-catalog-with-adls.md) through Fileset catalog. 
 
-The rest of this document will use HDFS or local file as an example to 
illustrate how to use the Hadoop catalog. For S3, GCS, OSS and Azure Blob 
Storage, the configuration is similar to HDFS, please refer to the 
corresponding document for more details.
+The rest of this document will use HDFS or local file as an example to 
illustrate how to use the Fileset catalog.
+For S3, GCS, OSS and Azure Blob Storage, the configuration is similar to HDFS, 
+please refer to the corresponding document for more details.
 
-Note that Gravitino uses Hadoop 3 dependencies to build Hadoop catalog. 
Theoretically, it should be
+Note that Gravitino uses Hadoop 3 dependencies to build Filesest catalog. 
Theoretically, it should be

Review Comment:
   Typo: 'Filesest' should be 'Fileset'.
   ```suggestion
   Note that Gravitino uses Hadoop 3 dependencies to build Fileset catalog. 
Theoretically, it should be
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to