This is an automated email from the ASF dual-hosted git repository.

yuqi4733 pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/gravitino-playground.git


The following commit(s) were added to refs/heads/main by this push:
     new b9d6d78  [MINOR] fix minor docs (#70)
b9d6d78 is described below

commit b9d6d78f70e134c91f9dfc6c1a88efc9e7802113
Author: roryqi <[email protected]>
AuthorDate: Thu Aug 29 16:24:51 2024 +0800

    [MINOR] fix minor docs (#70)
---
 README.md | 8 ++++----
 1 file changed, 4 insertions(+), 4 deletions(-)

diff --git a/README.md b/README.md
index d0f6725..1793314 100644
--- a/README.md
+++ b/README.md
@@ -63,7 +63,7 @@ the full functionality of the playground.
 
 ### Using Trino CLI in Docker Container
 
-1. Log in to the Gravitino playground Trino Docker container using the 
following command:
+1. Login to the Gravitino playground Trino Docker container using the 
following command:
 
 ```shell
 docker exec -it playground-trino bash
@@ -85,7 +85,7 @@ trino@container_id:/$ trino
 
 ## Using Spark client
 
-1. Log in to the Gravitino playground Spark Docker container using the 
following command:
+1. Login to the Gravitino playground Spark Docker container using the 
following command:
 
 ```shell
 docker exec -it playground-spark bash
@@ -174,7 +174,7 @@ GROUP BY e.employee_id,  given_name, family_name;
 
 You might consider generating data with SparkSQL and then querying this data 
using Trino. Give it a try with Gravitino:
 
-1. login Spark container and execute the SQLs:
+1. Login Spark container and execute the SQLs:
 
 ```sql
 // using Hive catalog to create Hive table
@@ -195,7 +195,7 @@ INSERT OVERWRITE TABLE employees 
PARTITION(department='Engineering') VALUES (1,
 INSERT OVERWRITE TABLE employees PARTITION(department='Marketing') VALUES (3, 
'Mike Brown', 32);
 ```
 
-2. login Trino container and execute SQLs:
+2. Login Trino container and execute SQLs:
 
 ```sql
 SELECT * FROM catalog_hive.product.employees WHERE department = 'Engineering';

Reply via email to