This is an automated email from the ASF dual-hosted git repository.

lzljs3620320 pushed a commit to branch release-0.2
in repository https://gitbox.apache.org/repos/asf/flink-table-store.git


The following commit(s) were added to refs/heads/release-0.2 by this push:
     new ae4fb439 [FLINK-28670] Documentation of spark2 is wrong
ae4fb439 is described below

commit ae4fb439bc2a569557a12876d500d87711992134
Author: Jingsong Lee <[email protected]>
AuthorDate: Mon Jul 25 18:20:09 2022 +0800

    [FLINK-28670] Documentation of spark2 is wrong
    
    This closes #235
---
 docs/content/docs/engines/spark2.md | 22 +++++++---------------
 1 file changed, 7 insertions(+), 15 deletions(-)

diff --git a/docs/content/docs/engines/spark2.md 
b/docs/content/docs/engines/spark2.md
index d84342d8..ec9593d0 100644
--- a/docs/content/docs/engines/spark2.md
+++ b/docs/content/docs/engines/spark2.md
@@ -48,21 +48,13 @@ spark-sql ... --jars flink-table-store-spark2-{{< version 
>}}.jar
 
 Alternatively, you can copy `flink-table-store-spark2-{{< version >}}.jar` 
under `spark/jars` in your Spark installation.
 
-## Create Temporary View
+## Read
 
-Use the `CREATE TEMPORARY VIEW` command to create a Spark mapping table on top 
of
-an existing Table Store table.
+Table store with Spark 2.4 does not support DDL, you can use the Dataset reader
+and register the Dataset as a temporary table. In spark shell:
 
-```sql
-CREATE TEMPORARY VIEW myTable
-USING tablestore
-OPTIONS (
-  path "file:/tmp/warehouse/default.db/myTable"
-)
-```
-
-## Query Table
-
-```sql
-SELECT * FROM myTable;
+```scala
+val dataset = 
spark.read.format("tablestore").load("file:/tmp/warehouse/default.db/myTable")
+dataset.createOrReplaceTempView("myTable")
+spark.sql("SELECT * FROM myTable")
 ```

Reply via email to