LadyForest commented on code in PR #210:
URL: https://github.com/apache/flink-table-store/pull/210#discussion_r919649696


##########
docs/content/docs/engines/spark.md:
##########
@@ -41,17 +41,33 @@ Download [flink-table-store-spark-{{< version 
>}}.jar](https://repo.maven.apache
 You are using an unreleased version of Table Store, you need to manually 
[Build Spark Bundled Jar]({{< ref "docs/engines/build" >}}) from the source 
code.
 {{< /unstable >}}
 
-Copy Table Store Spark bundle jar to `spark/jars`.
+Use `--jars` in spark-sql:
+```bash
+spark-sql ... --jars flink-table-store-spark-{{< version >}}.jar
+```
 
-## Table Store Catalog
+You can also copy `flink-table-store-spark-{{< version >}}.jar` to 
`spark/jars` in your Spark installation.
+
+## Catalog
 
 The following command registers the Table Store's Spark catalog with the name 
`table_store`:
 
 ```bash
-spark-sql --conf 
spark.sql.catalog.table_store=org.apache.flink.table.store.spark.SparkCatalog \
+spark-sql ... \
+    --conf 
spark.sql.catalog.table_store=org.apache.flink.table.store.spark.SparkCatalog \
     --conf spark.sql.catalog.table_store.warehouse=file:/tmp/warehouse
 ```
 
+If you are using the Hive Metastore, you will need to add some configuration:

Review Comment:
   ```suggestion
   Some extra configurations are needed if your Spark application uses the Hive 
Metastore to manage metadata.
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to