This is an automated email from the ASF dual-hosted git repository.
fokko pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/iceberg-docs.git
The following commit(s) were added to refs/heads/main by this push:
new b5db4c79 Fixed some naming issues (#201)
b5db4c79 is described below
commit b5db4c79b251d7485710aaab56400dbc8096a064
Author: J·Y <[email protected]>
AuthorDate: Wed Mar 1 21:05:15 2023 +0800
Fixed some naming issues (#201)
---
landing-page/content/common/spark-quickstart.md | 20 ++++++++++----------
1 file changed, 10 insertions(+), 10 deletions(-)
diff --git a/landing-page/content/common/spark-quickstart.md
b/landing-page/content/common/spark-quickstart.md
index 76caf009..2047a1d1 100644
--- a/landing-page/content/common/spark-quickstart.md
+++ b/landing-page/content/common/spark-quickstart.md
@@ -294,10 +294,10 @@ spark-sql --packages
org.apache.iceberg:iceberg-spark-runtime-3.2_2.12:{{% icebe
--conf
spark.sql.extensions=org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions
\
--conf
spark.sql.catalog.spark_catalog=org.apache.iceberg.spark.SparkSessionCatalog \
--conf spark.sql.catalog.spark_catalog.type=hive \
- --conf spark.sql.catalog.demo=org.apache.iceberg.spark.SparkCatalog \
- --conf spark.sql.catalog.demo.type=hadoop \
- --conf spark.sql.catalog.demo.warehouse=$PWD/warehouse \
- --conf spark.sql.defaultCatalog=demo
+ --conf spark.sql.catalog.local=org.apache.iceberg.spark.SparkCatalog \
+ --conf spark.sql.catalog.local.type=hadoop \
+ --conf spark.sql.catalog.local.warehouse=$PWD/warehouse \
+ --conf spark.sql.defaultCatalog=local
```
{{% /tabcontent %}}
{{% tabcontent "spark-defaults" %}}
@@ -306,17 +306,17 @@ spark.jars.packages
org.apache.iceberg:iceberg-
spark.sql.extensions
org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions
spark.sql.catalog.spark_catalog
org.apache.iceberg.spark.SparkSessionCatalog
spark.sql.catalog.spark_catalog.type hive
-spark.sql.catalog.demo
org.apache.iceberg.spark.SparkCatalog
-spark.sql.catalog.demo.type hadoop
-spark.sql.catalog.demo.warehouse $PWD/warehouse
-spark.sql.defaultCatalog demo
+spark.sql.catalog.local
org.apache.iceberg.spark.SparkCatalog
+spark.sql.catalog.local.type hadoop
+spark.sql.catalog.local.warehouse $PWD/warehouse
+spark.sql.defaultCatalog local
```
{{% /tabcontent %}}
{{% /codetabs %}}
{{< hint info >}}
-If your Iceberg catalog is not set as the default catalog, you will have to
switch to it by executing `USE demo;`
+If your Iceberg catalog is not set as the default catalog, you will have to
switch to it by executing `USE local;`
{{< /hint >}}
### Next steps
@@ -355,4 +355,4 @@ You can download the runtime by visiting to the
[Releases](https://iceberg.apach
#### Learn More
-Now that you're up an running with Iceberg and Spark, check out the
[Iceberg-Spark docs](../docs/latest/spark-ddl/) to learn more!
\ No newline at end of file
+Now that you're up an running with Iceberg and Spark, check out the
[Iceberg-Spark docs](../docs/latest/spark-ddl/) to learn more!