This is an automated email from the ASF dual-hosted git repository.
github-bot pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/iceberg-docs.git
The following commit(s) were added to refs/heads/asf-site by this push:
new bc9956f1 deploy: b5db4c79b251d7485710aaab56400dbc8096a064
bc9956f1 is described below
commit bc9956f14d76c304089b7bd5836023011d01b433
Author: Fokko <[email protected]>
AuthorDate: Wed Mar 1 13:05:40 2023 +0000
deploy: b5db4c79b251d7485710aaab56400dbc8096a064
---
getting-started/index.html | 20 +-------------------
landingpagesearch.json | 2 +-
spark-quickstart/index.html | 18 +++++++++---------
3 files changed, 11 insertions(+), 29 deletions(-)
diff --git a/getting-started/index.html b/getting-started/index.html
index b3c6e0d8..52cc0335 100644
--- a/getting-started/index.html
+++ b/getting-started/index.html
@@ -1,19 +1 @@
-<!--
- - Licensed to the Apache Software Foundation (ASF) under one or more
- - contributor license agreements. See the NOTICE file distributed with
- - this work for additional information regarding copyright ownership.
- - The ASF licenses this file to You under the Apache License, Version 2.0
- - (the "License"); you may not use this file except in compliance with
- - the License. You may obtain a copy of the License at
- -
- - http://www.apache.org/licenses/LICENSE-2.0
- -
- - Unless required by applicable law or agreed to in writing, software
- - distributed under the License is distributed on an "AS IS" BASIS,
- - WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- - See the License for the specific language governing permissions and
- - limitations under the License.
- -->
-<head>
- <meta http-equiv="Refresh" content="0; url='/docs/latest/getting-started'" />
-</head>
+<!doctype html><html
lang=en-us><head><title>https://iceberg.apache.org/spark-quickstart/</title><link
rel=canonical href=https://iceberg.apache.org/spark-quickstart/><meta
name=robots content="noindex"><meta charset=utf-8><meta http-equiv=refresh
content="0; url=https://iceberg.apache.org/spark-quickstart/"></head></html>
\ No newline at end of file
diff --git a/landingpagesearch.json b/landingpagesearch.json
index 5a684abb..2291a012 100644
--- a/landingpagesearch.json
+++ b/landingpagesearch.json
@@ -1 +1 @@
-[{"categories":null,"content":" Spark and Iceberg Quickstart This guide will
get you up and running with an Iceberg and Spark environment, including sample
code to highlight some powerful features. You can learn more about Iceberg’s
Spark runtime by checking out the Spark section.\nDocker-Compose Creating a
table Writing Data to a Table Reading Data from a Table Adding A Catalog Next
Steps Docker-Compose The fastest way to get started is to use a docker-compose
file that uses the the tab [...]
\ No newline at end of file
+[{"categories":null,"content":" Spark and Iceberg Quickstart This guide will
get you up and running with an Iceberg and Spark environment, including sample
code to highlight some powerful features. You can learn more about Iceberg’s
Spark runtime by checking out the Spark section.\nDocker-Compose Creating a
table Writing Data to a Table Reading Data from a Table Adding A Catalog Next
Steps Docker-Compose The fastest way to get started is to use a docker-compose
file that uses the the tab [...]
\ No newline at end of file
diff --git a/spark-quickstart/index.html b/spark-quickstart/index.html
index c82fe517..b76e38e6 100644
--- a/spark-quickstart/index.html
+++ b/spark-quickstart/index.html
@@ -163,19 +163,19 @@ the <a
href=../docs/latest/spark-configuration/#catalogs>Catalog</a> page in the
</span></span></span><span style=display:flex><span><span
style=color:#ae81ff></span> --conf spark.sql.extensions<span
style=color:#f92672>=</span>org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions
<span style=color:#ae81ff>\
</span></span></span><span style=display:flex><span><span
style=color:#ae81ff></span> --conf spark.sql.catalog.spark_catalog<span
style=color:#f92672>=</span>org.apache.iceberg.spark.SparkSessionCatalog <span
style=color:#ae81ff>\
</span></span></span><span style=display:flex><span><span
style=color:#ae81ff></span> --conf spark.sql.catalog.spark_catalog.type<span
style=color:#f92672>=</span>hive <span style=color:#ae81ff>\
-</span></span></span><span style=display:flex><span><span
style=color:#ae81ff></span> --conf spark.sql.catalog.demo<span
style=color:#f92672>=</span>org.apache.iceberg.spark.SparkCatalog <span
style=color:#ae81ff>\
-</span></span></span><span style=display:flex><span><span
style=color:#ae81ff></span> --conf spark.sql.catalog.demo.type<span
style=color:#f92672>=</span>hadoop <span style=color:#ae81ff>\
-</span></span></span><span style=display:flex><span><span
style=color:#ae81ff></span> --conf spark.sql.catalog.demo.warehouse<span
style=color:#f92672>=</span>$PWD/warehouse <span style=color:#ae81ff>\
-</span></span></span><span style=display:flex><span><span
style=color:#ae81ff></span> --conf spark.sql.defaultCatalog<span
style=color:#f92672>=</span>demo
+</span></span></span><span style=display:flex><span><span
style=color:#ae81ff></span> --conf spark.sql.catalog.local<span
style=color:#f92672>=</span>org.apache.iceberg.spark.SparkCatalog <span
style=color:#ae81ff>\
+</span></span></span><span style=display:flex><span><span
style=color:#ae81ff></span> --conf spark.sql.catalog.local.type<span
style=color:#f92672>=</span>hadoop <span style=color:#ae81ff>\
+</span></span></span><span style=display:flex><span><span
style=color:#ae81ff></span> --conf spark.sql.catalog.local.warehouse<span
style=color:#f92672>=</span>$PWD/warehouse <span style=color:#ae81ff>\
+</span></span></span><span style=display:flex><span><span
style=color:#ae81ff></span> --conf spark.sql.defaultCatalog<span
style=color:#f92672>=</span>local
</span></span></code></pre></div></codeblock><codeblock
class=spark-defaults><div class=highlight><pre tabindex=0
style=color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4><code
class=language-sh data-lang=sh><span
style=display:flex><span>spark.jars.packages
org.apache.iceberg:iceberg-spark-runtime-3.2_2.12:1.1.0
</span></span><span style=display:flex><span>spark.sql.extensions
org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions
</span></span><span style=display:flex><span>spark.sql.catalog.spark_catalog
org.apache.iceberg.spark.SparkSessionCatalog
</span></span><span
style=display:flex><span>spark.sql.catalog.spark_catalog.type
hive
-</span></span><span style=display:flex><span>spark.sql.catalog.demo
org.apache.iceberg.spark.SparkCatalog
-</span></span><span style=display:flex><span>spark.sql.catalog.demo.type
hadoop
-</span></span><span style=display:flex><span>spark.sql.catalog.demo.warehouse
$PWD/warehouse
-</span></span><span style=display:flex><span>spark.sql.defaultCatalog
demo
-</span></span></code></pre></div></codeblock></div><div class=info>If your
Iceberg catalog is not set as the default catalog, you will have to switch to
it by executing <code>USE demo;</code></div><h3 id=next-steps>Next
steps</h3><h4 id=adding-iceberg-to-spark>Adding Iceberg to Spark</h4><p>If you
already have a Spark environment, you can add Iceberg, using the
<code>--packages</code> option.</p><div class=codetabs><input id=spark-sql
type=radio name=AddIcebergToSpark onclick='selectExam [...]
+</span></span><span style=display:flex><span>spark.sql.catalog.local
org.apache.iceberg.spark.SparkCatalog
+</span></span><span style=display:flex><span>spark.sql.catalog.local.type
hadoop
+</span></span><span style=display:flex><span>spark.sql.catalog.local.warehouse
$PWD/warehouse
+</span></span><span style=display:flex><span>spark.sql.defaultCatalog
local
+</span></span></code></pre></div></codeblock></div><div class=info>If your
Iceberg catalog is not set as the default catalog, you will have to switch to
it by executing <code>USE local;</code></div><h3 id=next-steps>Next
steps</h3><h4 id=adding-iceberg-to-spark>Adding Iceberg to Spark</h4><p>If you
already have a Spark environment, you can add Iceberg, using the
<code>--packages</code> option.</p><div class=codetabs><input id=spark-sql
type=radio name=AddIcebergToSpark onclick='selectExa [...]
<label for=spark-sql>SparkSQL</label>
<input id=spark-shell type=radio name=AddIcebergToSpark
onclick='selectExampleLanguage("spark-queries","spark-shell")'>
<label for=spark-shell>Spark-Shell</label>