This is an automated email from the ASF dual-hosted git repository.
myui pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/incubator-hivemall-site.git
The following commit(s) were added to refs/heads/asf-site by this push:
new e025b11 Updated installation guide for spark
e025b11 is described below
commit e025b11a4f519932633778e6ab013a5688fdbd70
Author: Makoto Yui <[email protected]>
AuthorDate: Sat May 15 02:40:43 2021 +0900
Updated installation guide for spark
---
userguide/spark/getting_started/installation.html | 17 +++-----
userguide/spark/getting_started/installation.md | 49 -----------------------
2 files changed, 6 insertions(+), 60 deletions(-)
diff --git a/userguide/spark/getting_started/installation.html
b/userguide/spark/getting_started/installation.html
index 98ab6bd..458c8a3 100644
--- a/userguide/spark/getting_started/installation.html
+++ b/userguide/spark/getting_started/installation.html
@@ -2388,9 +2388,9 @@
-->
<h1 id="prerequisites">Prerequisites</h1>
<ul>
-<li>Spark v2.1 or later</li>
-<li>Java 7 or later</li>
-<li><code>hivemall-spark-xxx-with-dependencies.jar</code> that can be found in
<a href="https://www.apache.org/dyn/closer.cgi/incubator/hivemall/"
target="_blank">the ASF distribution mirror</a>.</li>
+<li>Spark v2.2 or later</li>
+<li>Java 8 or later</li>
+<li><code>hivemall-all-<version>.jar</code> that can be found in <a
href="https://search.maven.org/search?q=a:hivemall-all" target="_blank">Maven
central</a> (or use packages built by <code>bin/build.sh</code>).</li>
<li><a
href="https://github.com/apache/incubator-hivemall/blob/master/resources/ddl/define-all.spark"
target="_blank">define-all.spark</a></li>
</ul>
<h1 id="installation">Installation</h1>
@@ -2399,14 +2399,9 @@
</code></pre><h1 id="installation-via-spark-packages">Installation via <a
href="https://spark-packages.org/package/apache-hivemall/apache-hivemall"
target="_blank">Spark Packages</a></h1>
<p>In another way to install Hivemall, you can use a <code>--packages</code>
option.</p>
<pre><code>$ spark-shell --packages
org.apache.hivemall:hivemall-all:<version>
-</code></pre><p>You find available Hivemall versions on <a
href="https://mvnrepository.com/artifact/org.apache.hivemall/hivemall-all/0.5.2-incubating"
target="_blank">Maven repository</a>.</p>
-<blockquote>
-<h4 id="notice">Notice</h4>
-<p>If you would like to try Hivemall functions on the latest release of Spark,
you just say <code>bin/spark-shell</code> in a Hivemall package.
-This command automatically downloads the latest Spark version, compiles
Hivemall for the version, and invokes spark-shell with the compiled Hivemall
binary.</p>
-</blockquote>
+</code></pre><p>You find available Hivemall versions on <a
href="https://mvnrepository.com/artifact/org.apache.hivemall/hivemall-all/"
target="_blank">Maven repository</a>.</p>
<p>Then, you load scripts for Hivemall functions.</p>
-<pre><code>scala> :load resources/ddl/define-all.spark
+<pre><code>scala> :load
~/workspace/incubator-hivemall/resources/ddl/define-all.spark
</code></pre><p><div id="page-footer" class="localized-footer"><hr><!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
@@ -2462,7 +2457,7 @@ Apache Hivemall is an effort undergoing incubation at The
Apache Software Founda
<script>
var gitbook = gitbook || [];
gitbook.push(function() {
-
gitbook.page.hasChanged({"page":{"title":"Installation","level":"13.1.1","depth":2,"next":{"title":"Binary
Classification","level":"13.2","depth":1,"path":"spark/binaryclass/index.md","ref":"spark/binaryclass/index.md","articles":[{"title":"a9a
Tutorial for
SQL","level":"13.2.1","depth":2,"path":"spark/binaryclass/a9a_sql.md","ref":"spark/binaryclass/a9a_sql.md","articles":[]}]},"previous":{"title":"Getting
Started","level":"13.1","depth":1,"path":"spark/getting_started/READM [...]
+
gitbook.page.hasChanged({"page":{"title":"Installation","level":"13.1.1","depth":2,"next":{"title":"Binary
Classification","level":"13.2","depth":1,"path":"spark/binaryclass/index.md","ref":"spark/binaryclass/index.md","articles":[{"title":"a9a
Tutorial for
SQL","level":"13.2.1","depth":2,"path":"spark/binaryclass/a9a_sql.md","ref":"spark/binaryclass/a9a_sql.md","articles":[]}]},"previous":{"title":"Getting
Started","level":"13.1","depth":1,"path":"spark/getting_started/READM [...]
});
</script>
</div>
diff --git a/userguide/spark/getting_started/installation.md
b/userguide/spark/getting_started/installation.md
deleted file mode 100644
index 74fc568..0000000
--- a/userguide/spark/getting_started/installation.md
+++ /dev/null
@@ -1,49 +0,0 @@
-<!--
- Licensed to the Apache Software Foundation (ASF) under one
- or more contributor license agreements. See the NOTICE file
- distributed with this work for additional information
- regarding copyright ownership. The ASF licenses this file
- to you under the Apache License, Version 2.0 (the
- "License"); you may not use this file except in compliance
- with the License. You may obtain a copy of the License at
-
- http://www.apache.org/licenses/LICENSE-2.0
-
- Unless required by applicable law or agreed to in writing,
- software distributed under the License is distributed on an
- "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- KIND, either express or implied. See the License for the
- specific language governing permissions and limitations
- under the License.
--->
-
-Prerequisites
-============
-
-* Spark v2.0 or later
-* Java 7 or later
-* hivemall-spark-xxx-with-dependencies.jar
-*
[define-all.spark](https://github.com/apache/incubator-hivemall/blob/master/resources/ddl/define-all.spark)
-*
[import-packages.spark](https://github.com/apache/incubator-hivemall/blob/master/resources/ddl/import-packages.spark)
-
-Installation
-============
-
-First, you download a compiled Spark package from [the Spark official web
page](http://spark.apache.org/downloads.html) and
-invoke spark-shell with a compiled Hivemall binary.
-
-```
-$ ./bin/spark-shell --jars hivemall-spark-xxx-with-dependencies.jar
-```
-
-> #### Notice
-> If you would like to try Hivemall functions on the latest release of Spark,
you just say `bin/spark-shell` in a Hivemall package.
-> This command automatically downloads the latest Spark version, compiles
Hivemall for the version, and invokes spark-shell with the compiled Hivemall
binary.
-
-Then, you load scripts for Hivemall functions.
-
-```
-scala> :load define-all.spark
-scala> :load import-packages.spark
-```
-