Repository: incubator-hivemall
Updated Branches:
  refs/heads/master e8a182ce4 -> 10e7d450f (forced update)


Fixed userguide for Docker/Spark entry


Project: http://git-wip-us.apache.org/repos/asf/incubator-hivemall/repo
Commit: 
http://git-wip-us.apache.org/repos/asf/incubator-hivemall/commit/10e7d450
Tree: http://git-wip-us.apache.org/repos/asf/incubator-hivemall/tree/10e7d450
Diff: http://git-wip-us.apache.org/repos/asf/incubator-hivemall/diff/10e7d450

Branch: refs/heads/master
Commit: 10e7d450fa8257efc5d614957fda514b2b91fdee
Parents: 68f6b46
Author: myui <[email protected]>
Authored: Wed May 17 11:58:19 2017 -0400
Committer: myui <[email protected]>
Committed: Thu May 18 05:02:10 2017 -0400

----------------------------------------------------------------------
 docs/gitbook/docker/getting_started.md           | 19 +++++++++++++++++++
 docs/gitbook/spark/binaryclass/a9a_df.md         |  2 +-
 .../spark/getting_started/installation.md        |  4 ++--
 3 files changed, 22 insertions(+), 3 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/incubator-hivemall/blob/10e7d450/docs/gitbook/docker/getting_started.md
----------------------------------------------------------------------
diff --git a/docs/gitbook/docker/getting_started.md 
b/docs/gitbook/docker/getting_started.md
index 0944753..810e5d8 100644
--- a/docs/gitbook/docker/getting_started.md
+++ b/docs/gitbook/docker/getting_started.md
@@ -39,6 +39,9 @@ This page introduces how to run Hivemall on Docker.
   
   `docker build -f resources/docker/Dockerfile .`
 
+> #### Note
+> You can 
[skip](./getting_started.html#running-pre-built-docker-image-in-dockerhub) 
building images by using existing Docker images.
+
 # 2. Run container
 
 ## Run by docker-compose
@@ -52,11 +55,27 @@ This page introduces how to run Hivemall on Docker.
   2. Run `docker run -it ${docker_image_id}`. 
      Refer [Docker reference](https://docs.docker.com/engine/reference/run/) 
for the command detail.
 
+## Running pre-built Docker image in Dockerhub
+
+  1. Check [the latest tag](https://hub.docker.com/r/hivemall/latest/tags/) 
first.
+  2. Pull pre-build docker image from Dockerhub `docker pull 
hivemall/latest:20170517`
+  3. `docker run -p 8088:8088 -p 50070:50070 -p 19888:19888 -it 
hivemall/latest:20170517`
+
+You can find pre-built Hivemall docker images in [this 
repository](https://hub.docker.com/r/hivemall/latest/).
+
 # 3. Run Hivemall on Docker
 
   1. Type `hive` to run (`.hiverc` automatically loads Hivemall functions)
   2. Try your Hivemall queries!
 
+## Accessing Hadoop management GUIs
+
+* YARN http://localhost:8088/
+* HDFS http://localhost:50070/
+* MR jobhistory server http://localhost:19888/
+
+Note that you need to expose local ports e.g., by `-p 8088:8088 -p 50070:50070 
-p 19888:19888` on running docker image.
+
 ## Load data into HDFS (optional)
 
   You can find an example script to load data into HDFS in 
`./bin/prepare_iris.sh`.

http://git-wip-us.apache.org/repos/asf/incubator-hivemall/blob/10e7d450/docs/gitbook/spark/binaryclass/a9a_df.md
----------------------------------------------------------------------
diff --git a/docs/gitbook/spark/binaryclass/a9a_df.md 
b/docs/gitbook/spark/binaryclass/a9a_df.md
index 7c3de67..74f2705 100644
--- a/docs/gitbook/spark/binaryclass/a9a_df.md
+++ b/docs/gitbook/spark/binaryclass/a9a_df.md
@@ -50,7 +50,7 @@ val testDf = spark.read.format("libsvm").load("a9a.t")
   .select($"rowid", $"label".as("target"), $"feature", $"weight".as("value"))
   .cache
 
-scala> df.printSchema
+scala> testDf.printSchema
 root
  |-- rowid: string (nullable = true)
  |-- target: float (nullable = true)

http://git-wip-us.apache.org/repos/asf/incubator-hivemall/blob/10e7d450/docs/gitbook/spark/getting_started/installation.md
----------------------------------------------------------------------
diff --git a/docs/gitbook/spark/getting_started/installation.md 
b/docs/gitbook/spark/getting_started/installation.md
index 74fc568..2eb6cde 100644
--- a/docs/gitbook/spark/getting_started/installation.md
+++ b/docs/gitbook/spark/getting_started/installation.md
@@ -43,7 +43,7 @@ $ ./bin/spark-shell --jars 
hivemall-spark-xxx-with-dependencies.jar
 Then, you load scripts for Hivemall functions.
 
 ```
-scala> :load define-all.spark
-scala> :load import-packages.spark
+scala> :load resources/ddl/define-all.spark
+scala> :load resources/ddl/import-packages.spark
 ```
 

Reply via email to