Author: lidong
Date: Fri Aug 10 14:09:45 2018
New Revision: 1837805

URL: http://svn.apache.org/viewvc?rev=1837805&view=rev
Log:
update superset tutorial

Modified:
    kylin/site/cn/docs/tutorial/cube_spark.html
    kylin/site/cn/docs/tutorial/cube_streaming.html
    kylin/site/cn/docs/tutorial/superset.html
    kylin/site/cn/docs23/tutorial/cube_spark.html
    kylin/site/cn/docs23/tutorial/cube_streaming.html
    kylin/site/docs/tutorial/cube_streaming.html
    kylin/site/docs/tutorial/superset.html
    kylin/site/docs16/tutorial/cube_streaming.html
    kylin/site/docs20/tutorial/cube_streaming.html
    kylin/site/docs21/tutorial/cube_streaming.html
    kylin/site/docs23/tutorial/cube_streaming.html
    kylin/site/feed.xml

Modified: kylin/site/cn/docs/tutorial/cube_spark.html
URL: 
http://svn.apache.org/viewvc/kylin/site/cn/docs/tutorial/cube_spark.html?rev=1837805&r1=1837804&r2=1837805&view=diff
==============================================================================
--- kylin/site/cn/docs/tutorial/cube_spark.html (original)
+++ kylin/site/cn/docs/tutorial/cube_spark.html Fri Aug 10 14:09:45 2018
@@ -276,7 +276,7 @@ $KYLIN_HOME/bin/kylin.sh start</code></p
 
 <p>当出现 error,您可以首先查看 “logs/kylin.log”. 其中包含 
Kylin 执行的所有 Spark 命令,例如:</p>
 
-<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">2017-03-06 14:44:38,574 INFO  [Job 
2d5c1178-c6f6-4b50-8937-8e5e3b39227e-306] spark.SparkExecutable:121 : 
cmd:export 
HADOOP_CONF_DIR=/usr/local/apache-kylin-2.1.0-bin-hbase1x/hadoop-conf 
&amp;&amp; /usr/local/apache-kylin-2.1.0-bin-hbase1x/spark/bin/spark-submit 
--class org.apache.kylin.common.util.SparkEntry  --conf 
spark.executor.instances=1  --conf spark.yarn.queue=default  --conf 
spark.yarn.am.extraJavaOptions=-Dhdp.version=current  --conf 
spark.history.fs.logDirectory=hdfs:///kylin/spark-history  --conf 
spark.driver.extraJavaOptions=-Dhdp.version=current  --conf spark.master=yarn  
--conf spark.executor.extraJavaOptions=-Dhdp.version=current  --conf 
spark.executor.memory=1G  --conf spark.eventLog.enabled=true  --conf 
spark.eventLog.dir=hdfs:///kylin/spark-history  --conf spark.executor.cores=2  
--conf spark.submit.deployMode=cluster --files 
/etc/hbase/2.4.0.0-169/0/hbase-site.xml --jars /usr/hdp/2.4.0.0-16
 
9/hbase/lib/htrace-core-3.1.0-incubating.jar,/usr/hdp/2.4.0.0-169/hbase/lib/hbase-client-1.1.2.2.4.0.0-169.jar,/usr/hdp/2.4.0.0-169/hbase/lib/hbase-common-1.1.2.2.4.0.0-169.jar,/usr/hdp/2.4.0.0-169/hbase/lib/hbase-protocol-1.1.2.2.4.0.0-169.jar,/usr/hdp/2.4.0.0-169/hbase/lib/metrics-core-2.2.0.jar,/usr/hdp/2.4.0.0-169/hbase/lib/guava-12.0.1.jar,/usr/local/apache-kylin-2.1.0-bin-hbase1x/lib/kylin-job-2.1.0.jar
 -className org.apache.kylin.engine.spark.SparkCubingByLayer -hiveTable 
kylin_intermediate_kylin_sales_cube_555c4d32_40bb_457d_909a_1bb017bf2d9e 
-segmentId 555c4d32-40bb-457d-909a-1bb017bf2d9e -confPath 
/usr/local/apache-kylin-2.1.0-bin-hbase1x/conf -output 
hdfs:///kylin/kylin_metadata/kylin-2d5c1178-c6f6-4b50-8937-8e5e3b39227e/kylin_sales_cube/cuboid/
 -cubename kylin_sales_cube</code></pre></div>
+<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">2017-03-06 14:44:38,574 INFO  [Job 
2d5c1178-c6f6-4b50-8937-8e5e3b39227e-306] spark.SparkExecutable:121 : 
cmd:export 
HADOOP_CONF_DIR=/usr/local/apache-kylin-2.1.0-bin-hbase1x/hadoop-conf 
&amp;&amp; /usr/local/apache-kylin-2.1.0-bin-hbase1x/spark/bin/spark-submit 
--class org.apache.kylin.common.util.SparkEntry  --conf 
spark.executor.instances=1  --conf spark.yarn.queue=default  --conf 
spark.yarn.am.extraJavaOptions=-Dhdp.version=current  --conf 
spark.history.fs.logDirectory=hdfs:///kylin/spark-history  --conf 
spark.driver.extraJavaOptions=-Dhdp.version=current  --conf spark.master=yarn  
--conf spark.executor.extraJavaOptions=-Dhdp.version=current  --conf 
spark.executor.memory=1G  --conf spark.eventLog.enabled=true  --conf 
spark.eventLog.dir=hdfs:///kylin/spark-history  --conf spark.executor.cores=2  
--conf spark.submit.deployMode=cluster --files 
/etc/hbase/2.4.0.0-169/0/hbase-site.xml --jars /usr/hdp/2.4.0.0-16
 
9/hbase/lib/htrace-core-3.1.0-incubating.jar,/usr/hdp/2.4.0.0-169/hbase/lib/hbase-client-1.1.2.2.4.0.0-169.jar,/usr/hdp/2.4.0.0-169/hbase/lib/hbase-common-1.1.2.2.4.0.0-169.jar,/usr/hdp/2.4.0.0-169/hbase/lib/hbase-protocol-1.1.2.2.4.0.0-169.jar,/usr/hdp/2.4.0.0-169/hbase/lib/metrics-core-2.2.0.jar,/usr/hdp/2.4.0.0-169/hbase/lib/guava-12.0.1.jar,/usr/local/apache-kylin-2.1.0-bin-hbase1x/lib/kylin-job-2.1.0.jar
 -className org.apache.kylin.engine.spark.SparkCubingByLayer -hiveTable 
kylin_intermediate_kylin_sales_cube_555c4d32_40bb_457d_909a_1bb017bf2d9e 
-segmentId 555c4d32-40bb-457d-909a-1bb017bf2d9e -confPath 
/usr/local/apache-kylin-2.1.0-bin-hbase1x/conf -output 
hdfs:///kylin/kylin_metadata/kylin-2d5c1178-c6f6-4b50-8937-8e5e3b39227e/kylin_sales_cube/cuboid/
 -cubename kylin_sales_cube</code></pre></div>
 
 <p>您可以拷贝 cmd 以便在 shell 
中手动执行,然后快速进行参数调整;执行期间,您可以访问
 Yarn 资源管理器查看更多的消息。如果 job 
已经完成了,您可以在 Spark history server 中查看历史信息。</p>
 

Modified: kylin/site/cn/docs/tutorial/cube_streaming.html
URL: 
http://svn.apache.org/viewvc/kylin/site/cn/docs/tutorial/cube_streaming.html?rev=1837805&r1=1837804&r2=1837805&view=diff
==============================================================================
--- kylin/site/cn/docs/tutorial/cube_streaming.html (original)
+++ kylin/site/cn/docs/tutorial/cube_streaming.html Fri Aug 10 14:09:45 2018
@@ -176,7 +176,7 @@ var _hmt = _hmt || [];
 <h2 id="kafka-01000--kylin">安装 Kafka 0.10.0.0 和 Kylin</h2>
 <p>不要使用 HDP 2.2.4 自带的 Kafka,因为它太旧了,如果å…
¶è¿è¡Œç€è¯·å…ˆåœæŽ‰ã€‚</p>
 
-<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -s 
http://mirrors.tuna.tsinghua.edu.cn/apache/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz
 | tar -xz -C /usr/local/
+<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -s 
https://archive.apache.org/dist/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar 
-xz -C /usr/local/
 
 cd /usr/local/kafka_2.10-0.10.0.0/
 
@@ -282,7 +282,7 @@ bin/kafka-console-consumer.sh --bootstra
 
 <p>您可以在 web GUI 触发 build,通过点击 “Actions” -&gt; 
“Build”,或用 ‘curl’ 命令发送一个请求到 Kylin RESTful 
API:</p>
 
-<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X PUT --user ADMIN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" -d '{ "sourceOffsetStart": 
0,"sourceOffsetEnd": 9223372036854775807,"buildType": "BUILD"}' 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/build2</code></pre></div>
+<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X PUT --user ADMIN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" -d '{ "sourceOffsetStart": 0, 
"sourceOffsetEnd": 9223372036854775807, "buildType": "BUILD"}' 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/build2</code></pre></div>
 
 <p>请注意 API 终端和普通 cube 不一样 (这个 URL 以 “build2” 
结尾)。</p>
 
@@ -292,7 +292,7 @@ bin/kafka-console-consumer.sh --bootstra
 
 <h2 id="insight--sql-">点击 “Insight” 标签,编写 SQL 
运行,例如:</h2>
 
-<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">select minute_start,count(*),sum(amount),sum(qty) from 
streaming_sales_table group by minute_start order by 
minute_start</code></pre></div>
+<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">select minute_start, count(*), sum(amount), sum(qty) from 
streaming_sales_table group by minute_start order by 
minute_start</code></pre></div>
 
 <p>结果如下。<br />
 <img 
src="/images/tutorial/1.6/Kylin-Cube-Streaming-Tutorial/13_Query_result.png" 
alt="" /></p>
@@ -302,7 +302,7 @@ bin/kafka-console-consumer.sh --bootstra
 <p>一旦第一个 build 和查询成功了,您可以按ç…
§ä¸€å®šçš„频率调度增量 build。Kylin 将会记录每一个 build 的 
offsets;当收到一个 build 
请求,它将会从上一个结束的位置开始,然后从 Kafka 
获取最新的 offsets。有了 REST API 您可以使用任何像 Linux cron 
调度工具触发它:</p>
 
 <div class="highlight"><pre><code class="language-groff" 
data-lang="groff">crontab -e
-*/5 * * * * curl -X PUT --user ADMIN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" -d '{ "sourceOffsetStart": 
0,"sourceOffsetEnd": 9223372036854775807,"buildType": "BUILD"}' 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/build2</code></pre></div>
+*/5 * * * * curl -X PUT --user ADMIN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" -d '{ "sourceOffsetStart": 0, 
"sourceOffsetEnd": 9223372036854775807, "buildType": "BUILD"}' 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/build2</code></pre></div>
 
 <p>现在您可以观看 cube 从 streaming 中自动 built。当 cube 
segments 累积到更大的时间范围,Kylin 将会自动的将å…
¶åˆå¹¶åˆ°ä¸€ä¸ªæ›´å¤§çš„ segment 中。</p>
 
@@ -353,7 +353,7 @@ Caused by: java.lang.ClassNotFoundExcept
   <li>如果 Kafka 里已经有一组历史 message 且您不想从最开始 
build,您可以触发一个调用来将当前的结束位置设为 cube 
的开始:</li>
 </ul>
 
-<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X PUT --user ADMIN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" -d '{ "sourceOffsetStart": 
0,"sourceOffsetEnd": 9223372036854775807,"buildType": "BUILD"}' 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/init_start_offsets</code></pre></div>
+<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X PUT --user ADMIN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" -d '{ "sourceOffsetStart": 0, 
"sourceOffsetEnd": 9223372036854775807, "buildType": "BUILD"}' 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/init_start_offsets</code></pre></div>
 
 <ul>
   <li>如果一些 build job 出错了并且您将其 discard,Cube 
中就会留有一个洞(或称为空隙)。每一次 Kylin 
都会从最后的位置 build,您不可期望通过正常的 builds 
将洞填补。Kylin 提供了 API 检查和填补洞</li>
@@ -361,11 +361,11 @@ Caused by: java.lang.ClassNotFoundExcept
 
 <p>检查洞:</p>
 
-<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X GET --user ADMINN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes</code></pre></div>
+<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X GET --user ADMIN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes</code></pre></div>
 
 
<p>如果查询结果是一个空的数组,意味着没有洞;否则,触发
 Kylin 填补他们:</p>
 
-<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X PUT --user ADMINN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes</code></pre></div>
+<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X PUT --user ADMIN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes</code></pre></div>
 
 
                                                        </article>

Modified: kylin/site/cn/docs/tutorial/superset.html
URL: 
http://svn.apache.org/viewvc/kylin/site/cn/docs/tutorial/superset.html?rev=1837805&r1=1837804&r2=1837805&view=diff
==============================================================================
--- kylin/site/cn/docs/tutorial/superset.html (original)
+++ kylin/site/cn/docs/tutorial/superset.html Fri Aug 10 14:09:45 2018
@@ -200,18 +200,6 @@ var _hmt = _hmt || [];
 <h5 id="section-3">其它功能</h5>
 <p>Apache Superset 也支持导出 CSV, 共享, 以及查看 SQL 查询。</p>
 
-<h3 id="kyligence-insight-for-superset">Kyligence Insight for Superset</h3>
-<p>定制版的 Superset:Kyligence Insight for Superset,使得 Kylin 
的用户多了一种选择。具体的安装步骤请在 github 上查看 <a 
href="https://github.com/Kyligence/Insight-for-Superset";>这个项目</a>。</p>
-
-<h5 id="superset-">相比原生 Superset, 提供了如下增强功能:</h5>
-<ol>
-  <li>统一用户管理,用户无需在 “Superset” 
上额外创建用户和赋予权限,统一在 Kylin 
后端管理用户访问权限,直接使用 Kylin 账户登录 
Superset。</li>
-  <li>一键同步 Kylin Cube,无需在 Superset 
端重新定义数据模型,直接查询 Cube.</li>
-  <li>支持多表连接模型,支持 inner join 和 outer join.</li>
-  <li>Docker 容器化部署 
Superset,一键启动,降低部署和升级门槛。</li>
-  <li>自动适配 Kylin 查询语法。</li>
-</ol>
-
                                                        </article>
                                                </div>
                                        </div>

Modified: kylin/site/cn/docs23/tutorial/cube_spark.html
URL: 
http://svn.apache.org/viewvc/kylin/site/cn/docs23/tutorial/cube_spark.html?rev=1837805&r1=1837804&r2=1837805&view=diff
==============================================================================
--- kylin/site/cn/docs23/tutorial/cube_spark.html (original)
+++ kylin/site/cn/docs23/tutorial/cube_spark.html Fri Aug 10 14:09:45 2018
@@ -276,7 +276,7 @@ $KYLIN_HOME/bin/kylin.sh start</code></p
 
 <p>当出现 error,您可以首先查看 “logs/kylin.log”. 其中包含 
Kylin 执行的所有 Spark 命令,例如:</p>
 
-<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">2017-03-06 14:44:38,574 INFO  [Job 
2d5c1178-c6f6-4b50-8937-8e5e3b39227e-306] spark.SparkExecutable:121 : 
cmd:export 
HADOOP_CONF_DIR=/usr/local/apache-kylin-2.1.0-bin-hbase1x/hadoop-conf 
&amp;&amp; /usr/local/apache-kylin-2.1.0-bin-hbase1x/spark/bin/spark-submit 
--class org.apache.kylin.common.util.SparkEntry  --conf 
spark.executor.instances=1  --conf spark.yarn.queue=default  --conf 
spark.yarn.am.extraJavaOptions=-Dhdp.version=current  --conf 
spark.history.fs.logDirectory=hdfs:///kylin/spark-history  --conf 
spark.driver.extraJavaOptions=-Dhdp.version=current  --conf spark.master=yarn  
--conf spark.executor.extraJavaOptions=-Dhdp.version=current  --conf 
spark.executor.memory=1G  --conf spark.eventLog.enabled=true  --conf 
spark.eventLog.dir=hdfs:///kylin/spark-history  --conf spark.executor.cores=2  
--conf spark.submit.deployMode=cluster --files 
/etc/hbase/2.4.0.0-169/0/hbase-site.xml --jars /usr/hdp/2.4.0.0-16
 
9/hbase/lib/htrace-core-3.1.0-incubating.jar,/usr/hdp/2.4.0.0-169/hbase/lib/hbase-client-1.1.2.2.4.0.0-169.jar,/usr/hdp/2.4.0.0-169/hbase/lib/hbase-common-1.1.2.2.4.0.0-169.jar,/usr/hdp/2.4.0.0-169/hbase/lib/hbase-protocol-1.1.2.2.4.0.0-169.jar,/usr/hdp/2.4.0.0-169/hbase/lib/metrics-core-2.2.0.jar,/usr/hdp/2.4.0.0-169/hbase/lib/guava-12.0.1.jar,/usr/local/apache-kylin-2.1.0-bin-hbase1x/lib/kylin-job-2.1.0.jar
 -className org.apache.kylin.engine.spark.SparkCubingByLayer -hiveTable 
kylin_intermediate_kylin_sales_cube_555c4d32_40bb_457d_909a_1bb017bf2d9e 
-segmentId 555c4d32-40bb-457d-909a-1bb017bf2d9e -confPath 
/usr/local/apache-kylin-2.1.0-bin-hbase1x/conf -output 
hdfs:///kylin/kylin_metadata/kylin-2d5c1178-c6f6-4b50-8937-8e5e3b39227e/kylin_sales_cube/cuboid/
 -cubename kylin_sales_cube</code></pre></div>
+<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">2017-03-06 14:44:38,574 INFO  [Job 
2d5c1178-c6f6-4b50-8937-8e5e3b39227e-306] spark.SparkExecutable:121 : 
cmd:export 
HADOOP_CONF_DIR=/usr/local/apache-kylin-2.1.0-bin-hbase1x/hadoop-conf 
&amp;&amp; /usr/local/apache-kylin-2.1.0-bin-hbase1x/spark/bin/spark-submit 
--class org.apache.kylin.common.util.SparkEntry  --conf 
spark.executor.instances=1  --conf spark.yarn.queue=default  --conf 
spark.yarn.am.extraJavaOptions=-Dhdp.version=current  --conf 
spark.history.fs.logDirectory=hdfs:///kylin/spark-history  --conf 
spark.driver.extraJavaOptions=-Dhdp.version=current  --conf spark.master=yarn  
--conf spark.executor.extraJavaOptions=-Dhdp.version=current  --conf 
spark.executor.memory=1G  --conf spark.eventLog.enabled=true  --conf 
spark.eventLog.dir=hdfs:///kylin/spark-history  --conf spark.executor.cores=2  
--conf spark.submit.deployMode=cluster --files 
/etc/hbase/2.4.0.0-169/0/hbase-site.xml --jars /usr/hdp/2.4.0.0-16
 
9/hbase/lib/htrace-core-3.1.0-incubating.jar,/usr/hdp/2.4.0.0-169/hbase/lib/hbase-client-1.1.2.2.4.0.0-169.jar,/usr/hdp/2.4.0.0-169/hbase/lib/hbase-common-1.1.2.2.4.0.0-169.jar,/usr/hdp/2.4.0.0-169/hbase/lib/hbase-protocol-1.1.2.2.4.0.0-169.jar,/usr/hdp/2.4.0.0-169/hbase/lib/metrics-core-2.2.0.jar,/usr/hdp/2.4.0.0-169/hbase/lib/guava-12.0.1.jar,/usr/local/apache-kylin-2.1.0-bin-hbase1x/lib/kylin-job-2.1.0.jar
 -className org.apache.kylin.engine.spark.SparkCubingByLayer -hiveTable 
kylin_intermediate_kylin_sales_cube_555c4d32_40bb_457d_909a_1bb017bf2d9e 
-segmentId 555c4d32-40bb-457d-909a-1bb017bf2d9e -confPath 
/usr/local/apache-kylin-2.1.0-bin-hbase1x/conf -output 
hdfs:///kylin/kylin_metadata/kylin-2d5c1178-c6f6-4b50-8937-8e5e3b39227e/kylin_sales_cube/cuboid/
 -cubename kylin_sales_cube</code></pre></div>
 
 <p>您可以拷贝 cmd 以便在 shell 
中手动执行,然后快速进行参数调整;执行期间,您可以访问
 Yarn 资源管理器查看更多的消息。如果 job 
已经完成了,您可以在 Spark history server 中查看历史信息。</p>
 

Modified: kylin/site/cn/docs23/tutorial/cube_streaming.html
URL: 
http://svn.apache.org/viewvc/kylin/site/cn/docs23/tutorial/cube_streaming.html?rev=1837805&r1=1837804&r2=1837805&view=diff
==============================================================================
--- kylin/site/cn/docs23/tutorial/cube_streaming.html (original)
+++ kylin/site/cn/docs23/tutorial/cube_streaming.html Fri Aug 10 14:09:45 2018
@@ -176,7 +176,7 @@ var _hmt = _hmt || [];
 <h2 id="kafka-01000--kylin">安装 Kafka 0.10.0.0 和 Kylin</h2>
 <p>不要使用 HDP 2.2.4 自带的 Kafka,因为它太旧了,如果å…
¶è¿è¡Œç€è¯·å…ˆåœæŽ‰ã€‚</p>
 
-<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -s 
http://mirrors.tuna.tsinghua.edu.cn/apache/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz
 | tar -xz -C /usr/local/
+<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -s 
https://archive.apache.org/dist/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar 
-xz -C /usr/local/
 
 cd /usr/local/kafka_2.10-0.10.0.0/
 
@@ -282,7 +282,7 @@ bin/kafka-console-consumer.sh --bootstra
 
 <p>您可以在 web GUI 触发 build,通过点击 “Actions” -&gt; 
“Build”,或用 ‘curl’ 命令发送一个请求到 Kylin RESTful 
API:</p>
 
-<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X PUT --user ADMIN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" -d '{ "sourceOffsetStart": 
0,"sourceOffsetEnd": 9223372036854775807,"buildType": "BUILD"}' 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/build2</code></pre></div>
+<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X PUT --user ADMIN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" -d '{ "sourceOffsetStart": 0, 
"sourceOffsetEnd": 9223372036854775807, "buildType": "BUILD"}' 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/build2</code></pre></div>
 
 <p>请注意 API 终端和普通 cube 不一样 (这个 URL 以 “build2” 
结尾)。</p>
 
@@ -292,7 +292,7 @@ bin/kafka-console-consumer.sh --bootstra
 
 <h2 id="insight--sql-">点击 “Insight” 标签,编写 SQL 
运行,例如:</h2>
 
-<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">select minute_start,count(*),sum(amount),sum(qty) from 
streaming_sales_table group by minute_start order by 
minute_start</code></pre></div>
+<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">select minute_start, count(*), sum(amount), sum(qty) from 
streaming_sales_table group by minute_start order by 
minute_start</code></pre></div>
 
 <p>结果如下。<br />
 <img 
src="/images/tutorial/1.6/Kylin-Cube-Streaming-Tutorial/13_Query_result.png" 
alt="" /></p>
@@ -302,7 +302,7 @@ bin/kafka-console-consumer.sh --bootstra
 <p>一旦第一个 build 和查询成功了,您可以按ç…
§ä¸€å®šçš„频率调度增量 build。Kylin 将会记录每一个 build 的 
offsets;当收到一个 build 
请求,它将会从上一个结束的位置开始,然后从 Kafka 
获取最新的 offsets。有了 REST API 您可以使用任何像 Linux cron 
调度工具触发它:</p>
 
 <div class="highlight"><pre><code class="language-groff" 
data-lang="groff">crontab -e
-*/5 * * * * curl -X PUT --user ADMIN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" -d '{ "sourceOffsetStart": 
0,"sourceOffsetEnd": 9223372036854775807,"buildType": "BUILD"}' 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/build2</code></pre></div>
+*/5 * * * * curl -X PUT --user ADMIN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" -d '{ "sourceOffsetStart": 0, 
"sourceOffsetEnd": 9223372036854775807, "buildType": "BUILD"}' 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/build2</code></pre></div>
 
 <p>现在您可以观看 cube 从 streaming 中自动 built。当 cube 
segments 累积到更大的时间范围,Kylin 将会自动的将å…
¶åˆå¹¶åˆ°ä¸€ä¸ªæ›´å¤§çš„ segment 中。</p>
 
@@ -353,7 +353,7 @@ Caused by: java.lang.ClassNotFoundExcept
   <li>如果 Kafka 里已经有一组历史 message 且您不想从最开始 
build,您可以触发一个调用来将当前的结束位置设为 cube 
的开始:</li>
 </ul>
 
-<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X PUT --user ADMIN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" -d '{ "sourceOffsetStart": 
0,"sourceOffsetEnd": 9223372036854775807,"buildType": "BUILD"}' 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/init_start_offsets</code></pre></div>
+<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X PUT --user ADMIN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" -d '{ "sourceOffsetStart": 0, 
"sourceOffsetEnd": 9223372036854775807, "buildType": "BUILD"}' 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/init_start_offsets</code></pre></div>
 
 <ul>
   <li>如果一些 build job 出错了并且您将其 discard,Cube 
中就会留有一个洞(或称为空隙)。每一次 Kylin 
都会从最后的位置 build,您不可期望通过正常的 builds 
将洞填补。Kylin 提供了 API 检查和填补洞</li>
@@ -361,11 +361,11 @@ Caused by: java.lang.ClassNotFoundExcept
 
 <p>检查洞:</p>
 
-<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X GET --user ADMINN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes</code></pre></div>
+<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X GET --user ADMIN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes</code></pre></div>
 
 
<p>如果查询结果是一个空的数组,意味着没有洞;否则,触发
 Kylin 填补他们:</p>
 
-<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X PUT --user ADMINN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes</code></pre></div>
+<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X PUT --user ADMIN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes</code></pre></div>
 
 
                                                        </article>

Modified: kylin/site/docs/tutorial/cube_streaming.html
URL: 
http://svn.apache.org/viewvc/kylin/site/docs/tutorial/cube_streaming.html?rev=1837805&r1=1837804&r2=1837805&view=diff
==============================================================================
--- kylin/site/docs/tutorial/cube_streaming.html (original)
+++ kylin/site/docs/tutorial/cube_streaming.html Fri Aug 10 14:09:45 2018
@@ -5611,7 +5611,7 @@ var _hmt = _hmt || [];
 <h2 id="install-kafka-01000-and-kylin">Install Kafka 0.10.0.0 and Kylin</h2>
 <p>Don’t use HDP 2.2.4’s build-in Kafka as it is too old, stop it first if 
it is running.</p>
 
-<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -s 
http://mirrors.tuna.tsinghua.edu.cn/apache/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz
 | tar -xz -C /usr/local/
+<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -s 
https://archive.apache.org/dist/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar 
-xz -C /usr/local/
 
 cd /usr/local/kafka_2.10-0.10.0.0/
 
@@ -5796,11 +5796,11 @@ Caused by: java.lang.ClassNotFoundExcept
 
 <p>Check holes:</p>
 
-<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X GET --user ADMINN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes</code></pre></div>
+<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X GET --user ADMIN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes</code></pre></div>
 
 <p>If the result is an empty arrary, means there is no hole; Otherwise, 
trigger Kylin to fill them:</p>
 
-<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X PUT --user ADMINN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes</code></pre></div>
+<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X PUT --user ADMIN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes</code></pre></div>
 
 
                                                        </article>

Modified: kylin/site/docs/tutorial/superset.html
URL: 
http://svn.apache.org/viewvc/kylin/site/docs/tutorial/superset.html?rev=1837805&r1=1837804&r2=1837805&view=diff
==============================================================================
--- kylin/site/docs/tutorial/superset.html (original)
+++ kylin/site/docs/tutorial/superset.html Fri Aug 10 14:09:45 2018
@@ -5635,17 +5635,6 @@ var _hmt = _hmt || [];
 <h5 id="other-functionalities">Other functionalities</h5>
 <p>Apache Superset also supports exporting to CSV, sharing, and viewing SQL 
query.</p>
 
-<h3 id="kyligence-insight-for-superset">Kyligence Insight for Superset</h3>
-<p>A customized version of Superset: Kyligence Insight for Superset gives 
Kylin users a choice. Please check <a 
href="https://github.com/Kyligence/Insight-for-Superset";>this project</a> on 
github for specific installation steps.</p>
-
-<h5 
id="compared-to-the-native-superset-it-offers-the-following-enhancements">Compared
 to the native Superset, it offers the following enhancements:</h5>
-<ol>
-  <li>Unified user management, users do not need to create additional users 
and permissions on “Superset”, manage user access rights on the Kylin 
backend, and log in to Superset directly using Kylin account.</li>
-  <li>One-click synchronization Kylin Cube, no need to redefine the data model 
on the Superset side, directly query Cube.</li>
-  <li>Support multi-table join model, support inner join and outer join.</li>
-  <li>Docker containerized deployment Superset, one-click startup, reducing 
deployment and upgrade thresholds.</li>
-  <li>Automatically adapt Kylin query syntax.</li>
-</ol>
 
                                                        </article>
                                                </div>

Modified: kylin/site/docs16/tutorial/cube_streaming.html
URL: 
http://svn.apache.org/viewvc/kylin/site/docs16/tutorial/cube_streaming.html?rev=1837805&r1=1837804&r2=1837805&view=diff
==============================================================================
--- kylin/site/docs16/tutorial/cube_streaming.html (original)
+++ kylin/site/docs16/tutorial/cube_streaming.html Fri Aug 10 14:09:45 2018
@@ -7807,7 +7807,7 @@ var _hmt = _hmt || [];
 <h2 id="install-kafka-01000-and-kylin">Install Kafka 0.10.0.0 and Kylin</h2>
 <p>Don’t use HDP 2.2.4’s build-in Kafka as it is too old, stop it first if 
it is running.</p>
 
-<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -s 
http://mirrors.tuna.tsinghua.edu.cn/apache/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz
 | tar -xz -C /usr/local/
+<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -s 
https://archive.apache.org/dist/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar 
-xz -C /usr/local/
 
 cd /usr/local/kafka_2.10-0.10.0.0/
 
@@ -7992,11 +7992,11 @@ Caused by: java.lang.ClassNotFoundExcept
 
 <p>Check holes:</p>
 
-<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X GET --user ADMINN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes</code></pre></div>
+<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X GET --user ADMIN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes</code></pre></div>
 
 <p>If the result is an empty arrary, means there is no hole; Otherwise, 
trigger Kylin to fill them:</p>
 
-<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X PUT --user ADMINN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes</code></pre></div>
+<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X PUT --user ADMIN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes</code></pre></div>
 
 
                                                        </article>

Modified: kylin/site/docs20/tutorial/cube_streaming.html
URL: 
http://svn.apache.org/viewvc/kylin/site/docs20/tutorial/cube_streaming.html?rev=1837805&r1=1837804&r2=1837805&view=diff
==============================================================================
--- kylin/site/docs20/tutorial/cube_streaming.html (original)
+++ kylin/site/docs20/tutorial/cube_streaming.html Fri Aug 10 14:09:45 2018
@@ -8121,7 +8121,7 @@ var _hmt = _hmt || [];
 <h2 id="install-kafka-01000-and-kylin">Install Kafka 0.10.0.0 and Kylin</h2>
 <p>Don’t use HDP 2.2.4’s build-in Kafka as it is too old, stop it first if 
it is running.</p>
 
-<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -s 
http://mirrors.tuna.tsinghua.edu.cn/apache/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz
 | tar -xz -C /usr/local/
+<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -s 
https://archive.apache.org/dist/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar 
-xz -C /usr/local/
 
 cd /usr/local/kafka_2.10-0.10.0.0/
 
@@ -8306,11 +8306,11 @@ Caused by: java.lang.ClassNotFoundExcept
 
 <p>Check holes:</p>
 
-<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X GET --user ADMINN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes</code></pre></div>
+<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X GET --user ADMIN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes</code></pre></div>
 
 <p>If the result is an empty arrary, means there is no hole; Otherwise, 
trigger Kylin to fill them:</p>
 
-<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X PUT --user ADMINN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes</code></pre></div>
+<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X PUT --user ADMIN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes</code></pre></div>
 
 
                                                        </article>

Modified: kylin/site/docs21/tutorial/cube_streaming.html
URL: 
http://svn.apache.org/viewvc/kylin/site/docs21/tutorial/cube_streaming.html?rev=1837805&r1=1837804&r2=1837805&view=diff
==============================================================================
--- kylin/site/docs21/tutorial/cube_streaming.html (original)
+++ kylin/site/docs21/tutorial/cube_streaming.html Fri Aug 10 14:09:45 2018
@@ -9475,7 +9475,7 @@ var _hmt = _hmt || [];
 <h2 id="install-kafka-01000-and-kylin">Install Kafka 0.10.0.0 and Kylin</h2>
 <p>Don’t use HDP 2.2.4’s build-in Kafka as it is too old, stop it first if 
it is running.</p>
 
-<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -s 
http://mirrors.tuna.tsinghua.edu.cn/apache/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz
 | tar -xz -C /usr/local/
+<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -s 
https://archive.apache.org/dist/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar 
-xz -C /usr/local/
 
 cd /usr/local/kafka_2.10-0.10.0.0/
 
@@ -9660,11 +9660,11 @@ Caused by: java.lang.ClassNotFoundExcept
 
 <p>Check holes:</p>
 
-<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X GET --user ADMINN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes</code></pre></div>
+<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X GET --user ADMIN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes</code></pre></div>
 
 <p>If the result is an empty arrary, means there is no hole; Otherwise, 
trigger Kylin to fill them:</p>
 
-<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X PUT --user ADMINN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes</code></pre></div>
+<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X PUT --user ADMIN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes</code></pre></div>
 
 
                                                        </article>

Modified: kylin/site/docs23/tutorial/cube_streaming.html
URL: 
http://svn.apache.org/viewvc/kylin/site/docs23/tutorial/cube_streaming.html?rev=1837805&r1=1837804&r2=1837805&view=diff
==============================================================================
--- kylin/site/docs23/tutorial/cube_streaming.html (original)
+++ kylin/site/docs23/tutorial/cube_streaming.html Fri Aug 10 14:09:45 2018
@@ -5612,7 +5612,7 @@ var _hmt = _hmt || [];
 <h2 id="install-kafka-01000-and-kylin">Install Kafka 0.10.0.0 and Kylin</h2>
 <p>Don’t use HDP 2.2.4’s build-in Kafka as it is too old, stop it first if 
it is running.</p>
 
-<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -s 
http://mirrors.tuna.tsinghua.edu.cn/apache/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz
 | tar -xz -C /usr/local/
+<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -s 
https://archive.apache.org/dist/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar 
-xz -C /usr/local/
 
 cd /usr/local/kafka_2.10-0.10.0.0/
 
@@ -5797,11 +5797,11 @@ Caused by: java.lang.ClassNotFoundExcept
 
 <p>Check holes:</p>
 
-<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X GET --user ADMINN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes</code></pre></div>
+<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X GET --user ADMIN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes</code></pre></div>
 
 <p>If the result is an empty arrary, means there is no hole; Otherwise, 
trigger Kylin to fill them:</p>
 
-<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X PUT --user ADMINN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes</code></pre></div>
+<div class="highlight"><pre><code class="language-groff" 
data-lang="groff">curl -X PUT --user ADMIN:KYLIN -H "Content-Type: 
application/json;charset=utf-8" 
http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes</code></pre></div>
 
 
                                                        </article>

Modified: kylin/site/feed.xml
URL: 
http://svn.apache.org/viewvc/kylin/site/feed.xml?rev=1837805&r1=1837804&r2=1837805&view=diff
==============================================================================
--- kylin/site/feed.xml (original)
+++ kylin/site/feed.xml Fri Aug 10 14:09:45 2018
@@ -19,8 +19,8 @@
     <description>Apache Kylin Home</description>
     <link>http://kylin.apache.org/</link>
     <atom:link href="http://kylin.apache.org/feed.xml"; rel="self" 
type="application/rss+xml"/>
-    <pubDate>Tue, 07 Aug 2018 06:59:26 -0700</pubDate>
-    <lastBuildDate>Tue, 07 Aug 2018 06:59:26 -0700</lastBuildDate>
+    <pubDate>Fri, 10 Aug 2018 06:59:23 -0700</pubDate>
+    <lastBuildDate>Fri, 10 Aug 2018 06:59:23 -0700</lastBuildDate>
     <generator>Jekyll v2.5.3</generator>
     
       <item>


Reply via email to