Author: more
Date: Wed Nov  1 19:34:18 2017
New Revision: 1813985

URL: http://svn.apache.org/viewvc?rev=1813985&view=rev
Log:
KNOX-1095 - Fix Sample values in NameNodeUI service topology

Modified:
    knox/site/books/knox-0-14-0/user-guide.html
    knox/trunk/books/0.14.0/book_service-details.md
    knox/trunk/books/0.14.0/book_ui_service_details.md
    knox/trunk/books/0.14.0/service_webhdfs.md

Modified: knox/site/books/knox-0-14-0/user-guide.html
URL: 
http://svn.apache.org/viewvc/knox/site/books/knox-0-14-0/user-guide.html?rev=1813985&r1=1813984&r2=1813985&view=diff
==============================================================================
--- knox/site/books/knox-0-14-0/user-guide.html (original)
+++ knox/site/books/knox-0-14-0/user-guide.html Wed Nov  1 19:34:18 2017
@@ -3641,7 +3641,23 @@ dep/commons-codec-1.7.jar
     <li>JSON Path <a href="https://code.google.com/p/json-path/";>API</a></li>
     <li>GPath <a href="http://groovy.codehaus.org/GPath";>Overview</a></li>
   </ul></li>
-</ul><h2><a id="Service+Details">Service Details</a> <a 
href="#Service+Details"><img src="markbook-section-link.png"/></a></h2><p>In 
the sections that follow, the integrations currently available out of the box 
with the gateway will be described. In general these sections will include 
examples that demonstrate how to access each of these services via the gateway. 
In many cases this will include both the use of <a 
href="http://curl.haxx.se/";>cURL</a> as a REST API client as well as the use of 
the Knox Client DSL. You may notice that there are some minor differences 
between using the REST API of a given service via the gateway. In general this 
is necessary in order to achieve the goal of not leaking internal Hadoop 
cluster details to the client.</p><p>Keep in mind that the gateway uses a 
plugin model for supporting Hadoop services. Check back with the <a 
href="http://knox.apache.org";>Apache Knox</a> site for the latest news on 
plugin availability. You can also create your own custom p
 lugin to extend the capabilities of the gateway.</p><p>These are the current 
Hadoop services with built-in support.</p>
+</ul>
+<!---
+   Licensed to the Apache Software Foundation (ASF) under one or more
+   contributor license agreements.  See the NOTICE file distributed with
+   this work for additional information regarding copyright ownership.
+   The ASF licenses this file to You under the Apache License, Version 2.0
+   (the "License"); you may not use this file except in compliance with
+   the License.  You may obtain a copy of the License at
+
+       http://www.apache.org/licenses/LICENSE-2.0
+
+   Unless required by applicable law or agreed to in writing, software
+   distributed under the License is distributed on an "AS IS" BASIS,
+   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+   See the License for the specific language governing permissions and
+   limitations under the License.
+--><h2><a id="Service+Details">Service Details</a> <a 
href="#Service+Details"><img src="markbook-section-link.png"/></a></h2><p>In 
the sections that follow, the integrations currently available out of the box 
with the gateway will be described. In general these sections will include 
examples that demonstrate how to access each of these services via the gateway. 
In many cases this will include both the use of <a 
href="http://curl.haxx.se/";>cURL</a> as a REST API client as well as the use of 
the Knox Client DSL. You may notice that there are some minor differences 
between using the REST API of a given service via the gateway. In general this 
is necessary in order to achieve the goal of not leaking internal Hadoop 
cluster details to the client.</p><p>Keep in mind that the gateway uses a 
plugin model for supporting Hadoop services. Check back with the <a 
href="http://knox.apache.org";>Apache Knox</a> site for the latest news on 
plugin availability. You can also create your own custom plu
 gin to extend the capabilities of the gateway.</p><p>These are the current 
Hadoop services with built-in support.</p>
 <ul>
   <li><a href="#WebHDFS">WebHDFS</a></li>
   <li><a href="#WebHCat">WebHCat</a></li>
@@ -3663,7 +3679,23 @@ dep/commons-codec-1.7.jar
   <li>The default configuration for all of the samples is setup for use with 
Hortonworks&rsquo; <a 
href="http://hortonworks.com/products/hortonworks-sandbox";>Sandbox</a> version 
2.</li>
 </ul><h3><a id="Customization">Customization</a> <a href="#Customization"><img 
src="markbook-section-link.png"/></a></h3><p>Using these samples with other 
Hadoop installations will require changes to the steps described here as well 
as changes to referenced sample scripts. This will also likely require changes 
to the gateway&rsquo;s default configuration. In particular host names, ports, 
user names and password may need to be changed to match your environment. These 
changes may need to be made to gateway configuration and also the Groovy sample 
script files in the distribution. All of the values that may need to be 
customized in the sample scripts can be found together at the top of each of 
these files.</p><h3><a id="cURL">cURL</a> <a href="#cURL"><img 
src="markbook-section-link.png"/></a></h3><p>The cURL HTTP client command line 
utility is used extensively in the examples for each service. In particular 
this form of the cURL command line is used repeatedly.</p>
 <pre><code>curl -i -k -u guest:guest-password ...
-</code></pre><p>The option -i (aka &ndash;include) is used to output HTTP 
response header information. This will be important when the content of the 
HTTP Location header is required for subsequent requests.</p><p>The option -k 
(aka &ndash;insecure) is used to avoid any issues resulting from the use of 
demonstration SSL certificates.</p><p>The option -u (aka &ndash;user) is used 
to provide the credentials to be used when the client is challenged by the 
gateway.</p><p>Keep in mind that the samples do not use the cookie features of 
cURL for the sake of simplicity. Therefore each request via cURL will result in 
an authentication.</p><h3><a id="WebHDFS">WebHDFS</a> <a href="#WebHDFS"><img 
src="markbook-section-link.png"/></a></h3><p>REST API access to HDFS in a 
Hadoop cluster is provided by WebHDFS. The <a 
href="http://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-hdfs/WebHDFS.html";>WebHDFS
 REST API</a> documentation is available online. WebHDFS must be enabled in the 
hdfs-si
 te.xml configuration file. In the sandbox this configuration file is located 
at <code>/etc/hadoop/conf/hdfs-site.xml</code>. Note the properties shown below 
as they are related to configuration required by the gateway. Some of these 
represent the default values and may not actually be present in 
hdfs-site.xml.</p>
+</code></pre><p>The option -i (aka &ndash;include) is used to output HTTP 
response header information. This will be important when the content of the 
HTTP Location header is required for subsequent requests.</p><p>The option -k 
(aka &ndash;insecure) is used to avoid any issues resulting from the use of 
demonstration SSL certificates.</p><p>The option -u (aka &ndash;user) is used 
to provide the credentials to be used when the client is challenged by the 
gateway.</p><p>Keep in mind that the samples do not use the cookie features of 
cURL for the sake of simplicity. Therefore each request via cURL will result in 
an authentication.</p>
+<!---
+   Licensed to the Apache Software Foundation (ASF) under one or more
+   contributor license agreements.  See the NOTICE file distributed with
+   this work for additional information regarding copyright ownership.
+   The ASF licenses this file to You under the Apache License, Version 2.0
+   (the "License"); you may not use this file except in compliance with
+   the License.  You may obtain a copy of the License at
+
+       http://www.apache.org/licenses/LICENSE-2.0
+
+   Unless required by applicable law or agreed to in writing, software
+   distributed under the License is distributed on an "AS IS" BASIS,
+   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+   See the License for the specific language governing permissions and
+   limitations under the License.
+--><h3><a id="WebHDFS">WebHDFS</a> <a href="#WebHDFS"><img 
src="markbook-section-link.png"/></a></h3><p>REST API access to HDFS in a 
Hadoop cluster is provided by WebHDFS. The <a 
href="http://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-hdfs/WebHDFS.html";>WebHDFS
 REST API</a> documentation is available online. WebHDFS must be enabled in the 
hdfs-site.xml configuration file. In the sandbox this configuration file is 
located at <code>/etc/hadoop/conf/hdfs-site.xml</code>. Note the properties 
shown below as they are related to configuration required by the gateway. Some 
of these represent the default values and may not actually be present in 
hdfs-site.xml.</p>
 <pre><code>&lt;property&gt;
     &lt;name&gt;dfs.webhdfs.enabled&lt;/name&gt;
     &lt;value&gt;true&lt;/value&gt;
@@ -5599,7 +5631,7 @@ DriverManager.getConnection(url, props);
 </code></pre><p>The values above need to be reflected in each topology 
descriptor file deployed to the gateway. The gateway by default includes a 
sample topology descriptor file 
<code>{GATEWAY_HOME}/deployments/sandbox.xml</code>. The values in this sample 
are configured to work with an installed Sandbox VM.</p>
 <pre><code>&lt;service&gt;
     &lt;role&gt;HDFSUI&lt;/role&gt;
-    &lt;url&gt;http://sandbox.hortonworks.com:50070/webhdfs&lt;/url&gt;
+    &lt;url&gt;http://sandbox.hortonworks.com:50070&lt;/url&gt;
 &lt;/service&gt;
 </code></pre><p>In addition to the service configuration for HDFSUI, the REST 
service configuration for WEBHDFS is also required.</p>
 <pre><code>&lt;service&gt;

Modified: knox/trunk/books/0.14.0/book_service-details.md
URL: 
http://svn.apache.org/viewvc/knox/trunk/books/0.14.0/book_service-details.md?rev=1813985&r1=1813984&r2=1813985&view=diff
==============================================================================
--- knox/trunk/books/0.14.0/book_service-details.md (original)
+++ knox/trunk/books/0.14.0/book_service-details.md Wed Nov  1 19:34:18 2017
@@ -13,7 +13,7 @@
    WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    See the License for the specific language governing permissions and
    limitations under the License.
---->
+-->
 
 ## Service Details ##
 

Modified: knox/trunk/books/0.14.0/book_ui_service_details.md
URL: 
http://svn.apache.org/viewvc/knox/trunk/books/0.14.0/book_ui_service_details.md?rev=1813985&r1=1813984&r2=1813985&view=diff
==============================================================================
--- knox/trunk/books/0.14.0/book_ui_service_details.md (original)
+++ knox/trunk/books/0.14.0/book_ui_service_details.md Wed Nov  1 19:34:18 2017
@@ -59,7 +59,7 @@ The values in this sample are configured
 
     <service>
         <role>HDFSUI</role>
-        <url>http://sandbox.hortonworks.com:50070/webhdfs</url>
+        <url>http://sandbox.hortonworks.com:50070</url>
     </service>
 
 In addition to the service configuration for HDFSUI, the REST service 
configuration for WEBHDFS is also required.

Modified: knox/trunk/books/0.14.0/service_webhdfs.md
URL: 
http://svn.apache.org/viewvc/knox/trunk/books/0.14.0/service_webhdfs.md?rev=1813985&r1=1813984&r2=1813985&view=diff
==============================================================================
--- knox/trunk/books/0.14.0/service_webhdfs.md (original)
+++ knox/trunk/books/0.14.0/service_webhdfs.md Wed Nov  1 19:34:18 2017
@@ -13,7 +13,7 @@
    WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    See the License for the specific language governing permissions and
    limitations under the License.
---->
+-->
 
 ### WebHDFS ###
 


Reply via email to