Author: kminder
Date: Wed Jul 22 14:01:36 2015
New Revision: 1692277
URL: http://svn.apache.org/r1692277
Log:
KNOX-572: Broken WebHDFS REST API link in online documentation.
Modified:
knox/site/books/knox-0-3-0/knox-0-3-0.html
knox/site/books/knox-0-4-0/knox-0-4-0.html
knox/site/books/knox-0-5-0/knox-0-5-0.html
knox/site/books/knox-0-6-0/user-guide.html
knox/site/books/knox-0-7-0/user-guide.html
knox/site/css/site.css
knox/site/images/application-certificate.png
knox/site/index.html
knox/site/issue-tracking.html
knox/site/license.html
knox/site/mail-lists.html
knox/site/project-info.html
knox/site/team-list.html
knox/trunk/books/0.3.0/service_webhdfs.md
knox/trunk/books/0.4.0/service_webhdfs.md
knox/trunk/books/0.5.0/service_webhdfs.md
knox/trunk/books/0.6.0/service_webhdfs.md
knox/trunk/books/0.7.0/service_webhdfs.md
Modified: knox/site/books/knox-0-3-0/knox-0-3-0.html
URL:
http://svn.apache.org/viewvc/knox/site/books/knox-0-3-0/knox-0-3-0.html?rev=1692277&r1=1692276&r2=1692277&view=diff
==============================================================================
--- knox/site/books/knox-0-3-0/knox-0-3-0.html (original)
+++ knox/site/books/knox-0-3-0/knox-0-3-0.html Wed Jul 22 14:01:36 2015
@@ -1121,7 +1121,7 @@ dep/commons-codec-1.7.jar
<li>The default configuration for all of the samples is setup for use with
Hortonworks’ <a
href="http://hortonworks.com/products/hortonworks-sandbox">Sandbox</a> version
2.</li>
</ul><h3><a id="Customization"></a>Customization</h3><p>Using these samples
with other Hadoop installations will require changes to the steps describe here
as well as changes to referenced sample scripts. This will also likely require
changes to the gateway’s default configuration. In particular host names,
ports user names and password may need to be changes to match your environment.
These changes may need to be made to gateway configuration and also the Groovy
sample script files in the distribution. All of the values that may need to be
customized in the sample scripts can be found together at the top of each of
these files.</p><h3><a id="cURL"></a>cURL</h3><p>The cURL HTTP client command
line utility is used extensively in the examples for each service. In
particular this form of the cURL command line is used repeatedly.</p>
<pre><code>curl -i -k -u guest:guest-password ...
-</code></pre><p>The option -i (aka –include) is used to output HTTP
response header information. This will be important when the content of the
HTTP Location header is required for subsequent requests.</p><p>The option -k
(aka –insecure) is used to avoid any issues resulting the use of
demonstration SSL certificates.</p><p>The option -u (aka –user) is used
to provide the credentials to be used when the client is challenged by the
gateway.</p><p>Keep in mind that the samples do not use the cookie features of
cURL for the sake of simplicity. Therefore each request via cURL will result in
an authentication.</p><h3><a id="WebHDFS"></a>WebHDFS</h3><p>REST API access to
HDFS in a Hadoop cluster is provided by WebHDFS. The <a
href="http://hadoop.apache.org/docs/stable/webhdfs.html">WebHDFS REST API</a>
documentation is available online. WebHDFS must be enabled in the hdfs-site.xml
configuration file. In sandbox this configuration file is located at
/etc/hadoop/conf/hdfs-s
ite.xml. Note the properties shown below as they are related to configuration
required by the gateway. Some of these represent the default values and may not
actually be present in hdfs-site.xml.</p>
+</code></pre><p>The option -i (aka –include) is used to output HTTP
response header information. This will be important when the content of the
HTTP Location header is required for subsequent requests.</p><p>The option -k
(aka –insecure) is used to avoid any issues resulting the use of
demonstration SSL certificates.</p><p>The option -u (aka –user) is used
to provide the credentials to be used when the client is challenged by the
gateway.</p><p>Keep in mind that the samples do not use the cookie features of
cURL for the sake of simplicity. Therefore each request via cURL will result in
an authentication.</p><h3><a id="WebHDFS"></a>WebHDFS</h3><p>REST API access to
HDFS in a Hadoop cluster is provided by WebHDFS. The <a
href="http://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-hdfs/WebHDFS.html">WebHDFS
REST API</a> documentation is available online. WebHDFS must be enabled in the
hdfs-site.xml configuration file. In sandbox this configuration file is lo
cated at /etc/hadoop/conf/hdfs-site.xml. Note the properties shown below as
they are related to configuration required by the gateway. Some of these
represent the default values and may not actually be present in
hdfs-site.xml.</p>
<pre><code><property>
<name>dfs.webhdfs.enabled</name>
<value>true</value>
Modified: knox/site/books/knox-0-4-0/knox-0-4-0.html
URL:
http://svn.apache.org/viewvc/knox/site/books/knox-0-4-0/knox-0-4-0.html?rev=1692277&r1=1692276&r2=1692277&view=diff
==============================================================================
--- knox/site/books/knox-0-4-0/knox-0-4-0.html (original)
+++ knox/site/books/knox-0-4-0/knox-0-4-0.html Wed Jul 22 14:01:36 2015
@@ -1510,7 +1510,7 @@ dep/commons-codec-1.7.jar
<li>The default configuration for all of the samples is setup for use with
Hortonworks’ <a
href="http://hortonworks.com/products/hortonworks-sandbox">Sandbox</a> version
2.</li>
</ul><h3><a id="Customization"></a>Customization</h3><p>Using these samples
with other Hadoop installations will require changes to the steps describe here
as well as changes to referenced sample scripts. This will also likely require
changes to the gateway’s default configuration. In particular host names,
ports user names and password may need to be changes to match your environment.
These changes may need to be made to gateway configuration and also the Groovy
sample script files in the distribution. All of the values that may need to be
customized in the sample scripts can be found together at the top of each of
these files.</p><h3><a id="cURL"></a>cURL</h3><p>The cURL HTTP client command
line utility is used extensively in the examples for each service. In
particular this form of the cURL command line is used repeatedly.</p>
<pre><code>curl -i -k -u guest:guest-password ...
-</code></pre><p>The option -i (aka –include) is used to output HTTP
response header information. This will be important when the content of the
HTTP Location header is required for subsequent requests.</p><p>The option -k
(aka –insecure) is used to avoid any issues resulting the use of
demonstration SSL certificates.</p><p>The option -u (aka –user) is used
to provide the credentials to be used when the client is challenged by the
gateway.</p><p>Keep in mind that the samples do not use the cookie features of
cURL for the sake of simplicity. Therefore each request via cURL will result in
an authentication.</p><h3><a id="WebHDFS"></a>WebHDFS</h3><p>REST API access to
HDFS in a Hadoop cluster is provided by WebHDFS. The <a
href="http://hadoop.apache.org/docs/stable/webhdfs.html">WebHDFS REST API</a>
documentation is available online. WebHDFS must be enabled in the hdfs-site.xml
configuration file. In sandbox this configuration file is located at
/etc/hadoop/conf/hdfs-s
ite.xml. Note the properties shown below as they are related to configuration
required by the gateway. Some of these represent the default values and may not
actually be present in hdfs-site.xml.</p>
+</code></pre><p>The option -i (aka –include) is used to output HTTP
response header information. This will be important when the content of the
HTTP Location header is required for subsequent requests.</p><p>The option -k
(aka –insecure) is used to avoid any issues resulting the use of
demonstration SSL certificates.</p><p>The option -u (aka –user) is used
to provide the credentials to be used when the client is challenged by the
gateway.</p><p>Keep in mind that the samples do not use the cookie features of
cURL for the sake of simplicity. Therefore each request via cURL will result in
an authentication.</p><h3><a id="WebHDFS"></a>WebHDFS</h3><p>REST API access to
HDFS in a Hadoop cluster is provided by WebHDFS. The <a
href="http://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-hdfs/WebHDFS.html">WebHDFS
REST API</a> documentation is available online. WebHDFS must be enabled in the
hdfs-site.xml configuration file. In sandbox this configuration file is lo
cated at /etc/hadoop/conf/hdfs-site.xml. Note the properties shown below as
they are related to configuration required by the gateway. Some of these
represent the default values and may not actually be present in
hdfs-site.xml.</p>
<pre><code><property>
<name>dfs.webhdfs.enabled</name>
<value>true</value>
Modified: knox/site/books/knox-0-5-0/knox-0-5-0.html
URL:
http://svn.apache.org/viewvc/knox/site/books/knox-0-5-0/knox-0-5-0.html?rev=1692277&r1=1692276&r2=1692277&view=diff
==============================================================================
--- knox/site/books/knox-0-5-0/knox-0-5-0.html (original)
+++ knox/site/books/knox-0-5-0/knox-0-5-0.html Wed Jul 22 14:01:36 2015
@@ -1811,7 +1811,7 @@ dep/commons-codec-1.7.jar
<li>The default configuration for all of the samples is setup for use with
Hortonworks’ <a
href="http://hortonworks.com/products/hortonworks-sandbox">Sandbox</a> version
2.</li>
</ul><h3><a id="Customization"></a>Customization</h3><p>Using these samples
with other Hadoop installations will require changes to the steps describe here
as well as changes to referenced sample scripts. This will also likely require
changes to the gateway’s default configuration. In particular host names,
ports user names and password may need to be changes to match your environment.
These changes may need to be made to gateway configuration and also the Groovy
sample script files in the distribution. All of the values that may need to be
customized in the sample scripts can be found together at the top of each of
these files.</p><h3><a id="cURL"></a>cURL</h3><p>The cURL HTTP client command
line utility is used extensively in the examples for each service. In
particular this form of the cURL command line is used repeatedly.</p>
<pre><code>curl -i -k -u guest:guest-password ...
-</code></pre><p>The option -i (aka –include) is used to output HTTP
response header information. This will be important when the content of the
HTTP Location header is required for subsequent requests.</p><p>The option -k
(aka –insecure) is used to avoid any issues resulting the use of
demonstration SSL certificates.</p><p>The option -u (aka –user) is used
to provide the credentials to be used when the client is challenged by the
gateway.</p><p>Keep in mind that the samples do not use the cookie features of
cURL for the sake of simplicity. Therefore each request via cURL will result in
an authentication.</p><h3><a id="WebHDFS"></a>WebHDFS</h3><p>REST API access to
HDFS in a Hadoop cluster is provided by WebHDFS. The <a
href="http://hadoop.apache.org/docs/stable/webhdfs.html">WebHDFS REST API</a>
documentation is available online. WebHDFS must be enabled in the hdfs-site.xml
configuration file. In sandbox this configuration file is located at
/etc/hadoop/conf/hdfs-s
ite.xml. Note the properties shown below as they are related to configuration
required by the gateway. Some of these represent the default values and may not
actually be present in hdfs-site.xml.</p>
+</code></pre><p>The option -i (aka –include) is used to output HTTP
response header information. This will be important when the content of the
HTTP Location header is required for subsequent requests.</p><p>The option -k
(aka –insecure) is used to avoid any issues resulting the use of
demonstration SSL certificates.</p><p>The option -u (aka –user) is used
to provide the credentials to be used when the client is challenged by the
gateway.</p><p>Keep in mind that the samples do not use the cookie features of
cURL for the sake of simplicity. Therefore each request via cURL will result in
an authentication.</p><h3><a id="WebHDFS"></a>WebHDFS</h3><p>REST API access to
HDFS in a Hadoop cluster is provided by WebHDFS. The <a
href="http://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-hdfs/WebHDFS.html">WebHDFS
REST API</a> documentation is available online. WebHDFS must be enabled in the
hdfs-site.xml configuration file. In sandbox this configuration file is lo
cated at /etc/hadoop/conf/hdfs-site.xml. Note the properties shown below as
they are related to configuration required by the gateway. Some of these
represent the default values and may not actually be present in
hdfs-site.xml.</p>
<pre><code><property>
<name>dfs.webhdfs.enabled</name>
<value>true</value>
Modified: knox/site/books/knox-0-6-0/user-guide.html
URL:
http://svn.apache.org/viewvc/knox/site/books/knox-0-6-0/user-guide.html?rev=1692277&r1=1692276&r2=1692277&view=diff
==============================================================================
--- knox/site/books/knox-0-6-0/user-guide.html (original)
+++ knox/site/books/knox-0-6-0/user-guide.html Wed Jul 22 14:01:36 2015
@@ -2043,7 +2043,7 @@ dep/commons-codec-1.7.jar
<li>The default configuration for all of the samples is setup for use with
Hortonworks’ <a
href="http://hortonworks.com/products/hortonworks-sandbox">Sandbox</a> version
2.</li>
</ul><h3><a id="Customization"></a>Customization</h3><p>Using these samples
with other Hadoop installations will require changes to the steps describe here
as well as changes to referenced sample scripts. This will also likely require
changes to the gateway’s default configuration. In particular host names,
ports user names and password may need to be changes to match your environment.
These changes may need to be made to gateway configuration and also the Groovy
sample script files in the distribution. All of the values that may need to be
customized in the sample scripts can be found together at the top of each of
these files.</p><h3><a id="cURL"></a>cURL</h3><p>The cURL HTTP client command
line utility is used extensively in the examples for each service. In
particular this form of the cURL command line is used repeatedly.</p>
<pre><code>curl -i -k -u guest:guest-password ...
-</code></pre><p>The option -i (aka –include) is used to output HTTP
response header information. This will be important when the content of the
HTTP Location header is required for subsequent requests.</p><p>The option -k
(aka –insecure) is used to avoid any issues resulting the use of
demonstration SSL certificates.</p><p>The option -u (aka –user) is used
to provide the credentials to be used when the client is challenged by the
gateway.</p><p>Keep in mind that the samples do not use the cookie features of
cURL for the sake of simplicity. Therefore each request via cURL will result in
an authentication.</p><h3><a id="WebHDFS"></a>WebHDFS</h3><p>REST API access to
HDFS in a Hadoop cluster is provided by WebHDFS. The <a
href="http://hadoop.apache.org/docs/stable/webhdfs.html">WebHDFS REST API</a>
documentation is available online. WebHDFS must be enabled in the hdfs-site.xml
configuration file. In sandbox this configuration file is located at
/etc/hadoop/conf/hdfs-s
ite.xml. Note the properties shown below as they are related to configuration
required by the gateway. Some of these represent the default values and may not
actually be present in hdfs-site.xml.</p>
+</code></pre><p>The option -i (aka –include) is used to output HTTP
response header information. This will be important when the content of the
HTTP Location header is required for subsequent requests.</p><p>The option -k
(aka –insecure) is used to avoid any issues resulting the use of
demonstration SSL certificates.</p><p>The option -u (aka –user) is used
to provide the credentials to be used when the client is challenged by the
gateway.</p><p>Keep in mind that the samples do not use the cookie features of
cURL for the sake of simplicity. Therefore each request via cURL will result in
an authentication.</p><h3><a id="WebHDFS"></a>WebHDFS</h3><p>REST API access to
HDFS in a Hadoop cluster is provided by WebHDFS. The <a
href="http://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-hdfs/WebHDFS.html">WebHDFS
REST API</a> documentation is available online. WebHDFS must be enabled in the
hdfs-site.xml configuration file. In sandbox this configuration file is lo
cated at /etc/hadoop/conf/hdfs-site.xml. Note the properties shown below as
they are related to configuration required by the gateway. Some of these
represent the default values and may not actually be present in
hdfs-site.xml.</p>
<pre><code><property>
<name>dfs.webhdfs.enabled</name>
<value>true</value>
Modified: knox/site/books/knox-0-7-0/user-guide.html
URL:
http://svn.apache.org/viewvc/knox/site/books/knox-0-7-0/user-guide.html?rev=1692277&r1=1692276&r2=1692277&view=diff
==============================================================================
--- knox/site/books/knox-0-7-0/user-guide.html (original)
+++ knox/site/books/knox-0-7-0/user-guide.html Wed Jul 22 14:01:36 2015
@@ -2122,7 +2122,7 @@ dep/commons-codec-1.7.jar
<li>The default configuration for all of the samples is setup for use with
Hortonworks’ <a
href="http://hortonworks.com/products/hortonworks-sandbox">Sandbox</a> version
2.</li>
</ul><h3><a id="Customization"></a>Customization</h3><p>Using these samples
with other Hadoop installations will require changes to the steps describe here
as well as changes to referenced sample scripts. This will also likely require
changes to the gateway’s default configuration. In particular host names,
ports user names and password may need to be changes to match your environment.
These changes may need to be made to gateway configuration and also the Groovy
sample script files in the distribution. All of the values that may need to be
customized in the sample scripts can be found together at the top of each of
these files.</p><h3><a id="cURL"></a>cURL</h3><p>The cURL HTTP client command
line utility is used extensively in the examples for each service. In
particular this form of the cURL command line is used repeatedly.</p>
<pre><code>curl -i -k -u guest:guest-password ...
-</code></pre><p>The option -i (aka –include) is used to output HTTP
response header information. This will be important when the content of the
HTTP Location header is required for subsequent requests.</p><p>The option -k
(aka –insecure) is used to avoid any issues resulting the use of
demonstration SSL certificates.</p><p>The option -u (aka –user) is used
to provide the credentials to be used when the client is challenged by the
gateway.</p><p>Keep in mind that the samples do not use the cookie features of
cURL for the sake of simplicity. Therefore each request via cURL will result in
an authentication.</p><h3><a id="WebHDFS"></a>WebHDFS</h3><p>REST API access to
HDFS in a Hadoop cluster is provided by WebHDFS. The <a
href="http://hadoop.apache.org/docs/stable/webhdfs.html">WebHDFS REST API</a>
documentation is available online. WebHDFS must be enabled in the hdfs-site.xml
configuration file. In sandbox this configuration file is located at
/etc/hadoop/conf/hdfs-s
ite.xml. Note the properties shown below as they are related to configuration
required by the gateway. Some of these represent the default values and may not
actually be present in hdfs-site.xml.</p>
+</code></pre><p>The option -i (aka –include) is used to output HTTP
response header information. This will be important when the content of the
HTTP Location header is required for subsequent requests.</p><p>The option -k
(aka –insecure) is used to avoid any issues resulting the use of
demonstration SSL certificates.</p><p>The option -u (aka –user) is used
to provide the credentials to be used when the client is challenged by the
gateway.</p><p>Keep in mind that the samples do not use the cookie features of
cURL for the sake of simplicity. Therefore each request via cURL will result in
an authentication.</p><h3><a id="WebHDFS"></a>WebHDFS</h3><p>REST API access to
HDFS in a Hadoop cluster is provided by WebHDFS. The <a
href="http://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-hdfs/WebHDFS.html">WebHDFS
REST API</a> documentation is available online. WebHDFS must be enabled in the
hdfs-site.xml configuration file. In sandbox this configuration file is lo
cated at /etc/hadoop/conf/hdfs-site.xml. Note the properties shown below as
they are related to configuration required by the gateway. Some of these
represent the default values and may not actually be present in
hdfs-site.xml.</p>
<pre><code><property>
<name>dfs.webhdfs.enabled</name>
<value>true</value>
Modified: knox/site/css/site.css
URL:
http://svn.apache.org/viewvc/knox/site/css/site.css?rev=1692277&r1=1692276&r2=1692277&view=diff
==============================================================================
--- knox/site/css/site.css (original)
+++ knox/site/css/site.css Wed Jul 22 14:01:36 2015
@@ -1 +1,2 @@
-/* You can override this file with your own styles */
\ No newline at end of file
+a.externalLink[href^=http]{background:url('../images/external.png') right
center no-repeat;padding-right:18px}
+a.externalLink[href^=https]{background:url('../images/external.png') right
center no-repeat;padding-right:18px}
Modified: knox/site/images/application-certificate.png
URL:
http://svn.apache.org/viewvc/knox/site/images/application-certificate.png?rev=1692277&r1=1692276&r2=1692277&view=diff
==============================================================================
Binary files - no diff available.
Modified: knox/site/index.html
URL:
http://svn.apache.org/viewvc/knox/site/index.html?rev=1692277&r1=1692276&r2=1692277&view=diff
==============================================================================
--- knox/site/index.html (original)
+++ knox/site/index.html Wed Jul 22 14:01:36 2015
@@ -1,13 +1,13 @@
<!DOCTYPE html>
<!--
- | Generated by Apache Maven Doxia at 2015-06-23
+ | Generated by Apache Maven Doxia at 2015-07-22
| Rendered using Apache Maven Fluido Skin 1.3.0
-->
<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
- <meta name="Date-Revision-yyyymmdd" content="20150623" />
+ <meta name="Date-Revision-yyyymmdd" content="20150722" />
<meta http-equiv="Content-Language" content="en" />
<title>Knox Gateway – REST API Gateway for the Hadoop
Ecosystem</title>
<link rel="stylesheet" href="./css/apache-maven-fluido-1.3.0.min.css" />
@@ -58,7 +58,7 @@
- <li id="publishDate" class="pull-right">Last Published:
2015-06-23</li>
+ <li id="publishDate" class="pull-right">Last Published:
2015-07-22</li>
</ul>
</div>
Modified: knox/site/issue-tracking.html
URL:
http://svn.apache.org/viewvc/knox/site/issue-tracking.html?rev=1692277&r1=1692276&r2=1692277&view=diff
==============================================================================
--- knox/site/issue-tracking.html (original)
+++ knox/site/issue-tracking.html Wed Jul 22 14:01:36 2015
@@ -1,13 +1,13 @@
<!DOCTYPE html>
<!--
- | Generated by Apache Maven Doxia at 2015-06-23
+ | Generated by Apache Maven Doxia at 2015-07-22
| Rendered using Apache Maven Fluido Skin 1.3.0
-->
<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
- <meta name="Date-Revision-yyyymmdd" content="20150623" />
+ <meta name="Date-Revision-yyyymmdd" content="20150722" />
<meta http-equiv="Content-Language" content="en" />
<title>Knox Gateway – Issue Tracking</title>
<link rel="stylesheet" href="./css/apache-maven-fluido-1.3.0.min.css" />
@@ -58,7 +58,7 @@
- <li id="publishDate" class="pull-right">Last Published:
2015-06-23</li>
+ <li id="publishDate" class="pull-right">Last Published:
2015-07-22</li>
</ul>
</div>
Modified: knox/site/license.html
URL:
http://svn.apache.org/viewvc/knox/site/license.html?rev=1692277&r1=1692276&r2=1692277&view=diff
==============================================================================
--- knox/site/license.html (original)
+++ knox/site/license.html Wed Jul 22 14:01:36 2015
@@ -1,13 +1,13 @@
<!DOCTYPE html>
<!--
- | Generated by Apache Maven Doxia at 2015-06-23
+ | Generated by Apache Maven Doxia at 2015-07-22
| Rendered using Apache Maven Fluido Skin 1.3.0
-->
<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
- <meta name="Date-Revision-yyyymmdd" content="20150623" />
+ <meta name="Date-Revision-yyyymmdd" content="20150722" />
<meta http-equiv="Content-Language" content="en" />
<title>Knox Gateway – Project License</title>
<link rel="stylesheet" href="./css/apache-maven-fluido-1.3.0.min.css" />
@@ -58,7 +58,7 @@
- <li id="publishDate" class="pull-right">Last Published:
2015-06-23</li>
+ <li id="publishDate" class="pull-right">Last Published:
2015-07-22</li>
</ul>
</div>
Modified: knox/site/mail-lists.html
URL:
http://svn.apache.org/viewvc/knox/site/mail-lists.html?rev=1692277&r1=1692276&r2=1692277&view=diff
==============================================================================
--- knox/site/mail-lists.html (original)
+++ knox/site/mail-lists.html Wed Jul 22 14:01:36 2015
@@ -1,13 +1,13 @@
<!DOCTYPE html>
<!--
- | Generated by Apache Maven Doxia at 2015-06-23
+ | Generated by Apache Maven Doxia at 2015-07-22
| Rendered using Apache Maven Fluido Skin 1.3.0
-->
<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
- <meta name="Date-Revision-yyyymmdd" content="20150623" />
+ <meta name="Date-Revision-yyyymmdd" content="20150722" />
<meta http-equiv="Content-Language" content="en" />
<title>Knox Gateway – Project Mailing Lists</title>
<link rel="stylesheet" href="./css/apache-maven-fluido-1.3.0.min.css" />
@@ -58,7 +58,7 @@
- <li id="publishDate" class="pull-right">Last Published:
2015-06-23</li>
+ <li id="publishDate" class="pull-right">Last Published:
2015-07-22</li>
</ul>
</div>
Modified: knox/site/project-info.html
URL:
http://svn.apache.org/viewvc/knox/site/project-info.html?rev=1692277&r1=1692276&r2=1692277&view=diff
==============================================================================
--- knox/site/project-info.html (original)
+++ knox/site/project-info.html Wed Jul 22 14:01:36 2015
@@ -1,13 +1,13 @@
<!DOCTYPE html>
<!--
- | Generated by Apache Maven Doxia at 2015-06-23
+ | Generated by Apache Maven Doxia at 2015-07-22
| Rendered using Apache Maven Fluido Skin 1.3.0
-->
<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
- <meta name="Date-Revision-yyyymmdd" content="20150623" />
+ <meta name="Date-Revision-yyyymmdd" content="20150722" />
<meta http-equiv="Content-Language" content="en" />
<title>Knox Gateway – Project Information</title>
<link rel="stylesheet" href="./css/apache-maven-fluido-1.3.0.min.css" />
@@ -58,7 +58,7 @@
- <li id="publishDate" class="pull-right">Last Published:
2015-06-23</li>
+ <li id="publishDate" class="pull-right">Last Published:
2015-07-22</li>
</ul>
</div>
Modified: knox/site/team-list.html
URL:
http://svn.apache.org/viewvc/knox/site/team-list.html?rev=1692277&r1=1692276&r2=1692277&view=diff
==============================================================================
--- knox/site/team-list.html (original)
+++ knox/site/team-list.html Wed Jul 22 14:01:36 2015
@@ -1,13 +1,13 @@
<!DOCTYPE html>
<!--
- | Generated by Apache Maven Doxia at 2015-06-23
+ | Generated by Apache Maven Doxia at 2015-07-22
| Rendered using Apache Maven Fluido Skin 1.3.0
-->
<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
- <meta name="Date-Revision-yyyymmdd" content="20150623" />
+ <meta name="Date-Revision-yyyymmdd" content="20150722" />
<meta http-equiv="Content-Language" content="en" />
<title>Knox Gateway – Team list</title>
<link rel="stylesheet" href="./css/apache-maven-fluido-1.3.0.min.css" />
@@ -58,7 +58,7 @@
- <li id="publishDate" class="pull-right">Last Published:
2015-06-23</li>
+ <li id="publishDate" class="pull-right">Last Published:
2015-07-22</li>
</ul>
</div>
Modified: knox/trunk/books/0.3.0/service_webhdfs.md
URL:
http://svn.apache.org/viewvc/knox/trunk/books/0.3.0/service_webhdfs.md?rev=1692277&r1=1692276&r2=1692277&view=diff
==============================================================================
--- knox/trunk/books/0.3.0/service_webhdfs.md (original)
+++ knox/trunk/books/0.3.0/service_webhdfs.md Wed Jul 22 14:01:36 2015
@@ -18,7 +18,7 @@
### WebHDFS ###
REST API access to HDFS in a Hadoop cluster is provided by WebHDFS.
-The [WebHDFS REST API](http://hadoop.apache.org/docs/stable/webhdfs.html)
documentation is available online.
+The [WebHDFS REST
API](http://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-hdfs/WebHDFS.html)
documentation is available online.
WebHDFS must be enabled in the hdfs-site.xml configuration file.
In sandbox this configuration file is located at
/etc/hadoop/conf/hdfs-site.xml.
Note the properties shown below as they are related to configuration required
by the gateway.
Modified: knox/trunk/books/0.4.0/service_webhdfs.md
URL:
http://svn.apache.org/viewvc/knox/trunk/books/0.4.0/service_webhdfs.md?rev=1692277&r1=1692276&r2=1692277&view=diff
==============================================================================
--- knox/trunk/books/0.4.0/service_webhdfs.md (original)
+++ knox/trunk/books/0.4.0/service_webhdfs.md Wed Jul 22 14:01:36 2015
@@ -18,7 +18,7 @@
### WebHDFS ###
REST API access to HDFS in a Hadoop cluster is provided by WebHDFS.
-The [WebHDFS REST API](http://hadoop.apache.org/docs/stable/webhdfs.html)
documentation is available online.
+The [WebHDFS REST
API](http://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-hdfs/WebHDFS.html)
documentation is available online.
WebHDFS must be enabled in the hdfs-site.xml configuration file.
In sandbox this configuration file is located at
/etc/hadoop/conf/hdfs-site.xml.
Note the properties shown below as they are related to configuration required
by the gateway.
Modified: knox/trunk/books/0.5.0/service_webhdfs.md
URL:
http://svn.apache.org/viewvc/knox/trunk/books/0.5.0/service_webhdfs.md?rev=1692277&r1=1692276&r2=1692277&view=diff
==============================================================================
--- knox/trunk/books/0.5.0/service_webhdfs.md (original)
+++ knox/trunk/books/0.5.0/service_webhdfs.md Wed Jul 22 14:01:36 2015
@@ -18,7 +18,7 @@
### WebHDFS ###
REST API access to HDFS in a Hadoop cluster is provided by WebHDFS.
-The [WebHDFS REST API](http://hadoop.apache.org/docs/stable/webhdfs.html)
documentation is available online.
+The [WebHDFS REST
API](http://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-hdfs/WebHDFS.html)
documentation is available online.
WebHDFS must be enabled in the hdfs-site.xml configuration file.
In sandbox this configuration file is located at
/etc/hadoop/conf/hdfs-site.xml.
Note the properties shown below as they are related to configuration required
by the gateway.
Modified: knox/trunk/books/0.6.0/service_webhdfs.md
URL:
http://svn.apache.org/viewvc/knox/trunk/books/0.6.0/service_webhdfs.md?rev=1692277&r1=1692276&r2=1692277&view=diff
==============================================================================
--- knox/trunk/books/0.6.0/service_webhdfs.md (original)
+++ knox/trunk/books/0.6.0/service_webhdfs.md Wed Jul 22 14:01:36 2015
@@ -18,7 +18,7 @@
### WebHDFS ###
REST API access to HDFS in a Hadoop cluster is provided by WebHDFS.
-The [WebHDFS REST API](http://hadoop.apache.org/docs/stable/webhdfs.html)
documentation is available online.
+The [WebHDFS REST
API](http://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-hdfs/WebHDFS.html)
documentation is available online.
WebHDFS must be enabled in the hdfs-site.xml configuration file.
In sandbox this configuration file is located at
/etc/hadoop/conf/hdfs-site.xml.
Note the properties shown below as they are related to configuration required
by the gateway.
Modified: knox/trunk/books/0.7.0/service_webhdfs.md
URL:
http://svn.apache.org/viewvc/knox/trunk/books/0.7.0/service_webhdfs.md?rev=1692277&r1=1692276&r2=1692277&view=diff
==============================================================================
--- knox/trunk/books/0.7.0/service_webhdfs.md (original)
+++ knox/trunk/books/0.7.0/service_webhdfs.md Wed Jul 22 14:01:36 2015
@@ -18,7 +18,7 @@
### WebHDFS ###
REST API access to HDFS in a Hadoop cluster is provided by WebHDFS.
-The [WebHDFS REST API](http://hadoop.apache.org/docs/stable/webhdfs.html)
documentation is available online.
+The [WebHDFS REST
API](http://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-hdfs/WebHDFS.html)
documentation is available online.
WebHDFS must be enabled in the hdfs-site.xml configuration file.
In sandbox this configuration file is located at
/etc/hadoop/conf/hdfs-site.xml.
Note the properties shown below as they are related to configuration required
by the gateway.