svn commit: r993940 [16/16] - in /websites/staging/sqoop/trunk/content: ./ docs/1.99.7/ docs/1.99.7/_sources/ docs/1.99.7/_sources/admin/ docs/1.99.7/_sources/dev/ docs/1.99.7/_sources/security/ docs/

2016-07-27 Thread buildbot
Added: 
websites/staging/sqoop/trunk/content/docs/1.99.7/user/examples/S3Import.html
==
--- 
websites/staging/sqoop/trunk/content/docs/1.99.7/user/examples/S3Import.html 
(added)
+++ 
websites/staging/sqoop/trunk/content/docs/1.99.7/user/examples/S3Import.html 
Thu Jul 28 01:17:26 2016
@@ -0,0 +1,272 @@
+
+
+
+
+
+  
+
+  
+  
+  
+  
+  2.3.1. S3 Import to HDFS  Apache Sqoop  documentation
+  
+
+  
+  
+
+  
+
+  
+  
+
+
+  
+
+  
+  
+
+  
+
+  
+
+  
+
+
+
+ 
+
+  
+  
+
+
+
+
+
+  
+
+
+
+  
+
+  
+
+  
+ Apache Sqoop
+  
+
+  
+
+
+  
+  
+
+  
+
+
+  
+
+  
+
+  
+
+
+
+  
+
+
+  
+
+
+
+  
+
+
+
+1. Admin Guide
+2. User Guide
+2.1. Command Line Shell
+2.2. Connectors
+2.3. Examples
+2.3.1. S3 Import to HDFS
+2.3.1.1. 
Use case
+2.3.1.2. Configuration
+
+
+
+
+2.4. Sqoop 5 Minutes Demo
+
+
+3. 
Developer Guide
+4. Security Guide
+
+
+
+  
+
+  
+
+
+
+
+  
+  
+
+Apache Sqoop
+  
+
+
+  
+  
+
+  
+
+
+
+
+
+
+  
+Docs 
+  
+  2. User Guide 
+  
+  2.3. Examples 
+  
+2.3.1. S3 Import to HDFS
+  
+
+  
+ View page source
+  
+
+  
+  
+  
+
+  http://schema.org/Article;>
+   
+
+  
+2.3.1. S3 Import to HDFS¶
+
+Contents
+
+S3 Import 
to HDFS
+Use case
+Configuration
+
+
+
+
+This section contains detailed description for example use case of 
transferring data from S3 to HDFS.
+
+2.3.1.1. Use case¶
+You have directory on S3 where some external process is creating new text 
files. New files are added to this directory, but existing files are never 
altered. They can only be removed after some period of time. Data from all new 
files needs to be transferred to a single HDFS directory. Preserving file names 
is not required and multiple source files can be merged to single file on 
HDFS.
+
+
+2.3.1.2. Configuration¶
+We will use HDFS connector for both From and To sides of the data transfer. In order to create link 
for S3 you need to have S3 bucket name and S3 access and secret keys. Please 
follow S3 documentation to retrieve S3 credentials if you don’t have them 
already.
+sqoop:000 create 
link -c hdfs-connector
+
+
+
+Our example uses s3link for the link name
+Specify HDFS URI in form of s3a://$BUCKET_NAME where $BUCKET_NAME is name of the S3 
bucket
+Use Override configuration option and specify fs.s3a.access.key and fs.s3a.secret.key with 
your S3 access and secret key respectively.
+
+Next step is to create link for HDFS
+sqoop:000 create 
link -c hdfs-connector
+
+
+Our example uses hdfslink for the link name
+If your Sqoop server is running on node that has HDFS and mapreduce client 
configuration deployed, you can safely keep all options blank and use defaults 
for them.
+With having links for both HDFS and S3, you can create job that will 
transfer data from S3 to HDFS:
+sqoop:000 create 
job -f s3link -t hdfslink
+
+
+
+Our example uses s3import for the job name
+Input directory should point to a directory inside your S3 bucket where 
new files are generated
+Make sure to choose mode NEW_FILES for Incremental type
+Final destination for the imported files can be specified in Output 
directory
+Make sure to enable Append mode, so that Sqoop can upload newly created 
files to the same directory on HDFS
+Configure the remaining options as you see fit
+
+Then finally you can start the job by issuing following command:
+sqoop:000 start 
job -j s3import
+
+
+You can run the job s3import periodically and only newly created files will 
be transferred.
+
+
+
+
+   
+  
+  
+  
+
+  
+Next 
+  
+  
+ 
Previous
+  
+
+  
+
+  
+
+  
+
+ Copyright 2009-2016 The Apache Software Foundation.
+
+
+   
+
+
+
+
+  
+
+
+
+  
+  
+
+
+  
+
+
+var DOCUMENTATION_OPTIONS = {
+URL_ROOT:'../../',
+VERSION:'',
+COLLAPSE_INDEX:false,
+FILE_SUFFIX:'.html',
+HAS_SOURCE:  true
+};
+
+  
+  
+  
+
+  
+
+  
+  
+
+  
+
+  
+  
+  
+  jQuery(function () {
+  SphinxRtdTheme.StickyNav.enable();
+  });
+  
+   
+
+
+
\ No newline at end of file

Modified: websites/staging/sqoop/trunk/content/index.html
==
--- websites/staging/sqoop/trunk/content/index.html (original)
+++ websites/staging/sqoop/trunk/content/index.html Thu Jul 28 01:17:26 2016
@@ -1,13 +1,13 @@
 
 
 

svn commit: r993940 [6/16] - in /websites/staging/sqoop/trunk/content: ./ docs/1.99.7/ docs/1.99.7/_sources/ docs/1.99.7/_sources/admin/ docs/1.99.7/_sources/dev/ docs/1.99.7/_sources/security/ docs/1

2016-07-27 Thread buildbot
Added: websites/staging/sqoop/trunk/content/docs/1.99.7/_static/jquery.js
==
--- websites/staging/sqoop/trunk/content/docs/1.99.7/_static/jquery.js (added)
+++ websites/staging/sqoop/trunk/content/docs/1.99.7/_static/jquery.js Thu Jul 
28 01:17:26 2016
@@ -0,0 +1,154 @@
+/*!
+ * jQuery JavaScript Library v1.4.2
+ * http://jquery.com/
+ *
+ * Copyright 2010, John Resig
+ * Dual licensed under the MIT or GPL Version 2 licenses.
+ * http://jquery.org/license
+ *
+ * Includes Sizzle.js
+ * http://sizzlejs.com/
+ * Copyright 2010, The Dojo Foundation
+ * Released under the MIT, BSD, and GPL Licenses.
+ *
+ * Date: Sat Feb 13 22:33:48 2010 -0500
+ */
+(function(A,w){function 
ma(){if(!c.isReady){try{s.documentElement.doScroll("left")}catch(a){setTimeout(ma,1);return}c.ready()}}function
 
Qa(a,b){b.src?c.ajax({url:b.src,async:false,dataType:"script"}):c.globalEval(b.text||b.textContent||b.innerHTML||"");b.parentNode&(b)}function
 X(a,b,d,f,e,j){var i=a.length;if(typeof b==="object"){for(var o in 
b)X(a,o,b[o],f,e,d);return 
a}if(d!==w){f=!j&&(d);for(o=0;o)[^>]*$|^#([\w-]+)$/,Ua=/^.[^:#\[\.,]*$/,Va=/\S/,
+Wa=/^(\s|\u00A0)+|(\s|\u00A0)+$/g,Xa=/^<(\w+)\s*\/?>(?:<\/\1>)?$/,P=navigator.userAgent,xa=false,Q=[],L,$=Object.prototype.toString,aa=Object.prototype.hasOwnProperty,ba=Array.prototype.push,R=Array.prototype.slice,ya=Array.prototype.indexOf;c.fn=c.prototype={init:function(a,b){var
 d,f;if(!a)return 
this;if(a.nodeType){this.context=this[0]=a;this.length=1;return 
this}if(a==="body"&&!b){this.context=s;this[0]=s.body;this.selector="body";this.length=1;return
 this}if(typeof a==="string")if((d=Ta.exec(a))&&
+(d[1]||!b))if(d[1]){f=b?b.ownerDocument||b:s;if(a=Xa.exec(a))if(c.isPlainObject(b)){a=[s.createElement(a[1])];c.fn.attr.call(a,b,true)}else
 
a=[f.createElement(a[1])];else{a=sa([d[1]],[f]);a=(a.cacheable?a.fragment.cloneNode(true):a.fragment).childNodes}return
 c.merge(this,a)}else{if(b=s.getElementById(d[2])){if(b.id!==d[2])return 
T.find(a);this.length=1;this[0]=b}this.context=s;this.selector=a;return 
this}else 
if(!b&&/^\w+$/.test(a)){this.selector=a;this.context=s;a=s.getElementsByTagName(a);return
 c.merge(this,
+a)}else return!b||b.jquery?(b||T).find(a):c(b).find(a);else 
if(c.isFunction(a))return 
T.ready(a);if(a.selector!==w){this.selector=a.selector;this.context=a.context}return
 
c.makeArray(a,this)},selector:"",jquery:"1.4.2",length:0,size:function(){return 
this.length},toArray:function(){return R.call(this,0)},get:function(a){return 
a==null?this.toArray():a<0?this.slice(a)[0]:this[a]},pushStack:function(a,b,d){var
 
f=c();c.isArray(a)?ba.apply(f,a):c.merge(f,a);f.prevObject=this;f.context=this.context;if(b===
+"find")f.selector=this.selector+(this.selector?" ":"")+d;else 
if(b)f.selector=this.selector+"."+b+"("+d+")";return 
f},each:function(a,b){return 
c.each(this,a,b)},ready:function(a){c.bindReady();if(c.isReady)a.call(s,c);else 

svn commit: r993940 [5/16] - in /websites/staging/sqoop/trunk/content: ./ docs/1.99.7/ docs/1.99.7/_sources/ docs/1.99.7/_sources/admin/ docs/1.99.7/_sources/dev/ docs/1.99.7/_sources/security/ docs/1

2016-07-27 Thread buildbot
Added: 
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user/connectors/Connector-Kite.txt
==
--- 
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user/connectors/Connector-Kite.txt
 (added)
+++ 
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user/connectors/Connector-Kite.txt
 Thu Jul 28 01:17:26 2016
@@ -0,0 +1,110 @@
+.. Licensed to the Apache Software Foundation (ASF) under one or more
+   contributor license agreements.  See the NOTICE file distributed with
+   this work for additional information regarding copyright ownership.
+   The ASF licenses this file to You under the Apache License, Version 2.0
+   (the "License"); you may not use this file except in compliance with
+   the License.  You may obtain a copy of the License at
+
+   http://www.apache.org/licenses/LICENSE-2.0
+
+   Unless required by applicable law or agreed to in writing, software
+   distributed under the License is distributed on an "AS IS" BASIS,
+   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+   See the License for the specific language governing permissions and
+   limitations under the License.
+
+
+==
+Kite Connector
+==
+
+.. contents::
+   :depth: 3
+
+-
+Usage
+-
+
+To use the Kite Connector, create a link for the connector and a job that uses 
the link. For more information on Kite, checkout the kite documentation: 
http://kitesdk.org/docs/1.0.0/Kite-SDK-Guide.html.
+
+**Link Configuration**
+++
+
+Inputs associated with the link configuration include:
+
++-+-+---++
+| Input   | Type| Description  
 | Example|
++=+=+===++
+| authority   | String  | The authority of the kite dataset.   
 | hdfs://example.com:8020/   |
+| | | *Optional*. See note below.  
 ||
++-+-+---++
+
+**Notes**
+=
+
+1. The authority is useful for specifying Hive metastore or HDFS URI.
+
+**FROM Job Configuration**
+++
+
+Inputs associated with the Job configuration for the FROM direction include:
+
++-+-+---++
+| Input   | Type| Description  
 | Example|
++=+=+===++
+| URI | String  | The Kite dataset URI to use. 
 | dataset:hdfs:/tmp/ns/ds|
+| | | *Required*. See notes below. 
 ||
++-+-+---++
+
+**Notes**
+=
+
+1. The URI and the authority from the link configuration will be merged to 
create a complete dataset URI internally. If the given dataset URI contains 
authority, the authority from the link configuration will be ignored.
+2. Only *hdfs* and *hive* are supported currently.
+
+**TO Job Configuration**
+
+
+Inputs associated with the Job configuration for the TO direction include:
+
++-+-+---++
+| Input   | Type| Description  
 | Example|
++=+=+===++
+| URI | String  | The Kite dataset URI to use. 
 | dataset:hdfs:/tmp/ns/ds|
+| | | *Required*. See note below.  
 ||
++-+-+---++
+| File format | Enum| The format of the data 

svn commit: r993940 [10/16] - in /websites/staging/sqoop/trunk/content: ./ docs/1.99.7/ docs/1.99.7/_sources/ docs/1.99.7/_sources/admin/ docs/1.99.7/_sources/dev/ docs/1.99.7/_sources/security/ docs/

2016-07-27 Thread buildbot
Added: websites/staging/sqoop/trunk/content/docs/1.99.7/dev/DevEnv.html
==
--- websites/staging/sqoop/trunk/content/docs/1.99.7/dev/DevEnv.html (added)
+++ websites/staging/sqoop/trunk/content/docs/1.99.7/dev/DevEnv.html Thu Jul 28 
01:17:26 2016
@@ -0,0 +1,252 @@
+
+
+
+
+
+  
+
+  
+  
+  
+  
+  3.4. Sqoop 2 Development Environment Setup  Apache Sqoop  
documentation
+  
+
+  
+  
+
+  
+
+  
+  
+
+
+  
+
+  
+  
+
+  
+
+  
+
+  
+
+
+
+ 
+
+  
+  
+
+
+
+
+
+  
+
+
+
+  
+
+  
+
+  
+ Apache Sqoop
+  
+
+  
+
+
+  
+  
+
+  
+
+
+  
+
+  
+
+  
+
+
+
+  
+
+
+  
+
+
+
+  
+
+
+
+1. 
Admin Guide
+2. 
User Guide
+3. Developer Guide
+3.1. Building Sqoop2 from source code
+3.2. Sqoop Java Client API Guide
+3.3. Sqoop 2 Connector Development
+3.4. Sqoop 2 Development Environment Setup
+3.4.1. System Requirement
+3.4.1.1. 
Java
+3.4.1.2. 
Maven
+
+
+3.4.2. Eclipse Setup
+
+
+3.5. 
Sqoop REST API Guide
+3.6. Repository
+
+
+4. Security Guide
+
+
+
+  
+
+  
+
+
+
+
+  
+  
+
+Apache Sqoop
+  
+
+
+  
+  
+
+  
+
+
+
+
+
+
+  
+Docs 
+  
+  3. Developer Guide 
+  
+3.4. Sqoop 2 Development Environment Setup
+  
+
+  
+ View page 
source
+  
+
+  
+  
+  
+
+  http://schema.org/Article;>
+   
+
+  
+3.4. Sqoop 2 Development Environment Setup¶
+This document describes you how to setup development environment for Sqoop 
2.
+
+3.4.1. System Requirement¶
+
+3.4.1.1. Java¶
+Sqoop has been developped and test only with JDK from http://www.oracle.com/technetwork/java/javase/downloads/index.html;>Oracle
 and we require at least version 7 (were not supporting JDK 1.6 and 
older releases).
+
+
+3.4.1.2. Maven¶
+Sqoop uses Maven 3 for building the project. Download http://maven.apache.org/download.cgi;>Maven and its 
Installation instructions given in http://maven.apache.org/download.cgi#Maven_Documentation;>link.
+
+
+
+3.4.2. Eclipse Setup¶
+Steps for downloading source code are given in Building Sqoop2 from source 
code.
+Sqoop 2 project has multiple modules where one module is depend on another 
module for e.g. sqoop 2 client module has sqoop 2 common module dependency. 
Follow below step for creating eclipses project and classpath for each 
module.
+//Install all package 
into local maven repository
+mvn clean install -DskipTests
+
+//Adding M2_REPO variable to eclipse workspace
+mvn eclipse:configure-workspace 
-Declipse.workspace=path-to-eclipse-workspace-dir-for-sqoop-2
+
+//Eclipse project creation with optional parameters
+mvn eclipse:eclipse -DdownloadSources=true -DdownloadJavadocs=true
+
+
+Alternatively, for manually adding M2_REPO classpath variable as maven 
repository path in eclipse- window- Java -Classpath Variables 
-Click New -In new dialog box, input Name as M2_REPO and 
Path as $HOME/.m2/repository -click Ok.
+On successful execution of above maven commands, Then import the sqoop 
project modules into eclipse- File - Import -General -Existing 
Projects into Workspace- Click Next- Browse Sqoop 2 directory 
($HOME/git/sqoop2) -Click Ok -Import dialog shows multiple projects 
(sqoop-client, sqoop-common, etc.) - Select all modules - click 
Finish.
+
+
+
+
+   
+  
+  
+  
+
+  
+Next 
+  
+  
+ Previous
+  
+
+  
+
+  
+
+  
+
+ Copyright 2009-2016 The Apache Software Foundation.
+
+
+   
+
+
+
+
+  
+
+
+
+  
+  
+
+
+  
+
+
+var DOCUMENTATION_OPTIONS = {
+URL_ROOT:'../',
+VERSION:'',
+COLLAPSE_INDEX:false,
+FILE_SUFFIX:'.html',
+HAS_SOURCE:  true
+};
+
+  
+  
+  
+
+  
+
+  
+  
+
+  
+
+  
+  
+  
+  jQuery(function () {
+  SphinxRtdTheme.StickyNav.enable();
+  });
+  
+   
+
+
+
\ No newline at end of file




svn commit: r993940 [2/16] - in /websites/staging/sqoop/trunk/content: ./ docs/1.99.7/ docs/1.99.7/_sources/ docs/1.99.7/_sources/admin/ docs/1.99.7/_sources/dev/ docs/1.99.7/_sources/security/ docs/1

2016-07-27 Thread buildbot
Added: 
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/dev/ConnectorDevelopment.txt
==
--- 
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/dev/ConnectorDevelopment.txt
 (added)
+++ 
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/dev/ConnectorDevelopment.txt
 Thu Jul 28 01:17:26 2016
@@ -0,0 +1,634 @@
+.. Licensed to the Apache Software Foundation (ASF) under one or more
+   contributor license agreements.  See the NOTICE file distributed with
+   this work for additional information regarding copyright ownership.
+   The ASF licenses this file to You under the Apache License, Version 2.0
+   (the "License"); you may not use this file except in compliance with
+   the License.  You may obtain a copy of the License at
+
+   http://www.apache.org/licenses/LICENSE-2.0
+
+   Unless required by applicable law or agreed to in writing, software
+   distributed under the License is distributed on an "AS IS" BASIS,
+   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+   See the License for the specific language governing permissions and
+   limitations under the License.
+
+
+=
+Sqoop 2 Connector Development
+=
+
+This document describes how to implement a connector in the Sqoop 2 using the 
code sample from one of the built-in connectors ( ``GenericJdbcConnector`` ) as 
a reference. Sqoop 2 jobs support extraction from and/or loading to different 
data sources. Sqoop 2 connectors encapsulate the job lifecyle operations for 
extracting and/or loading data from and/or to
+different data sources. Each connector will primarily focus on a particular 
data source and its custom implementation for optimally reading and/or writing 
data in a distributed environment.
+
+.. contents::
+
+What is a Sqoop Connector?
+++
+
+Connectors provide the facility to interact with many data sources and thus 
can be used as a means to transfer data between them in Sqoop. The connector 
implementation will provide logic to read from and/or write to a data source 
that it represents. For instance the ( ``GenericJdbcConnector`` ) encapsulates 
the logic to read from and/or write to jdbc enabled relational data sources. 
The connector part that enables reading from a data source and transferring 
this data to internal Sqoop format is called the FROM and the part that enables 
writng data to a data source by transferring data from Sqoop format is called 
TO. In order to interact with these data sources, the connector will provide 
one or many config classes and input fields within it.
+
+Broadly we support two main config types for connectors, link type represented 
by the enum ``ConfigType.LINK`` and job type represented by the enum 
``ConfigType.JOB``. Link config represents the properties to physically connect 
to the data source. Job config represent the properties that are required to 
invoke reading from and/or writing to particular dataset in the data source it 
connects to. If a connector supports both reading from and writing to, it will 
provide the ``FromJobConfig`` and ``ToJobConfig`` objects. Each of these config 
objects are custom to each connector and can have one or more inputs associated 
with each of the Link, FromJob and ToJob config types. Hence we call the 
connectors as configurables i.e an entity that can provide configs for 
interacting with the data source it represents. As the connectors evolve over 
time to support new features in their data sources, the configs and inputs will 
change as well. Thus the connector API also provides methods for upgradi
 ng the config and input names and data related to these data sources across 
different versions.
+
+The connectors implement logic for various stages of the extract/load process 
using the connector API described below. While extracting/reading data from the 
data-source the main stages are ``Initializer``, ``Partitioner``, ``Extractor`` 
and ``Destroyer``. While loading/writitng data to the data source the main 
stages currently supported are ``Initializer``, ``Loader`` and ``Destroyer``. 
Each stage has its unique set of responsibilities that are explained in detail 
below. Since connectors understand the internals of the data source they 
represent, they work in tandem with the sqoop supported execution engines such 
as MapReduce or Spark (in future) to accomplish this process in a most optimal 
way.
+
+When do we add a new connector?
+===
+You add a new connector when you need to extract/read data from a new data 
source, or load/write
+data into a new data source that is not supported yet in Sqoop 2.
+In addition to the connector API, Sqoop 2 also has an submission and execution 
engine interface.
+At the moment the only supported engine is MapReduce, but we may support 
additional engines in the future such as Spark. 

svn commit: r993940 [1/16] - in /websites/staging/sqoop/trunk/content: ./ docs/1.99.7/ docs/1.99.7/_sources/ docs/1.99.7/_sources/admin/ docs/1.99.7/_sources/dev/ docs/1.99.7/_sources/security/ docs/1

2016-07-27 Thread buildbot
Author: buildbot
Date: Thu Jul 28 01:17:26 2016
New Revision: 993940

Log:
Staging update by buildbot for sqoop

Added:
websites/staging/sqoop/trunk/content/docs/1.99.7/
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/admin/
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/admin.txt

websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/admin/Installation.txt
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/admin/Tools.txt
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/admin/Upgrade.txt
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/dev/
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/dev.txt

websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/dev/BuildingSqoop2.txt
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/dev/ClientAPI.txt

websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/dev/ConnectorDevelopment.txt
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/dev/DevEnv.txt
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/dev/RESTAPI.txt
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/dev/Repository.txt
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/index.txt
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/security/
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/security.txt
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/security/API 
TLS-SSL.txt

websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/security/AuthenticationAndAuthorization.txt

websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/security/RepositoryEncryption.txt
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user/
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user.txt

websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user/CommandLineClient.txt

websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user/Connectors.txt
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user/Examples.txt

websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user/Sqoop5MinutesDemo.txt
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user/connectors/

websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user/connectors/Connector-FTP.txt

websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user/connectors/Connector-GenericJDBC.txt

websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user/connectors/Connector-HDFS.txt

websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user/connectors/Connector-Kafka.txt

websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user/connectors/Connector-Kite.txt

websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user/connectors/Connector-SFTP.txt
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user/examples/

websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user/examples/S3Import.txt
websites/staging/sqoop/trunk/content/docs/1.99.7/_static/
websites/staging/sqoop/trunk/content/docs/1.99.7/_static/ajax-loader.gif   
(with props)
websites/staging/sqoop/trunk/content/docs/1.99.7/_static/basic.css
websites/staging/sqoop/trunk/content/docs/1.99.7/_static/comment-bright.png 
  (with props)
websites/staging/sqoop/trunk/content/docs/1.99.7/_static/comment-close.png  
 (with props)
websites/staging/sqoop/trunk/content/docs/1.99.7/_static/comment.png   
(with props)
websites/staging/sqoop/trunk/content/docs/1.99.7/_static/css/
websites/staging/sqoop/trunk/content/docs/1.99.7/_static/css/badge_only.css
websites/staging/sqoop/trunk/content/docs/1.99.7/_static/css/theme.css
websites/staging/sqoop/trunk/content/docs/1.99.7/_static/doctools.js
websites/staging/sqoop/trunk/content/docs/1.99.7/_static/down-pressed.png   
(with props)
websites/staging/sqoop/trunk/content/docs/1.99.7/_static/down.png   (with 
props)
websites/staging/sqoop/trunk/content/docs/1.99.7/_static/file.png   (with 
props)
websites/staging/sqoop/trunk/content/docs/1.99.7/_static/fonts/

websites/staging/sqoop/trunk/content/docs/1.99.7/_static/fonts/fontawesome-webfont.svg
   (with props)
websites/staging/sqoop/trunk/content/docs/1.99.7/_static/jquery.js
websites/staging/sqoop/trunk/content/docs/1.99.7/_static/js/
websites/staging/sqoop/trunk/content/docs/1.99.7/_static/js/modernizr.min.js
websites/staging/sqoop/trunk/content/docs/1.99.7/_static/js/theme.js
websites/staging/sqoop/trunk/content/docs/1.99.7/_static/minus.png   (with 
props)
websites/staging/sqoop/trunk/content/docs/1.99.7/_static/plus.png   (with 
props)
websites/staging/sqoop/trunk/content/docs/1.99.7/_static/pygments.css
websites/staging/sqoop/trunk/content/docs/1.99.7/_static/searchtools.js

svn commit: r993940 [11/16] - in /websites/staging/sqoop/trunk/content: ./ docs/1.99.7/ docs/1.99.7/_sources/ docs/1.99.7/_sources/admin/ docs/1.99.7/_sources/dev/ docs/1.99.7/_sources/security/ docs/

2016-07-27 Thread buildbot
Added: websites/staging/sqoop/trunk/content/docs/1.99.7/dev/RESTAPI.html
==
--- websites/staging/sqoop/trunk/content/docs/1.99.7/dev/RESTAPI.html (added)
+++ websites/staging/sqoop/trunk/content/docs/1.99.7/dev/RESTAPI.html Thu Jul 
28 01:17:26 2016
@@ -0,0 +1,2073 @@
+
+
+
+
+
+  
+
+  
+  
+  
+  
+  3.5. Sqoop REST API Guide  Apache Sqoop  documentation
+  
+
+  
+  
+
+  
+
+  
+  
+
+
+  
+
+  
+  
+
+  
+
+  
+
+  
+
+
+
+ 
+
+  
+  
+
+
+
+
+
+  
+
+
+
+  
+
+  
+
+  
+ Apache Sqoop
+  
+
+  
+
+
+  
+  
+
+  
+
+
+  
+
+  
+
+  
+
+
+
+  
+
+
+  
+
+
+
+  
+
+
+
+1. 
Admin Guide
+2. 
User Guide
+3. Developer Guide
+3.1. Building Sqoop2 from source code
+3.2. Sqoop Java Client API Guide
+3.3. Sqoop 2 Connector Development
+3.4. 
Sqoop 2 Development Environment Setup
+3.5. Sqoop REST API Guide
+3.5.1. Initialization
+3.5.2. Understand Connector, 
Driver, Link and Job
+3.5.3. 
Objects
+3.5.3.1. Configs and Inputs
+3.5.3.2. Exception Response
+3.5.3.3. Config and Input 
Validation Status Response
+3.5.3.4. Job Submission Status 
Response
+
+
+3.5.4. Header Parameters
+3.5.5. 
REST APIs
+3.5.5.1. /version - [GET] - Get Sqoop 
Version
+3.5.5.2. /v1/connectors - [GET]  
Get all Connectors
+3.5.5.3. /v1/connector/[cname] - 
[GET] - Get Connector
+3.5.5.4. /v1/driver - [GET]- Get Sqoop 
Driver
+3.5.5.5. /v1/links/ - [GET]  Get all 
links
+3.5.5.6. 
/v1/links?cname=[cname] - [GET]  Get all links by Connector
+3.5.5.7. /v1/link/[lname]  - [GET] - Get 
Link
+3.5.5.8. /v1/link - [POST] - Create 
Link
+3.5.5.9. /v1/link/[lname] - [PUT] - 
Update Link
+3.5.5.10. /v1/link/[lname]  - [DELETE] 
- Delete Link
+3.5.5.11. /v1/link/[lname]/enable  
- [PUT] - Enable Link
+3.5.5.12. 
/v1/link/[lname]/disable - [PUT] - Disable Link
+3.5.5.13. /v1/jobs/ - [GET]  Get all 
jobs
+3.5.5.14. 
/v1/jobs?cname=[cname] - [GET]  Get all jobs by connector
+3.5.5.15. /v1/job/[jname] - [GET] - Get 
Job
+3.5.5.16. /v1/job - [POST] - Create Job
+3.5.5.17. /v1/job/[jname] - [PUT] - Update 
Job
+3.5.5.18. /v1/job/[jname] - [DELETE] - 
Delete Job
+3.5.5.19. /v1/job/[jname]/enable - 
[PUT] - Enable Job
+3.5.5.20. /v1/job/[jname]/disable 
- [PUT] - Disable Job
+3.5.5.21. /v1/job/[jname]/start - 
[PUT]- Start Job
+3.5.5.22. /v1/job/[jname]/stop  - [PUT]- 
Stop Job
+3.5.5.23. /v1/job/[jname]/status 
 - [GET]- Get Job Status
+3.5.5.24. /v1/submissions? - 
[GET] - Get all job Submissions
+3.5.5.25. 
/v1/submissions?jname=[jname] - [GET] - Get Submissions by Job
+3.5.5.26. 
/v1/authorization/roles/create - [POST] - Create Role
+3.5.5.27. 
/v1/authorization/role/[role-name]  - [DELETE] - Delete Role
+3.5.5.28.
 
/v1/authorization/roles?principal_type=[principal-type]principal_name=[principal-name]
 - [GET]  Get all Roles by Principal
+3.5.5.29.
 /v1/authorization/principals?role_name=[rname] - [GET]  Get all Principals by 
Role
+3.5.5.30. 
/v1/authorization/roles/grant - [PUT] - Grant a Role to a Principal
+3.5.5.31.
 /v1/authorization/roles/revoke - [PUT] - Revoke a Role from a 
Principal
+3.5.5.32.
 /v1/authorization/privileges/grant - [PUT] - Grant a Privilege to a 
Principal
+3.5.5.33.
 /v1/authorization/privileges/revoke - [PUT] - Revoke a Privilege to a 
Principal
+3.5.5.34.
 
/v1/authorization/privilieges?principal_type=[principal-type]principal_name=[principal-name]resource_type=[resource-type]resource_name=[resource-name]
 - [GET]  Get all Roles by Principal (and Resource)
+
+
+
+
+3.6. Repository
+
+
+4. Security Guide
+
+
+
+  
+
+  
+
+
+
+
+  
+  
+
+Apache Sqoop
+  
+
+
+  
+  
+
+  
+
+
+
+
+
+
+  
+Docs 
+  
+  3. Developer Guide 
+  
+3.5. Sqoop REST API Guide
+  
+
+  
+ View page 
source
+  
+
+  
+  
+  
+
+  http://schema.org/Article;>
+   
+
+  
+3.5. Sqoop REST API Guide¶
+This document will explain how you can use Sqoop REST API to build 
applications interacting with Sqoop server.
+The REST API covers all aspects of managing Sqoop jobs and allows you to build 
an app in any programming language using HTTP over JSON.
+
+Table of Contents
+
+Sqoop 
REST API Guide
+Initialization
+Understand Connector, 
Driver, Link and Job
+Objects
+Configs 
and Inputs
+Exception Response
+Config and Input 
Validation Status Response
+Job Submission Status Response
+
+
+Header 
Parameters
+REST APIs
+/version - [GET] - Get Sqoop Version
+/v1/connectors - [GET]  Get all Connectors
+/v1/connector/[cname] - [GET] - Get Connector
+/v1/driver - [GET]- Get Sqoop Driver
+/v1/links/ - [GET]  Get all links

svn commit: r993940 [7/16] - in /websites/staging/sqoop/trunk/content: ./ docs/1.99.7/ docs/1.99.7/_sources/ docs/1.99.7/_sources/admin/ docs/1.99.7/_sources/dev/ docs/1.99.7/_sources/security/ docs/1

2016-07-27 Thread buildbot
Added: websites/staging/sqoop/trunk/content/docs/1.99.7/_static/js/theme.js
==
--- websites/staging/sqoop/trunk/content/docs/1.99.7/_static/js/theme.js (added)
+++ websites/staging/sqoop/trunk/content/docs/1.99.7/_static/js/theme.js Thu 
Jul 28 01:17:26 2016
@@ -0,0 +1,153 @@
+require=(function e(t,n,r){function s(o,u){if(!n[o]){if(!t[o]){var a=typeof 
require=="function"&if(!u&)return a(o,!0);if(i)return i(o,!0);var 
f=new Error("Cannot find module '"+o+"'");throw f.code="MODULE_NOT_FOUND",f}var 
l=n[o]={exports:{}};t[o][0].call(l.exports,function(e){var n=t[o][1][e];return 
s(n?n:e)},l,l.exports,e,t,n,r)}return n[o].exports}var i=typeof 
require=="function"&for(var o=0;o this.docHeight) {
+return;
+}
+this.navBar.scrollTop(newNavPosition);
+   

svn commit: r993940 [14/16] - in /websites/staging/sqoop/trunk/content: ./ docs/1.99.7/ docs/1.99.7/_sources/ docs/1.99.7/_sources/admin/ docs/1.99.7/_sources/dev/ docs/1.99.7/_sources/security/ docs/

2016-07-27 Thread buildbot
Added: 
websites/staging/sqoop/trunk/content/docs/1.99.7/user/CommandLineClient.html
==
--- 
websites/staging/sqoop/trunk/content/docs/1.99.7/user/CommandLineClient.html 
(added)
+++ 
websites/staging/sqoop/trunk/content/docs/1.99.7/user/CommandLineClient.html 
Thu Jul 28 01:17:26 2016
@@ -0,0 +1,1020 @@
+
+
+
+
+
+  
+
+  
+  
+  
+  
+  2.1. Command Line Shell  Apache Sqoop  documentation
+  
+
+  
+  
+
+  
+
+  
+  
+
+
+  
+
+  
+  
+
+  
+
+  
+
+  
+
+
+
+ 
+
+  
+  
+
+
+
+
+
+  
+
+
+
+  
+
+  
+
+  
+ Apache Sqoop
+  
+
+  
+
+
+  
+  
+
+  
+
+
+  
+
+  
+
+  
+
+
+
+  
+
+
+  
+
+
+
+  
+
+
+
+1. 
Admin Guide
+2. User Guide
+2.1. Command Line Shell
+2.1.1. Resource file
+2.1.2. 
Commands
+2.1.2.1. Auxiliary Commands
+2.1.2.2. Set Command
+2.1.2.3. Show Command
+2.1.2.4. Create Command
+2.1.2.5. Update Command
+2.1.2.6. Delete Command
+2.1.2.7. Clone Command
+2.1.2.8. Start Command
+2.1.2.9. Stop Command
+2.1.2.10. Status Command
+
+
+
+
+2.2. Connectors
+2.3. 
Examples
+2.4. Sqoop 5 Minutes Demo
+
+
+3. 
Developer Guide
+4. Security Guide
+
+
+
+  
+
+  
+
+
+
+
+  
+  
+
+Apache Sqoop
+  
+
+
+  
+  
+
+  
+
+
+
+
+
+
+  
+Docs 
+  
+  2. User Guide 
+  
+2.1. Command Line Shell
+  
+
+  
+ 
View page source
+  
+
+  
+  
+  
+
+  http://schema.org/Article;>
+   
+
+  
+2.1. Command Line Shell¶
+Sqoop 2 provides command line shell that is capable of communicating with 
Sqoop 2 server using REST interface. Client is able to run in two modes - 
interactive and batch mode. Commands create, update and clone are not currently supported in batch mode. 
Interactive mode supports all available commands.
+You can start Sqoop 2 client in interactive mode using command sqoop2-shell:
+sqoop2-shell
+
+
+Batch mode can be started by adding additional argument representing path 
to your Sqoop client script:
+sqoop2-shell 
/path/to/your/script.sqoop
+
+
+Sqoop client script is expected to contain valid Sqoop client commands, 
empty lines and lines starting with # that are denoting comment lines. Comments and empty 
lines are ignored, all other lines are interpreted. Example script:
+# Specify company 
server
+set server --host sqoop2.company.net
+
+# Executing given job
+start job --name 1
+
+
+
+Table of Contents
+
+Command 
Line Shell
+Resource 
file
+Commands
+Auxiliary Commands
+Set 
Command
+Set 
Server Function
+Set 
Option Function
+
+
+Show 
Command
+Show 
Server Function
+Show 
Option Function
+Show 
Version Function
+Show Connector Function
+Show 
Driver Function
+Show 
Link Function
+Show Job 
Function
+Show Submission Function
+
+
+Create 
Command
+Create Link Function
+Create 
Job Function
+
+
+Update 
Command
+Update Link Function
+Update 
Job Function
+
+
+Delete 
Command
+Delete Link Function
+Delete 
Job Function
+
+
+Clone 
Command
+Clone 
Link Function
+Clone 
Job Function
+
+
+Start 
Command
+Start 
Job Function
+
+
+Stop 
Command
+Stop Job 
Function
+
+
+Status 
Command
+Status 
Job Function
+
+
+
+
+
+
+
+
+
+2.1.1. Resource file¶
+Sqoop 2 client have ability to load resource files similarly as other 
command line tools. At the beginning of execution Sqoop client will check 
existence of file .sqoop2rc in home directory of currently logged user. 
If such file exists, it will be interpreted before any additional actions. This 
file is loaded in both interactive and batch mode. It can be used to execute 
any batch compatible commands.
+Example resource file:
+# Configure our Sqoop 
2 server automatically
+set server --host sqoop2.company.net
+
+# Run in verbose mode by default
+set option --name verbose --value true
+
+
+
+
+2.1.2. Commands¶
+Sqoop 2 contains several commands that will be documented in this section. 
Each command have one more functions that are accepting various arguments. Not 
all commands are supported in both interactive and batch mode.
+
+2.1.2.1. Auxiliary Commands¶
+Auxiliary commands are commands that are improving user experience and are 
running purely on client side. Thus they do not need working connection to the 
server.
+
+exit Exit 
client immediately. This command can be also executed by sending EOT (end of 
transmission) character. Its CTRL+D on most common Linux shells like 
Bash or Zsh.
+history Print 
out command history. Please note that Sqoop client is saving history from 
previous executions and thus you might see commands that youve executed 
in previous runs.
+help Show all 
available commands with short 

svn commit: r993940 [12/16] - in /websites/staging/sqoop/trunk/content: ./ docs/1.99.7/ docs/1.99.7/_sources/ docs/1.99.7/_sources/admin/ docs/1.99.7/_sources/dev/ docs/1.99.7/_sources/security/ docs/

2016-07-27 Thread buildbot
Added: websites/staging/sqoop/trunk/content/docs/1.99.7/dev/Repository.html
==
--- websites/staging/sqoop/trunk/content/docs/1.99.7/dev/Repository.html (added)
+++ websites/staging/sqoop/trunk/content/docs/1.99.7/dev/Repository.html Thu 
Jul 28 01:17:26 2016
@@ -0,0 +1,725 @@
+
+
+
+
+
+  
+
+  
+  
+  
+  
+  3.6. Repository  Apache Sqoop  documentation
+  
+
+  
+  
+
+  
+
+  
+  
+
+
+  
+
+  
+  
+
+  
+
+  
+
+  
+
+
+
+ 
+
+  
+  
+
+
+
+
+
+  
+
+
+
+  
+
+  
+
+  
+ Apache Sqoop
+  
+
+  
+
+
+  
+  
+
+  
+
+
+  
+
+  
+
+  
+
+
+
+  
+
+
+  
+
+
+
+  
+
+
+
+1. 
Admin Guide
+2. 
User Guide
+3. Developer Guide
+3.1. Building Sqoop2 from source code
+3.2. Sqoop Java Client API Guide
+3.3. Sqoop 2 Connector Development
+3.4. 
Sqoop 2 Development Environment Setup
+3.5. 
Sqoop REST API Guide
+3.6. Repository
+3.6.1. Sqoop Schema
+3.6.1.1. SQ_SYSTEM
+3.6.1.2. SQ_DIRECTION
+3.6.1.3. SQ_CONFIGURABLE
+3.6.1.4. SQ_CONNECTOR_DIRECTIONS
+3.6.1.5. SQ_CONFIG
+3.6.1.6. SQ_CONFIG_DIRECTIONS
+3.6.1.7. 
SQ_INPUT
+3.6.1.8. 
SQ_LINK
+3.6.1.9. 
SQ_JOB
+3.6.1.10. SQ_LINK_INPUT
+3.6.1.11. SQ_JOB_INPUT
+3.6.1.12. SQ_SUBMISSION
+3.6.1.13. SQ_COUNTER_GROUP
+3.6.1.14. SQ_COUNTER
+3.6.1.15. SQ_COUNTER_SUBMISSION
+
+
+
+
+
+
+4. Security Guide
+
+
+
+  
+
+  
+
+
+
+
+  
+  
+
+Apache Sqoop
+  
+
+
+  
+  
+
+  
+
+
+
+
+
+
+  
+Docs 
+  
+  3. Developer Guide 
+  
+3.6. Repository
+  
+
+  
+ View page 
source
+  
+
+  
+  
+  
+
+  http://schema.org/Article;>
+   
+
+  
+3.6. Repository¶
+This repository contains additional information regarding Sqoop.
+
+3.6.1. Sqoop Schema¶
+The DDL queries that create the Sqoop repository schema in Derby database 
create the following tables:
+
+3.6.1.1. SQ_SYSTEM¶
+Store for various state information
+
+
+
+
+
+
+SQ_SYSTEM
+
+
+
+SQM_ID: BIGINT PK
+
+SQM_KEY: VARCHAR(64)
+
+SQM_VALUE: VARCHAR(64)
+
+
+
+
+
+
+3.6.1.2. SQ_DIRECTION¶
+Directions
+
+
+
+
+
+
+
+SQ_DIRECTION
+
+
+
+
+SQD_ID: BIGINT PK AUTO-GEN
+
+
+SQD_NAME: VARCHAR(64)
+FROM|TO
+
+
+
+
+
+
+3.6.1.3. SQ_CONFIGURABLE¶
+Configurable registration
+
+
+
+
+
+
+
+SQ_CONFIGURABLE
+
+
+
+
+SQC_ID: BIGINT PK AUTO-GEN
+
+
+SQC_NAME: VARCHAR(64)
+
+
+SQC_CLASS: VARCHAR(255)
+
+
+SQC_TYPE: VARCHAR(32)
+CONNECTOR|DRIVER
+
+SQC_VERSION: VARCHAR(64)
+
+
+
+
+
+
+
+3.6.1.4. SQ_CONNECTOR_DIRECTIONS¶
+Connector directions
+
+
+
+
+
+
+
+SQ_CONNECTOR_DIRECTIONS
+
+
+
+
+SQCD_ID: BIGINT PK AUTO-GEN
+
+
+SQCD_CONNECTOR: BIGINT
+FK SQCD_CONNECTOR(SQC_ID)
+
+SQCD_DIRECTION: BIGINT
+FK SQCD_DIRECTION(SQD_ID)
+
+
+
+
+
+
+3.6.1.5. SQ_CONFIG¶
+Config details
+
+
+
+
+
+
+
+SQ_CONFIG
+
+
+
+
+SQ_CFG_ID: BIGINT PK AUTO-GEN
+
+
+SQ_CFG_CONNECTOR: BIGINT
+FK SQ_CFG_CONNECTOR(SQC_ID), NULL for driver
+
+SQ_CFG_NAME: VARCHAR(64)
+
+
+SQ_CFG_TYPE: VARCHAR(32)
+LINK|JOB
+
+SQ_CFG_INDEX: SMALLINT
+
+
+
+
+
+
+
+3.6.1.6. SQ_CONFIG_DIRECTIONS¶
+Connector directions
+
+
+
+
+
+
+
+SQ_CONNECTOR_DIRECTIONS
+
+
+
+
+SQCD_ID: BIGINT PK AUTO-GEN
+
+
+SQCD_CONFIG: BIGINT
+FK SQCD_CONFIG(SQ_CFG_ID)
+
+SQCD_DIRECTION: BIGINT
+FK SQCD_DIRECTION(SQD_ID)
+
+
+
+
+
+
+3.6.1.7. SQ_INPUT¶
+Input details
+
+
+
+
+
+
+
+SQ_INPUT
+
+
+
+
+SQI_ID: BIGINT PK AUTO-GEN
+
+
+SQI_NAME: VARCHAR(64)
+
+
+SQI_CONFIG: BIGINT
+FK SQ_CONFIG(SQ_CFG_ID)
+
+SQI_INDEX: SMALLINT
+
+
+SQI_TYPE: VARCHAR(32)
+STRING|MAP
+
+SQI_STRMASK: BOOLEAN
+
+
+SQI_STRLENGTH: SMALLINT
+
+
+SQI_ENUMVALS: VARCHAR(100)
+
+
+
+
+
+
+
+3.6.1.8. SQ_LINK¶
+Stored links
+
+
+
+
+
+
+
+SQ_LINK
+
+
+
+
+SQ_LNK_ID: BIGINT PK AUTO-GEN
+
+
+SQ_LNK_NAME: VARCHAR(64)
+
+
+SQ_LNK_CONNECTOR: BIGINT
+FK SQ_CONNECTOR(SQC_ID)
+
+SQ_LNK_CREATION_USER: VARCHAR(32)
+
+
+SQ_LNK_CREATION_DATE: TIMESTAMP
+
+
+SQ_LNK_UPDATE_USER: VARCHAR(32)
+
+
+SQ_LNK_UPDATE_DATE: TIMESTAMP
+
+
+SQ_LNK_ENABLED: BOOLEAN
+
+
+
+
+
+
+
+3.6.1.9. SQ_JOB¶
+Stored jobs
+
+
+
+
+
+
+
+SQ_JOB
+
+
+
+
+SQB_ID: BIGINT PK AUTO-GEN
+
+
+SQB_NAME: VARCHAR(64)
+
+
+SQB_FROM_LINK: BIGINT
+FK SQ_LINK(SQ_LNK_ID)
+
+SQB_TO_LINK: BIGINT
+FK SQ_LINK(SQ_LNK_ID)
+
+SQB_CREATION_USER: VARCHAR(32)
+
+
+SQB_CREATION_DATE: TIMESTAMP
+
+
+SQB_UPDATE_USER: VARCHAR(32)
+
+
+SQB_UPDATE_DATE: TIMESTAMP
+
+
+SQB_ENABLED: BOOLEAN
+
+
+
+
+
+
+
+3.6.1.10. SQ_LINK_INPUT¶
+N:M relationship link and input
+
+
+
+
+
+
+
+SQ_LINK_INPUT
+
+
+
+
+SQ_LNKI_LINK: BIGINT PK
+FK SQ_LINK(SQ_LNK_ID)
+
+SQ_LNKI_INPUT: BIGINT PK
+FK SQ_INPUT(SQI_ID)
+
+SQ_LNKI_VALUE: LONG VARCHAR
+
+
+
+
+
+
+
+3.6.1.11. SQ_JOB_INPUT¶
+N:M relationship job and input

svn commit: r993940 [9/16] - in /websites/staging/sqoop/trunk/content: ./ docs/1.99.7/ docs/1.99.7/_sources/ docs/1.99.7/_sources/admin/ docs/1.99.7/_sources/dev/ docs/1.99.7/_sources/security/ docs/1

2016-07-27 Thread buildbot
Added: websites/staging/sqoop/trunk/content/docs/1.99.7/dev/ClientAPI.html
==
--- websites/staging/sqoop/trunk/content/docs/1.99.7/dev/ClientAPI.html (added)
+++ websites/staging/sqoop/trunk/content/docs/1.99.7/dev/ClientAPI.html Thu Jul 
28 01:17:26 2016
@@ -0,0 +1,534 @@
+
+
+
+
+
+  
+
+  
+  
+  
+  
+  3.2. Sqoop Java Client API Guide  Apache Sqoop  
documentation
+  
+
+  
+  
+
+  
+
+  
+  
+
+
+  
+
+  
+  
+
+  
+
+  
+
+  
+
+
+
+ 
+
+  
+  
+
+
+
+
+
+  
+
+
+
+  
+
+  
+
+  
+ Apache Sqoop
+  
+
+  
+
+
+  
+  
+
+  
+
+
+  
+
+  
+
+  
+
+
+
+  
+
+
+  
+
+
+
+  
+
+
+
+1. 
Admin Guide
+2. 
User Guide
+3. Developer Guide
+3.1. Building Sqoop2 from source code
+3.2. Sqoop Java Client API Guide
+3.2.1. 
Workflow
+3.2.2. Project Dependencies
+3.2.3. Initialization
+3.2.4. 
Link
+3.2.4.1. Save Link
+
+
+3.2.5. 
Job
+3.2.5.1. 
Save Job
+3.2.5.2. List of status codes
+3.2.5.3. View Error or Warning 
valdiation message
+3.2.5.4. Updating link and job
+
+
+3.2.6. 
Job Start
+3.2.7. Display Config and 
Input Names For Connector
+
+
+3.3. Sqoop 2 Connector Development
+3.4. 
Sqoop 2 Development Environment Setup
+3.5. 
Sqoop REST API Guide
+3.6. Repository
+
+
+4. Security Guide
+
+
+
+  
+
+  
+
+
+
+
+  
+  
+
+Apache Sqoop
+  
+
+
+  
+  
+
+  
+
+
+
+
+
+
+  
+Docs 
+  
+  3. Developer Guide 
+  
+3.2. Sqoop Java Client API Guide
+  
+
+  
+ View page 
source
+  
+
+  
+  
+  
+
+  http://schema.org/Article;>
+   
+
+  
+3.2. Sqoop Java Client API Guide¶
+This document will explain how to use Sqoop Java Client API with external 
application. Client API allows you to execute the functions of sqoop commands. 
It requires Sqoop Client JAR and its dependencies.
+The main class that provides wrapper methods for all the supported 
operations is the
+public class 
SqoopClient {
+  ...
+}
+
+
+Java Client API is explained using Generic JDBC Connector example. Before 
executing the application using the sqoop client API, check whether sqoop 
server is running.
+
+3.2.1. Workflow¶
+Given workflow has to be followed for executing a sqoop job in Sqoop 
server.
+
+
+Create LINK object for a given connector name  - Creates Link 
object and returns it
+Create a JOB for a given from and to link name 
- Create Job object and returns it
+Start the JOB for a given job name - Start Job on 
the server and creates a submission record
+
+
+
+
+3.2.2. Project Dependencies¶
+Here given maven dependency
+dependency
+  groupIdorg.apache.sqoop/groupId
+artifactIdsqoop-client/artifactId
+version${requestedVersion}/version
+/dependency
+
+
+
+
+3.2.3. Initialization¶
+First initialize the SqoopClient class with server URL as argument.
+String url = 
http://localhost:12000/sqoop/;;
+SqoopClient client = new SqoopClient(url);
+
+
+Server URL value can be modfied by setting value to setServerUrl(String) 
method
+client.setServerUrl(newUrl);
+
+
+
+
+3.2.4. Link¶
+Connectors provide the facility to interact with many data sources and thus 
can be used as a means to transfer data between them in Sqoop. The registered 
connector implementation will provide logic to read from and/or write to a data 
source that it represents. A connector can have one or more links associated 
with it. The java client API allows you to create, update and delete a link for 
any registered connector. Creating or updating a link requires you to populate 
the Link Config for that particular connector. Hence the first thing to do is 
get the list of registered connectors and select the connector for which you 
would like to create a link. Then
+you can get the list of all the config/inputs using Display Config 
and Input Names For Connector for that connector.
+
+3.2.4.1. Save Link¶
+First create a new link by invoking createLink(connectorName) method with connector name 
and it returns a MLink object with dummy id and the unfilled link config inputs 
for that connector. Then fill the config inputs with relevant values. Invoke 
saveLink passing it 
the filled MLink object.
+// create a 
placeholder for link
+MLink link = client.createLink(connectorName);
+link.setName(Vampire);
+link.setCreationUser(Buffy);
+MLinkConfig linkConfig = link.getConnectorLinkConfig();
+// fill in the link config values
+linkConfig.getStringInput(linkConfig.connectionString).setValue(jdbc:mysql://localhost/my);

svn commit: r993940 [8/16] - in /websites/staging/sqoop/trunk/content: ./ docs/1.99.7/ docs/1.99.7/_sources/ docs/1.99.7/_sources/admin/ docs/1.99.7/_sources/dev/ docs/1.99.7/_sources/security/ docs/1

2016-07-27 Thread buildbot
Added: websites/staging/sqoop/trunk/content/docs/1.99.7/admin/Installation.html
==
--- websites/staging/sqoop/trunk/content/docs/1.99.7/admin/Installation.html 
(added)
+++ websites/staging/sqoop/trunk/content/docs/1.99.7/admin/Installation.html 
Thu Jul 28 01:17:26 2016
@@ -0,0 +1,358 @@
+
+
+
+
+
+  
+
+  
+  
+  
+  
+  1.1. Installation  Apache Sqoop  documentation
+  
+
+  
+  
+
+  
+
+  
+  
+
+
+  
+
+  
+  
+
+  
+
+  
+
+  
+
+
+
+ 
+
+  
+  
+
+
+
+
+
+  
+
+
+
+  
+
+  
+
+  
+ Apache Sqoop
+  
+
+  
+
+
+  
+  
+
+  
+
+
+  
+
+  
+
+  
+
+
+
+  
+
+
+  
+
+
+
+  
+
+
+
+1. Admin Guide
+1.1. Installation
+1.1.1. Server installation
+1.1.1.1. Hadoop dependencies
+1.1.1.2. Hadoop configuration
+1.1.1.3. Third party jars
+1.1.1.4. Configuring PATH
+1.1.1.5. Configuring Server
+1.1.1.6. Repository Initialization
+1.1.1.7. Server Life Cycle
+
+
+1.1.2. Client installation
+
+
+1.2. 
Tools
+1.3. 
Upgrade
+
+
+2. 
User Guide
+3. 
Developer Guide
+4. Security Guide
+
+
+
+  
+
+  
+
+
+
+
+  
+  
+
+Apache Sqoop
+  
+
+
+  
+  
+
+  
+
+
+
+
+
+
+  
+Docs 
+  
+  1. Admin Guide 
+  
+1.1. Installation
+  
+
+  
+ View 
page source
+  
+
+  
+  
+  
+
+  http://schema.org/Article;>
+   
+
+  
+1.1. Installation¶
+Sqoop ships as one binary package that incorporates two separate parts - 
client and server.
+
+Server You need to install server on single node in your 
cluster. This node will then serve as an entry point for all Sqoop clients.
+Client Clients can be installed on any number of 
machines.
+
+
+1.1.1. Server installation¶
+Copy the Sqoop artifact to the machine where you want to run Sqoop server. 
The Sqoop server acts as a Hadoop client, therefore Hadoop libraries (Yarn, 
Mapreduce, and HDFS jar files) and configuration files (core-site.xml, mapreduce-site.xml, ...) must be 
available on this node. You do not need to run any Hadoop related services - 
running the server on a gateway node is perfectly fine.
+You should be able to list a HDFS for example:
+hadoop dfs -ls
+
+
+Sqoop currently supports Hadoop version 2.6.0 or later. To install the 
Sqoop server, decompress the tarball (in a location of your choosing) and set 
the newly created forder as your working directory.
+# 
Decompress Sqoop distribution tarball
+tar -xvf sqoop-version-bin-hadoophadoop-version.tar.gz
+
+# Move decompressed content to any location
+mv sqoop-version-bin-hadoophadoop version.tar.gz /usr/lib/sqoop
+
+# Change working directory
+cd /usr/lib/sqoop
+
+
+
+1.1.1.1. Hadoop dependencies¶
+Sqoop server needs following environmental variables pointing at Hadoop 
libraries - $HADOOP_COMMON_HOME, $HADOOP_HDFS_HOME, $HADOOP_MAPRED_HOME and $HADOOP_YARN_HOME. You have to make sure 
that those variables are defined and pointing to a valid Hadoop installation. 
Sqoop server will not start if Hadoop libraries cant be found.
+The Sqoop server uses environment variables to find Hadoop libraries. If 
the environment variable $HADOOP_HOME is set, Sqoop will look for jars in the 
following locations: $HADOOP_HOME/share/hadoop/common, $HADOOP_HOME/share/hadoop/hdfs, $HADOOP_HOME/share/hadoop/mapreduce and $HADOOP_HOME/share/hadoop/yarn. You can specify where 
the Sqoop server should look for the common, hdfs, mapreduce, and yarn jars 
indepently with the $HADOOP_COMMON_HOME, $HADOOP_HDFS_HOME, $HADOOP_MAPRED_HOME and <
 span class="pre">$HADOOP_YARN_HOME environment variables.
+# 
Export HADOOP_HOME variable
+export HADOOP_HOME=/...
+
+# Or alternatively HADOOP_*_HOME variables
+export HADOOP_COMMON_HOME=/...
+export HADOOP_HDFS_HOME=/...
+export HADOOP_MAPRED_HOME=/...
+export HADOOP_YARN_HOME=/...
+
+
+
+Note
+If the environment $HADOOP_HOME is set, Sqoop will usee the following 
locations: $HADOOP_HOME/share/hadoop/common, $HADOOP_HOME/share/hadoop/hdfs, $HADOOP_HOME/share/hadoop/mapreduce and $HADOOP_HOME/share/hadoop/yarn.
+
+
+
+1.1.1.2. Hadoop configuration¶
+Sqoop server will need to impersonate users to access HDFS and other 
resources in or outside of the cluster as the user who started given job rather 
then user who is running the server. You need to configure Hadoop to explicitly 
allow this impersonation via so called proxyuser system. You need to create two 
properties in  core-site.xml file - hadoop.proxyuser.$SERVER_USER.hosts and hadoop.proxyuser.$SERVER_USER.groups where $SERVER_USER is the user 
who will be running Sqoop 2 server. In most scenarios configuring * is 

svn commit: r993940 [15/16] - in /websites/staging/sqoop/trunk/content: ./ docs/1.99.7/ docs/1.99.7/_sources/ docs/1.99.7/_sources/admin/ docs/1.99.7/_sources/dev/ docs/1.99.7/_sources/security/ docs/

2016-07-27 Thread buildbot
Added: 
websites/staging/sqoop/trunk/content/docs/1.99.7/user/connectors/Connector-FTP.html
==
--- 
websites/staging/sqoop/trunk/content/docs/1.99.7/user/connectors/Connector-FTP.html
 (added)
+++ 
websites/staging/sqoop/trunk/content/docs/1.99.7/user/connectors/Connector-FTP.html
 Thu Jul 28 01:17:26 2016
@@ -0,0 +1,335 @@
+
+
+
+
+
+  
+
+  
+  
+  
+  
+  2.2.1. FTP Connector  Apache Sqoop  documentation
+  
+
+  
+  
+
+  
+
+  
+  
+
+
+  
+
+  
+  
+
+  
+
+  
+
+  
+
+
+
+ 
+
+  
+  
+
+
+
+
+
+  
+
+
+
+  
+
+  
+
+  
+ Apache Sqoop
+  
+
+  
+
+
+  
+  
+
+  
+
+
+  
+
+  
+
+  
+
+
+
+  
+
+
+  
+
+
+
+  
+
+
+
+1. Admin Guide
+2. User Guide
+2.1. Command Line Shell
+2.2. Connectors
+2.2.1. FTP Connector
+2.2.1.1. 
Usage
+2.2.1.2. 
Loader
+
+
+2.2.2. Generic JDBC Connector
+2.2.3. HDFS Connector
+2.2.4. Kafka Connector
+2.2.5. Kite Connector
+2.2.6. SFTP Connector
+
+
+2.3. Examples
+2.4. Sqoop 5 Minutes Demo
+
+
+3. 
Developer Guide
+4. Security Guide
+
+
+
+  
+
+  
+
+
+
+
+  
+  
+
+Apache Sqoop
+  
+
+
+  
+  
+
+  
+
+
+
+
+
+
+  
+Docs 
+  
+  2. User Guide 
+  
+  2.2. Connectors 
+  
+2.2.1. FTP Connector
+  
+
+  
+ View page source
+  
+
+  
+  
+  
+
+  http://schema.org/Article;>
+   
+
+  
+2.2.1. FTP Connector¶
+The FTP connector supports moving data between an FTP server and other 
supported Sqoop2 connectors.
+Currently only the TO direction is supported to write records to an FTP 
server. A FROM connector is pending (SQOOP-2127).
+
+Contents
+
+FTP 
Connector
+Usage
+Link Configuration
+TO Job Configuration
+
+
+Loader
+
+
+
+
+
+2.2.1.1. Usage¶
+To use the FTP Connector, create a link for the connector and a job that 
uses the link.
+
+2.2.1.1.1. Link 
Configuration¶
+Inputs associated with the link configuration include:
+
+
+
+
+
+
+
+
+Input
+Type
+Description
+Example
+
+
+
+FTP server hostname
+String
+Hostname for the FTP server.
+Required.
+ftp.example.com
+
+FTP server port
+Integer
+Port number for the FTP server. Defaults to 21.
+Optional.
+2100
+
+Username
+String
+The username to provide when connecting to the FTP server.
+Required.
+sqoop
+
+Password
+String
+The password to provide when connecting to the FTP server.
+Required
+sqoop
+
+
+
+
+2.2.1.1.1.1. Notes¶
+
+The FTP connector will attempt to connect to the FTP server as part of the 
link validation process. If for some reason a connection can not be 
established, youll see a corresponding warning message.
+
+
+
+
+2.2.1.1.2. TO Job 
Configuration¶
+Inputs associated with the Job configuration for the TO direction 
include:
+
+
+
+
+
+
+
+
+Input
+Type
+Description
+Example
+
+
+
+Output directory
+String
+The location on the FTP server that the connector will write files to.
+Required
+uploads
+
+
+
+
+2.2.1.1.2.1. Notes¶
+
+The output directory value needs to be an existing directory on 
the FTP server.
+
+
+
+
+
+2.2.1.2. Loader¶
+During the loading phase, the connector will create uniquely named 
files in the output directory for each partition of data received from 
the FROM connector.
+
+
+
+
+   
+  
+  
+  
+
+  
+Next 
+  
+  
+ 
Previous
+  
+
+  
+
+  
+
+  
+
+ Copyright 2009-2016 The Apache Software Foundation.
+
+
+   
+
+
+
+
+  
+
+
+
+  
+  
+
+
+  
+
+
+var DOCUMENTATION_OPTIONS = {
+URL_ROOT:'../../',
+VERSION:'',
+COLLAPSE_INDEX:false,
+FILE_SUFFIX:'.html',
+HAS_SOURCE:  true
+};
+
+  
+  
+  
+
+  
+
+  
+  
+
+  
+
+  
+  
+  
+  jQuery(function () {
+  SphinxRtdTheme.StickyNav.enable();
+  });
+  
+   
+
+
+
\ No newline at end of file

Added: 
websites/staging/sqoop/trunk/content/docs/1.99.7/user/connectors/Connector-GenericJDBC.html
==
--- 
websites/staging/sqoop/trunk/content/docs/1.99.7/user/connectors/Connector-GenericJDBC.html
 (added)
+++ 
websites/staging/sqoop/trunk/content/docs/1.99.7/user/connectors/Connector-GenericJDBC.html
 Thu Jul 28 01:17:26 2016
@@ -0,0 +1,500 @@
+
+
+
+
+
+  
+
+  
+  
+  
+  
+  2.2.2. Generic JDBC Connector  Apache Sqoop  
documentation
+  
+
+  
+  
+
+  
+
+  
+  
+
+
+  
+
+  
+  
+
+  
+
+  
+
+  
+
+
+
+ 
+
+  
+  
+
+
+
+
+
+  
+
+
+
+  
+
+  
+

svn commit: r1754350 [12/17] - in /sqoop/site/trunk/content: ./ resources/docs/1.99.7/ resources/docs/1.99.7/_sources/ resources/docs/1.99.7/_sources/admin/ resources/docs/1.99.7/_sources/dev/ resourc

2016-07-27 Thread afine
Added: sqoop/site/trunk/content/resources/docs/1.99.7/dev/RESTAPI.html
URL: 
http://svn.apache.org/viewvc/sqoop/site/trunk/content/resources/docs/1.99.7/dev/RESTAPI.html?rev=1754350=auto
==
--- sqoop/site/trunk/content/resources/docs/1.99.7/dev/RESTAPI.html (added)
+++ sqoop/site/trunk/content/resources/docs/1.99.7/dev/RESTAPI.html Thu Jul 28 
01:16:05 2016
@@ -0,0 +1,2073 @@
+
+
+
+
+
+  
+
+  
+  
+  
+  
+  3.5. Sqoop REST API Guide  Apache Sqoop  documentation
+  
+
+  
+  
+
+  
+
+  
+  
+
+
+  
+
+  
+  
+
+  
+
+  
+
+  
+
+
+
+ 
+
+  
+  
+
+
+
+
+
+  
+
+
+
+  
+
+  
+
+  
+ Apache Sqoop
+  
+
+  
+
+
+  
+  
+
+  
+
+
+  
+
+  
+
+  
+
+
+
+  
+
+
+  
+
+
+
+  
+
+
+
+1. 
Admin Guide
+2. 
User Guide
+3. Developer Guide
+3.1. Building Sqoop2 from source code
+3.2. Sqoop Java Client API Guide
+3.3. Sqoop 2 Connector Development
+3.4. 
Sqoop 2 Development Environment Setup
+3.5. Sqoop REST API Guide
+3.5.1. Initialization
+3.5.2. Understand Connector, 
Driver, Link and Job
+3.5.3. 
Objects
+3.5.3.1. Configs and Inputs
+3.5.3.2. Exception Response
+3.5.3.3. Config and Input 
Validation Status Response
+3.5.3.4. Job Submission Status 
Response
+
+
+3.5.4. Header Parameters
+3.5.5. 
REST APIs
+3.5.5.1. /version - [GET] - Get Sqoop 
Version
+3.5.5.2. /v1/connectors - [GET]  
Get all Connectors
+3.5.5.3. /v1/connector/[cname] - 
[GET] - Get Connector
+3.5.5.4. /v1/driver - [GET]- Get Sqoop 
Driver
+3.5.5.5. /v1/links/ - [GET]  Get all 
links
+3.5.5.6. 
/v1/links?cname=[cname] - [GET]  Get all links by Connector
+3.5.5.7. /v1/link/[lname]  - [GET] - Get 
Link
+3.5.5.8. /v1/link - [POST] - Create 
Link
+3.5.5.9. /v1/link/[lname] - [PUT] - 
Update Link
+3.5.5.10. /v1/link/[lname]  - [DELETE] 
- Delete Link
+3.5.5.11. /v1/link/[lname]/enable  
- [PUT] - Enable Link
+3.5.5.12. 
/v1/link/[lname]/disable - [PUT] - Disable Link
+3.5.5.13. /v1/jobs/ - [GET]  Get all 
jobs
+3.5.5.14. 
/v1/jobs?cname=[cname] - [GET]  Get all jobs by connector
+3.5.5.15. /v1/job/[jname] - [GET] - Get 
Job
+3.5.5.16. /v1/job - [POST] - Create Job
+3.5.5.17. /v1/job/[jname] - [PUT] - Update 
Job
+3.5.5.18. /v1/job/[jname] - [DELETE] - 
Delete Job
+3.5.5.19. /v1/job/[jname]/enable - 
[PUT] - Enable Job
+3.5.5.20. /v1/job/[jname]/disable 
- [PUT] - Disable Job
+3.5.5.21. /v1/job/[jname]/start - 
[PUT]- Start Job
+3.5.5.22. /v1/job/[jname]/stop  - [PUT]- 
Stop Job
+3.5.5.23. /v1/job/[jname]/status 
 - [GET]- Get Job Status
+3.5.5.24. /v1/submissions? - 
[GET] - Get all job Submissions
+3.5.5.25. 
/v1/submissions?jname=[jname] - [GET] - Get Submissions by Job
+3.5.5.26. 
/v1/authorization/roles/create - [POST] - Create Role
+3.5.5.27. 
/v1/authorization/role/[role-name]  - [DELETE] - Delete Role
+3.5.5.28.
 
/v1/authorization/roles?principal_type=[principal-type]principal_name=[principal-name]
 - [GET]  Get all Roles by Principal
+3.5.5.29.
 /v1/authorization/principals?role_name=[rname] - [GET]  Get all Principals by 
Role
+3.5.5.30. 
/v1/authorization/roles/grant - [PUT] - Grant a Role to a Principal
+3.5.5.31.
 /v1/authorization/roles/revoke - [PUT] - Revoke a Role from a 
Principal
+3.5.5.32.
 /v1/authorization/privileges/grant - [PUT] - Grant a Privilege to a 
Principal
+3.5.5.33.
 /v1/authorization/privileges/revoke - [PUT] - Revoke a Privilege to a 
Principal
+3.5.5.34.
 
/v1/authorization/privilieges?principal_type=[principal-type]principal_name=[principal-name]resource_type=[resource-type]resource_name=[resource-name]
 - [GET]  Get all Roles by Principal (and Resource)
+
+
+
+
+3.6. Repository
+
+
+4. Security Guide
+
+
+
+  
+
+  
+
+
+
+
+  
+  
+
+Apache Sqoop
+  
+
+
+  
+  
+
+  
+
+
+
+
+
+
+  
+Docs 
+  
+  3. Developer Guide 
+  
+3.5. Sqoop REST API Guide
+  
+
+  
+ View page 
source
+  
+
+  
+  
+  
+
+  http://schema.org/Article;>
+   
+
+  
+3.5. Sqoop REST API Guide¶
+This document will explain how you can use Sqoop REST API to build 
applications interacting with Sqoop server.
+The REST API covers all aspects of managing Sqoop jobs and allows you to build 
an app in any programming language using HTTP over JSON.
+
+Table of Contents
+
+Sqoop 
REST API Guide
+Initialization
+Understand Connector, 
Driver, Link and Job
+Objects
+Configs 
and Inputs
+Exception Response
+Config and Input 
Validation Status Response
+Job Submission Status Response
+
+
+Header 
Parameters
+REST APIs
+/version - [GET] - Get Sqoop Version
+/v1/connectors - [GET]  Get all Connectors

svn commit: r1754350 [3/17] - in /sqoop/site/trunk/content: ./ resources/docs/1.99.7/ resources/docs/1.99.7/_sources/ resources/docs/1.99.7/_sources/admin/ resources/docs/1.99.7/_sources/dev/ resource

2016-07-27 Thread afine
Added: sqoop/site/trunk/content/resources/docs/1.99.7/_sources/dev/RESTAPI.txt
URL: 
http://svn.apache.org/viewvc/sqoop/site/trunk/content/resources/docs/1.99.7/_sources/dev/RESTAPI.txt?rev=1754350=auto
==
--- sqoop/site/trunk/content/resources/docs/1.99.7/_sources/dev/RESTAPI.txt 
(added)
+++ sqoop/site/trunk/content/resources/docs/1.99.7/_sources/dev/RESTAPI.txt Thu 
Jul 28 01:16:05 2016
@@ -0,0 +1,1616 @@
+.. Licensed to the Apache Software Foundation (ASF) under one or more
+   contributor license agreements.  See the NOTICE file distributed with
+   this work for additional information regarding copyright ownership.
+   The ASF licenses this file to You under the Apache License, Version 2.0
+   (the "License"); you may not use this file except in compliance with
+   the License.  You may obtain a copy of the License at
+
+   http://www.apache.org/licenses/LICENSE-2.0
+
+   Unless required by applicable law or agreed to in writing, software
+   distributed under the License is distributed on an "AS IS" BASIS,
+   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+   See the License for the specific language governing permissions and
+   limitations under the License.
+
+=
+Sqoop REST API Guide
+=
+
+This document will explain how you can use Sqoop REST API to build 
applications interacting with Sqoop server.
+The REST API covers all aspects of managing Sqoop jobs and allows you to build 
an app in any programming language using HTTP over JSON.
+
+.. contents:: Table of Contents
+
+Initialization
+=
+
+Before continuing further, make sure that the Sqoop server is running.
+
+The Sqoop 2 server exposes its REST API via Jetty. By default the server is 
accessible over HTTP but it can be configured to use HTTPS, please see: 
:ref:`apitlsssl` for more information. The endpoints are registered under the 
``/sqoop`` path and the port is configurable (the default is 12000). For 
example, if the host running the Sqoop 2 server is ``example.com`` and we are 
using the default port, we can reach the version endpoint by sending a GET 
request to:
+
+ ::
+
+http://example.com:12000/sqoop/v1/version
+
+
+Certain requests might need to contain some additional query parameters and 
post data. These parameters could be given via
+the HTTP headers, request body or both. All the content in the HTTP body is in 
``JSON`` format.
+
+Understand Connector, Driver, Link and Job
+===
+
+To create and run a Sqoop Job, we need to provide config values for connecting 
to a data source and then processing the data in that data source. Processing 
might be either reading from or writing to the data source. Thus we have 
configurable entities such as the ``From`` and ``To`` parts of the connectors, 
the driver that each expose configs and one or more inputs within them.
+
+For instance a connector that represents a relational data source such as 
MySQL will expose config classes for connecting to the database. Some of the 
relevant inputs are the connection string, driver class, the username and the 
password to connect to the database. These configs remain the same to read data 
from any of the tables within that database. Hence they are grouped under 
``LinkConfiguration``.
+
+Each connector can support Reading from a data source and/or writing/to a data 
source it represents. Reading from and writing to a data source are represented 
by From and To respectively. Specific configurations are required to peform the 
job of reading from or writing to the data source. These are grouped in the 
``FromJobConfiguration`` and ``ToJobConfiguration`` objects of the connector.
+
+For instance, a connector that represents a relational data source such as 
MySQL will expose the table name to read from or the SQL query to use while 
reading data as a FromJobConfiguration. Similarly a connector that represents a 
data source such as HDFS, will expose the output directory to write to as a 
ToJobConfiguration.
+
+
+Objects
+==
+
+This section covers all the objects that might exist in an API request and/or 
API response.
+
+Configs and Inputs
+--
+
+Before creating any link for a connector or a job with associated ``From`` and 
``To`` links, the first thing to do is getting familiar with all the 
configurations that the connector exposes.
+
+Each config consists of the following information
+
++--+-+
+|   Field  | Description |
++==+=+
+| ``id``   | The id of this config   |
++--+-+
+| 

svn commit: r1754350 [16/17] - in /sqoop/site/trunk/content: ./ resources/docs/1.99.7/ resources/docs/1.99.7/_sources/ resources/docs/1.99.7/_sources/admin/ resources/docs/1.99.7/_sources/dev/ resourc

2016-07-27 Thread afine
Added: 
sqoop/site/trunk/content/resources/docs/1.99.7/user/connectors/Connector-FTP.html
URL: 
http://svn.apache.org/viewvc/sqoop/site/trunk/content/resources/docs/1.99.7/user/connectors/Connector-FTP.html?rev=1754350=auto
==
--- 
sqoop/site/trunk/content/resources/docs/1.99.7/user/connectors/Connector-FTP.html
 (added)
+++ 
sqoop/site/trunk/content/resources/docs/1.99.7/user/connectors/Connector-FTP.html
 Thu Jul 28 01:16:05 2016
@@ -0,0 +1,335 @@
+
+
+
+
+
+  
+
+  
+  
+  
+  
+  2.2.1. FTP Connector  Apache Sqoop  documentation
+  
+
+  
+  
+
+  
+
+  
+  
+
+
+  
+
+  
+  
+
+  
+
+  
+
+  
+
+
+
+ 
+
+  
+  
+
+
+
+
+
+  
+
+
+
+  
+
+  
+
+  
+ Apache Sqoop
+  
+
+  
+
+
+  
+  
+
+  
+
+
+  
+
+  
+
+  
+
+
+
+  
+
+
+  
+
+
+
+  
+
+
+
+1. Admin Guide
+2. User Guide
+2.1. Command Line Shell
+2.2. Connectors
+2.2.1. FTP Connector
+2.2.1.1. 
Usage
+2.2.1.2. 
Loader
+
+
+2.2.2. Generic JDBC Connector
+2.2.3. HDFS Connector
+2.2.4. Kafka Connector
+2.2.5. Kite Connector
+2.2.6. SFTP Connector
+
+
+2.3. Examples
+2.4. Sqoop 5 Minutes Demo
+
+
+3. 
Developer Guide
+4. Security Guide
+
+
+
+  
+
+  
+
+
+
+
+  
+  
+
+Apache Sqoop
+  
+
+
+  
+  
+
+  
+
+
+
+
+
+
+  
+Docs 
+  
+  2. User Guide 
+  
+  2.2. Connectors 
+  
+2.2.1. FTP Connector
+  
+
+  
+ View page source
+  
+
+  
+  
+  
+
+  http://schema.org/Article;>
+   
+
+  
+2.2.1. FTP Connector¶
+The FTP connector supports moving data between an FTP server and other 
supported Sqoop2 connectors.
+Currently only the TO direction is supported to write records to an FTP 
server. A FROM connector is pending (SQOOP-2127).
+
+Contents
+
+FTP 
Connector
+Usage
+Link Configuration
+TO Job Configuration
+
+
+Loader
+
+
+
+
+
+2.2.1.1. Usage¶
+To use the FTP Connector, create a link for the connector and a job that 
uses the link.
+
+2.2.1.1.1. Link 
Configuration¶
+Inputs associated with the link configuration include:
+
+
+
+
+
+
+
+
+Input
+Type
+Description
+Example
+
+
+
+FTP server hostname
+String
+Hostname for the FTP server.
+Required.
+ftp.example.com
+
+FTP server port
+Integer
+Port number for the FTP server. Defaults to 21.
+Optional.
+2100
+
+Username
+String
+The username to provide when connecting to the FTP server.
+Required.
+sqoop
+
+Password
+String
+The password to provide when connecting to the FTP server.
+Required
+sqoop
+
+
+
+
+2.2.1.1.1.1. Notes¶
+
+The FTP connector will attempt to connect to the FTP server as part of the 
link validation process. If for some reason a connection can not be 
established, youll see a corresponding warning message.
+
+
+
+
+2.2.1.1.2. TO Job 
Configuration¶
+Inputs associated with the Job configuration for the TO direction 
include:
+
+
+
+
+
+
+
+
+Input
+Type
+Description
+Example
+
+
+
+Output directory
+String
+The location on the FTP server that the connector will write files to.
+Required
+uploads
+
+
+
+
+2.2.1.1.2.1. Notes¶
+
+The output directory value needs to be an existing directory on 
the FTP server.
+
+
+
+
+
+2.2.1.2. Loader¶
+During the loading phase, the connector will create uniquely named 
files in the output directory for each partition of data received from 
the FROM connector.
+
+
+
+
+   
+  
+  
+  
+
+  
+Next 
+  
+  
+ 
Previous
+  
+
+  
+
+  
+
+  
+
+ Copyright 2009-2016 The Apache Software Foundation.
+
+
+   
+
+
+
+
+  
+
+
+
+  
+  
+
+
+  
+
+
+var DOCUMENTATION_OPTIONS = {
+URL_ROOT:'../../',
+VERSION:'',
+COLLAPSE_INDEX:false,
+FILE_SUFFIX:'.html',
+HAS_SOURCE:  true
+};
+
+  
+  
+  
+
+  
+
+  
+  
+
+  
+
+  
+  
+  
+  jQuery(function () {
+  SphinxRtdTheme.StickyNav.enable();
+  });
+  
+   
+
+
+
\ No newline at end of file

Added: 
sqoop/site/trunk/content/resources/docs/1.99.7/user/connectors/Connector-GenericJDBC.html
URL: 
http://svn.apache.org/viewvc/sqoop/site/trunk/content/resources/docs/1.99.7/user/connectors/Connector-GenericJDBC.html?rev=1754350=auto
==
--- 
sqoop/site/trunk/content/resources/docs/1.99.7/user/connectors/Connector-GenericJDBC.html
 (added)
+++ 
sqoop/site/trunk/content/resources/docs/1.99.7/user/connectors/Connector-GenericJDBC.html
 Thu Jul 28 01:16:05 2016
@@ -0,0 +1,500 @@
+
+
+
+
+
+  

svn commit: r1754350 [10/17] - in /sqoop/site/trunk/content: ./ resources/docs/1.99.7/ resources/docs/1.99.7/_sources/ resources/docs/1.99.7/_sources/admin/ resources/docs/1.99.7/_sources/dev/ resourc

2016-07-27 Thread afine
Added: sqoop/site/trunk/content/resources/docs/1.99.7/dev/ClientAPI.html
URL: 
http://svn.apache.org/viewvc/sqoop/site/trunk/content/resources/docs/1.99.7/dev/ClientAPI.html?rev=1754350=auto
==
--- sqoop/site/trunk/content/resources/docs/1.99.7/dev/ClientAPI.html (added)
+++ sqoop/site/trunk/content/resources/docs/1.99.7/dev/ClientAPI.html Thu Jul 
28 01:16:05 2016
@@ -0,0 +1,534 @@
+
+
+
+
+
+  
+
+  
+  
+  
+  
+  3.2. Sqoop Java Client API Guide  Apache Sqoop  
documentation
+  
+
+  
+  
+
+  
+
+  
+  
+
+
+  
+
+  
+  
+
+  
+
+  
+
+  
+
+
+
+ 
+
+  
+  
+
+
+
+
+
+  
+
+
+
+  
+
+  
+
+  
+ Apache Sqoop
+  
+
+  
+
+
+  
+  
+
+  
+
+
+  
+
+  
+
+  
+
+
+
+  
+
+
+  
+
+
+
+  
+
+
+
+1. 
Admin Guide
+2. 
User Guide
+3. Developer Guide
+3.1. Building Sqoop2 from source code
+3.2. Sqoop Java Client API Guide
+3.2.1. 
Workflow
+3.2.2. Project Dependencies
+3.2.3. Initialization
+3.2.4. 
Link
+3.2.4.1. Save Link
+
+
+3.2.5. 
Job
+3.2.5.1. 
Save Job
+3.2.5.2. List of status codes
+3.2.5.3. View Error or Warning 
valdiation message
+3.2.5.4. Updating link and job
+
+
+3.2.6. 
Job Start
+3.2.7. Display Config and 
Input Names For Connector
+
+
+3.3. Sqoop 2 Connector Development
+3.4. 
Sqoop 2 Development Environment Setup
+3.5. 
Sqoop REST API Guide
+3.6. Repository
+
+
+4. Security Guide
+
+
+
+  
+
+  
+
+
+
+
+  
+  
+
+Apache Sqoop
+  
+
+
+  
+  
+
+  
+
+
+
+
+
+
+  
+Docs 
+  
+  3. Developer Guide 
+  
+3.2. Sqoop Java Client API Guide
+  
+
+  
+ View page 
source
+  
+
+  
+  
+  
+
+  http://schema.org/Article;>
+   
+
+  
+3.2. Sqoop Java Client API Guide¶
+This document will explain how to use Sqoop Java Client API with external 
application. Client API allows you to execute the functions of sqoop commands. 
It requires Sqoop Client JAR and its dependencies.
+The main class that provides wrapper methods for all the supported 
operations is the
+public class 
SqoopClient {
+  ...
+}
+
+
+Java Client API is explained using Generic JDBC Connector example. Before 
executing the application using the sqoop client API, check whether sqoop 
server is running.
+
+3.2.1. Workflow¶
+Given workflow has to be followed for executing a sqoop job in Sqoop 
server.
+
+
+Create LINK object for a given connector name  - Creates Link 
object and returns it
+Create a JOB for a given from and to link name 
- Create Job object and returns it
+Start the JOB for a given job name - Start Job on 
the server and creates a submission record
+
+
+
+
+3.2.2. Project Dependencies¶
+Here given maven dependency
+dependency
+  groupIdorg.apache.sqoop/groupId
+artifactIdsqoop-client/artifactId
+version${requestedVersion}/version
+/dependency
+
+
+
+
+3.2.3. Initialization¶
+First initialize the SqoopClient class with server URL as argument.
+String url = 
http://localhost:12000/sqoop/;;
+SqoopClient client = new SqoopClient(url);
+
+
+Server URL value can be modfied by setting value to setServerUrl(String) 
method
+client.setServerUrl(newUrl);
+
+
+
+
+3.2.4. Link¶
+Connectors provide the facility to interact with many data sources and thus 
can be used as a means to transfer data between them in Sqoop. The registered 
connector implementation will provide logic to read from and/or write to a data 
source that it represents. A connector can have one or more links associated 
with it. The java client API allows you to create, update and delete a link for 
any registered connector. Creating or updating a link requires you to populate 
the Link Config for that particular connector. Hence the first thing to do is 
get the list of registered connectors and select the connector for which you 
would like to create a link. Then
+you can get the list of all the config/inputs using Display Config 
and Input Names For Connector for that connector.
+
+3.2.4.1. Save Link¶
+First create a new link by invoking createLink(connectorName) method with connector name 
and it returns a MLink object with dummy id and the unfilled link config inputs 
for that connector. Then fill the config inputs with relevant values. Invoke 
saveLink passing it 
the filled MLink object.
+// create a 
placeholder for link
+MLink link = client.createLink(connectorName);
+link.setName(Vampire);
+link.setCreationUser(Buffy);
+MLinkConfig linkConfig = link.getConnectorLinkConfig();
+// fill in the link config values

svn commit: r1754350 [1/17] - in /sqoop/site/trunk/content: ./ resources/docs/1.99.7/ resources/docs/1.99.7/_sources/ resources/docs/1.99.7/_sources/admin/ resources/docs/1.99.7/_sources/dev/ resource

2016-07-27 Thread afine
Author: afine
Date: Thu Jul 28 01:16:05 2016
New Revision: 1754350

URL: http://svn.apache.org/viewvc?rev=1754350=rev
Log:
Adding 1.99.7 release

Added:
sqoop/site/trunk/content/resources/docs/1.99.7/
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/admin/
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/admin.txt

sqoop/site/trunk/content/resources/docs/1.99.7/_sources/admin/Installation.txt
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/admin/Tools.txt
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/admin/Upgrade.txt
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/dev/
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/dev.txt

sqoop/site/trunk/content/resources/docs/1.99.7/_sources/dev/BuildingSqoop2.txt
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/dev/ClientAPI.txt

sqoop/site/trunk/content/resources/docs/1.99.7/_sources/dev/ConnectorDevelopment.txt
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/dev/DevEnv.txt
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/dev/RESTAPI.txt
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/dev/Repository.txt
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/index.txt
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/security/
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/security.txt
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/security/API 
TLS-SSL.txt

sqoop/site/trunk/content/resources/docs/1.99.7/_sources/security/AuthenticationAndAuthorization.txt

sqoop/site/trunk/content/resources/docs/1.99.7/_sources/security/RepositoryEncryption.txt
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/user/
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/user.txt

sqoop/site/trunk/content/resources/docs/1.99.7/_sources/user/CommandLineClient.txt
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/user/Connectors.txt
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/user/Examples.txt

sqoop/site/trunk/content/resources/docs/1.99.7/_sources/user/Sqoop5MinutesDemo.txt
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/user/connectors/

sqoop/site/trunk/content/resources/docs/1.99.7/_sources/user/connectors/Connector-FTP.txt

sqoop/site/trunk/content/resources/docs/1.99.7/_sources/user/connectors/Connector-GenericJDBC.txt

sqoop/site/trunk/content/resources/docs/1.99.7/_sources/user/connectors/Connector-HDFS.txt

sqoop/site/trunk/content/resources/docs/1.99.7/_sources/user/connectors/Connector-Kafka.txt

sqoop/site/trunk/content/resources/docs/1.99.7/_sources/user/connectors/Connector-Kite.txt

sqoop/site/trunk/content/resources/docs/1.99.7/_sources/user/connectors/Connector-SFTP.txt
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/user/examples/

sqoop/site/trunk/content/resources/docs/1.99.7/_sources/user/examples/S3Import.txt
sqoop/site/trunk/content/resources/docs/1.99.7/_static/
sqoop/site/trunk/content/resources/docs/1.99.7/_static/ajax-loader.gif   
(with props)
sqoop/site/trunk/content/resources/docs/1.99.7/_static/basic.css
sqoop/site/trunk/content/resources/docs/1.99.7/_static/comment-bright.png   
(with props)
sqoop/site/trunk/content/resources/docs/1.99.7/_static/comment-close.png   
(with props)
sqoop/site/trunk/content/resources/docs/1.99.7/_static/comment.png   (with 
props)
sqoop/site/trunk/content/resources/docs/1.99.7/_static/css/
sqoop/site/trunk/content/resources/docs/1.99.7/_static/css/badge_only.css
sqoop/site/trunk/content/resources/docs/1.99.7/_static/css/theme.css
sqoop/site/trunk/content/resources/docs/1.99.7/_static/doctools.js
sqoop/site/trunk/content/resources/docs/1.99.7/_static/down-pressed.png   
(with props)
sqoop/site/trunk/content/resources/docs/1.99.7/_static/down.png   (with 
props)
sqoop/site/trunk/content/resources/docs/1.99.7/_static/file.png   (with 
props)
sqoop/site/trunk/content/resources/docs/1.99.7/_static/fonts/

sqoop/site/trunk/content/resources/docs/1.99.7/_static/fonts/fontawesome-webfont.svg
sqoop/site/trunk/content/resources/docs/1.99.7/_static/jquery.js
sqoop/site/trunk/content/resources/docs/1.99.7/_static/js/
sqoop/site/trunk/content/resources/docs/1.99.7/_static/js/modernizr.min.js
sqoop/site/trunk/content/resources/docs/1.99.7/_static/js/theme.js
sqoop/site/trunk/content/resources/docs/1.99.7/_static/minus.png   (with 
props)
sqoop/site/trunk/content/resources/docs/1.99.7/_static/plus.png   (with 
props)
sqoop/site/trunk/content/resources/docs/1.99.7/_static/pygments.css
sqoop/site/trunk/content/resources/docs/1.99.7/_static/searchtools.js
sqoop/site/trunk/content/resources/docs/1.99.7/_static/sqoop-logo.png   
(with props)

svn commit: r1754350 [4/17] - in /sqoop/site/trunk/content: ./ resources/docs/1.99.7/ resources/docs/1.99.7/_sources/ resources/docs/1.99.7/_sources/admin/ resources/docs/1.99.7/_sources/dev/ resource

2016-07-27 Thread afine
Added: 
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/security/AuthenticationAndAuthorization.txt
URL: 
http://svn.apache.org/viewvc/sqoop/site/trunk/content/resources/docs/1.99.7/_sources/security/AuthenticationAndAuthorization.txt?rev=1754350=auto
==
--- 
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/security/AuthenticationAndAuthorization.txt
 (added)
+++ 
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/security/AuthenticationAndAuthorization.txt
 Thu Jul 28 01:16:05 2016
@@ -0,0 +1,239 @@
+.. Licensed to the Apache Software Foundation (ASF) under one or more
+   contributor license agreements.  See the NOTICE file distributed with
+   this work for additional information regarding copyright ownership.
+   The ASF licenses this file to You under the Apache License, Version 2.0
+   (the "License"); you may not use this file except in compliance with
+   the License.  You may obtain a copy of the License at
+
+   http://www.apache.org/licenses/LICENSE-2.0
+
+   Unless required by applicable law or agreed to in writing, software
+   distributed under the License is distributed on an "AS IS" BASIS,
+   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+   See the License for the specific language governing permissions and
+   limitations under the License.
+
+
+
+Authentication and Authorization
+
+
+Most Hadoop components, such as HDFS, Yarn, Hive, etc., have security 
frameworks, which support Simple, Kerberos and LDAP authentication. currently 
Sqoop 2 provides 2 types of authentication: simple and kerberos. The 
authentication module is pluggable, so more authentication types can be added. 
Additionally, a new role based access control is introduced in Sqoop 1.99.6. We 
recommend to use this capability in multi tenant environments, so that 
malicious users can’t easily abuse your created link and job objects.
+
+Simple Authentication
+=
+
+Configuration
+-
+Modify Sqoop configuration file, normally in /conf/sqoop.properties.
+
+::
+
+  org.apache.sqoop.authentication.type=SIMPLE
+  
org.apache.sqoop.authentication.handler=org.apache.sqoop.security.authentication.SimpleAuthenticationHandler
+  org.apache.sqoop.anonymous=true
+
+-  Simple authentication is used by default. Commenting out authentication 
configuration will yield the use of simple authentication.
+
+Run command
+---
+Start Sqoop server as usual.
+
+::
+
+  /bin/sqoop.sh server start
+
+Start Sqoop client as usual.
+
+::
+
+  /bin/sqoop.sh client
+
+Kerberos Authentication
+===
+
+Kerberos is a computer network authentication protocol which works on the 
basis of 'tickets' to allow nodes communicating over a non-secure network to 
prove their identity to one another in a secure manner. Its designers aimed it 
primarily at a client–server model and it provides mutual 
authentication—both the user and the server verify each other's identity. 
Kerberos protocol messages are protected against eavesdropping and replay 
attacks.
+
+Dependency
+--
+Set up a KDC server. Skip this step if KDC server exists. It's difficult to 
cover every way Kerberos can be setup (ie: there are cross realm setups and 
multi-trust environments). This section will describe how to setup the sqoop 
principals with a local deployment of MIT kerberos.
+
+-  All components which are Kerberos authenticated need one KDC server. If 
current Hadoop cluster uses Kerberos authentication, there should be a KDC 
server.
+-  If there is no KDC server, follow 
http://web.mit.edu/kerberos/krb5-devel/doc/admin/install_kdc.html to set up one.
+
+Configure Hadoop cluster to use Kerberos authentication.
+
+-  Authentication type should be cluster level. All components must have 
the same authentication type: use Kerberos or not. In other words, Sqoop with 
Kerberos authentication could not communicate with other Hadoop components, 
such as HDFS, Yarn, Hive, etc., without Kerberos authentication, and vice versa.
+-  How to set up a Hadoop cluster with Kerberos authentication is out of 
the scope of this document. Follow the related links like 
https://hadoop.apache.org/docs/r2.5.0/hadoop-project-dist/hadoop-common/SecureMode.html
+
+Create keytab and principal for Sqoop 2 via kadmin in command line.
+
+::
+
+  addprinc -randkey HTTP/@
+  addprinc -randkey sqoop/@
+  xst -k /home/kerberos/sqoop.keytab HTTP/@
+  xst -k /home/kerberos/sqoop.keytab sqoop/@
+
+-  The  should be replaced by the FQDN of the server, which could be 
found via “hostname -f” in command line.
+-  The  should be replaced by the realm name in krb5.conf file 
generated when installing the KDC server in the former step.
+-  The principal HTTP/@ is used in communication between 
Sqoop client and Sqoop server. Since Sqoop 

svn commit: r1754350 [17/17] - in /sqoop/site/trunk/content: ./ resources/docs/1.99.7/ resources/docs/1.99.7/_sources/ resources/docs/1.99.7/_sources/admin/ resources/docs/1.99.7/_sources/dev/ resourc

2016-07-27 Thread afine
Added: 
sqoop/site/trunk/content/resources/docs/1.99.7/user/examples/S3Import.html
URL: 
http://svn.apache.org/viewvc/sqoop/site/trunk/content/resources/docs/1.99.7/user/examples/S3Import.html?rev=1754350=auto
==
--- sqoop/site/trunk/content/resources/docs/1.99.7/user/examples/S3Import.html 
(added)
+++ sqoop/site/trunk/content/resources/docs/1.99.7/user/examples/S3Import.html 
Thu Jul 28 01:16:05 2016
@@ -0,0 +1,272 @@
+
+
+
+
+
+  
+
+  
+  
+  
+  
+  2.3.1. S3 Import to HDFS  Apache Sqoop  documentation
+  
+
+  
+  
+
+  
+
+  
+  
+
+
+  
+
+  
+  
+
+  
+
+  
+
+  
+
+
+
+ 
+
+  
+  
+
+
+
+
+
+  
+
+
+
+  
+
+  
+
+  
+ Apache Sqoop
+  
+
+  
+
+
+  
+  
+
+  
+
+
+  
+
+  
+
+  
+
+
+
+  
+
+
+  
+
+
+
+  
+
+
+
+1. Admin Guide
+2. User Guide
+2.1. Command Line Shell
+2.2. Connectors
+2.3. Examples
+2.3.1. S3 Import to HDFS
+2.3.1.1. 
Use case
+2.3.1.2. Configuration
+
+
+
+
+2.4. Sqoop 5 Minutes Demo
+
+
+3. 
Developer Guide
+4. Security Guide
+
+
+
+  
+
+  
+
+
+
+
+  
+  
+
+Apache Sqoop
+  
+
+
+  
+  
+
+  
+
+
+
+
+
+
+  
+Docs 
+  
+  2. User Guide 
+  
+  2.3. Examples 
+  
+2.3.1. S3 Import to HDFS
+  
+
+  
+ View page source
+  
+
+  
+  
+  
+
+  http://schema.org/Article;>
+   
+
+  
+2.3.1. S3 Import to HDFS¶
+
+Contents
+
+S3 Import 
to HDFS
+Use case
+Configuration
+
+
+
+
+This section contains detailed description for example use case of 
transferring data from S3 to HDFS.
+
+2.3.1.1. Use case¶
+You have directory on S3 where some external process is creating new text 
files. New files are added to this directory, but existing files are never 
altered. They can only be removed after some period of time. Data from all new 
files needs to be transferred to a single HDFS directory. Preserving file names 
is not required and multiple source files can be merged to single file on 
HDFS.
+
+
+2.3.1.2. Configuration¶
+We will use HDFS connector for both From and To sides of the data transfer. In order to create link 
for S3 you need to have S3 bucket name and S3 access and secret keys. Please 
follow S3 documentation to retrieve S3 credentials if you don’t have them 
already.
+sqoop:000 create 
link -c hdfs-connector
+
+
+
+Our example uses s3link for the link name
+Specify HDFS URI in form of s3a://$BUCKET_NAME where $BUCKET_NAME is name of the S3 
bucket
+Use Override configuration option and specify fs.s3a.access.key and fs.s3a.secret.key with 
your S3 access and secret key respectively.
+
+Next step is to create link for HDFS
+sqoop:000 create 
link -c hdfs-connector
+
+
+Our example uses hdfslink for the link name
+If your Sqoop server is running on node that has HDFS and mapreduce client 
configuration deployed, you can safely keep all options blank and use defaults 
for them.
+With having links for both HDFS and S3, you can create job that will 
transfer data from S3 to HDFS:
+sqoop:000 create 
job -f s3link -t hdfslink
+
+
+
+Our example uses s3import for the job name
+Input directory should point to a directory inside your S3 bucket where 
new files are generated
+Make sure to choose mode NEW_FILES for Incremental type
+Final destination for the imported files can be specified in Output 
directory
+Make sure to enable Append mode, so that Sqoop can upload newly created 
files to the same directory on HDFS
+Configure the remaining options as you see fit
+
+Then finally you can start the job by issuing following command:
+sqoop:000 start 
job -j s3import
+
+
+You can run the job s3import periodically and only newly created files will 
be transferred.
+
+
+
+
+   
+  
+  
+  
+
+  
+Next 
+  
+  
+ 
Previous
+  
+
+  
+
+  
+
+  
+
+ Copyright 2009-2016 The Apache Software Foundation.
+
+
+   
+
+
+
+
+  
+
+
+
+  
+  
+
+
+  
+
+
+var DOCUMENTATION_OPTIONS = {
+URL_ROOT:'../../',
+VERSION:'',
+COLLAPSE_INDEX:false,
+FILE_SUFFIX:'.html',
+HAS_SOURCE:  true
+};
+
+  
+  
+  
+
+  
+
+  
+  
+
+  
+
+  
+  
+  
+  jQuery(function () {
+  SphinxRtdTheme.StickyNav.enable();
+  });
+  
+   
+
+
+
\ No newline at end of file

Modified: sqoop/site/trunk/content/site.xml
URL: 
http://svn.apache.org/viewvc/sqoop/site/trunk/content/site.xml?rev=1754350=1754349=1754350=diff

svn commit: r1754350 [2/17] - in /sqoop/site/trunk/content: ./ resources/docs/1.99.7/ resources/docs/1.99.7/_sources/ resources/docs/1.99.7/_sources/admin/ resources/docs/1.99.7/_sources/dev/ resource

2016-07-27 Thread afine
Added: 
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/dev/ConnectorDevelopment.txt
URL: 
http://svn.apache.org/viewvc/sqoop/site/trunk/content/resources/docs/1.99.7/_sources/dev/ConnectorDevelopment.txt?rev=1754350=auto
==
--- 
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/dev/ConnectorDevelopment.txt
 (added)
+++ 
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/dev/ConnectorDevelopment.txt
 Thu Jul 28 01:16:05 2016
@@ -0,0 +1,634 @@
+.. Licensed to the Apache Software Foundation (ASF) under one or more
+   contributor license agreements.  See the NOTICE file distributed with
+   this work for additional information regarding copyright ownership.
+   The ASF licenses this file to You under the Apache License, Version 2.0
+   (the "License"); you may not use this file except in compliance with
+   the License.  You may obtain a copy of the License at
+
+   http://www.apache.org/licenses/LICENSE-2.0
+
+   Unless required by applicable law or agreed to in writing, software
+   distributed under the License is distributed on an "AS IS" BASIS,
+   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+   See the License for the specific language governing permissions and
+   limitations under the License.
+
+
+=
+Sqoop 2 Connector Development
+=
+
+This document describes how to implement a connector in the Sqoop 2 using the 
code sample from one of the built-in connectors ( ``GenericJdbcConnector`` ) as 
a reference. Sqoop 2 jobs support extraction from and/or loading to different 
data sources. Sqoop 2 connectors encapsulate the job lifecyle operations for 
extracting and/or loading data from and/or to
+different data sources. Each connector will primarily focus on a particular 
data source and its custom implementation for optimally reading and/or writing 
data in a distributed environment.
+
+.. contents::
+
+What is a Sqoop Connector?
+++
+
+Connectors provide the facility to interact with many data sources and thus 
can be used as a means to transfer data between them in Sqoop. The connector 
implementation will provide logic to read from and/or write to a data source 
that it represents. For instance the ( ``GenericJdbcConnector`` ) encapsulates 
the logic to read from and/or write to jdbc enabled relational data sources. 
The connector part that enables reading from a data source and transferring 
this data to internal Sqoop format is called the FROM and the part that enables 
writng data to a data source by transferring data from Sqoop format is called 
TO. In order to interact with these data sources, the connector will provide 
one or many config classes and input fields within it.
+
+Broadly we support two main config types for connectors, link type represented 
by the enum ``ConfigType.LINK`` and job type represented by the enum 
``ConfigType.JOB``. Link config represents the properties to physically connect 
to the data source. Job config represent the properties that are required to 
invoke reading from and/or writing to particular dataset in the data source it 
connects to. If a connector supports both reading from and writing to, it will 
provide the ``FromJobConfig`` and ``ToJobConfig`` objects. Each of these config 
objects are custom to each connector and can have one or more inputs associated 
with each of the Link, FromJob and ToJob config types. Hence we call the 
connectors as configurables i.e an entity that can provide configs for 
interacting with the data source it represents. As the connectors evolve over 
time to support new features in their data sources, the configs and inputs will 
change as well. Thus the connector API also provides methods for upgradi
 ng the config and input names and data related to these data sources across 
different versions.
+
+The connectors implement logic for various stages of the extract/load process 
using the connector API described below. While extracting/reading data from the 
data-source the main stages are ``Initializer``, ``Partitioner``, ``Extractor`` 
and ``Destroyer``. While loading/writitng data to the data source the main 
stages currently supported are ``Initializer``, ``Loader`` and ``Destroyer``. 
Each stage has its unique set of responsibilities that are explained in detail 
below. Since connectors understand the internals of the data source they 
represent, they work in tandem with the sqoop supported execution engines such 
as MapReduce or Spark (in future) to accomplish this process in a most optimal 
way.
+
+When do we add a new connector?
+===
+You add a new connector when you need to extract/read data from a new data 
source, or load/write
+data into a new data source that is not supported yet in Sqoop 2.
+In addition to the connector API, Sqoop 2 also has an submission and execution 
engine 

svn commit: r1754350 [9/17] - in /sqoop/site/trunk/content: ./ resources/docs/1.99.7/ resources/docs/1.99.7/_sources/ resources/docs/1.99.7/_sources/admin/ resources/docs/1.99.7/_sources/dev/ resource

2016-07-27 Thread afine
Added: sqoop/site/trunk/content/resources/docs/1.99.7/admin/Installation.html
URL: 
http://svn.apache.org/viewvc/sqoop/site/trunk/content/resources/docs/1.99.7/admin/Installation.html?rev=1754350=auto
==
--- sqoop/site/trunk/content/resources/docs/1.99.7/admin/Installation.html 
(added)
+++ sqoop/site/trunk/content/resources/docs/1.99.7/admin/Installation.html Thu 
Jul 28 01:16:05 2016
@@ -0,0 +1,358 @@
+
+
+
+
+
+  
+
+  
+  
+  
+  
+  1.1. Installation  Apache Sqoop  documentation
+  
+
+  
+  
+
+  
+
+  
+  
+
+
+  
+
+  
+  
+
+  
+
+  
+
+  
+
+
+
+ 
+
+  
+  
+
+
+
+
+
+  
+
+
+
+  
+
+  
+
+  
+ Apache Sqoop
+  
+
+  
+
+
+  
+  
+
+  
+
+
+  
+
+  
+
+  
+
+
+
+  
+
+
+  
+
+
+
+  
+
+
+
+1. Admin Guide
+1.1. Installation
+1.1.1. Server installation
+1.1.1.1. Hadoop dependencies
+1.1.1.2. Hadoop configuration
+1.1.1.3. Third party jars
+1.1.1.4. Configuring PATH
+1.1.1.5. Configuring Server
+1.1.1.6. Repository Initialization
+1.1.1.7. Server Life Cycle
+
+
+1.1.2. Client installation
+
+
+1.2. 
Tools
+1.3. 
Upgrade
+
+
+2. 
User Guide
+3. 
Developer Guide
+4. Security Guide
+
+
+
+  
+
+  
+
+
+
+
+  
+  
+
+Apache Sqoop
+  
+
+
+  
+  
+
+  
+
+
+
+
+
+
+  
+Docs 
+  
+  1. Admin Guide 
+  
+1.1. Installation
+  
+
+  
+ View 
page source
+  
+
+  
+  
+  
+
+  http://schema.org/Article;>
+   
+
+  
+1.1. Installation¶
+Sqoop ships as one binary package that incorporates two separate parts - 
client and server.
+
+Server You need to install server on single node in your 
cluster. This node will then serve as an entry point for all Sqoop clients.
+Client Clients can be installed on any number of 
machines.
+
+
+1.1.1. Server installation¶
+Copy the Sqoop artifact to the machine where you want to run Sqoop server. 
The Sqoop server acts as a Hadoop client, therefore Hadoop libraries (Yarn, 
Mapreduce, and HDFS jar files) and configuration files (core-site.xml, mapreduce-site.xml, ...) must be 
available on this node. You do not need to run any Hadoop related services - 
running the server on a gateway node is perfectly fine.
+You should be able to list a HDFS for example:
+hadoop dfs -ls
+
+
+Sqoop currently supports Hadoop version 2.6.0 or later. To install the 
Sqoop server, decompress the tarball (in a location of your choosing) and set 
the newly created forder as your working directory.
+# 
Decompress Sqoop distribution tarball
+tar -xvf sqoop-version-bin-hadoophadoop-version.tar.gz
+
+# Move decompressed content to any location
+mv sqoop-version-bin-hadoophadoop version.tar.gz /usr/lib/sqoop
+
+# Change working directory
+cd /usr/lib/sqoop
+
+
+
+1.1.1.1. Hadoop dependencies¶
+Sqoop server needs following environmental variables pointing at Hadoop 
libraries - $HADOOP_COMMON_HOME, $HADOOP_HDFS_HOME, $HADOOP_MAPRED_HOME and $HADOOP_YARN_HOME. You have to make sure 
that those variables are defined and pointing to a valid Hadoop installation. 
Sqoop server will not start if Hadoop libraries cant be found.
+The Sqoop server uses environment variables to find Hadoop libraries. If 
the environment variable $HADOOP_HOME is set, Sqoop will look for jars in the 
following locations: $HADOOP_HOME/share/hadoop/common, $HADOOP_HOME/share/hadoop/hdfs, $HADOOP_HOME/share/hadoop/mapreduce and $HADOOP_HOME/share/hadoop/yarn. You can specify where 
the Sqoop server should look for the common, hdfs, mapreduce, and yarn jars 
indepently with the $HADOOP_COMMON_HOME, $HADOOP_HDFS_HOME, $HADOOP_MAPRED_HOME and <
 span class="pre">$HADOOP_YARN_HOME environment variables.
+# 
Export HADOOP_HOME variable
+export HADOOP_HOME=/...
+
+# Or alternatively HADOOP_*_HOME variables
+export HADOOP_COMMON_HOME=/...
+export HADOOP_HDFS_HOME=/...
+export HADOOP_MAPRED_HOME=/...
+export HADOOP_YARN_HOME=/...
+
+
+
+Note
+If the environment $HADOOP_HOME is set, Sqoop will usee the following 
locations: $HADOOP_HOME/share/hadoop/common, $HADOOP_HOME/share/hadoop/hdfs, $HADOOP_HOME/share/hadoop/mapreduce and $HADOOP_HOME/share/hadoop/yarn.
+
+
+
+1.1.1.2. Hadoop configuration¶
+Sqoop server will need to impersonate users to access HDFS and other 
resources in or outside of the cluster as the user who started given job rather 
then user who is running the server. You need to configure Hadoop to explicitly 
allow this impersonation via so called proxyuser system. You need to create two 
properties in  core-site.xml file - hadoop.proxyuser.$SERVER_USER.hosts and 

svn commit: r1754350 [7/17] - in /sqoop/site/trunk/content: ./ resources/docs/1.99.7/ resources/docs/1.99.7/_sources/ resources/docs/1.99.7/_sources/admin/ resources/docs/1.99.7/_sources/dev/ resource

2016-07-27 Thread afine
Added: sqoop/site/trunk/content/resources/docs/1.99.7/_static/jquery.js
URL: 
http://svn.apache.org/viewvc/sqoop/site/trunk/content/resources/docs/1.99.7/_static/jquery.js?rev=1754350=auto
==
--- sqoop/site/trunk/content/resources/docs/1.99.7/_static/jquery.js (added)
+++ sqoop/site/trunk/content/resources/docs/1.99.7/_static/jquery.js Thu Jul 28 
01:16:05 2016
@@ -0,0 +1,154 @@
+/*!
+ * jQuery JavaScript Library v1.4.2
+ * http://jquery.com/
+ *
+ * Copyright 2010, John Resig
+ * Dual licensed under the MIT or GPL Version 2 licenses.
+ * http://jquery.org/license
+ *
+ * Includes Sizzle.js
+ * http://sizzlejs.com/
+ * Copyright 2010, The Dojo Foundation
+ * Released under the MIT, BSD, and GPL Licenses.
+ *
+ * Date: Sat Feb 13 22:33:48 2010 -0500
+ */
+(function(A,w){function 
ma(){if(!c.isReady){try{s.documentElement.doScroll("left")}catch(a){setTimeout(ma,1);return}c.ready()}}function
 
Qa(a,b){b.src?c.ajax({url:b.src,async:false,dataType:"script"}):c.globalEval(b.text||b.textContent||b.innerHTML||"");b.parentNode&(b)}function
 X(a,b,d,f,e,j){var i=a.length;if(typeof b==="object"){for(var o in 
b)X(a,o,b[o],f,e,d);return 
a}if(d!==w){f=!j&&(d);for(o=0;o)[^>]*$|^#([\w-]+)$/,Ua=/^.[^:#\[\.,]*$/,Va=/\S/,
+Wa=/^(\s|\u00A0)+|(\s|\u00A0)+$/g,Xa=/^<(\w+)\s*\/?>(?:<\/\1>)?$/,P=navigator.userAgent,xa=false,Q=[],L,$=Object.prototype.toString,aa=Object.prototype.hasOwnProperty,ba=Array.prototype.push,R=Array.prototype.slice,ya=Array.prototype.indexOf;c.fn=c.prototype={init:function(a,b){var
 d,f;if(!a)return 
this;if(a.nodeType){this.context=this[0]=a;this.length=1;return 
this}if(a==="body"&&!b){this.context=s;this[0]=s.body;this.selector="body";this.length=1;return
 this}if(typeof a==="string")if((d=Ta.exec(a))&&
+(d[1]||!b))if(d[1]){f=b?b.ownerDocument||b:s;if(a=Xa.exec(a))if(c.isPlainObject(b)){a=[s.createElement(a[1])];c.fn.attr.call(a,b,true)}else
 
a=[f.createElement(a[1])];else{a=sa([d[1]],[f]);a=(a.cacheable?a.fragment.cloneNode(true):a.fragment).childNodes}return
 c.merge(this,a)}else{if(b=s.getElementById(d[2])){if(b.id!==d[2])return 
T.find(a);this.length=1;this[0]=b}this.context=s;this.selector=a;return 
this}else 
if(!b&&/^\w+$/.test(a)){this.selector=a;this.context=s;a=s.getElementsByTagName(a);return
 c.merge(this,
+a)}else return!b||b.jquery?(b||T).find(a):c(b).find(a);else 
if(c.isFunction(a))return 
T.ready(a);if(a.selector!==w){this.selector=a.selector;this.context=a.context}return
 
c.makeArray(a,this)},selector:"",jquery:"1.4.2",length:0,size:function(){return 
this.length},toArray:function(){return R.call(this,0)},get:function(a){return 
a==null?this.toArray():a<0?this.slice(a)[0]:this[a]},pushStack:function(a,b,d){var
 
f=c();c.isArray(a)?ba.apply(f,a):c.merge(f,a);f.prevObject=this;f.context=this.context;if(b===
+"find")f.selector=this.selector+(this.selector?" ":"")+d;else 
if(b)f.selector=this.selector+"."+b+"("+d+")";return 

svn commit: r1754350 [13/17] - in /sqoop/site/trunk/content: ./ resources/docs/1.99.7/ resources/docs/1.99.7/_sources/ resources/docs/1.99.7/_sources/admin/ resources/docs/1.99.7/_sources/dev/ resourc

2016-07-27 Thread afine
Added: sqoop/site/trunk/content/resources/docs/1.99.7/dev/Repository.html
URL: 
http://svn.apache.org/viewvc/sqoop/site/trunk/content/resources/docs/1.99.7/dev/Repository.html?rev=1754350=auto
==
--- sqoop/site/trunk/content/resources/docs/1.99.7/dev/Repository.html (added)
+++ sqoop/site/trunk/content/resources/docs/1.99.7/dev/Repository.html Thu Jul 
28 01:16:05 2016
@@ -0,0 +1,725 @@
+
+
+
+
+
+  
+
+  
+  
+  
+  
+  3.6. Repository  Apache Sqoop  documentation
+  
+
+  
+  
+
+  
+
+  
+  
+
+
+  
+
+  
+  
+
+  
+
+  
+
+  
+
+
+
+ 
+
+  
+  
+
+
+
+
+
+  
+
+
+
+  
+
+  
+
+  
+ Apache Sqoop
+  
+
+  
+
+
+  
+  
+
+  
+
+
+  
+
+  
+
+  
+
+
+
+  
+
+
+  
+
+
+
+  
+
+
+
+1. 
Admin Guide
+2. 
User Guide
+3. Developer Guide
+3.1. Building Sqoop2 from source code
+3.2. Sqoop Java Client API Guide
+3.3. Sqoop 2 Connector Development
+3.4. 
Sqoop 2 Development Environment Setup
+3.5. 
Sqoop REST API Guide
+3.6. Repository
+3.6.1. Sqoop Schema
+3.6.1.1. SQ_SYSTEM
+3.6.1.2. SQ_DIRECTION
+3.6.1.3. SQ_CONFIGURABLE
+3.6.1.4. SQ_CONNECTOR_DIRECTIONS
+3.6.1.5. SQ_CONFIG
+3.6.1.6. SQ_CONFIG_DIRECTIONS
+3.6.1.7. 
SQ_INPUT
+3.6.1.8. 
SQ_LINK
+3.6.1.9. 
SQ_JOB
+3.6.1.10. SQ_LINK_INPUT
+3.6.1.11. SQ_JOB_INPUT
+3.6.1.12. SQ_SUBMISSION
+3.6.1.13. SQ_COUNTER_GROUP
+3.6.1.14. SQ_COUNTER
+3.6.1.15. SQ_COUNTER_SUBMISSION
+
+
+
+
+
+
+4. Security Guide
+
+
+
+  
+
+  
+
+
+
+
+  
+  
+
+Apache Sqoop
+  
+
+
+  
+  
+
+  
+
+
+
+
+
+
+  
+Docs 
+  
+  3. Developer Guide 
+  
+3.6. Repository
+  
+
+  
+ View page 
source
+  
+
+  
+  
+  
+
+  http://schema.org/Article;>
+   
+
+  
+3.6. Repository¶
+This repository contains additional information regarding Sqoop.
+
+3.6.1. Sqoop Schema¶
+The DDL queries that create the Sqoop repository schema in Derby database 
create the following tables:
+
+3.6.1.1. SQ_SYSTEM¶
+Store for various state information
+
+
+
+
+
+
+SQ_SYSTEM
+
+
+
+SQM_ID: BIGINT PK
+
+SQM_KEY: VARCHAR(64)
+
+SQM_VALUE: VARCHAR(64)
+
+
+
+
+
+
+3.6.1.2. SQ_DIRECTION¶
+Directions
+
+
+
+
+
+
+
+SQ_DIRECTION
+
+
+
+
+SQD_ID: BIGINT PK AUTO-GEN
+
+
+SQD_NAME: VARCHAR(64)
+FROM|TO
+
+
+
+
+
+
+3.6.1.3. SQ_CONFIGURABLE¶
+Configurable registration
+
+
+
+
+
+
+
+SQ_CONFIGURABLE
+
+
+
+
+SQC_ID: BIGINT PK AUTO-GEN
+
+
+SQC_NAME: VARCHAR(64)
+
+
+SQC_CLASS: VARCHAR(255)
+
+
+SQC_TYPE: VARCHAR(32)
+CONNECTOR|DRIVER
+
+SQC_VERSION: VARCHAR(64)
+
+
+
+
+
+
+
+3.6.1.4. SQ_CONNECTOR_DIRECTIONS¶
+Connector directions
+
+
+
+
+
+
+
+SQ_CONNECTOR_DIRECTIONS
+
+
+
+
+SQCD_ID: BIGINT PK AUTO-GEN
+
+
+SQCD_CONNECTOR: BIGINT
+FK SQCD_CONNECTOR(SQC_ID)
+
+SQCD_DIRECTION: BIGINT
+FK SQCD_DIRECTION(SQD_ID)
+
+
+
+
+
+
+3.6.1.5. SQ_CONFIG¶
+Config details
+
+
+
+
+
+
+
+SQ_CONFIG
+
+
+
+
+SQ_CFG_ID: BIGINT PK AUTO-GEN
+
+
+SQ_CFG_CONNECTOR: BIGINT
+FK SQ_CFG_CONNECTOR(SQC_ID), NULL for driver
+
+SQ_CFG_NAME: VARCHAR(64)
+
+
+SQ_CFG_TYPE: VARCHAR(32)
+LINK|JOB
+
+SQ_CFG_INDEX: SMALLINT
+
+
+
+
+
+
+
+3.6.1.6. SQ_CONFIG_DIRECTIONS¶
+Connector directions
+
+
+
+
+
+
+
+SQ_CONNECTOR_DIRECTIONS
+
+
+
+
+SQCD_ID: BIGINT PK AUTO-GEN
+
+
+SQCD_CONFIG: BIGINT
+FK SQCD_CONFIG(SQ_CFG_ID)
+
+SQCD_DIRECTION: BIGINT
+FK SQCD_DIRECTION(SQD_ID)
+
+
+
+
+
+
+3.6.1.7. SQ_INPUT¶
+Input details
+
+
+
+
+
+
+
+SQ_INPUT
+
+
+
+
+SQI_ID: BIGINT PK AUTO-GEN
+
+
+SQI_NAME: VARCHAR(64)
+
+
+SQI_CONFIG: BIGINT
+FK SQ_CONFIG(SQ_CFG_ID)
+
+SQI_INDEX: SMALLINT
+
+
+SQI_TYPE: VARCHAR(32)
+STRING|MAP
+
+SQI_STRMASK: BOOLEAN
+
+
+SQI_STRLENGTH: SMALLINT
+
+
+SQI_ENUMVALS: VARCHAR(100)
+
+
+
+
+
+
+
+3.6.1.8. SQ_LINK¶
+Stored links
+
+
+
+
+
+
+
+SQ_LINK
+
+
+
+
+SQ_LNK_ID: BIGINT PK AUTO-GEN
+
+
+SQ_LNK_NAME: VARCHAR(64)
+
+
+SQ_LNK_CONNECTOR: BIGINT
+FK SQ_CONNECTOR(SQC_ID)
+
+SQ_LNK_CREATION_USER: VARCHAR(32)
+
+
+SQ_LNK_CREATION_DATE: TIMESTAMP
+
+
+SQ_LNK_UPDATE_USER: VARCHAR(32)
+
+
+SQ_LNK_UPDATE_DATE: TIMESTAMP
+
+
+SQ_LNK_ENABLED: BOOLEAN
+
+
+
+
+
+
+
+3.6.1.9. SQ_JOB¶
+Stored jobs
+
+
+
+
+
+
+
+SQ_JOB
+
+
+
+
+SQB_ID: BIGINT PK AUTO-GEN
+
+
+SQB_NAME: VARCHAR(64)
+
+
+SQB_FROM_LINK: BIGINT
+FK SQ_LINK(SQ_LNK_ID)
+
+SQB_TO_LINK: BIGINT
+FK SQ_LINK(SQ_LNK_ID)
+
+SQB_CREATION_USER: VARCHAR(32)
+
+
+SQB_CREATION_DATE: TIMESTAMP
+
+
+SQB_UPDATE_USER: VARCHAR(32)
+
+
+SQB_UPDATE_DATE: TIMESTAMP
+
+
+SQB_ENABLED: BOOLEAN
+
+
+
+
+
+
+
+3.6.1.10. SQ_LINK_INPUT¶
+N:M relationship link and input
+
+
+
+
+
+
+
+SQ_LINK_INPUT
+
+
+
+
+SQ_LNKI_LINK: BIGINT PK
+FK SQ_LINK(SQ_LNK_ID)
+
+SQ_LNKI_INPUT: BIGINT PK
+FK 

svn commit: r1754350 [15/17] - in /sqoop/site/trunk/content: ./ resources/docs/1.99.7/ resources/docs/1.99.7/_sources/ resources/docs/1.99.7/_sources/admin/ resources/docs/1.99.7/_sources/dev/ resourc

2016-07-27 Thread afine
Added: 
sqoop/site/trunk/content/resources/docs/1.99.7/user/CommandLineClient.html
URL: 
http://svn.apache.org/viewvc/sqoop/site/trunk/content/resources/docs/1.99.7/user/CommandLineClient.html?rev=1754350=auto
==
--- sqoop/site/trunk/content/resources/docs/1.99.7/user/CommandLineClient.html 
(added)
+++ sqoop/site/trunk/content/resources/docs/1.99.7/user/CommandLineClient.html 
Thu Jul 28 01:16:05 2016
@@ -0,0 +1,1020 @@
+
+
+
+
+
+  
+
+  
+  
+  
+  
+  2.1. Command Line Shell  Apache Sqoop  documentation
+  
+
+  
+  
+
+  
+
+  
+  
+
+
+  
+
+  
+  
+
+  
+
+  
+
+  
+
+
+
+ 
+
+  
+  
+
+
+
+
+
+  
+
+
+
+  
+
+  
+
+  
+ Apache Sqoop
+  
+
+  
+
+
+  
+  
+
+  
+
+
+  
+
+  
+
+  
+
+
+
+  
+
+
+  
+
+
+
+  
+
+
+
+1. 
Admin Guide
+2. User Guide
+2.1. Command Line Shell
+2.1.1. Resource file
+2.1.2. 
Commands
+2.1.2.1. Auxiliary Commands
+2.1.2.2. Set Command
+2.1.2.3. Show Command
+2.1.2.4. Create Command
+2.1.2.5. Update Command
+2.1.2.6. Delete Command
+2.1.2.7. Clone Command
+2.1.2.8. Start Command
+2.1.2.9. Stop Command
+2.1.2.10. Status Command
+
+
+
+
+2.2. Connectors
+2.3. 
Examples
+2.4. Sqoop 5 Minutes Demo
+
+
+3. 
Developer Guide
+4. Security Guide
+
+
+
+  
+
+  
+
+
+
+
+  
+  
+
+Apache Sqoop
+  
+
+
+  
+  
+
+  
+
+
+
+
+
+
+  
+Docs 
+  
+  2. User Guide 
+  
+2.1. Command Line Shell
+  
+
+  
+ 
View page source
+  
+
+  
+  
+  
+
+  http://schema.org/Article;>
+   
+
+  
+2.1. Command Line Shell¶
+Sqoop 2 provides command line shell that is capable of communicating with 
Sqoop 2 server using REST interface. Client is able to run in two modes - 
interactive and batch mode. Commands create, update and clone are not currently supported in batch mode. 
Interactive mode supports all available commands.
+You can start Sqoop 2 client in interactive mode using command sqoop2-shell:
+sqoop2-shell
+
+
+Batch mode can be started by adding additional argument representing path 
to your Sqoop client script:
+sqoop2-shell 
/path/to/your/script.sqoop
+
+
+Sqoop client script is expected to contain valid Sqoop client commands, 
empty lines and lines starting with # that are denoting comment lines. Comments and empty 
lines are ignored, all other lines are interpreted. Example script:
+# Specify company 
server
+set server --host sqoop2.company.net
+
+# Executing given job
+start job --name 1
+
+
+
+Table of Contents
+
+Command 
Line Shell
+Resource 
file
+Commands
+Auxiliary Commands
+Set 
Command
+Set 
Server Function
+Set 
Option Function
+
+
+Show 
Command
+Show 
Server Function
+Show 
Option Function
+Show 
Version Function
+Show Connector Function
+Show 
Driver Function
+Show 
Link Function
+Show Job 
Function
+Show Submission Function
+
+
+Create 
Command
+Create Link Function
+Create 
Job Function
+
+
+Update 
Command
+Update Link Function
+Update 
Job Function
+
+
+Delete 
Command
+Delete Link Function
+Delete 
Job Function
+
+
+Clone 
Command
+Clone 
Link Function
+Clone 
Job Function
+
+
+Start 
Command
+Start 
Job Function
+
+
+Stop 
Command
+Stop Job 
Function
+
+
+Status 
Command
+Status 
Job Function
+
+
+
+
+
+
+
+
+
+2.1.1. Resource file¶
+Sqoop 2 client have ability to load resource files similarly as other 
command line tools. At the beginning of execution Sqoop client will check 
existence of file .sqoop2rc in home directory of currently logged user. 
If such file exists, it will be interpreted before any additional actions. This 
file is loaded in both interactive and batch mode. It can be used to execute 
any batch compatible commands.
+Example resource file:
+# Configure our Sqoop 
2 server automatically
+set server --host sqoop2.company.net
+
+# Run in verbose mode by default
+set option --name verbose --value true
+
+
+
+
+2.1.2. Commands¶
+Sqoop 2 contains several commands that will be documented in this section. 
Each command have one more functions that are accepting various arguments. Not 
all commands are supported in both interactive and batch mode.
+
+2.1.2.1. Auxiliary Commands¶
+Auxiliary commands are commands that are improving user experience and are 
running purely on client side. Thus they do not need working connection to the 
server.
+
+exit Exit 
client immediately. This command can be also executed by sending EOT (end of 
transmission) character. Its CTRL+D on most common Linux shells like 
Bash or Zsh.
+history Print 
out command history. Please note that Sqoop client is saving history from 
previous executions and 

svn commit: r1754350 [14/17] - in /sqoop/site/trunk/content: ./ resources/docs/1.99.7/ resources/docs/1.99.7/_sources/ resources/docs/1.99.7/_sources/admin/ resources/docs/1.99.7/_sources/dev/ resourc

2016-07-27 Thread afine
Added: 
sqoop/site/trunk/content/resources/docs/1.99.7/security/AuthenticationAndAuthorization.html
URL: 
http://svn.apache.org/viewvc/sqoop/site/trunk/content/resources/docs/1.99.7/security/AuthenticationAndAuthorization.html?rev=1754350=auto
==
--- 
sqoop/site/trunk/content/resources/docs/1.99.7/security/AuthenticationAndAuthorization.html
 (added)
+++ 
sqoop/site/trunk/content/resources/docs/1.99.7/security/AuthenticationAndAuthorization.html
 Thu Jul 28 01:16:05 2016
@@ -0,0 +1,438 @@
+
+
+
+
+
+  
+
+  
+  
+  
+  
+  4.2. Authentication and Authorization  Apache Sqoop  
documentation
+  
+
+  
+  
+
+  
+
+  
+  
+
+
+  
+
+  
+  
+
+  
+
+  
+
+  
+
+
+
+ 
+
+  
+  
+
+
+
+
+
+  
+
+
+
+  
+
+  
+
+  
+ Apache Sqoop
+  
+
+  
+
+
+  
+  
+
+  
+
+
+  
+
+  
+
+  
+
+
+
+  
+
+
+  
+
+
+
+  
+
+
+
+1. 
Admin Guide
+2. 
User Guide
+3. 
Developer Guide
+4. Security Guide
+4.1. API TLS/SSL
+4.2. Authentication and Authorization
+4.2.1. Simple Authentication
+4.2.1.1. Configuration
+4.2.1.2. Run command
+
+
+4.2.2. Kerberos Authentication
+4.2.2.1. Dependency
+4.2.2.2. 
Configuration
+4.2.2.3. Run 
command
+4.2.2.4. 
Verify
+
+
+4.2.3. Customized Authentication
+4.2.4. Authorization
+4.2.4.1. Users, Groups, and Roles
+4.2.4.2. Administrator
+4.2.4.3. Role management commands
+4.2.4.4. Principal management 
commands
+4.2.4.5. Privilege management 
commands
+
+
+
+
+4.3. Repository Encryption
+
+
+
+
+
+  
+
+  
+
+
+
+
+  
+  
+
+Apache Sqoop
+  
+
+
+  
+  
+
+  
+
+
+
+
+
+
+  
+Docs 
+  
+  4. Security Guide 
+  
+4.2. Authentication and Authorization
+  
+
+  
+ View page source
+  
+
+  
+  
+  
+
+  http://schema.org/Article;>
+   
+
+  
+4.2. Authentication and Authorization¶
+Most Hadoop components, such as HDFS, Yarn, Hive, etc., have security 
frameworks, which support Simple, Kerberos and LDAP authentication. currently 
Sqoop 2 provides 2 types of authentication: simple and kerberos. The 
authentication module is pluggable, so more authentication types can be added. 
Additionally, a new role based access control is introduced in Sqoop 1.99.6. We 
recommend to use this capability in multi tenant environments, so that 
malicious users can’t easily abuse your created link and job objects.
+
+4.2.1. Simple Authentication¶
+
+4.2.1.1. Configuration¶
+Modify Sqoop configuration file, normally in Sqoop 
Folder/conf/sqoop.properties.
+org.apache.sqoop.authentication.type=SIMPLE
+org.apache.sqoop.authentication.handler=org.apache.sqoop.security.authentication.SimpleAuthenticationHandler
+org.apache.sqoop.anonymous=true
+
+
+
+Simple authentication is used by default. Commenting out authentication 
configuration will yield the use of simple authentication.
+
+
+
+4.2.1.2. Run command¶
+Start Sqoop server as usual.
+Sqoop 
Folder/bin/sqoop.sh server start
+
+
+Start Sqoop client as usual.
+Sqoop 
Folder/bin/sqoop.sh client
+
+
+
+
+
+4.2.2. Kerberos Authentication¶
+Kerberos is a computer network authentication protocol which works on the 
basis of tickets to allow nodes communicating over a non-secure 
network to prove their identity to one another in a secure manner. Its 
designers aimed it primarily at a client–server model and it provides mutual 
authentication—both the user and the server verify each others 
identity. Kerberos protocol messages are protected against eavesdropping and 
replay attacks.
+
+4.2.2.1. Dependency¶
+Set up a KDC server. Skip this step if KDC server exists. Its 
difficult to cover every way Kerberos can be setup (ie: there are cross realm 
setups and multi-trust environments). This section will describe how to setup 
the sqoop principals with a local deployment of MIT kerberos.
+
+All components which are Kerberos authenticated need one KDC server. If 
current Hadoop cluster uses Kerberos authentication, there should be a KDC 
server.
+If there is no KDC server, follow http://web.mit.edu/kerberos/krb5-devel/doc/admin/install_kdc.html;>http://web.mit.edu/kerberos/krb5-devel/doc/admin/install_kdc.html
 to set up one.
+
+Configure Hadoop cluster to use Kerberos authentication.
+
+Authentication type should be cluster level. All components must have the 
same authentication type: use Kerberos or not. In other words, Sqoop with 
Kerberos authentication could not communicate with other Hadoop components, 
such as HDFS, Yarn, Hive, etc., without Kerberos authentication, and vice 
versa.
+How to set up a Hadoop cluster with Kerberos 

svn commit: r1754350 [5/17] - in /sqoop/site/trunk/content: ./ resources/docs/1.99.7/ resources/docs/1.99.7/_sources/ resources/docs/1.99.7/_sources/admin/ resources/docs/1.99.7/_sources/dev/ resource

2016-07-27 Thread afine
Added: 
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/user/connectors/Connector-Kafka.txt
URL: 
http://svn.apache.org/viewvc/sqoop/site/trunk/content/resources/docs/1.99.7/_sources/user/connectors/Connector-Kafka.txt?rev=1754350=auto
==
--- 
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/user/connectors/Connector-Kafka.txt
 (added)
+++ 
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/user/connectors/Connector-Kafka.txt
 Thu Jul 28 01:16:05 2016
@@ -0,0 +1,63 @@
+.. Licensed to the Apache Software Foundation (ASF) under one or more
+   contributor license agreements.  See the NOTICE file distributed with
+   this work for additional information regarding copyright ownership.
+   The ASF licenses this file to You under the Apache License, Version 2.0
+   (the "License"); you may not use this file except in compliance with
+   the License.  You may obtain a copy of the License at
+
+   http://www.apache.org/licenses/LICENSE-2.0
+
+   Unless required by applicable law or agreed to in writing, software
+   distributed under the License is distributed on an "AS IS" BASIS,
+   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+   See the License for the specific language governing permissions and
+   limitations under the License.
+
+
+===
+Kafka Connector
+===
+
+Currently, only the TO direction is supported.
+
+.. contents::
+   :depth: 3
+
+-
+Usage
+-
+
+To use the Kafka Connector, create a link for the connector and a job that 
uses the link.
+
+**Link Configuration**
+++
+
+Inputs associated with the link configuration include:
+
++--+-+---+-+
+| Input| Type| Description 
  | Example |
++==+=+===+=+
+| Broker list  | String  | Comma separated list of kafka brokers.  
  | example.com:1,example.com:11000 |
+|  | | *Required*. 
  | |
++--+-+---+-+
+| Zookeeper connection | String  | Comma separated list of zookeeper servers 
in your quorum. | /etc/conf/hadoop|
+|  | | *Required*. 
  | |
++--+-+---+-+
+
+**TO Job Configuration**
+
+
+Inputs associated with the Job configuration for the FROM direction include:
+
++---+-+-+--+
+| Input | Type| Description | Example  |
++===+=+=+==+
+| topic | String  | The Kafka topic to transfer to. | my topic |
+|   | | *Required*. |  |
++---+-+-+--+
+
+--
+Loader
+--
+
+During the *loading* phase, Kafka is written to directly from each loader. The 
order in which data is loaded into Kafka is not guaranteed.

Added: 
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/user/connectors/Connector-Kite.txt
URL: 
http://svn.apache.org/viewvc/sqoop/site/trunk/content/resources/docs/1.99.7/_sources/user/connectors/Connector-Kite.txt?rev=1754350=auto
==
--- 
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/user/connectors/Connector-Kite.txt
 (added)
+++ 
sqoop/site/trunk/content/resources/docs/1.99.7/_sources/user/connectors/Connector-Kite.txt
 Thu Jul 28 01:16:05 2016
@@ -0,0 +1,110 @@
+.. Licensed to the Apache Software Foundation (ASF) under one or more
+   contributor license agreements.  See the NOTICE file distributed with
+   this work for additional information regarding copyright ownership.
+   The ASF licenses this file to You under the Apache License, Version 2.0
+   (the "License"); you may not use this file except in compliance with
+   the License.  You may obtain a copy of the License at
+
+   http://www.apache.org/licenses/LICENSE-2.0
+
+   Unless required by applicable law or agreed to in writing, software
+   distributed under the License is distributed on an "AS IS" BASIS,
+   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+   See the License for the specific language governing permissions and
+   

svn commit: r1754350 [8/17] - in /sqoop/site/trunk/content: ./ resources/docs/1.99.7/ resources/docs/1.99.7/_sources/ resources/docs/1.99.7/_sources/admin/ resources/docs/1.99.7/_sources/dev/ resource

2016-07-27 Thread afine
Added: sqoop/site/trunk/content/resources/docs/1.99.7/_static/js/theme.js
URL: 
http://svn.apache.org/viewvc/sqoop/site/trunk/content/resources/docs/1.99.7/_static/js/theme.js?rev=1754350=auto
==
--- sqoop/site/trunk/content/resources/docs/1.99.7/_static/js/theme.js (added)
+++ sqoop/site/trunk/content/resources/docs/1.99.7/_static/js/theme.js Thu Jul 
28 01:16:05 2016
@@ -0,0 +1,153 @@
+require=(function e(t,n,r){function s(o,u){if(!n[o]){if(!t[o]){var a=typeof 
require=="function"&if(!u&)return a(o,!0);if(i)return i(o,!0);var 
f=new Error("Cannot find module '"+o+"'");throw f.code="MODULE_NOT_FOUND",f}var 
l=n[o]={exports:{}};t[o][0].call(l.exports,function(e){var n=t[o][1][e];return 
s(n?n:e)},l,l.exports,e,t,n,r)}return n[o].exports}var i=typeof 
require=="function"&for(var o=0;o

svn commit: r1754350 [11/17] - in /sqoop/site/trunk/content: ./ resources/docs/1.99.7/ resources/docs/1.99.7/_sources/ resources/docs/1.99.7/_sources/admin/ resources/docs/1.99.7/_sources/dev/ resourc

2016-07-27 Thread afine
Added: sqoop/site/trunk/content/resources/docs/1.99.7/dev/DevEnv.html
URL: 
http://svn.apache.org/viewvc/sqoop/site/trunk/content/resources/docs/1.99.7/dev/DevEnv.html?rev=1754350=auto
==
--- sqoop/site/trunk/content/resources/docs/1.99.7/dev/DevEnv.html (added)
+++ sqoop/site/trunk/content/resources/docs/1.99.7/dev/DevEnv.html Thu Jul 28 
01:16:05 2016
@@ -0,0 +1,252 @@
+
+
+
+
+
+  
+
+  
+  
+  
+  
+  3.4. Sqoop 2 Development Environment Setup  Apache Sqoop  
documentation
+  
+
+  
+  
+
+  
+
+  
+  
+
+
+  
+
+  
+  
+
+  
+
+  
+
+  
+
+
+
+ 
+
+  
+  
+
+
+
+
+
+  
+
+
+
+  
+
+  
+
+  
+ Apache Sqoop
+  
+
+  
+
+
+  
+  
+
+  
+
+
+  
+
+  
+
+  
+
+
+
+  
+
+
+  
+
+
+
+  
+
+
+
+1. 
Admin Guide
+2. 
User Guide
+3. Developer Guide
+3.1. Building Sqoop2 from source code
+3.2. Sqoop Java Client API Guide
+3.3. Sqoop 2 Connector Development
+3.4. Sqoop 2 Development Environment Setup
+3.4.1. System Requirement
+3.4.1.1. 
Java
+3.4.1.2. 
Maven
+
+
+3.4.2. Eclipse Setup
+
+
+3.5. 
Sqoop REST API Guide
+3.6. Repository
+
+
+4. Security Guide
+
+
+
+  
+
+  
+
+
+
+
+  
+  
+
+Apache Sqoop
+  
+
+
+  
+  
+
+  
+
+
+
+
+
+
+  
+Docs 
+  
+  3. Developer Guide 
+  
+3.4. Sqoop 2 Development Environment Setup
+  
+
+  
+ View page 
source
+  
+
+  
+  
+  
+
+  http://schema.org/Article;>
+   
+
+  
+3.4. Sqoop 2 Development Environment Setup¶
+This document describes you how to setup development environment for Sqoop 
2.
+
+3.4.1. System Requirement¶
+
+3.4.1.1. Java¶
+Sqoop has been developped and test only with JDK from http://www.oracle.com/technetwork/java/javase/downloads/index.html;>Oracle
 and we require at least version 7 (were not supporting JDK 1.6 and 
older releases).
+
+
+3.4.1.2. Maven¶
+Sqoop uses Maven 3 for building the project. Download http://maven.apache.org/download.cgi;>Maven and its 
Installation instructions given in http://maven.apache.org/download.cgi#Maven_Documentation;>link.
+
+
+
+3.4.2. Eclipse Setup¶
+Steps for downloading source code are given in Building Sqoop2 from source 
code.
+Sqoop 2 project has multiple modules where one module is depend on another 
module for e.g. sqoop 2 client module has sqoop 2 common module dependency. 
Follow below step for creating eclipses project and classpath for each 
module.
+//Install all package 
into local maven repository
+mvn clean install -DskipTests
+
+//Adding M2_REPO variable to eclipse workspace
+mvn eclipse:configure-workspace 
-Declipse.workspace=path-to-eclipse-workspace-dir-for-sqoop-2
+
+//Eclipse project creation with optional parameters
+mvn eclipse:eclipse -DdownloadSources=true -DdownloadJavadocs=true
+
+
+Alternatively, for manually adding M2_REPO classpath variable as maven 
repository path in eclipse- window- Java -Classpath Variables 
-Click New -In new dialog box, input Name as M2_REPO and 
Path as $HOME/.m2/repository -click Ok.
+On successful execution of above maven commands, Then import the sqoop 
project modules into eclipse- File - Import -General -Existing 
Projects into Workspace- Click Next- Browse Sqoop 2 directory 
($HOME/git/sqoop2) -Click Ok -Import dialog shows multiple projects 
(sqoop-client, sqoop-common, etc.) - Select all modules - click 
Finish.
+
+
+
+
+   
+  
+  
+  
+
+  
+Next 
+  
+  
+ Previous
+  
+
+  
+
+  
+
+  
+
+ Copyright 2009-2016 The Apache Software Foundation.
+
+
+   
+
+
+
+
+  
+
+
+
+  
+  
+
+
+  
+
+
+var DOCUMENTATION_OPTIONS = {
+URL_ROOT:'../',
+VERSION:'',
+COLLAPSE_INDEX:false,
+FILE_SUFFIX:'.html',
+HAS_SOURCE:  true
+};
+
+  
+  
+  
+
+  
+
+  
+  
+
+  
+
+  
+  
+  
+  jQuery(function () {
+  SphinxRtdTheme.StickyNav.enable();
+  });
+  
+   
+
+
+
\ No newline at end of file