Author: buildbot
Date: Thu Jul 28 01:17:26 2016
New Revision: 993940

Log:
Staging update by buildbot for sqoop

Added:
    websites/staging/sqoop/trunk/content/docs/1.99.7/
    websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/
    websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/admin/
    websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/admin.txt
    
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/admin/Installation.txt
    websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/admin/Tools.txt
    websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/admin/Upgrade.txt
    websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/dev/
    websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/dev.txt
    
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/dev/BuildingSqoop2.txt
    websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/dev/ClientAPI.txt
    
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/dev/ConnectorDevelopment.txt
    websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/dev/DevEnv.txt
    websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/dev/RESTAPI.txt
    websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/dev/Repository.txt
    websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/index.txt
    websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/security/
    websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/security.txt
    websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/security/API 
TLS-SSL.txt
    
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/security/AuthenticationAndAuthorization.txt
    
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/security/RepositoryEncryption.txt
    websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user/
    websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user.txt
    
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user/CommandLineClient.txt
    
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user/Connectors.txt
    websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user/Examples.txt
    
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user/Sqoop5MinutesDemo.txt
    websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user/connectors/
    
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user/connectors/Connector-FTP.txt
    
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user/connectors/Connector-GenericJDBC.txt
    
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user/connectors/Connector-HDFS.txt
    
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user/connectors/Connector-Kafka.txt
    
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user/connectors/Connector-Kite.txt
    
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user/connectors/Connector-SFTP.txt
    websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user/examples/
    
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/user/examples/S3Import.txt
    websites/staging/sqoop/trunk/content/docs/1.99.7/_static/
    websites/staging/sqoop/trunk/content/docs/1.99.7/_static/ajax-loader.gif   
(with props)
    websites/staging/sqoop/trunk/content/docs/1.99.7/_static/basic.css
    websites/staging/sqoop/trunk/content/docs/1.99.7/_static/comment-bright.png 
  (with props)
    websites/staging/sqoop/trunk/content/docs/1.99.7/_static/comment-close.png  
 (with props)
    websites/staging/sqoop/trunk/content/docs/1.99.7/_static/comment.png   
(with props)
    websites/staging/sqoop/trunk/content/docs/1.99.7/_static/css/
    websites/staging/sqoop/trunk/content/docs/1.99.7/_static/css/badge_only.css
    websites/staging/sqoop/trunk/content/docs/1.99.7/_static/css/theme.css
    websites/staging/sqoop/trunk/content/docs/1.99.7/_static/doctools.js
    websites/staging/sqoop/trunk/content/docs/1.99.7/_static/down-pressed.png   
(with props)
    websites/staging/sqoop/trunk/content/docs/1.99.7/_static/down.png   (with 
props)
    websites/staging/sqoop/trunk/content/docs/1.99.7/_static/file.png   (with 
props)
    websites/staging/sqoop/trunk/content/docs/1.99.7/_static/fonts/
    
websites/staging/sqoop/trunk/content/docs/1.99.7/_static/fonts/fontawesome-webfont.svg
   (with props)
    websites/staging/sqoop/trunk/content/docs/1.99.7/_static/jquery.js
    websites/staging/sqoop/trunk/content/docs/1.99.7/_static/js/
    websites/staging/sqoop/trunk/content/docs/1.99.7/_static/js/modernizr.min.js
    websites/staging/sqoop/trunk/content/docs/1.99.7/_static/js/theme.js
    websites/staging/sqoop/trunk/content/docs/1.99.7/_static/minus.png   (with 
props)
    websites/staging/sqoop/trunk/content/docs/1.99.7/_static/plus.png   (with 
props)
    websites/staging/sqoop/trunk/content/docs/1.99.7/_static/pygments.css
    websites/staging/sqoop/trunk/content/docs/1.99.7/_static/searchtools.js
    websites/staging/sqoop/trunk/content/docs/1.99.7/_static/sqoop-logo.png   
(with props)
    websites/staging/sqoop/trunk/content/docs/1.99.7/_static/underscore.js
    websites/staging/sqoop/trunk/content/docs/1.99.7/_static/up-pressed.png   
(with props)
    websites/staging/sqoop/trunk/content/docs/1.99.7/_static/up.png   (with 
props)
    websites/staging/sqoop/trunk/content/docs/1.99.7/_static/websupport.js
    websites/staging/sqoop/trunk/content/docs/1.99.7/admin/
    websites/staging/sqoop/trunk/content/docs/1.99.7/admin.html
    websites/staging/sqoop/trunk/content/docs/1.99.7/admin/Installation.html
    websites/staging/sqoop/trunk/content/docs/1.99.7/admin/Tools.html
    websites/staging/sqoop/trunk/content/docs/1.99.7/admin/Upgrade.html
    websites/staging/sqoop/trunk/content/docs/1.99.7/dev/
    websites/staging/sqoop/trunk/content/docs/1.99.7/dev.html
    websites/staging/sqoop/trunk/content/docs/1.99.7/dev/BuildingSqoop2.html
    websites/staging/sqoop/trunk/content/docs/1.99.7/dev/ClientAPI.html
    
websites/staging/sqoop/trunk/content/docs/1.99.7/dev/ConnectorDevelopment.html
    websites/staging/sqoop/trunk/content/docs/1.99.7/dev/DevEnv.html
    websites/staging/sqoop/trunk/content/docs/1.99.7/dev/RESTAPI.html
    websites/staging/sqoop/trunk/content/docs/1.99.7/dev/Repository.html
    websites/staging/sqoop/trunk/content/docs/1.99.7/genindex.html
    websites/staging/sqoop/trunk/content/docs/1.99.7/index.html
    websites/staging/sqoop/trunk/content/docs/1.99.7/objects.inv   (with props)
    websites/staging/sqoop/trunk/content/docs/1.99.7/search.html
    websites/staging/sqoop/trunk/content/docs/1.99.7/searchindex.js
    websites/staging/sqoop/trunk/content/docs/1.99.7/security/
    websites/staging/sqoop/trunk/content/docs/1.99.7/security.html
    websites/staging/sqoop/trunk/content/docs/1.99.7/security/API TLS-SSL.html
    
websites/staging/sqoop/trunk/content/docs/1.99.7/security/AuthenticationAndAuthorization.html
    
websites/staging/sqoop/trunk/content/docs/1.99.7/security/RepositoryEncryption.html
    websites/staging/sqoop/trunk/content/docs/1.99.7/user/
    websites/staging/sqoop/trunk/content/docs/1.99.7/user.html
    websites/staging/sqoop/trunk/content/docs/1.99.7/user/CommandLineClient.html
    websites/staging/sqoop/trunk/content/docs/1.99.7/user/Connectors.html
    websites/staging/sqoop/trunk/content/docs/1.99.7/user/Examples.html
    websites/staging/sqoop/trunk/content/docs/1.99.7/user/Sqoop5MinutesDemo.html
    websites/staging/sqoop/trunk/content/docs/1.99.7/user/connectors/
    
websites/staging/sqoop/trunk/content/docs/1.99.7/user/connectors/Connector-FTP.html
    
websites/staging/sqoop/trunk/content/docs/1.99.7/user/connectors/Connector-GenericJDBC.html
    
websites/staging/sqoop/trunk/content/docs/1.99.7/user/connectors/Connector-HDFS.html
    
websites/staging/sqoop/trunk/content/docs/1.99.7/user/connectors/Connector-Kafka.html
    
websites/staging/sqoop/trunk/content/docs/1.99.7/user/connectors/Connector-Kite.html
    
websites/staging/sqoop/trunk/content/docs/1.99.7/user/connectors/Connector-SFTP.html
    websites/staging/sqoop/trunk/content/docs/1.99.7/user/examples/
    websites/staging/sqoop/trunk/content/docs/1.99.7/user/examples/S3Import.html
Modified:
    websites/staging/sqoop/trunk/content/   (props changed)
    websites/staging/sqoop/trunk/content/index.html
    websites/staging/sqoop/trunk/content/issue-tracking.html
    websites/staging/sqoop/trunk/content/license.html
    websites/staging/sqoop/trunk/content/mail-lists.html
    websites/staging/sqoop/trunk/content/project-info.html
    websites/staging/sqoop/trunk/content/source-repository.html
    websites/staging/sqoop/trunk/content/team-list.html

Propchange: websites/staging/sqoop/trunk/content/
------------------------------------------------------------------------------
--- cms:source-revision (original)
+++ cms:source-revision Thu Jul 28 01:17:26 2016
@@ -1 +1 @@
-1733642
+1754350

Added: websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/admin.txt
==============================================================================
--- websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/admin.txt (added)
+++ websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/admin.txt Thu Jul 
28 01:17:26 2016
@@ -0,0 +1,24 @@
+.. Licensed to the Apache Software Foundation (ASF) under one or more
+   contributor license agreements.  See the NOTICE file distributed with
+   this work for additional information regarding copyright ownership.
+   The ASF licenses this file to You under the Apache License, Version 2.0
+   (the "License"); you may not use this file except in compliance with
+   the License.  You may obtain a copy of the License at
+
+       http://www.apache.org/licenses/LICENSE-2.0
+
+   Unless required by applicable law or agreed to in writing, software
+   distributed under the License is distributed on an "AS IS" BASIS,
+   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+   See the License for the specific language governing permissions and
+   limitations under the License.
+
+
+===========
+Admin Guide
+===========
+
+.. toctree::
+   :glob:
+
+   admin/*

Added: 
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/admin/Installation.txt
==============================================================================
--- 
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/admin/Installation.txt
 (added)
+++ 
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/admin/Installation.txt
 Thu Jul 28 01:17:26 2016
@@ -0,0 +1,187 @@
+.. Licensed to the Apache Software Foundation (ASF) under one or more
+   contributor license agreements.  See the NOTICE file distributed with
+   this work for additional information regarding copyright ownership.
+   The ASF licenses this file to You under the Apache License, Version 2.0
+   (the "License"); you may not use this file except in compliance with
+   the License.  You may obtain a copy of the License at
+
+       http://www.apache.org/licenses/LICENSE-2.0
+
+   Unless required by applicable law or agreed to in writing, software
+   distributed under the License is distributed on an "AS IS" BASIS,
+   WITHOUT WARRANTIES OR CONDITIONS OF lANY KIND, either express or implied.
+   See the License for the specific language governing permissions and
+   limitations under the License.
+
+
+============
+Installation
+============
+
+Sqoop ships as one binary package that incorporates two separate parts - 
client and server.
+
+* **Server** You need to install server on single node in your cluster. This 
node will then serve as an entry point for all Sqoop clients.
+* **Client** Clients can be installed on any number of machines.
+
+Server installation
+===================
+
+Copy the Sqoop artifact to the machine where you want to run Sqoop server. The 
Sqoop server acts as a Hadoop client, therefore Hadoop libraries (Yarn, 
Mapreduce, and HDFS jar files) and configuration files (``core-site.xml``, 
``mapreduce-site.xml``, ...) must be available on this node. You do not need to 
run any Hadoop related services - running the server on a "gateway" node is 
perfectly fine.
+
+You should be able to list a HDFS for example:
+
+.. code-block:: bash
+
+  hadoop dfs -ls
+
+Sqoop currently supports Hadoop version 2.6.0 or later. To install the Sqoop 
server, decompress the tarball (in a location of your choosing) and set the 
newly created forder as your working directory.
+
+.. code-block:: bash
+
+  # Decompress Sqoop distribution tarball
+  tar -xvf sqoop-<version>-bin-hadoop<hadoop-version>.tar.gz
+
+  # Move decompressed content to any location
+  mv sqoop-<version>-bin-hadoop<hadoop version>.tar.gz /usr/lib/sqoop
+
+  # Change working directory
+  cd /usr/lib/sqoop
+
+
+Hadoop dependencies
+-------------------
+
+Sqoop server needs following environmental variables pointing at Hadoop 
libraries - ``$HADOOP_COMMON_HOME``, ``$HADOOP_HDFS_HOME``, 
``$HADOOP_MAPRED_HOME`` and ``$HADOOP_YARN_HOME``. You have to make sure that 
those variables are defined and pointing to a valid Hadoop installation. Sqoop 
server will not start if Hadoop libraries can't be found.
+
+The Sqoop server uses environment variables to find Hadoop libraries. If the 
environment variable ``$HADOOP_HOME`` is set, Sqoop will look for jars in the 
following locations: ``$HADOOP_HOME/share/hadoop/common``, 
``$HADOOP_HOME/share/hadoop/hdfs``, ``$HADOOP_HOME/share/hadoop/mapreduce`` and 
``$HADOOP_HOME/share/hadoop/yarn``. You can specify where the Sqoop server 
should look for the common, hdfs, mapreduce, and yarn jars indepently with the 
``$HADOOP_COMMON_HOME``, ``$HADOOP_HDFS_HOME``, ``$HADOOP_MAPRED_HOME`` and 
``$HADOOP_YARN_HOME`` environment variables.
+
+
+.. code-block:: bash
+
+  # Export HADOOP_HOME variable
+  export HADOOP_HOME=/...
+
+  # Or alternatively HADOOP_*_HOME variables
+  export HADOOP_COMMON_HOME=/...
+  export HADOOP_HDFS_HOME=/...
+  export HADOOP_MAPRED_HOME=/...
+  export HADOOP_YARN_HOME=/...
+
+.. note::
+
+  If the environment ``$HADOOP_HOME`` is set, Sqoop will usee the following 
locations: ``$HADOOP_HOME/share/hadoop/common``, 
``$HADOOP_HOME/share/hadoop/hdfs``, ``$HADOOP_HOME/share/hadoop/mapreduce`` and 
``$HADOOP_HOME/share/hadoop/yarn``.
+
+Hadoop configuration
+--------------------
+
+Sqoop server will need to impersonate users to access HDFS and other resources 
in or outside of the cluster as the user who started given job rather then user 
who is running the server. You need to configure Hadoop to explicitly allow 
this impersonation via so called proxyuser system. You need to create two 
properties in  ``core-site.xml`` file - ``hadoop.proxyuser.$SERVER_USER.hosts`` 
and ``hadoop.proxyuser.$SERVER_USER.groups`` where ``$SERVER_USER`` is the user 
who will be running Sqoop 2 server. In most scenarios configuring ``*`` is 
sufficient. Please refer to Hadoop documentation for details how to use those 
properties.
+
+Example fragment that needs to be present in ``core-site.xml`` file for case 
when server is running under ``sqoop2`` user:
+
+.. code-block:: xml
+
+  <property>
+    <name>hadoop.proxyuser.sqoop2.hosts</name>
+    <value>*</value>
+  </property>
+  <property>
+    <name>hadoop.proxyuser.sqoop2.groups</name>
+    <value>*</value>
+  </property>
+
+If you're running Sqoop 2 server under a so called system user (user with ID 
less then ``min.user.id`` - 1000 by default), then YARN will by default refuse 
to run Sqoop 2 jobs. You will need to add the user name who is running Sqoop 2 
server (most likely user ``sqoop2``) to a ``allowed.system.users`` property of 
``container-executor.cfg``. Please refer to YARN documentation for further 
details.
+
+Example fragment that needs to be present in ``container-executor.cfg`` file 
for case when server is running under ``sqoop2`` user:
+
+.. code-block:: xml
+
+  allowed.system.users=sqoop2
+
+Third party jars
+----------------
+
+To propagate any third party jars to Sqoop server classpath, create a 
directory anywhere on the file system and export it's location in 
``SQOOP_SERVER_EXTRA_LIB`` variable.
+
+.. code-block:: bash
+
+  # Create directory for extra jars
+  mkdir -p /var/lib/sqoop2/
+
+  # Copy all your JDBC drivers to this directory
+  cp mysql-jdbc*.jar /var/lib/sqoop2/
+  cp postgresql-jdbc*.jar /var/lib/sqoop2/
+
+  # And finally export this directory to SQOOP_SERVER_EXTRA_LIB
+  export SQOOP_SERVER_EXTRA_LIB=/var/lib/sqoop2/
+
+.. note::
+
+  Sqoop doesn't ship with any JDBC drivers due to incompatible licenses. You 
will need to use this mechanism to install all JDBC drivers that are needed.
+
+Configuring ``PATH``
+--------------------
+
+All user and administrator facing shell commands are stored in ``bin/`` 
directory. It's recommended to add this directory to your ``$PATH`` for easier 
execution, for example:
+
+.. code-block:: bash
+
+  PATH=$PATH:`pwd`/bin/
+
+The remainder of the Sqoop 2 documentation assumes that the shell commands are 
in your ``$PATH``.
+
+Configuring Server
+------------------
+
+Server configuration files are stored in ``conf`` directory. File 
``sqoop_bootstrap.properties`` specifies which configuration provider should be 
used for loading configuration for rest of Sqoop server. Default value 
``PropertiesConfigurationProvider`` should be sufficient.
+
+Second configuration file called ``sqoop.properties`` contains remaining 
configuration properties that can affect Sqoop server. The configuration file 
is very well documented, so check if all configuration properties fits your 
environment. Default or very little tweaking should be sufficient in most 
common cases.
+
+Repository Initialization
+-------------------------
+
+The metadata repository needs to be initialized before starting Sqoop 2 server 
for the first time. Use :ref:`tool-upgrade` to initialize the repository:
+
+.. code-block:: bash
+
+  sqoop2-tool upgrade
+
+You can verify if everything have been configured correctly using 
:ref:`tool-verify`:
+
+.. code-block:: bash
+
+  sqoop2-tool verify
+  ...
+  Verification was successful.
+  Tool class org.apache.sqoop.tools.tool.VerifyTool has finished correctly
+
+Server Life Cycle
+-----------------
+
+After installation and configuration you can start Sqoop server with following 
command:
+
+.. code-block:: bash
+
+  sqoop2-server start
+
+You can stop the server using the following command:
+
+.. code-block:: bash
+
+  sqoop2-server stop
+
+By default Sqoop server daemon use port ``12000``. You can set 
``org.apache.sqoop.jetty.port`` in configuration file ``conf/sqoop.properties`` 
to use different port.
+
+Client installation
+===================
+
+Just copy Sqoop distribution artifact on target machine and unzip it in 
desired location. You can start client with following command:
+
+.. code-block:: bash
+
+  sqoop2-shell
+
+You can find more documentation for Sqoop shell in 
:doc:`/user/CommandLineClient`.
+
+.. note::
+
+  Client is not acting as a Hadoop client and thus you do not need to be 
installed on node with Hadoop libraries and configuration files.
\ No newline at end of file

Added: websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/admin/Tools.txt
==============================================================================
--- websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/admin/Tools.txt 
(added)
+++ websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/admin/Tools.txt 
Thu Jul 28 01:17:26 2016
@@ -0,0 +1,252 @@
+.. Licensed to the Apache Software Foundation (ASF) under one or more
+   contributor license agreements.  See the NOTICE file distributed with
+   this work for additional information regarding copyright ownership.
+   The ASF licenses this file to You under the Apache License, Version 2.0
+   (the "License"); you may not use this file except in compliance with
+   the License.  You may obtain a copy of the License at
+
+       http://www.apache.org/licenses/LICENSE-2.0
+
+   Unless required by applicable law or agreed to in writing, software
+   distributed under the License is distributed on an "AS IS" BASIS,
+   WITHOUT WARRANTIES OR CONDITIONS OF lANY KIND, either express or implied.
+   See the License for the specific language governing permissions and
+   limitations under the License.
+
+
+=====
+Tools
+=====
+
+Tools are server commands that administrators can execute on the Sqoop server 
machine in order to perform various maintenance tasks. The tool execution will 
always perform a given task and finish. There are no long running services 
implemented as tools.
+
+In order to perform the maintenance task each tool is suppose to do, they need 
to be executed in exactly the same environment as the main Sqoop server. The 
tool binary will take care of setting up the ``CLASSPATH`` and other 
environmental variables that might be required. However it's up to the 
administrator himself to run the tool under the same user as is used for the 
server. This is usually configured automatically for various Hadoop 
distributions (such as Apache Bigtop).
+
+
+.. note:: Running tools while the Sqoop Server is also running is not 
recommended as it might lead to a data corruption and service disruption.
+
+List of available tools:
+
+* verify
+* upgrade
+
+To run the desired tool, execute binary ``sqoop2-tool`` with desired tool 
name. For example to run ``verify`` tool::
+
+  sqoop2-tool verify
+
+.. note:: Stop the Sqoop Server before running Sqoop tools. Running tools 
while Sqoop Server is running can lead to a data corruption and service 
disruption.
+
+.. _tool-verify:
+
+Verify tool
+===========
+
+The verify tool will verify Sqoop server configuration by starting all 
subsystems with the exception of servlets and tearing them down.
+
+To run the ``verify`` tool::
+
+  sqoop2-tool verify
+
+If the verification process succeeds, you should see messages like::
+
+  Verification was successful.
+  Tool class org.apache.sqoop.tools.tool.VerifyTool has finished correctly
+
+If the verification process will find any inconsistencies, it will print out 
the following message instead::
+
+  Verification has failed, please check Server logs for further details.
+  Tool class org.apache.sqoop.tools.tool.VerifyTool has failed.
+
+Further details why the verification has failed will be available in the Sqoop 
server log - same file as the Sqoop Server logs into.
+
+.. _tool-upgrade:
+
+Upgrade tool
+============
+
+Upgrades all versionable components inside Sqoop2. This includes structural 
changes inside the repository and stored metadata.
+Running this tool on Sqoop deployment that was already upgraded will have no 
effect.
+
+To run the ``upgrade`` tool::
+
+  sqoop2-tool upgrade
+
+Upon successful upgrade you should see following message::
+
+  Tool class org.apache.sqoop.tools.tool.UpgradeTool has finished correctly.
+
+Execution failure will show the following message instead::
+
+  Tool class org.apache.sqoop.tools.tool.UpgradeTool has failed.
+
+Further details why the upgrade process has failed will be available in the 
Sqoop server log - same file as the Sqoop Server logs into.
+
+RepositoryDump
+==============
+
+Writes the user-created contents of the Sqoop repository to a file in JSON 
format. This includes connections, jobs and submissions.
+
+To run the ``repositorydump`` tool::
+
+  sqoop2-tool repositorydump -o repository.json
+
+As an option, the administrator can choose to include sensitive information 
such as database connection passwords in the file::
+
+  sqoop2-tool repositorydump -o repository.json --include-sensitive
+
+Upon successful execution, you should see the following message::
+
+  Tool class org.apache.sqoop.tools.tool.RepositoryDumpTool has finished 
correctly.
+
+If repository dump has failed, you will see the following message instead::
+
+  Tool class org.apache.sqoop.tools.tool.RepositoryDumpTool has failed.
+
+Further details why the upgrade process has failed will be available in the 
Sqoop server log - same file as the Sqoop Server logs into.
+
+RepositoryLoad
+==============
+
+Reads a json formatted file created by RepositoryDump and loads to current 
Sqoop repository.
+
+To run the ``repositoryLoad`` tool::
+
+  sqoop2-tool repositoryload -i repository.json
+
+Upon successful execution, you should see the following message::
+
+  Tool class org.apache.sqoop.tools.tool.RepositoryLoadTool has finished 
correctly.
+
+If repository load failed you will see the following message instead::
+
+ Tool class org.apache.sqoop.tools.tool.RepositoryLoadTool has failed.
+
+Or an exception. Further details why the upgrade process has failed will be 
available in the Sqoop server log - same file as the Sqoop Server logs into.
+
+.. note:: If the repository dump was created without passwords (default), the 
connections will not contain a password and the jobs will fail to execute. In 
that case you'll need to manually update the connections and set the password.
+.. note:: RepositoryLoad tool will always generate new connections, jobs and 
submissions from the file. Even when an identical objects already exists in 
repository.
+
+.. _repositoryencryption-tool:
+
+RepositoryEncryption
+====================
+
+Please see :ref:`repositoryencryption` for more details on repository 
encryption.
+
+Sometimes we may want to change the password that is used to encrypt our data, 
generate a new key for our existing password,
+encrypt an existing unencrypted repository, or decrypt an existing encrypting 
repository. Sqoop 2 provides the
+Repository Encryption Tool to allow us to do this.
+
+Before using the tool it is important to shut down the Sqoop 2 server.
+
+All changes that the tool makes occur in a single transaction with the 
repository, which will prevent leaving the
+repository in a bad state.
+
+The Repository Encryption Tool is very simple, it uses the exact same 
configuration specified above (with the exception
+of ``useConf``). Configuration prefixed with a "-F" represents the existing 
repository state, configuration prefixed with
+a "-T" represents the desired repository state. If one of these configuration 
sets is left out that means unencrypted.
+
+Changing the Password
+---------------------
+
+In order to change the password, we need to specify the current configuration 
with the existing password and the desired
+configuration with the new password. It looks like this:
+
+::
+
+    sqoop.sh tool repositoryencryption \
+        -Forg.apache.sqoop.security.repo_encryption.password=old_password \
+        -Forg.apache.sqoop.security.repo_encryption.hmac_algorithm=HmacSHA256 \
+        -Forg.apache.sqoop.security.repo_encryption.cipher_algorithm=AES \
+        -Forg.apache.sqoop.security.repo_encryption.cipher_key_size=16 \
+        
-Forg.apache.sqoop.security.repo_encryption.cipher_spec=AES/CBC/PKCS5Padding \
+        
-Forg.apache.sqoop.security.repo_encryption.initialization_vector_size=16 \
+        
-Forg.apache.sqoop.security.repo_encryption.pbkdf2_algorithm=PBKDF2WithHmacSHA1 
\
+        -Forg.apache.sqoop.security.repo_encryption.pbkdf2_rounds=4000 \
+        -Torg.apache.sqoop.security.repo_encryption.password=new_password \
+        -Torg.apache.sqoop.security.repo_encryption.hmac_algorithm=HmacSHA256 \
+        -Torg.apache.sqoop.security.repo_encryption.cipher_algorithm=AES \
+        -Torg.apache.sqoop.security.repo_encryption.cipher_key_size=16 \
+        
-Torg.apache.sqoop.security.repo_encryption.cipher_spec=AES/CBC/PKCS5Padding \
+        
-Torg.apache.sqoop.security.repo_encryption.initialization_vector_size=16 \
+        
-Torg.apache.sqoop.security.repo_encryption.pbkdf2_algorithm=PBKDF2WithHmacSHA1 
\
+        -Torg.apache.sqoop.security.repo_encryption.pbkdf2_rounds=4000
+
+Generate a New Key for the Existing Password
+--------------------------------------------
+
+Just like with the previous scenario you could copy the same configuration 
twice like this:
+
+::
+
+    sqoop.sh tool repositoryencryption \
+        -Forg.apache.sqoop.security.repo_encryption.password=password \
+        -Forg.apache.sqoop.security.repo_encryption.hmac_algorithm=HmacSHA256 \
+        -Forg.apache.sqoop.security.repo_encryption.cipher_algorithm=AES \
+        -Forg.apache.sqoop.security.repo_encryption.cipher_key_size=16 \
+        
-Forg.apache.sqoop.security.repo_encryption.cipher_spec=AES/CBC/PKCS5Padding \
+        
-Forg.apache.sqoop.security.repo_encryption.initialization_vector_size=16 \
+        
-Forg.apache.sqoop.security.repo_encryption.pbkdf2_algorithm=PBKDF2WithHmacSHA1 
\
+        -Forg.apache.sqoop.security.repo_encryption.pbkdf2_rounds=4000 \
+        -Torg.apache.sqoop.security.repo_encryption.password=password \
+        -Torg.apache.sqoop.security.repo_encryption.hmac_algorithm=HmacSHA256 \
+        -Torg.apache.sqoop.security.repo_encryption.cipher_algorithm=AES \
+        -Torg.apache.sqoop.security.repo_encryption.cipher_key_size=16 \
+        
-Torg.apache.sqoop.security.repo_encryption.cipher_spec=AES/CBC/PKCS5Padding \
+        
-Torg.apache.sqoop.security.repo_encryption.initialization_vector_size=16 \
+        
-Torg.apache.sqoop.security.repo_encryption.pbkdf2_algorithm=PBKDF2WithHmacSHA1 
\
+        -Torg.apache.sqoop.security.repo_encryption.pbkdf2_rounds=4000
+
+But we do have a shortcut to make this easier:
+
+::
+
+    sqoop.sh tool repositoryencryption -FuseConf -TuseConf
+
+The ``useConf`` option will read whatever configuration is already in the 
configured sqoop properties file and apply it
+for the specified direction.
+
+Encrypting an Existing Unencrypted Repository
+---------------------------------------------
+
+::
+
+    sqoop.sh tool repositoryencryption \
+        -Torg.apache.sqoop.security.repo_encryption.password=password \
+        -Torg.apache.sqoop.security.repo_encryption.hmac_algorithm=HmacSHA256 \
+        -Torg.apache.sqoop.security.repo_encryption.cipher_algorithm=AES \
+        -Torg.apache.sqoop.security.repo_encryption.cipher_key_size=16 \
+        
-Torg.apache.sqoop.security.repo_encryption.cipher_spec=AES/CBC/PKCS5Padding \
+        
-Torg.apache.sqoop.security.repo_encryption.initialization_vector_size=16 \
+        
-Torg.apache.sqoop.security.repo_encryption.pbkdf2_algorithm=PBKDF2WithHmacSHA1 
\
+        -Torg.apache.sqoop.security.repo_encryption.pbkdf2_rounds=4000
+
+If the configuration for the encrypted repository has already been written to 
the sqoop properties file, one can simply
+execute:
+
+::
+
+    sqoop.sh tool repositoryencryption -TuseConf
+
+
+Decrypting an Existing Encrypted Repository
+-------------------------------------------
+
+::
+
+    sqoop.sh tool repositoryencryption \
+        -Forg.apache.sqoop.security.repo_encryption.password=password \
+        -Forg.apache.sqoop.security.repo_encryption.hmac_algorithm=HmacSHA256 \
+        -Forg.apache.sqoop.security.repo_encryption.cipher_algorithm=AES \
+        -Forg.apache.sqoop.security.repo_encryption.cipher_key_size=16 \
+        
-Forg.apache.sqoop.security.repo_encryption.cipher_spec=AES/CBC/PKCS5Padding \
+        
-Forg.apache.sqoop.security.repo_encryption.initialization_vector_size=16 \
+        
-Forg.apache.sqoop.security.repo_encryption.pbkdf2_algorithm=PBKDF2WithHmacSHA1 
\
+        -Forg.apache.sqoop.security.repo_encryption.pbkdf2_rounds=4000
+
+If the configuration for the encrypted repository has not yet been removed 
from the sqoop properties file, one can simply
+execute:
+
+::
+
+    sqoop.sh tool repositoryencryption -FuseConf

Added: 
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/admin/Upgrade.txt
==============================================================================
--- websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/admin/Upgrade.txt 
(added)
+++ websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/admin/Upgrade.txt 
Thu Jul 28 01:17:26 2016
@@ -0,0 +1,84 @@
+.. Licensed to the Apache Software Foundation (ASF) under one or more
+   contributor license agreements.  See the NOTICE file distributed with
+   this work for additional information regarding copyright ownership.
+   The ASF licenses this file to You under the Apache License, Version 2.0
+   (the "License"); you may not use this file except in compliance with
+   the License.  You may obtain a copy of the License at
+
+       http://www.apache.org/licenses/LICENSE-2.0
+
+   Unless required by applicable law or agreed to in writing, software
+   distributed under the License is distributed on an "AS IS" BASIS,
+   WITHOUT WARRANTIES OR CONDITIONS OF lANY KIND, either express or implied.
+   See the License for the specific language governing permissions and
+   limitations under the License.
+
+
+=======
+Upgrade
+=======
+
+This page describes procedure that you need to take in order to upgrade Sqoop 
from one release to a higher release. Upgrading both client and server 
component will be discussed separately.
+
+.. note:: Only updates from one Sqoop 2 release to another are covered, 
starting with upgrades from version 1.99.2. This guide do not contain general 
information how to upgrade from Sqoop 1 to Sqoop 2.
+
+Upgrading Server
+================
+
+As Sqoop server is using a database repository for persisting sqoop entities 
such as the connector, driver, links and jobs the repository schema might need 
to be updated as part of the server upgrade. In addition the configs and inputs 
described by the various connectors and the driver may also change with a new 
server version and might need a data upgrade.
+
+There are two ways how to upgrade Sqoop entities in the repository, you can 
either execute upgrade tool or configure the sqoop server to perform all 
necessary upgrades on start up.
+
+It's strongly advised to back up the repository before moving on to next 
steps. Backup instructions will vary depending on the repository 
implementation. For example, using MySQL as a repository will require a 
different back procedure than Apache Derby. Please follow the repositories' 
backup procedure.
+
+Upgrading Server using upgrade tool
+-----------------------------------
+
+Preferred upgrade path is to explicitly run the :ref:`tool-upgrade`. First 
step is to however shutdown the server as having both the server and upgrade 
utility accessing the same repository might corrupt it::
+
+  sqoop2-server stop
+
+When the server has been successfully stopped, you can update the server bits 
and simply run the upgrade tool::
+
+  sqoop2-tool upgrade
+
+You should see that the upgrade process has been successful::
+
+  Tool class org.apache.sqoop.tools.tool.UpgradeTool has finished correctly.
+
+In case of any failure, please take a look into :ref:`tool-upgrade` 
documentation page.
+
+Upgrading Server on start-up
+----------------------------
+
+The capability of performing the upgrade has been built-in to the server, 
however is disabled by default to avoid any unintentional changes to the 
repository. You can start the repository schema upgrade procedure by stopping 
the server: ::
+
+  sqoop2-server stop
+
+Before starting the server again you will need to enable the auto-upgrade 
feature that will perform all necessary changes during Sqoop Server start up.
+
+You need to set the following property in configuration file 
``sqoop.properties`` for the repository schema upgrade.
+::
+
+   org.apache.sqoop.repository.schema.immutable=false
+
+You need to set the following property in configuration file 
``sqoop.properties`` for the connector config data upgrade.
+::
+
+   org.apache.sqoop.connector.autoupgrade=true
+
+You need to set the following property in configuration file 
``sqoop.properties`` for the driver config data upgrade.
+::
+
+   org.apache.sqoop.driver.autoupgrade=true
+
+When all properties are set, start the sqoop server using the following 
command::
+
+  sqoop2-server start
+
+All required actions will be performed automatically during the server 
bootstrap. It's strongly advised to set all three properties to their original 
values once the server has been successfully started and the upgrade has 
completed
+
+Upgrading Client
+================
+
+Client do not require any manual steps during upgrade. Replacing the binaries 
with updated version is sufficient.

Added: websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/dev.txt
==============================================================================
--- websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/dev.txt (added)
+++ websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/dev.txt Thu Jul 
28 01:17:26 2016
@@ -0,0 +1,24 @@
+.. Licensed to the Apache Software Foundation (ASF) under one or more
+   contributor license agreements.  See the NOTICE file distributed with
+   this work for additional information regarding copyright ownership.
+   The ASF licenses this file to You under the Apache License, Version 2.0
+   (the "License"); you may not use this file except in compliance with
+   the License.  You may obtain a copy of the License at
+
+       http://www.apache.org/licenses/LICENSE-2.0
+
+   Unless required by applicable law or agreed to in writing, software
+   distributed under the License is distributed on an "AS IS" BASIS,
+   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+   See the License for the specific language governing permissions and
+   limitations under the License.
+
+
+===============
+Developer Guide
+===============
+
+.. toctree::
+   :glob:
+
+   dev/*

Added: 
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/dev/BuildingSqoop2.txt
==============================================================================
--- 
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/dev/BuildingSqoop2.txt
 (added)
+++ 
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/dev/BuildingSqoop2.txt
 Thu Jul 28 01:17:26 2016
@@ -0,0 +1,76 @@
+.. Licensed to the Apache Software Foundation (ASF) under one or more
+   contributor license agreements.  See the NOTICE file distributed with
+   this work for additional information regarding copyright ownership.
+   The ASF licenses this file to You under the Apache License, Version 2.0
+   (the "License"); you may not use this file except in compliance with
+   the License.  You may obtain a copy of the License at
+
+       http://www.apache.org/licenses/LICENSE-2.0
+
+   Unless required by applicable law or agreed to in writing, software
+   distributed under the License is distributed on an "AS IS" BASIS,
+   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+   See the License for the specific language governing permissions and
+   limitations under the License.
+
+
+================================
+Building Sqoop2 from source code
+================================
+
+This guide will show you how to build Sqoop2 from source code. Sqoop is using 
`maven <http://maven.apache.org/>`_ as build system. You you will need to use 
at least version 3.0 as older versions will not work correctly. All other 
dependencies will be downloaded by maven automatically. With exception of 
special JDBC drivers that are needed only for advanced integration tests.
+
+Downloading source code
+-----------------------
+
+Sqoop project is using git as a revision control system hosted at Apache 
Software Foundation. You can clone entire repository using following command:
+
+::
+
+  git clone https://git-wip-us.apache.org/repos/asf/sqoop.git sqoop2
+
+Sqoop2 is currently developed in special branch ``sqoop2`` that you need to 
check out after clone:
+
+::
+
+  cd sqoop2
+  git checkout sqoop2
+
+Building project
+----------------
+
+You can use usual maven targets like ``compile`` or ``package`` to build the 
project. Sqoop supports one major Hadoop revision at the moment - 2.x. As 
compiled code for one Hadoop major version can't be used on another, you must 
compile Sqoop against appropriate Hadoop version.
+
+::
+
+  mvn compile
+
+Maven target ``package`` can be used to create Sqoop packages similar to the 
ones that are officially available for download. Sqoop will build only source 
tarball by default. You need to specify ``-Pbinary`` to build binary 
distribution.
+
+::
+
+  mvn package -Pbinary
+
+Running tests
+-------------
+
+Sqoop supports two different sets of tests. First smaller and much faster set 
is called **unit tests** and will be executed on maven target ``test``. Second 
larger set of **integration tests** will be executed on maven target 
``integration-test``. Please note that integration tests might require manual 
steps for installing various JDBC drivers into your local maven cache.
+
+Example for running unit tests:
+
+::
+
+  mvn test
+
+Example for running integration tests:
+
+::
+
+  mvn integration-test
+
+For the **unit tests**, there are two helpful profiles: **fast** and **slow**. 
The **fast** unit tests do not start or use any services. The **slow** unit 
tests, may start services or use an external service (ie. MySQL).
+
+::
+
+  mvn test -Pfast,hadoop200
+  mvn test -Pslow,hadoop200
\ No newline at end of file

Added: 
websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/dev/ClientAPI.txt
==============================================================================
--- websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/dev/ClientAPI.txt 
(added)
+++ websites/staging/sqoop/trunk/content/docs/1.99.7/_sources/dev/ClientAPI.txt 
Thu Jul 28 01:17:26 2016
@@ -0,0 +1,300 @@
+.. Licensed to the Apache Software Foundation (ASF) under one or more
+   contributor license agreements.  See the NOTICE file distributed with
+   this work for additional information regarding copyright ownership.
+   The ASF licenses this file to You under the Apache License, Version 2.0
+   (the "License"); you may not use this file except in compliance with
+   the License.  You may obtain a copy of the License at
+
+       http://www.apache.org/licenses/LICENSE-2.0
+
+   Unless required by applicable law or agreed to in writing, software
+   distributed under the License is distributed on an "AS IS" BASIS,
+   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+   See the License for the specific language governing permissions and
+   limitations under the License.
+
+
+===========================
+Sqoop Java Client API Guide
+===========================
+
+This document will explain how to use Sqoop Java Client API with external 
application. Client API allows you to execute the functions of sqoop commands. 
It requires Sqoop Client JAR and its dependencies.
+
+The main class that provides wrapper methods for all the supported operations 
is the
+::
+
+  public class SqoopClient {
+    ...
+  }
+
+Java Client API is explained using Generic JDBC Connector example. Before 
executing the application using the sqoop client API, check whether sqoop 
server is running.
+
+Workflow
+========
+
+Given workflow has to be followed for executing a sqoop job in Sqoop server.
+
+  1. Create LINK object for a given connector name              - Creates Link 
object and returns it
+  2. Create a JOB for a given "from" and "to" link name         - Create Job 
object and returns it
+  3. Start the JOB for a given job name                         - Start Job on 
the server and creates a submission record
+
+Project Dependencies
+====================
+Here given maven dependency
+
+::
+
+  <dependency>
+    <groupId>org.apache.sqoop</groupId>
+      <artifactId>sqoop-client</artifactId>
+      <version>${requestedVersion}</version>
+  </dependency>
+
+Initialization
+==============
+
+First initialize the SqoopClient class with server URL as argument.
+
+::
+
+  String url = "http://localhost:12000/sqoop/";;
+  SqoopClient client = new SqoopClient(url);
+
+Server URL value can be modfied by setting value to setServerUrl(String) method
+
+::
+
+  client.setServerUrl(newUrl);
+
+
+Link
+====
+Connectors provide the facility to interact with many data sources and thus 
can be used as a means to transfer data between them in Sqoop. The registered 
connector implementation will provide logic to read from and/or write to a data 
source that it represents. A connector can have one or more links associated 
with it. The java client API allows you to create, update and delete a link for 
any registered connector. Creating or updating a link requires you to populate 
the Link Config for that particular connector. Hence the first thing to do is 
get the list of registered connectors and select the connector for which you 
would like to create a link. Then
+you can get the list of all the config/inputs using `Display Config and Input 
Names For Connector`_ for that connector.
+
+
+Save Link
+---------
+
+First create a new link by invoking ``createLink(connectorName)`` method with 
connector name and it returns a MLink object with dummy id and the unfilled 
link config inputs for that connector. Then fill the config inputs with 
relevant values. Invoke ``saveLink`` passing it the filled MLink object.
+
+::
+
+  // create a placeholder for link
+  MLink link = client.createLink("connectorName");
+  link.setName("Vampire");
+  link.setCreationUser("Buffy");
+  MLinkConfig linkConfig = link.getConnectorLinkConfig();
+  // fill in the link config values
+  
linkConfig.getStringInput("linkConfig.connectionString").setValue("jdbc:mysql://localhost/my");
+  
linkConfig.getStringInput("linkConfig.jdbcDriver").setValue("com.mysql.jdbc.Driver");
+  linkConfig.getStringInput("linkConfig.username").setValue("root");
+  linkConfig.getStringInput("linkConfig.password").setValue("root");
+  // save the link object that was filled
+  Status status = client.saveLink(link);
+  if(status.canProceed()) {
+   System.out.println("Created Link with Link Name : " + link.getName());
+  } else {
+   System.out.println("Something went wrong creating the link");
+  }
+
+``status.canProceed()`` returns true if status is OK or a WARNING. Before 
sending the status, the link config values are validated using the 
corresponding validator associated with th link config inputs.
+
+On successful execution of the saveLink method, new link name is assigned to 
the link object else an exception is thrown. ``link.getName()`` method returns 
the unique name for this object persisted in the sqoop repository.
+
+User can retrieve a link using the following methods
+
++----------------------------+--------------------------------------+
+|   Method                   | Description                          |
++============================+======================================+
+| ``getLink(linkName)``      | Returns a link by name               |
++----------------------------+--------------------------------------+
+| ``getLinks()``             | Returns list of links in the sqoop   |
++----------------------------+--------------------------------------+
+
+Job
+===
+
+A sqoop job holds the ``From`` and ``To`` parts for transferring data from the 
``From`` data source to the ``To`` data source. Both the ``From`` and the 
``To`` are uniquely identified by their corresponding connector Link Ids. i.e 
when creating a job we have to specifiy the ``FromLinkId`` and the 
``ToLinkId``. Thus the pre-requisite for creating a job is to first create the 
links as described above.
+
+Once the link names for the ``From`` and ``To`` are given, then the job 
configs for the associated connector for the link object have to be filled. You 
can get the list of all the from and to job config/inputs using `Display Config 
and Input Names For Connector`_ for that connector. A connector can have one or 
more links. We then use the links in the ``From`` and ``To`` direction to 
populate the corresponding ``MFromConfig`` and ``MToConfig`` respectively.
+
+In addition to filling the job configs for the ``From`` and the ``To`` 
representing the link, we also need to fill the driver configs that control the 
job execution engine environment. For example, if the job execution engine 
happens to be the MapReduce we will specifiy the number of mappers to be used 
in reading data from the ``From`` data source.
+
+Save Job
+---------
+Here is the code to create and then save a job
+::
+
+  String url = "http://localhost:12000/sqoop/";;
+  SqoopClient client = new SqoopClient(url);
+  //Creating dummy job object
+  MJob job = client.createJob("fromLinkName", "toLinkName");
+  job.setName("Vampire");
+  job.setCreationUser("Buffy");
+  // set the "FROM" link job config values
+  MFromConfig fromJobConfig = job.getFromJobConfig();
+  fromJobConfig.getStringInput("fromJobConfig.schemaName").setValue("sqoop");
+  fromJobConfig.getStringInput("fromJobConfig.tableName").setValue("sqoop");
+  fromJobConfig.getStringInput("fromJobConfig.partitionColumn").setValue("id");
+  // set the "TO" link job config values
+  MToConfig toJobConfig = job.getToJobConfig();
+  
toJobConfig.getStringInput("toJobConfig.outputDirectory").setValue("/usr/tmp");
+  // set the driver config values
+  MDriverConfig driverConfig = job.getDriverConfig();
+  driverConfig.getStringInput("throttlingConfig.numExtractors").setValue("3");
+
+  Status status = client.saveJob(job);
+  if(status.canProceed()) {
+   System.out.println("Created Job with Job Name: "+ job.getName());
+  } else {
+   System.out.println("Something went wrong creating the job");
+  }
+
+User can retrieve a job using the following methods
+
++----------------------------+--------------------------------------+
+|   Method                   | Description                          |
++============================+======================================+
+| ``getJob(jobName)``        | Returns a job by name                |
++----------------------------+--------------------------------------+
+| ``getJobs()``              | Returns list of jobs in the sqoop    |
++----------------------------+--------------------------------------+
+
+
+List of status codes
+--------------------
+
++------------------+------------------------------------------------------------------------------------------------------------+
+| Function         | Description                                               
                                                 |
++==================+============================================================================================================+
+| ``OK``           | There are no issues, no warnings.                         
                                                 |
++------------------+------------------------------------------------------------------------------------------------------------+
+| ``WARNING``      | Validated entity is correct enough to be proceed. Not a 
fatal error                                        |
++------------------+------------------------------------------------------------------------------------------------------------+
+| ``ERROR``        | There are serious issues with validated entity. We can't 
proceed until reported issues will be resolved.   |
++------------------+------------------------------------------------------------------------------------------------------------+
+
+View Error or Warning valdiation message
+----------------------------------------
+
+In case of any WARNING AND ERROR status, user has to iterate the list of 
validation messages.
+
+::
+
+ printMessage(link.getConnectorLinkConfig().getConfigs());
+
+ private static void printMessage(List<MConfig> configs) {
+   for(MConfig config : configs) {
+     List<MInput<?>> inputlist = config.getInputs();
+     if (config.getValidationMessages() != null) {
+      // print every validation message
+      for(Message message : config.getValidationMessages()) {
+       System.out.println("Config validation message: " + 
message.getMessage());
+      }
+     }
+     for (MInput minput : inputlist) {
+       if (minput.getValidationStatus() == Status.WARNING) {
+        for(Message message : minput.getValidationMessages()) {
+         System.out.println("Config Input Validation Warning: " + 
message.getMessage());
+       }
+     }
+     else if (minput.getValidationStatus() == Status.ERROR) {
+       for(Message message : minput.getValidationMessages()) {
+        System.out.println("Config Input Validation Error: " + 
message.getMessage());
+       }
+      }
+     }
+    }
+
+Updating link and job
+---------------------
+After creating link or job in the repository, you can update or delete a link 
or job using the following functions
+
++----------------------------------+------------------------------------------------------------------------------------+
+|   Method                         | Description                               
                                         |
++==================================+====================================================================================+
+| ``updateLink(link)``             | Invoke update with link and check status 
for any errors or warnings                |
++----------------------------------+------------------------------------------------------------------------------------+
+| ``deleteLink(linkName)``         | Delete link. Deletes only if specified 
link is not used by any job                 |
++----------------------------------+------------------------------------------------------------------------------------+
+| ``updateJob(job)``               | Invoke update with job and check status 
for any errors or warnings                 |
++----------------------------------+------------------------------------------------------------------------------------+
+| ``deleteJob(jobName)``           | Delete job                                
                                         |
++----------------------------------+------------------------------------------------------------------------------------+
+
+Job Start
+==============
+
+Starting a job requires a job name. On successful start, getStatus() method 
returns "BOOTING" or "RUNNING".
+
+::
+
+  //Job start
+  MSubmission submission = client.startJob("jobName");
+  System.out.println("Job Submission Status : " + submission.getStatus());
+  if(submission.getStatus().isRunning() && submission.getProgress() != -1) {
+    System.out.println("Progress : " + String.format("%.2f %%", 
submission.getProgress() * 100));
+  }
+  System.out.println("Hadoop job id :" + submission.getExternalId());
+  System.out.println("Job link : " + submission.getExternalLink());
+  Counters counters = submission.getCounters();
+  if(counters != null) {
+    System.out.println("Counters:");
+    for(CounterGroup group : counters) {
+      System.out.print("\t");
+      System.out.println(group.getName());
+      for(Counter counter : group) {
+        System.out.print("\t\t");
+        System.out.print(counter.getName());
+        System.out.print(": ");
+        System.out.println(counter.getValue());
+      }
+    }
+  }
+  if(submission.getExceptionInfo() != null) {
+    System.out.println("Exception info : " +submission.getExceptionInfo());
+  }
+
+
+  //Check job status for a running job
+  MSubmission submission = client.getJobStatus("jobName");
+  if(submission.getStatus().isRunning() && submission.getProgress() != -1) {
+    System.out.println("Progress : " + String.format("%.2f %%", 
submission.getProgress() * 100));
+  }
+
+  //Stop a running job
+  submission.stopJob("jobName");
+
+Above code block, job start is asynchronous. For synchronous job start, use 
``startJob(jobName, callback, pollTime)`` method. If you are not interested in 
getting the job status, then invoke the same method with "null" as the value 
for the callback parameter and this returns the final job status. ``pollTime`` 
is the request interval for getting the job status from sqoop server and the 
value should be greater than zero. We will frequently hit the sqoop server if a 
low value is given for the ``pollTime``. When a synchronous job is started with 
a non null callback, it first invokes the callback's ``submitted(MSubmission)`` 
method on successful start, after every poll time interval, it then invokes the 
``updated(MSubmission)`` method on the callback API and finally on finishing 
the job executuon it invokes the ``finished(MSubmission)`` method on the 
callback API.
+
+Display Config and Input Names For Connector
+============================================
+
+You can view the config/input names for the link and job config types per 
connector
+
+::
+
+  String url = "http://localhost:12000/sqoop/";;
+  SqoopClient client = new SqoopClient(url);
+  String connectorName = "connectorName";
+  // link config for connector
+  describe(client.getConnector(connectorName).getLinkConfig().getConfigs(), 
client.getConnectorConfigBundle(connectorName));
+  // from job config for connector
+  describe(client.getConnector(connectorName).getFromConfig().getConfigs(), 
client.getConnectorConfigBundle(connectorName));
+  // to job config for the connector
+  describe(client.getConnector(connectorName).getToConfig().getConfigs(), 
client.getConnectorConfigBundle(connectorName));
+
+  void describe(List<MConfig> configs, ResourceBundle resource) {
+    for (MConfig config : configs) {
+      System.out.println(resource.getString(config.getLabelKey())+":");
+      List<MInput<?>> inputs = config.getInputs();
+      for (MInput input : inputs) {
+        System.out.println(resource.getString(input.getLabelKey()) + " : " + 
input.getValue());
+      }
+      System.out.println();
+    }
+  }
+
+
+Above Sqoop 2 Client API tutorial explained how to create a link, create job 
and and then start the job.


Reply via email to