SQOOP-2694: Sqoop2: Doc: Register structure in sphinx for our docs
(Jarek Jarcec Cecho via Kate Ting)


Project: http://git-wip-us.apache.org/repos/asf/sqoop/repo
Commit: http://git-wip-us.apache.org/repos/asf/sqoop/commit/3613843a
Tree: http://git-wip-us.apache.org/repos/asf/sqoop/tree/3613843a
Diff: http://git-wip-us.apache.org/repos/asf/sqoop/diff/3613843a

Branch: refs/heads/sqoop2
Commit: 3613843a7c52fb872f7365fae58b2410a1047b4b
Parents: 25b0df5
Author: Kate Ting <[email protected]>
Authored: Wed Nov 18 14:58:57 2015 -0800
Committer: Kate Ting <[email protected]>
Committed: Wed Nov 18 14:58:57 2015 -0800

----------------------------------------------------------------------
 docs/pom.xml                                    |    5 +-
 docs/src/site/sphinx/BuildingSqoop2.rst         |   76 -
 docs/src/site/sphinx/ClientAPI.rst              |  304 ----
 docs/src/site/sphinx/CommandLineClient.rst      |  533 ------
 docs/src/site/sphinx/Connector-FTP.rst          |   81 -
 docs/src/site/sphinx/Connector-GenericJDBC.rst  |  194 ---
 docs/src/site/sphinx/Connector-HDFS.rst         |  159 --
 docs/src/site/sphinx/Connector-Kafka.rst        |   64 -
 docs/src/site/sphinx/Connector-Kite.rst         |  110 --
 docs/src/site/sphinx/Connector-SFTP.rst         |   91 -
 docs/src/site/sphinx/ConnectorDevelopment.rst   |  595 -------
 docs/src/site/sphinx/DevEnv.rst                 |   57 -
 docs/src/site/sphinx/Installation.rst           |  103 --
 docs/src/site/sphinx/RESTAPI.rst                | 1601 ------------------
 docs/src/site/sphinx/Repository.rst             |  335 ----
 docs/src/site/sphinx/SecurityGuideOnSqoop2.rst  |  239 ---
 docs/src/site/sphinx/Sqoop5MinutesDemo.rst      |  242 ---
 docs/src/site/sphinx/Tools.rst                  |  129 --
 docs/src/site/sphinx/Upgrade.rst                |   84 -
 docs/src/site/sphinx/admin.rst                  |   24 +
 docs/src/site/sphinx/admin/Installation.rst     |  103 ++
 docs/src/site/sphinx/admin/Tools.rst            |  129 ++
 docs/src/site/sphinx/admin/Upgrade.rst          |   84 +
 docs/src/site/sphinx/conf.py                    |    4 +-
 docs/src/site/sphinx/dev.rst                    |   24 +
 docs/src/site/sphinx/dev/BuildingSqoop2.rst     |   76 +
 docs/src/site/sphinx/dev/ClientAPI.rst          |  304 ++++
 .../site/sphinx/dev/ConnectorDevelopment.rst    |  595 +++++++
 docs/src/site/sphinx/dev/DevEnv.rst             |   57 +
 docs/src/site/sphinx/dev/RESTAPI.rst            | 1601 ++++++++++++++++++
 docs/src/site/sphinx/dev/Repository.rst         |  335 ++++
 docs/src/site/sphinx/index.rst                  |   76 +-
 docs/src/site/sphinx/security.rst               |   24 +
 .../sphinx/security/SecurityGuideOnSqoop2.rst   |  239 +++
 docs/src/site/sphinx/user.rst                   |   24 +
 docs/src/site/sphinx/user/CommandLineClient.rst |  533 ++++++
 docs/src/site/sphinx/user/Connectors.rst        |   24 +
 docs/src/site/sphinx/user/Sqoop5MinutesDemo.rst |  242 +++
 .../sphinx/user/connectors/Connector-FTP.rst    |   81 +
 .../user/connectors/Connector-GenericJDBC.rst   |  194 +++
 .../sphinx/user/connectors/Connector-HDFS.rst   |  159 ++
 .../sphinx/user/connectors/Connector-Kafka.rst  |   64 +
 .../sphinx/user/connectors/Connector-Kite.rst   |  110 ++
 .../sphinx/user/connectors/Connector-SFTP.rst   |   91 +
 44 files changed, 5152 insertions(+), 5047 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/sqoop/blob/3613843a/docs/pom.xml
----------------------------------------------------------------------
diff --git a/docs/pom.xml b/docs/pom.xml
index 079e896..c96a582 100644
--- a/docs/pom.xml
+++ b/docs/pom.xml
@@ -70,7 +70,10 @@ limitations under the License.
           <plugin>
             <groupId>org.tomdz.maven</groupId>
             <artifactId>sphinx-maven-plugin</artifactId>
-            <version>1.0.2</version>
+            <version>1.0.3</version>
+            <configuration>
+              <warningsAsErrors>true</warningsAsErrors>
+            </configuration>
           </plugin>
           <!-- Turning off standard reports as they collide with sphinx -->
           <plugin>

http://git-wip-us.apache.org/repos/asf/sqoop/blob/3613843a/docs/src/site/sphinx/BuildingSqoop2.rst
----------------------------------------------------------------------
diff --git a/docs/src/site/sphinx/BuildingSqoop2.rst 
b/docs/src/site/sphinx/BuildingSqoop2.rst
deleted file mode 100644
index 7fbbb6b..0000000
--- a/docs/src/site/sphinx/BuildingSqoop2.rst
+++ /dev/null
@@ -1,76 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one or more
-   contributor license agreements.  See the NOTICE file distributed with
-   this work for additional information regarding copyright ownership.
-   The ASF licenses this file to You under the Apache License, Version 2.0
-   (the "License"); you may not use this file except in compliance with
-   the License.  You may obtain a copy of the License at
-
-       http://www.apache.org/licenses/LICENSE-2.0
-
-   Unless required by applicable law or agreed to in writing, software
-   distributed under the License is distributed on an "AS IS" BASIS,
-   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-   See the License for the specific language governing permissions and
-   limitations under the License.
-
-
-================================
-Building Sqoop2 from source code
-================================
-
-This guide will show you how to build Sqoop2 from source code. Sqoop is using 
`maven <http://maven.apache.org/>`_ as build system. You you will need to use 
at least version 3.0 as older versions will not work correctly. All other 
dependencies will be downloaded by maven automatically. With exception of 
special JDBC drivers that are needed only for advanced integration tests.
-
-Downloading source code
------------------------
-
-Sqoop project is using git as a revision control system hosted at Apache 
Software Foundation. You can clone entire repository using following command:
-
-::
-
-  git clone https://git-wip-us.apache.org/repos/asf/sqoop.git sqoop2
-
-Sqoop2 is currently developed in special branch ``sqoop2`` that you need to 
check out after clone:
-
-::
-
-  cd sqoop2
-  git checkout sqoop2
-
-Building project
-----------------
-
-You can use usual maven targets like ``compile`` or ``package`` to build the 
project. Sqoop supports one major Hadoop revision at the moment - 2.x. As 
compiled code for one Hadoop major version can't be used on another, you must 
compile Sqoop against appropriate Hadoop version.
-
-::
-
-  mvn compile
-
-Maven target ``package`` can be used to create Sqoop packages similar to the 
ones that are officially available for download. Sqoop will build only source 
tarball by default. You need to specify ``-Pbinary`` to build binary 
distribution.
-
-::
-
-  mvn package -Pbinary
-
-Running tests
--------------
-
-Sqoop supports two different sets of tests. First smaller and much faster set 
is called **unit tests** and will be executed on maven target ``test``. Second 
larger set of **integration tests** will be executed on maven target 
``integration-test``. Please note that integration tests might require manual 
steps for installing various JDBC drivers into your local maven cache.
-
-Example for running unit tests:
-
-::
-
-  mvn test
-
-Example for running integration tests:
-
-::
-
-  mvn integration-test
-
-For the **unit tests**, there are two helpful profiles: **fast** and **slow**. 
The **fast** unit tests do not start or use any services. The **slow** unit 
tests, may start services or use an external service (ie. MySQL).
-
-::
-
-  mvn test -Pfast,hadoop200
-  mvn test -Pslow,hadoop200
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/sqoop/blob/3613843a/docs/src/site/sphinx/ClientAPI.rst
----------------------------------------------------------------------
diff --git a/docs/src/site/sphinx/ClientAPI.rst 
b/docs/src/site/sphinx/ClientAPI.rst
deleted file mode 100644
index 9626878..0000000
--- a/docs/src/site/sphinx/ClientAPI.rst
+++ /dev/null
@@ -1,304 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one or more
-   contributor license agreements.  See the NOTICE file distributed with
-   this work for additional information regarding copyright ownership.
-   The ASF licenses this file to You under the Apache License, Version 2.0
-   (the "License"); you may not use this file except in compliance with
-   the License.  You may obtain a copy of the License at
-
-       http://www.apache.org/licenses/LICENSE-2.0
-
-   Unless required by applicable law or agreed to in writing, software
-   distributed under the License is distributed on an "AS IS" BASIS,
-   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-   See the License for the specific language governing permissions and
-   limitations under the License.
-
-
-===========================
-Sqoop Java Client API Guide
-===========================
-
-This document will explain how to use Sqoop Java Client API with external 
application. Client API allows you to execute the functions of sqoop commands. 
It requires Sqoop Client JAR and its dependencies.
-
-The main class that provides wrapper methods for all the supported operations 
is the
-::
-
-  public class SqoopClient {
-    ...
-  }
-
-Java Client API is explained using Generic JDBC Connector example. Before 
executing the application using the sqoop client API, check whether sqoop 
server is running.
-
-Workflow
-========
-
-Given workflow has to be followed for executing a sqoop job in Sqoop server.
-
-  1. Create LINK object for a given connectorId             - Creates Link 
object and returns linkId (lid)
-  2. Create a JOB for a given "from" and "to" linkId            - Create Job 
object and returns jobId (jid)
-  3. Start the JOB for a given jobId                        - Start Job on the 
server and creates a submission record
-
-Project Dependencies
-====================
-Here given maven dependency
-
-::
-
-  <dependency>
-    <groupId>org.apache.sqoop</groupId>
-      <artifactId>sqoop-client</artifactId>
-      <version>${requestedVersion}</version>
-  </dependency>
-
-Initialization
-==============
-
-First initialize the SqoopClient class with server URL as argument.
-
-::
-
-  String url = "http://localhost:12000/sqoop/";;
-  SqoopClient client = new SqoopClient(url);
-
-Server URL value can be modfied by setting value to setServerUrl(String) method
-
-::
-
-  client.setServerUrl(newUrl);
-
-
-Link
-====
-Connectors provide the facility to interact with many data sources and thus 
can be used as a means to transfer data between them in Sqoop. The registered 
connector implementation will provide logic to read from and/or write to a data 
source that it represents. A connector can have one or more links associated 
with it. The java client API allows you to create, update and delete a link for 
any registered connector. Creating or updating a link requires you to populate 
the Link Config for that particular connector. Hence the first thing to do is 
get the list of registered connectors and select the connector for which you 
would like to create a link. Then
-you can get the list of all the config/inputs using `Display Config and Input 
Names For Connector`_ for that connector.
-
-
-Save Link
----------
-
-First create a new link by invoking ``createLink(cid)`` method with connector 
Id and it returns a MLink object with dummy id and the unfilled link config 
inputs for that connector. Then fill the config inputs with relevant values. 
Invoke ``saveLink`` passing it the filled MLink object.
-
-::
-
-  // create a placeholder for link
-  long connectorId = 1;
-  MLink link = client.createLink(connectorId);
-  link.setName("Vampire");
-  link.setCreationUser("Buffy");
-  MLinkConfig linkConfig = link.getConnectorLinkConfig();
-  // fill in the link config values
-  
linkConfig.getStringInput("linkConfig.connectionString").setValue("jdbc:mysql://localhost/my");
-  
linkConfig.getStringInput("linkConfig.jdbcDriver").setValue("com.mysql.jdbc.Driver");
-  linkConfig.getStringInput("linkConfig.username").setValue("root");
-  linkConfig.getStringInput("linkConfig.password").setValue("root");
-  // save the link object that was filled
-  Status status = client.saveLink(link);
-  if(status.canProceed()) {
-   System.out.println("Created Link with Link Id : " + 
link.getPersistenceId());
-  } else {
-   System.out.println("Something went wrong creating the link");
-  }
-
-``status.canProceed()`` returns true if status is OK or a WARNING. Before 
sending the status, the link config values are validated using the 
corresponding validator associated with th link config inputs.
-
-On successful execution of the saveLink method, new link Id is assigned to the 
link object else an exception is thrown. ``link.getPersistenceId()`` method 
returns the unique Id for this object persisted in the sqoop repository.
-
-User can retrieve a link using the following methods
-
-+----------------------------+--------------------------------------+
-|   Method                   | Description                          |
-+============================+======================================+
-| ``getLink(lid)``           | Returns a link by id                 |
-+----------------------------+--------------------------------------+
-| ``getLinks()``             | Returns list of links in the sqoop   |
-+----------------------------+--------------------------------------+
-
-Job
-===
-
-A sqoop job holds the ``From`` and ``To`` parts for transferring data from the 
``From`` data source to the ``To`` data source. Both the ``From`` and the 
``To`` are uniquely identified by their corresponding connector Link Ids. i.e 
when creating a job we have to specifiy the ``FromLinkId`` and the 
``ToLinkId``. Thus the pre-requisite for creating a job is to first create the 
links as described above.
-
-Once the linkIds for the ``From`` and ``To`` are given, then the job configs 
for the associated connector for the link object have to be filled. You can get 
the list of all the from and to job config/inputs using `Display Config and 
Input Names For Connector`_ for that connector. A connector can have one or 
more links. We then use the links in the ``From`` and ``To`` direction to 
populate the corresponding ``MFromConfig`` and ``MToConfig`` respectively.
-
-In addition to filling the job configs for the ``From`` and the ``To`` 
representing the link, we also need to fill the driver configs that control the 
job execution engine environment. For example, if the job execution engine 
happens to be the MapReduce we will specifiy the number of mappers to be used 
in reading data from the ``From`` data source.
-
-Save Job
----------
-Here is the code to create and then save a job
-::
-
-  String url = "http://localhost:12000/sqoop/";;
-  SqoopClient client = new SqoopClient(url);
-  //Creating dummy job object
-  long fromLinkId = 1;// for jdbc connector
-  long toLinkId = 2; // for HDFS connector
-  MJob job = client.createJob(fromLinkId, toLinkId);
-  job.setName("Vampire");
-  job.setCreationUser("Buffy");
-  // set the "FROM" link job config values
-  MFromConfig fromJobConfig = job.getFromJobConfig();
-  fromJobConfig.getStringInput("fromJobConfig.schemaName").setValue("sqoop");
-  fromJobConfig.getStringInput("fromJobConfig.tableName").setValue("sqoop");
-  fromJobConfig.getStringInput("fromJobConfig.partitionColumn").setValue("id");
-  // set the "TO" link job config values
-  MToConfig toJobConfig = job.getToJobConfig();
-  
toJobConfig.getStringInput("toJobConfig.outputDirectory").setValue("/usr/tmp");
-  // set the driver config values
-  MDriverConfig driverConfig = job.getDriverConfig();
-  driverConfig.getStringInput("throttlingConfig.numExtractors").setValue("3");
-
-  Status status = client.saveJob(job);
-  if(status.canProceed()) {
-   System.out.println("Created Job with Job Id: "+ job.getPersistenceId());
-  } else {
-   System.out.println("Something went wrong creating the job");
-  }
-
-User can retrieve a job using the following methods
-
-+----------------------------+--------------------------------------+
-|   Method                   | Description                          |
-+============================+======================================+
-| ``getJob(jid)``            | Returns a job by id                  |
-+----------------------------+--------------------------------------+
-| ``getJobs()``              | Returns list of jobs in the sqoop    |
-+----------------------------+--------------------------------------+
-
-
-List of status codes
---------------------
-
-+------------------+------------------------------------------------------------------------------------------------------------+
-| Function         | Description                                               
                                                 |
-+==================+============================================================================================================+
-| ``OK``           | There are no issues, no warnings.                         
                                                 |
-+------------------+------------------------------------------------------------------------------------------------------------+
-| ``WARNING``      | Validated entity is correct enough to be proceed. Not a 
fatal error                                        |
-+------------------+------------------------------------------------------------------------------------------------------------+
-| ``ERROR``        | There are serious issues with validated entity. We can't 
proceed until reported issues will be resolved.   |
-+------------------+------------------------------------------------------------------------------------------------------------+
-
-View Error or Warning valdiation message
-----------------------------------------
-
-In case of any WARNING AND ERROR status, user has to iterate the list of 
validation messages.
-
-::
-
- printMessage(link.getConnectorLinkConfig().getConfigs());
-
- private static void printMessage(List<MConfig> configs) {
-   for(MConfig config : configs) {
-     List<MInput<?>> inputlist = config.getInputs();
-     if (config.getValidationMessages() != null) {
-      // print every validation message
-      for(Message message : config.getValidationMessages()) {
-       System.out.println("Config validation message: " + 
message.getMessage());
-      }
-     }
-     for (MInput minput : inputlist) {
-       if (minput.getValidationStatus() == Status.WARNING) {
-        for(Message message : minput.getValidationMessages()) {
-         System.out.println("Config Input Validation Warning: " + 
message.getMessage());
-       }
-     }
-     else if (minput.getValidationStatus() == Status.ERROR) {
-       for(Message message : minput.getValidationMessages()) {
-        System.out.println("Config Input Validation Error: " + 
message.getMessage());
-       }
-      }
-     }
-    }
-
-Updating link and job
----------------------
-After creating link or job in the repository, you can update or delete a link 
or job using the following functions
-
-+----------------------------------+------------------------------------------------------------------------------------+
-|   Method                         | Description                               
                                         |
-+==================================+====================================================================================+
-| ``updateLink(link)``             | Invoke update with link and check status 
for any errors or warnings                |
-+----------------------------------+------------------------------------------------------------------------------------+
-| ``deleteLink(lid)``              | Delete link. Deletes only if specified 
link is not used by any job                 |
-+----------------------------------+------------------------------------------------------------------------------------+
-| ``updateJob(job)``               | Invoke update with job and check status 
for any errors or warnings                 |
-+----------------------------------+------------------------------------------------------------------------------------+
-| ``deleteJob(jid)``               | Delete job                                
                                         |
-+----------------------------------+------------------------------------------------------------------------------------+
-
-Job Start
-==============
-
-Starting a job requires a job id. On successful start, getStatus() method 
returns "BOOTING" or "RUNNING".
-
-::
-
-  //Job start
-  long jobId = 1;
-  MSubmission submission = client.startJob(jobId);
-  System.out.println("Job Submission Status : " + submission.getStatus());
-  if(submission.getStatus().isRunning() && submission.getProgress() != -1) {
-    System.out.println("Progress : " + String.format("%.2f %%", 
submission.getProgress() * 100));
-  }
-  System.out.println("Hadoop job id :" + submission.getExternalId());
-  System.out.println("Job link : " + submission.getExternalLink());
-  Counters counters = submission.getCounters();
-  if(counters != null) {
-    System.out.println("Counters:");
-    for(CounterGroup group : counters) {
-      System.out.print("\t");
-      System.out.println(group.getName());
-      for(Counter counter : group) {
-        System.out.print("\t\t");
-        System.out.print(counter.getName());
-        System.out.print(": ");
-        System.out.println(counter.getValue());
-      }
-    }
-  }
-  if(submission.getExceptionInfo() != null) {
-    System.out.println("Exception info : " +submission.getExceptionInfo());
-  }
-
-
-  //Check job status for a running job 
-  MSubmission submission = client.getJobStatus(jobId);
-  if(submission.getStatus().isRunning() && submission.getProgress() != -1) {
-    System.out.println("Progress : " + String.format("%.2f %%", 
submission.getProgress() * 100));
-  }
-
-  //Stop a running job
-  submission.stopJob(jobId);
-
-Above code block, job start is asynchronous. For synchronous job start, use 
``startJob(jid, callback, pollTime)`` method. If you are not interested in 
getting the job status, then invoke the same method with "null" as the value 
for the callback parameter and this returns the final job status. ``pollTime`` 
is the request interval for getting the job status from sqoop server and the 
value should be greater than zero. We will frequently hit the sqoop server if a 
low value is given for the ``pollTime``. When a synchronous job is started with 
a non null callback, it first invokes the callback's ``submitted(MSubmission)`` 
method on successful start, after every poll time interval, it then invokes the 
``updated(MSubmission)`` method on the callback API and finally on finishing 
the job executuon it invokes the ``finished(MSubmission)`` method on the 
callback API.
-
-Display Config and Input Names For Connector
-============================================
-
-You can view the config/input names for the link and job config types per 
connector
-
-::
-
-  String url = "http://localhost:12000/sqoop/";;
-  SqoopClient client = new SqoopClient(url);
-  long connectorId = 1;
-  // link config for connector
-  describe(client.getConnector(connectorId).getLinkConfig().getConfigs(), 
client.getConnectorConfigBundle(connectorId));
-  // from job config for connector
-  describe(client.getConnector(connectorId).getFromConfig().getConfigs(), 
client.getConnectorConfigBundle(connectorId));
-  // to job config for the connector
-  describe(client.getConnector(connectorId).getToConfig().getConfigs(), 
client.getConnectorConfigBundle(connectorId));
-
-  void describe(List<MConfig> configs, ResourceBundle resource) {
-    for (MConfig config : configs) {
-      System.out.println(resource.getString(config.getLabelKey())+":");
-      List<MInput<?>> inputs = config.getInputs();
-      for (MInput input : inputs) {
-        System.out.println(resource.getString(input.getLabelKey()) + " : " + 
input.getValue());
-      }
-      System.out.println();
-    }
-  }
-
-
-Above Sqoop 2 Client API tutorial explained how to create a link, create job 
and and then start the job.

http://git-wip-us.apache.org/repos/asf/sqoop/blob/3613843a/docs/src/site/sphinx/CommandLineClient.rst
----------------------------------------------------------------------
diff --git a/docs/src/site/sphinx/CommandLineClient.rst 
b/docs/src/site/sphinx/CommandLineClient.rst
deleted file mode 100644
index 8c4c592..0000000
--- a/docs/src/site/sphinx/CommandLineClient.rst
+++ /dev/null
@@ -1,533 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one or more
-   contributor license agreements.  See the NOTICE file distributed with
-   this work for additional information regarding copyright ownership.
-   The ASF licenses this file to You under the Apache License, Version 2.0
-   (the "License"); you may not use this file except in compliance with
-   the License.  You may obtain a copy of the License at
-
-       http://www.apache.org/licenses/LICENSE-2.0
-
-   Unless required by applicable law or agreed to in writing, software
-   distributed under the License is distributed on an "AS IS" BASIS,
-   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-   See the License for the specific language governing permissions and
-   limitations under the License.
-
-
-===================
-Command Line Shell
-===================
-
-Sqoop 2 provides command line shell that is capable of communicating with 
Sqoop 2 server using REST interface. Client is able to run in two modes - 
interactive and batch mode. Commands ``create``, ``update`` and ``clone`` are 
not currently supported in batch mode. Interactive mode supports all available 
commands.
-
-You can start Sqoop 2 client in interactive mode using command 
``sqoop2-shell``::
-
-  sqoop2-shell
-
-Batch mode can be started by adding additional argument representing path to 
your Sqoop client script: ::
-
-  sqoop2-shell /path/to/your/script.sqoop
-
-Sqoop client script is expected to contain valid Sqoop client commands, empty 
lines and lines starting with ``#`` that are denoting comment lines. Comments 
and empty lines are ignored, all other lines are interpreted. Example script: ::
-
-  # Specify company server
-  set server --host sqoop2.company.net
-
-  # Executing given job
-  start job  --jid 1
-
-
-.. contents:: Table of Contents
-
-Resource file
-=============
-
-Sqoop 2 client have ability to load resource files similarly as other command 
line tools. At the beginning of execution Sqoop client will check existence of 
file ``.sqoop2rc`` in home directory of currently logged user. If such file 
exists, it will be interpreted before any additional actions. This file is 
loaded in both interactive and batch mode. It can be used to execute any batch 
compatible commands.
-
-Example resource file: ::
-
-  # Configure our Sqoop 2 server automatically
-  set server --host sqoop2.company.net
-
-  # Run in verbose mode by default
-  set option --name verbose --value true
-
-Commands
-========
-
-Sqoop 2 contains several commands that will be documented in this section. 
Each command have one more functions that are accepting various arguments. Not 
all commands are supported in both interactive and batch mode.
-
-Auxiliary Commands
-------------------
-
-Auxiliary commands are commands that are improving user experience and are 
running purely on client side. Thus they do not need working connection to the 
server.
-
-* ``exit`` Exit client immediately. This command can be also executed by 
sending EOT (end of transmission) character. It's CTRL+D on most common Linux 
shells like Bash or Zsh.
-* ``history`` Print out command history. Please note that Sqoop client is 
saving history from previous executions and thus you might see commands that 
you've executed in previous runs.
-* ``help`` Show all available commands with short in-shell documentation.
-
-::
-
- sqoop:000> help
- For information about Sqoop, visit: http://sqoop.apache.org/
-
- Available commands:
-   exit    (\x  ) Exit the shell
-   history (\H  ) Display, manage and recall edit-line history
-   help    (\h  ) Display this help message
-   set     (\st ) Configure various client options and settings
-   show    (\sh ) Display various objects and configuration options
-   create  (\cr ) Create new object in Sqoop repository
-   delete  (\d  ) Delete existing object in Sqoop repository
-   update  (\up ) Update objects in Sqoop repository
-   clone   (\cl ) Create new object based on existing one
-   start   (\sta) Start job
-   stop    (\stp) Stop job
-   status  (\stu) Display status of a job
-   enable  (\en ) Enable object in Sqoop repository
-   disable (\di ) Disable object in Sqoop repository
-
-Set Command
------------
-
-Set command allows to set various properties of the client. Similarly as 
auxiliary commands, set do not require connection to Sqoop server. Set commands 
is not used to reconfigure Sqoop server.
-
-Available functions:
-
-+---------------+------------------------------------------+
-| Function      | Description                              |
-+===============+==========================================+
-| ``server``    | Set connection configuration for server  |
-+---------------+------------------------------------------+
-| ``option``    | Set various client side options          |
-+---------------+------------------------------------------+
-
-Set Server Function
-~~~~~~~~~~~~~~~~~~~
-
-Configure connection to Sqoop server - host port and web application name. 
Available arguments:
-
-+-----------------------+---------------+--------------------------------------------------+
-| Argument              | Default value | Description                          
            |
-+=======================+===============+==================================================+
-| ``-h``, ``--host``    | localhost     | Server name (FQDN) where Sqoop 
server is running |
-+-----------------------+---------------+--------------------------------------------------+
-| ``-p``, ``--port``    | 12000         | TCP Port                             
            |
-+-----------------------+---------------+--------------------------------------------------+
-| ``-w``, ``--webapp``  | sqoop         | Jetty's web application name         
           |
-+-----------------------+---------------+--------------------------------------------------+
-| ``-u``, ``--url``     |               | Sqoop Server in url format           
            |
-+-----------------------+---------------+--------------------------------------------------+
-
-Example: ::
-
-  set server --host sqoop2.company.net --port 80 --webapp sqoop
-
-or ::
-
-  set server --url http://sqoop2.company.net:80/sqoop
-
-Note: When ``--url`` option is given, ``--host``, ``--port`` or ``--webapp`` 
option will be ignored.
-
-Set Option Function
-~~~~~~~~~~~~~~~~~~~
-
-Configure Sqoop client related options. This function have two required 
arguments ``name`` and ``value``. Name represents internal property name and 
value holds new value that should be set. List of available option names 
follows:
-
-+-------------------+---------------+---------------------------------------------------------------------+
-| Option name       | Default value | Description                              
                           |
-+===================+===============+=====================================================================+
-| ``verbose``       | false         | Client will print additional information 
if verbose mode is enabled |
-+-------------------+---------------+---------------------------------------------------------------------+
-| ``poll-timeout``  | 10000         | Server poll timeout in milliseconds      
                           |
-+-------------------+---------------+---------------------------------------------------------------------+
-
-Example: ::
-
-  set option --name verbose --value true
-  set option --name poll-timeout --value 20000
-
-Show Command
-------------
-
-Show commands displays various information as described below.
-
-Available functions:
-
-+----------------+--------------------------------------------------------------------------------------------------------+
-| Function       | Description                                                 
                                           |
-+================+========================================================================================================+
-| ``server``     | Display connection information to the sqoop server (host, 
port, webapp)                                |
-+----------------+--------------------------------------------------------------------------------------------------------+
-| ``option``     | Display various client side options                         
                                           |
-+----------------+--------------------------------------------------------------------------------------------------------+
-| ``version``    | Show client build version, with an option -all it shows 
server build version and supported api versions|
-+----------------+--------------------------------------------------------------------------------------------------------+
-| ``connector``  | Show connector configurable and its related configs         
                                           |
-+----------------+--------------------------------------------------------------------------------------------------------+
-| ``driver``     | Show driver configurable and its related configs            
                                           |
-+----------------+--------------------------------------------------------------------------------------------------------+
-| ``link``       | Show links in sqoop                                         
                                           |
-+----------------+--------------------------------------------------------------------------------------------------------+
-| ``job``        | Show jobs in sqoop                                          
                                           |
-+----------------+--------------------------------------------------------------------------------------------------------+
-
-Show Server Function
-~~~~~~~~~~~~~~~~~~~~
-
-Show details about connection to Sqoop server.
-
-+-----------------------+--------------------------------------------------------------+
-| Argument              |  Description                                         
        |
-+=======================+==============================================================+
-| ``-a``, ``--all``     | Show all connection related information (host, port, 
webapp) |
-+-----------------------+--------------------------------------------------------------+
-| ``-h``, ``--host``    | Show host                                            
        |
-+-----------------------+--------------------------------------------------------------+
-| ``-p``, ``--port``    | Show port                                            
        |
-+-----------------------+--------------------------------------------------------------+
-| ``-w``, ``--webapp``  | Show web application name                            
        |
-+-----------------------+--------------------------------------------------------------+
-
-Example: ::
-
-  show server --all
-
-Show Option Function
-~~~~~~~~~~~~~~~~~~~~
-
-Show values of various client side options. This function will show all client 
options when called without arguments.
-
-+-----------------------+--------------------------------------------------------------+
-| Argument              |  Description                                         
        |
-+=======================+==============================================================+
-| ``-n``, ``--name``    | Show client option value with given name             
        |
-+-----------------------+--------------------------------------------------------------+
-
-Please check table in `Set Option Function`_ section to get a list of all 
supported option names.
-
-Example: ::
-
-  show option --name verbose
-
-Show Version Function
-~~~~~~~~~~~~~~~~~~~~~
-
-Show build versions of both client and server as well as the supported rest 
api versions.
-
-+------------------------+-----------------------------------------------+
-| Argument               |  Description                                  |
-+========================+===============================================+
-| ``-a``, ``--all``      | Show all versions (server, client, api)       |
-+------------------------+-----------------------------------------------+
-| ``-c``, ``--client``   | Show client build version                     |
-+------------------------+-----------------------------------------------+
-| ``-s``, ``--server``   | Show server build version                     |
-+------------------------+-----------------------------------------------+
-| ``-p``, ``--api``      | Show supported api versions                   |
-+------------------------+-----------------------------------------------+
-
-Example: ::
-
-  show version --all
-
-Show Connector Function
-~~~~~~~~~~~~~~~~~~~~~~~
-
-Show persisted connector configurable and its related configs used in creating 
associated link and job objects
-
-+-----------------------+------------------------------------------------+
-| Argument              |  Description                                   |
-+=======================+================================================+
-| ``-a``, ``--all``     | Show information for all connectors            |
-+-----------------------+------------------------------------------------+
-| ``-c``, ``--cid <x>`` | Show information for connector with id ``<x>`` |
-+-----------------------+------------------------------------------------+
-
-Example: ::
-
-  show connector --all or show connector
-
-Show Driver Function
-~~~~~~~~~~~~~~~~~~~~
-
-Show persisted driver configurable and its related configs used in creating 
job objects
-
-This function do not have any extra arguments. There is only one registered 
driver in sqoop
-
-Example: ::
-
-  show driver
-
-Show Link Function
-~~~~~~~~~~~~~~~~~~
-
-Show persisted link objects.
-
-+-----------------------+------------------------------------------------------+
-| Argument              |  Description                                         
|
-+=======================+======================================================+
-| ``-a``, ``--all``     | Show all available links                             
|
-+-----------------------+------------------------------------------------------+
-| ``-x``, ``--lid <x>`` | Show link with id ``<x>``                            
|
-+-----------------------+------------------------------------------------------+
-
-Example: ::
-
-  show link --all or show link
-
-Show Job Function
-~~~~~~~~~~~~~~~~~
-
-Show persisted job objects.
-
-+-----------------------+----------------------------------------------+
-| Argument              |  Description                                 |
-+=======================+==============================================+
-| ``-a``, ``--all``     | Show all available jobs                      |
-+-----------------------+----------------------------------------------+
-| ``-j``, ``--jid <x>`` | Show job with id ``<x>``                     |
-+-----------------------+----------------------------------------------+
-
-Example: ::
-
-  show job --all or show job
-
-Show Submission Function
-~~~~~~~~~~~~~~~~~~~~~~~~
-
-Show persisted job submission objects.
-
-+-----------------------+---------------------------------------------+
-| Argument              |  Description                                |
-+=======================+=============================================+
-| ``-j``, ``--jid <x>`` | Show available submissions for given job    |
-+-----------------------+---------------------------------------------+
-| ``-d``, ``--detail``  | Show job submissions in full details        |
-+-----------------------+---------------------------------------------+
-
-Example: ::
-
-  show submission
-  show submission --jid 1
-  show submission --jid 1 --detail
-
-Create Command
---------------
-
-Creates new link and job objects. This command is supported only in 
interactive mode. It will ask user to enter the link config and job configs for 
from /to and driver when creating link and job objects respectively.
-
-Available functions:
-
-+----------------+-------------------------------------------------+
-| Function       | Description                                     |
-+================+=================================================+
-| ``link``       | Create new link object                          |
-+----------------+-------------------------------------------------+
-| ``job``        | Create new job object                           |
-+----------------+-------------------------------------------------+
-
-Create Link Function
-~~~~~~~~~~~~~~~~~~~~
-
-Create new link object.
-
-+------------------------+-------------------------------------------------------------+
-| Argument               |  Description                                        
        |
-+========================+=============================================================+
-| ``-c``, ``--cid <x>``  |  Create new link object for connector with id 
``<x>``       |
-+------------------------+-------------------------------------------------------------+
-
-
-Example: ::
-
-  create link --cid 1 or create link -c 1
-
-Create Job Function
-~~~~~~~~~~~~~~~~~~~
-
-Create new job object.
-
-+------------------------+------------------------------------------------------------------+
-| Argument               |  Description                                        
             |
-+========================+==================================================================+
-| ``-f``, ``--from <x>`` | Create new job object with a FROM link with id 
``<x>``           |
-+------------------------+------------------------------------------------------------------+
-| ``-t``, ``--to <t>``   | Create new job object with a TO link with id 
``<x>``             |
-+------------------------+------------------------------------------------------------------+
-
-Example: ::
-
-  create job --from 1 --to 2 or create job --f 1 --t 2 
-
-Update Command
---------------
-
-Update commands allows you to edit link and job objects. This command is 
supported only in interactive mode.
-
-Update Link Function
-~~~~~~~~~~~~~~~~~~~~
-
-Update existing link object.
-
-+-----------------------+---------------------------------------------+
-| Argument              |  Description                                |
-+=======================+=============================================+
-| ``-x``, ``--lid <x>`` |  Update existing link with id ``<x>``       |
-+-----------------------+---------------------------------------------+
-
-Example: ::
-
-  update link --lid 1
-
-Update Job Function
-~~~~~~~~~~~~~~~~~~~
-
-Update existing job object.
-
-+-----------------------+--------------------------------------------+
-| Argument              |  Description                               |
-+=======================+============================================+
-| ``-j``, ``--jid <x>`` | Update existing job object with id ``<x>`` |
-+-----------------------+--------------------------------------------+
-
-Example: ::
-
-  update job --jid 1
-
-
-Delete Command
---------------
-
-Deletes link and job objects from Sqoop server.
-
-Delete Link Function
-~~~~~~~~~~~~~~~~~~~~
-
-Delete existing link object.
-
-+-----------------------+-------------------------------------------+
-| Argument              |  Description                              |
-+=======================+===========================================+
-| ``-x``, ``--lid <x>`` |  Delete link object with id ``<x>``       |
-+-----------------------+-------------------------------------------+
-
-Example: ::
-
-  delete link --lid 1
-
-
-Delete Job Function
-~~~~~~~~~~~~~~~~~~~
-
-Delete existing job object.
-
-+-----------------------+------------------------------------------+
-| Argument              |  Description                             |
-+=======================+==========================================+
-| ``-j``, ``--jid <x>`` | Delete job object with id ``<x>``        |
-+-----------------------+------------------------------------------+
-
-Example: ::
-
-  delete job --jid 1
-
-
-Clone Command
--------------
-
-Clone command will load existing link or job object from Sqoop server and 
allow user in place updates that will result in creation of new link or job 
object. This command is not supported in batch mode.
-
-Clone Link Function
-~~~~~~~~~~~~~~~~~~~~~~~~~
-
-Clone existing link object.
-
-+-----------------------+------------------------------------------+
-| Argument              |  Description                             |
-+=======================+==========================================+
-| ``-x``, ``--lid <x>`` |  Clone link object with id ``<x>``       |
-+-----------------------+------------------------------------------+
-
-Example: ::
-
-  clone link --lid 1
-
-
-Clone Job Function
-~~~~~~~~~~~~~~~~~~
-
-Clone existing job object.
-
-+-----------------------+------------------------------------------+
-| Argument              |  Description                             |
-+=======================+==========================================+
-| ``-j``, ``--jid <x>`` | Clone job object with id ``<x>``         |
-+-----------------------+------------------------------------------+
-
-Example: ::
-
-  clone job --jid 1
-
-Start Command
--------------
-
-Start command will begin execution of an existing Sqoop job.
-
-Start Job Function
-~~~~~~~~~~~~~~~~~~
-
-Start job (submit new submission). Starting already running job is considered 
as invalid operation.
-
-+----------------------------+----------------------------+
-| Argument                   |  Description               |
-+============================+============================+
-| ``-j``, ``--jid <x>``      | Start job with id ``<x>``  |
-+----------------------------+----------------------------+
-| ``-s``, ``--synchronous``  | Synchoronous job execution |
-+----------------------------+----------------------------+
-
-Example: ::
-
-  start job --jid 1
-  start job --jid 1 --synchronous
-
-Stop Command
-------------
-
-Stop command will interrupt an job execution.
-
-Stop Job Function
-~~~~~~~~~~~~~~~~~
-
-Interrupt running job.
-
-+-----------------------+------------------------------------------+
-| Argument              |  Description                             |
-+=======================+==========================================+
-| ``-j``, ``--jid <x>`` | Interrupt running job with id ``<x>``    |
-+-----------------------+------------------------------------------+
-
-Example: ::
-
-  stop job --jid 1
-
-Status Command
---------------
-
-Status command will retrieve the last status of a job.
-
-Status Job Function
-~~~~~~~~~~~~~~~~~~~
-
-Retrieve last status for given job.
-
-+-----------------------+------------------------------------------+
-| Argument              |  Description                             |
-+=======================+==========================================+
-| ``-j``, ``--jid <x>`` | Retrieve status for job with id ``<x>``  |
-+-----------------------+------------------------------------------+
-
-Example: ::
-
-  status job --jid 1
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/sqoop/blob/3613843a/docs/src/site/sphinx/Connector-FTP.rst
----------------------------------------------------------------------
diff --git a/docs/src/site/sphinx/Connector-FTP.rst 
b/docs/src/site/sphinx/Connector-FTP.rst
deleted file mode 100644
index cc10d68..0000000
--- a/docs/src/site/sphinx/Connector-FTP.rst
+++ /dev/null
@@ -1,81 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one or more
-   contributor license agreements.  See the NOTICE file distributed with
-   this work for additional information regarding copyright ownership.
-   The ASF licenses this file to You under the Apache License, Version 2.0
-   (the "License"); you may not use this file except in compliance with
-   the License.  You may obtain a copy of the License at
-
-       http://www.apache.org/licenses/LICENSE-2.0
-
-   Unless required by applicable law or agreed to in writing, software
-   distributed under the License is distributed on an "AS IS" BASIS,
-   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-   See the License for the specific language governing permissions and
-   limitations under the License.
-
-
-==================
-FTP Connector
-==================
-
-The FTP connector supports moving data between an FTP server and other 
supported Sqoop2 connectors.
-
-Currently only the TO direction is supported to write records to an FTP 
server. A FROM connector is pending (SQOOP-2127).
-
-.. contents::
-   :depth: 3
-
------
-Usage
------
-
-To use the FTP Connector, create a link for the connector and a job that uses 
the link.
-
-**Link Configuration**
-++++++++++++++++++++++
-
-Inputs associated with the link configuration include:
-
-+-----------------------------+---------+-----------------------------------------------------------------------+----------------------------+
-| Input                       | Type    | Description                          
                                 | Example                    |
-+=============================+=========+=======================================================================+============================+
-| FTP server hostname         | String  | Hostname for the FTP server.         
                                 | ftp.example.com            |
-|                             |         | *Required*.                          
                                 |                            |
-+-----------------------------+---------+-----------------------------------------------------------------------+----------------------------+
-| FTP server port             | Integer | Port number for the FTP server. 
Defaults to 21.                       | 2100                       |
-|                             |         | *Optional*.                          
                                 |                            |
-+-----------------------------+---------+-----------------------------------------------------------------------+----------------------------+
-| Username                    | String  | The username to provide when 
connecting to the FTP server.            | sqoop                      |
-|                             |         | *Required*.                          
                                 |                            |
-+-----------------------------+---------+-----------------------------------------------------------------------+----------------------------+
-| Password                    | String  | The password to provide when 
connecting to the FTP server.            | sqoop                      |
-|                             |         | *Required*                           
                                 |                            |
-+-----------------------------+---------+-----------------------------------------------------------------------+----------------------------+
-
-**Notes**
-=========
-
-1. The FTP connector will attempt to connect to the FTP server as part of the 
link validation process. If for some reason a connection can not be 
established, you'll see a corresponding warning message.
-
-**TO Job Configuration**
-++++++++++++++++++++++++
-
-Inputs associated with the Job configuration for the TO direction include:
-
-+-----------------------------+---------+-------------------------------------------------------------------------+-----------------------------------+
-| Input                       | Type    | Description                          
                                   | Example                           |
-+=============================+=========+=========================================================================+===================================+
-| Output directory            | String  | The location on the FTP server that 
the connector will write files to.  | uploads                           |
-|                             |         | *Required*                           
                                   |                                   |
-+-----------------------------+---------+-------------------------------------------------------------------------+-----------------------------------+
-
-**Notes**
-=========
-
-1. The *output directory* value needs to be an existing directory on the FTP 
server.
-
-------
-Loader
-------
-
-During the *loading* phase, the connector will create uniquely named files in 
the *output directory* for each partition of data received from the **FROM** 
connector.

http://git-wip-us.apache.org/repos/asf/sqoop/blob/3613843a/docs/src/site/sphinx/Connector-GenericJDBC.rst
----------------------------------------------------------------------
diff --git a/docs/src/site/sphinx/Connector-GenericJDBC.rst 
b/docs/src/site/sphinx/Connector-GenericJDBC.rst
deleted file mode 100644
index 347547d..0000000
--- a/docs/src/site/sphinx/Connector-GenericJDBC.rst
+++ /dev/null
@@ -1,194 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one or more
-   contributor license agreements.  See the NOTICE file distributed with
-   this work for additional information regarding copyright ownership.
-   The ASF licenses this file to You under the Apache License, Version 2.0
-   (the "License"); you may not use this file except in compliance with
-   the License.  You may obtain a copy of the License at
-
-       http://www.apache.org/licenses/LICENSE-2.0
-
-   Unless required by applicable law or agreed to in writing, software
-   distributed under the License is distributed on an "AS IS" BASIS,
-   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-   See the License for the specific language governing permissions and
-   limitations under the License.
-
-
-======================
-Generic JDBC Connector
-======================
-
-The Generic JDBC Connector can connect to any data source that adheres to the 
**JDBC 4** specification.
-
-.. contents::
-   :depth: 3
-
------
-Usage
------
-
-To use the Generic JDBC Connector, create a link for the connector and a job 
that uses the link.
-
-**Link Configuration**
-++++++++++++++++++++++
-
-Inputs associated with the link configuration include:
-
-+-----------------------------+---------+-----------------------------------------------------------------------+------------------------------------------+
-| Input                       | Type    | Description                          
                                 | Example                                  |
-+=============================+=========+=======================================================================+==========================================+
-| JDBC Driver Class           | String  | The full class name of the JDBC 
driver.                               | com.mysql.jdbc.Driver                   
 |
-|                             |         | *Required* and accessible by the 
Sqoop server.                        |                                          
|
-+-----------------------------+---------+-----------------------------------------------------------------------+------------------------------------------+
-| JDBC Connection String      | String  | The JDBC connection string to use 
when connecting to the data source. | jdbc:mysql://localhost/test              |
-|                             |         | *Required*. Connectivity upon 
creation is optional.                   |                                       
   |
-+-----------------------------+---------+-----------------------------------------------------------------------+------------------------------------------+
-| Username                    | String  | The username to provide when 
connecting to the data source.           | sqoop                                
    |
-|                             |         | *Optional*. Connectivity upon 
creation is optional.                   |                                       
   |
-+-----------------------------+---------+-----------------------------------------------------------------------+------------------------------------------+
-| Password                    | String  | The password to provide when 
connecting to the data source.           | sqoop                                
    |
-|                             |         | *Optional*. Connectivity upon 
creation is optional.                   |                                       
   |
-+-----------------------------+---------+-----------------------------------------------------------------------+------------------------------------------+
-| JDBC Connection Properties  | Map     | A map of JDBC connection properties 
to pass to the JDBC driver        | profileSQL=true&useFastDateParsing=false |
-|                             |         | *Optional*.                          
                                 |                                          |
-+-----------------------------+---------+-----------------------------------------------------------------------+------------------------------------------+
-
-**FROM Job Configuration**
-++++++++++++++++++++++++++
-
-Inputs associated with the Job configuration for the FROM direction include:
-
-+-----------------------------+---------+-------------------------------------------------------------------------+---------------------------------------------+
-| Input                       | Type    | Description                          
                                   | Example                                    
 |
-+=============================+=========+=========================================================================+=============================================+
-| Schema name                 | String  | The schema name the table is part 
of.                                   | sqoop                                   
    |
-|                             |         | *Optional*                           
                                   |                                            
 |
-+-----------------------------+---------+-------------------------------------------------------------------------+---------------------------------------------+
-| Table name                  | String  | The table name to import data from.  
                                   | test                                       
 |
-|                             |         | *Optional*. See note below.          
                                   |                                            
 |
-+-----------------------------+---------+-------------------------------------------------------------------------+---------------------------------------------+
-| Table SQL statement         | String  | The SQL statement used to perform a 
**free form query**.                | ``SELECT COUNT(*) FROM test 
${CONDITIONS}`` |
-|                             |         | *Optional*. See notes below.         
                                   |                                            
 |
-+-----------------------------+---------+-------------------------------------------------------------------------+---------------------------------------------+
-| Table column names          | String  | Columns to extract from the JDBC 
data source.                           | col1,col2                              
     |
-|                             |         | *Optional* Comma separated list of 
columns.                             |                                          
   |
-+-----------------------------+---------+-------------------------------------------------------------------------+---------------------------------------------+
-| Partition column name       | Map     | The column name used to partition 
the data transfer process.            | col1                                    
    |
-|                             |         | *Optional*.  Defaults to table's 
first column of primary key.           |                                        
     |
-+-----------------------------+---------+-------------------------------------------------------------------------+---------------------------------------------+
-| Null value allowed for      | Boolean | True or false depending on whether 
NULL values are allowed in data      | true                                     
   |
-| the partition column        |         | of the Partition column. *Optional*. 
                                   |                                            
 |
-+-----------------------------+---------+-------------------------------------------------------------------------+---------------------------------------------+
-| Boundary query              | String  | The query used to define an upper 
and lower boundary when partitioning. |                                         
    |
-|                             |         | *Optional*.                          
                                   |                                            
 |
-+-----------------------------+---------+-------------------------------------------------------------------------+---------------------------------------------+
-
-**Notes**
-=========
-
-1. *Table name* and *Table SQL statement* are mutually exclusive. If *Table 
name* is provided, the *Table SQL statement* should not be provided. If *Table 
SQL statement* is provided then *Table name* should not be provided.
-2. *Table column names* should be provided only if *Table name* is provided.
-3. If there are columns with similar names, column aliases are required. For 
example: ``SELECT table1.id as "i", table2.id as "j" FROM table1 INNER JOIN 
table2 ON table1.id = table2.id``.
-
-**TO Job Configuration**
-++++++++++++++++++++++++
-
-Inputs associated with the Job configuration for the TO direction include:
-
-+-----------------------------+---------+-------------------------------------------------------------------------+-------------------------------------------------+
-| Input                       | Type    | Description                          
                                   | Example                                    
     |
-+=============================+=========+=========================================================================+=================================================+
-| Schema name                 | String  | The schema name the table is part 
of.                                   | sqoop                                   
        |
-|                             |         | *Optional*                           
                                   |                                            
     |
-+-----------------------------+---------+-------------------------------------------------------------------------+-------------------------------------------------+
-| Table name                  | String  | The table name to import data from.  
                                   | test                                       
     |
-|                             |         | *Optional*. See note below.          
                                   |                                            
     |
-+-----------------------------+---------+-------------------------------------------------------------------------+-------------------------------------------------+
-| Table SQL statement         | String  | The SQL statement used to perform a 
**free form query**.                | ``INSERT INTO test (col1, col2) VALUES 
(?, ?)`` |
-|                             |         | *Optional*. See note below.          
                                   |                                            
     |
-+-----------------------------+---------+-------------------------------------------------------------------------+-------------------------------------------------+
-| Table column names          | String  | Columns to insert into the JDBC data 
source.                            | col1,col2                                  
     |
-|                             |         | *Optional* Comma separated list of 
columns.                             |                                          
       |
-+-----------------------------+---------+-------------------------------------------------------------------------+-------------------------------------------------+
-| Stage table name            | String  | The name of the table used as a 
*staging table*.                        | staging                               
          |
-|                             |         | *Optional*.                          
                                   |                                            
     |
-+-----------------------------+---------+-------------------------------------------------------------------------+-------------------------------------------------+
-| Should clear stage table    | Boolean | True or false depending on whether 
the staging table should be cleared  | true                                     
       |
-|                             |         | after the data transfer has 
finished. *Optional*.                       |                                   
              |
-+-----------------------------+---------+-------------------------------------------------------------------------+-------------------------------------------------+
-
-**Notes**
-=========
-
-1. *Table name* and *Table SQL statement* are mutually exclusive. If *Table 
name* is provided, the *Table SQL statement* should not be provided. If *Table 
SQL statement* is provided then *Table name* should not be provided.
-2. *Table column names* should be provided only if *Table name* is provided.
-
------------
-Partitioner
------------
-
-The Generic JDBC Connector partitioner generates conditions to be used by the 
extractor.
-It varies in how it partitions data transfer based on the partition column 
data type.
-Though, each strategy roughly takes on the following form:
-::
-
-  (upper boundary - lower boundary) / (max partitions)
-
-By default, the *primary key* will be used to partition the data unless 
otherwise specified.
-
-The following data types are currently supported:
-
-1. TINYINT
-2. SMALLINT
-3. INTEGER
-4. BIGINT
-5. REAL
-6. FLOAT
-7. DOUBLE
-8. NUMERIC
-9. DECIMAL
-10. BIT
-11. BOOLEAN
-12. DATE
-13. TIME
-14. TIMESTAMP
-15. CHAR
-16. VARCHAR
-17. LONGVARCHAR
-
----------
-Extractor
----------
-
-During the *extraction* phase, the JDBC data source is queried using SQL. This 
SQL will vary based on your configuration.
-
-- If *Table name* is provided, then the SQL statement generated will take on 
the form ``SELECT * FROM <table name>``.
-- If *Table name* and *Columns* are provided, then the SQL statement generated 
will take on the form ``SELECT <columns> FROM <table name>``.
-- If *Table SQL statement* is provided, then the provided SQL statement will 
be used.
-
-The conditions generated by the *partitioner* are appended to the end of the 
SQL query to query a section of data.
-
-The Generic JDBC connector extracts CSV data usable by the *CSV Intermediate 
Data Format*.
-
-------
-Loader
-------
-
-During the *loading* phase, the JDBC data source is queried using SQL. This 
SQL will vary based on your configuration.
-
-- If *Table name* is provided, then the SQL statement generated will take on 
the form ``INSERT INTO <table name> (col1, col2, ...) VALUES (?,?,..)``.
-- If *Table name* and *Columns* are provided, then the SQL statement generated 
will take on the form ``INSERT INTO <table name> (<columns>) VALUES (?,?,..)``.
-- If *Table SQL statement* is provided, then the provided SQL statement will 
be used.
-
-This connector expects to receive CSV data consumable by the *CSV Intermediate 
Data Format*.
-
-----------
-Destroyers
-----------
-
-The Generic JDBC Connector performs two operations in the destroyer in the TO 
direction:
-
-1. Copy the contents of the staging table to the desired table.
-2. Clear the staging table.
-
-No operations are performed in the FROM direction.
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/sqoop/blob/3613843a/docs/src/site/sphinx/Connector-HDFS.rst
----------------------------------------------------------------------
diff --git a/docs/src/site/sphinx/Connector-HDFS.rst 
b/docs/src/site/sphinx/Connector-HDFS.rst
deleted file mode 100644
index c44b1b6..0000000
--- a/docs/src/site/sphinx/Connector-HDFS.rst
+++ /dev/null
@@ -1,159 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one or more
-   contributor license agreements.  See the NOTICE file distributed with
-   this work for additional information regarding copyright ownership.
-   The ASF licenses this file to You under the Apache License, Version 2.0
-   (the "License"); you may not use this file except in compliance with
-   the License.  You may obtain a copy of the License at
-
-       http://www.apache.org/licenses/LICENSE-2.0
-
-   Unless required by applicable law or agreed to in writing, software
-   distributed under the License is distributed on an "AS IS" BASIS,
-   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-   See the License for the specific language governing permissions and
-   limitations under the License.
-
-
-==============
-HDFS Connector
-==============
-
-.. contents::
-   :depth: 3
-
------
-Usage
------
-
-To use the HDFS Connector, create a link for the connector and a job that uses 
the link.
-
-**Link Configuration**
-++++++++++++++++++++++
-
-Inputs associated with the link configuration include:
-
-+-----------------------------+---------+-----------------------------------------------------------------------+----------------------------+
-| Input                       | Type    | Description                          
                                 | Example                    |
-+=============================+=========+=======================================================================+============================+
-| URI                         | String  | The URI of the HDFS File System.     
                                 | hdfs://example.com:8020/   |
-|                             |         | *Optional*. See note below.          
                                 |                            |
-+-----------------------------+---------+-----------------------------------------------------------------------+----------------------------+
-| Configuration directory     | String  | Path to the clusters configuration 
directory.                         | /etc/conf/hadoop           |
-|                             |         | *Optional*.                          
                                 |                            |
-+-----------------------------+---------+-----------------------------------------------------------------------+----------------------------+
-
-**Notes**
-=========
-
-1. The specified URI will override the declared URI in your configuration.
-
-**FROM Job Configuration**
-++++++++++++++++++++++++++
-
-Inputs associated with the Job configuration for the FROM direction include:
-
-+-----------------------------+---------+-------------------------------------------------------------------------+------------------+
-| Input                       | Type    | Description                          
                                   | Example          |
-+=============================+=========+=========================================================================+==================+
-| Input directory             | String  | The location in HDFS that the 
connector should look for files in.       | /tmp/sqoop2/hdfs |
-|                             |         | *Required*. See note below.          
                                   |                  |
-+-----------------------------+---------+-------------------------------------------------------------------------+------------------+
-| Null value                  | String  | The value of NULL in the contents of 
each file extracted.               | \N               |
-|                             |         | *Optional*. See note below.          
                                   |                  |
-+-----------------------------+---------+-------------------------------------------------------------------------+------------------+
-| Override null value         | Boolean | Tells the connector to replace the 
specified NULL value.                | true             |
-|                             |         | *Optional*. See note below.          
                                   |                  |
-+-----------------------------+---------+-------------------------------------------------------------------------+------------------+
-
-**Notes**
-=========
-
-1. All files in *Input directory* will be extracted.
-2. *Null value* and *override null value* should be used in conjunction. If 
*override null value* is not set to true, then *null value* will not be used 
when extracting data.
-
-**TO Job Configuration**
-++++++++++++++++++++++++
-
-Inputs associated with the Job configuration for the TO direction include:
-
-+-----------------------------+---------+-------------------------------------------------------------------------+-----------------------------------+
-| Input                       | Type    | Description                          
                                   | Example                           |
-+=============================+=========+=========================================================================+===================================+
-| Output directory            | String  | The location in HDFS that the 
connector will load files to.             | /tmp/sqoop2/hdfs                  |
-|                             |         | *Optional*                           
                                   |                                   |
-+-----------------------------+---------+-------------------------------------------------------------------------+-----------------------------------+
-| Output format               | Enum    | The format to output data to.        
                                   | CSV                               |
-|                             |         | *Optional*. See note below.          
                                   |                                   |
-+-----------------------------+---------+-------------------------------------------------------------------------+-----------------------------------+
-| Compression                 | Enum    | Compression class.                   
                                   | GZIP                              |
-|                             |         | *Optional*. See note below.          
                                   |                                   |
-+-----------------------------+---------+-------------------------------------------------------------------------+-----------------------------------+
-| Custom compression          | String  | Custom compression class.            
                                   | org.apache.sqoop.SqoopCompression |
-|                             |         | *Optional* Comma separated list of 
columns.                             |                                   |
-+-----------------------------+---------+-------------------------------------------------------------------------+-----------------------------------+
-| Null value                  | String  | The value of NULL in the contents of 
each file loaded.                  | \N                                |
-|                             |         | *Optional*. See note below.          
                                   |                                   |
-+-----------------------------+---------+-------------------------------------------------------------------------+-----------------------------------+
-| Override null value         | Boolean | Tells the connector to replace the 
specified NULL value.                | true                              |
-|                             |         | *Optional*. See note below.          
                                   |                                   |
-+-----------------------------+---------+-------------------------------------------------------------------------+-----------------------------------+
-| Append mode                 | Boolean | Append to an existing output 
directory.                                 | true                              |
-|                             |         | *Optional*.                          
                                   |                                   |
-+-----------------------------+---------+-------------------------------------------------------------------------+-----------------------------------+
-
-**Notes**
-=========
-
-1. *Output format* only supports CSV at the moment.
-2. *Compression* supports all Hadoop compression classes.
-3. *Null value* and *override null value* should be used in conjunction. If 
*override null value* is not set to true, then *null value* will not be used 
when loading data.
-
------------
-Partitioner
------------
-
-The HDFS Connector partitioner partitions based on total blocks in all files 
in the specified input directory.
-Blocks will try to be placed in splits based on the *node* and *rack* they 
reside in.
-
----------
-Extractor
----------
-
-During the *extraction* phase, the FileSystem API is used to query files from 
HDFS. The HDFS cluster used is the one defined by:
-
-1. The HDFS URI in the link configuration
-2. The Hadoop configuration in the link configuration
-3. The Hadoop configuration used by the execution framework
-
-The format of the data must be CSV. The NULL value in the CSV can be chosen 
via *null value*. For example::
-
-    1,\N
-    2,null
-    3,NULL
-
-In the above example, if *null value* is set to \N, then only the first row's 
NULL value will be inferred.
-
-------
-Loader
-------
-
-During the *loading* phase, HDFS is written to via the FileSystem API. The 
number of files created is equal to the number of loads that run. The format of 
the data currently can only be CSV. The NULL value in the CSV can be chosen via 
*null value*. For example:
-
-+--------------+-------+
-| Id           | Value |
-+==============+=======+
-| 1            | NULL  |
-+--------------+-------+
-| 2            | value |
-+--------------+-------+
-
-If *null value* is set to \N, then here's how the data will look like in HDFS::
-
-    1,\N
-    2,value
-
-----------
-Destroyers
-----------
-
-The HDFS TO destroyer moves all created files to the proper output directory.

http://git-wip-us.apache.org/repos/asf/sqoop/blob/3613843a/docs/src/site/sphinx/Connector-Kafka.rst
----------------------------------------------------------------------
diff --git a/docs/src/site/sphinx/Connector-Kafka.rst 
b/docs/src/site/sphinx/Connector-Kafka.rst
deleted file mode 100644
index b6bca14..0000000
--- a/docs/src/site/sphinx/Connector-Kafka.rst
+++ /dev/null
@@ -1,64 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one or more
-   contributor license agreements.  See the NOTICE file distributed with
-   this work for additional information regarding copyright ownership.
-   The ASF licenses this file to You under the Apache License, Version 2.0
-   (the "License"); you may not use this file except in compliance with
-   the License.  You may obtain a copy of the License at
-
-       http://www.apache.org/licenses/LICENSE-2.0
-
-   Unless required by applicable law or agreed to in writing, software
-   distributed under the License is distributed on an "AS IS" BASIS,
-   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-   See the License for the specific language governing permissions and
-   limitations under the License.
-
-
-===============
-Kafka Connector
-===============
-
-Currently, only the TO direction is supported.
-
-.. contents::
-   :depth: 3
-
------
-Usage
------
-
-To use the Kafka Connector, create a link for the connector and a job that 
uses the link.
-
-**Link Configuration**
-++++++++++++++++++++++
-
-Inputs associated with the link configuration include:
-
-+----------------------+---------+-----------------------------------------------------------+-------------------------------------+
-| Input                | Type    | Description                                 
              | Example                             |
-+======================+=========+===========================================================+=====================================+
-| Broker list          | String  | Comma separated list of kafka brokers.      
              | example.com:10000,example.com:11000 |
-|                      |         | *Required*.                                 
              |                                     |
-+----------------------+---------+-----------------------------------------------------------+-------------------------------------+
-| Zookeeper connection | String  | Comma separated list of zookeeper servers 
in your quorum. | /etc/conf/hadoop                    |
-|                      |         | *Required*.                                 
              |                                     |
-+----------------------+---------+-----------------------------------------------------------+-------------------------------------+
-
-**TO Job Configuration**
-++++++++++++++++++++++++
-
-Inputs associated with the Job configuration for the FROM direction include:
-
-+-------+---------+---------------------------------+----------+
-| Input | Type    | Description                     | Example  |
-+=======+=========+=================================+==========+
-| topic | String  | The Kafka topic to transfer to. | my topic |
-|       |         | *Required*.                     |          |
-+-------+---------+---------------------------------+----------+
-
-------
-Loader
-------
-
-During the *loading* phase, Kafka is written to directly from each loader. The 
order in which data is loaded into Kafka is not guaranteed.
-

http://git-wip-us.apache.org/repos/asf/sqoop/blob/3613843a/docs/src/site/sphinx/Connector-Kite.rst
----------------------------------------------------------------------
diff --git a/docs/src/site/sphinx/Connector-Kite.rst 
b/docs/src/site/sphinx/Connector-Kite.rst
deleted file mode 100644
index 414ad8a..0000000
--- a/docs/src/site/sphinx/Connector-Kite.rst
+++ /dev/null
@@ -1,110 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one or more
-   contributor license agreements.  See the NOTICE file distributed with
-   this work for additional information regarding copyright ownership.
-   The ASF licenses this file to You under the Apache License, Version 2.0
-   (the "License"); you may not use this file except in compliance with
-   the License.  You may obtain a copy of the License at
-
-       http://www.apache.org/licenses/LICENSE-2.0
-
-   Unless required by applicable law or agreed to in writing, software
-   distributed under the License is distributed on an "AS IS" BASIS,
-   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-   See the License for the specific language governing permissions and
-   limitations under the License.
-
-
-==============
-Kite Connector
-==============
-
-.. contents::
-   :depth: 3
-
------
-Usage
------
-
-To use the Kite Connector, create a link for the connector and a job that uses 
the link. For more information on Kite, checkout the kite documentation: 
http://kitesdk.org/docs/1.0.0/Kite-SDK-Guide.html.
-
-**Link Configuration**
-++++++++++++++++++++++
-
-Inputs associated with the link configuration include:
-
-+-----------------------------+---------+-----------------------------------------------------------------------+----------------------------+
-| Input                       | Type    | Description                          
                                 | Example                    |
-+=============================+=========+=======================================================================+============================+
-| authority                   | String  | The authority of the kite dataset.   
                                 | hdfs://example.com:8020/   |
-|                             |         | *Optional*. See note below.          
                                 |                            |
-+-----------------------------+---------+-----------------------------------------------------------------------+----------------------------+
-
-**Notes**
-=========
-
-1. The authority is useful for specifying Hive metastore or HDFS URI.
-
-**FROM Job Configuration**
-++++++++++++++++++++++++++
-
-Inputs associated with the Job configuration for the FROM direction include:
-
-+-----------------------------+---------+-----------------------------------------------------------------------+----------------------------+
-| Input                       | Type    | Description                          
                                 | Example                    |
-+=============================+=========+=======================================================================+============================+
-| URI                         | String  | The Kite dataset URI to use.         
                                 | dataset:hdfs:/tmp/ns/ds    |
-|                             |         | *Required*. See notes below.         
                                 |                            |
-+-----------------------------+---------+-----------------------------------------------------------------------+----------------------------+
-
-**Notes**
-=========
-
-1. The URI and the authority from the link configuration will be merged to 
create a complete dataset URI internally. If the given dataset URI contains 
authority, the authority from the link configuration will be ignored.
-2. Only *hdfs* and *hive* are supported currently.
-
-**TO Job Configuration**
-++++++++++++++++++++++++
-
-Inputs associated with the Job configuration for the TO direction include:
-
-+-----------------------------+---------+-----------------------------------------------------------------------+----------------------------+
-| Input                       | Type    | Description                          
                                 | Example                    |
-+=============================+=========+=======================================================================+============================+
-| URI                         | String  | The Kite dataset URI to use.         
                                 | dataset:hdfs:/tmp/ns/ds    |
-|                             |         | *Required*. See note below.          
                                 |                            |
-+-----------------------------+---------+-----------------------------------------------------------------------+----------------------------+
-| File format                 | Enum    | The format of the data the kite 
dataset should write out.             | PARQUET                    |
-|                             |         | *Optional*. See note below.          
                                 |                            |
-+-----------------------------+---------+-----------------------------------------------------------------------+----------------------------+
-
-**Notes**
-=========
-
-1. The URI and the authority from the link configuration will be merged to 
create a complete dataset URI internally. If the given dataset URI contains 
authority, the authority from the link configuration will be ignored.
-2. Only *hdfs* and *hive* are supported currently.
-
------------
-Partitioner
------------
-
-The kite connector only creates one partition currently.
-
----------
-Extractor
----------
-
-During the *extraction* phase, Kite is used to query a dataset. Since there is 
only one dataset to query, only a single reader is created to read the dataset.
-
-**NOTE**: The avro schema kite generates will be slightly different than the 
original schema. This is because avro identifiers have strict naming 
requirements.
-
-------
-Loader
-------
-
-During the *loading* phase, Kite is used to write several temporary datasets. 
The number of temporary datasets is equivalent to the number of *loaders* that 
are being used.
-
-----------
-Destroyers
-----------
-
-The Kite connector TO destroyer merges all the temporary datasets into a 
single dataset.
\ No newline at end of file

Reply via email to