[
https://issues.apache.org/jira/browse/SQOOP-2201?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14364470#comment-14364470
]
Sqoop QA bot commented on SQOOP-2201:
-------------------------------------
Testing file
[SQOOP-2201.patch|https://issues.apache.org/jira/secure/attachment/12704976/SQOOP-2201.patch]
against branch sqoop2 took 0:44:50.548976.
{color:red}Overall:{color} -1 due to 5 errors
{color:red}ERROR:{color} Some of fast integration tests failed
([report|https://builds.apache.org/job/PreCommit-SQOOP-Build/1168/artifact/patch-process/test_integration_fast.txt],
executed 23 tests)
{color:red}ERROR:{color} Failed fast integration test:
{{org.apache.sqoop.integration.repository.derby.upgrade.Derby1_99_4UpgradeTest}}
{color:red}ERROR:{color} Failed fast integration test:
{{org.apache.sqoop.integration.server.SubmissionWithDisabledModelObjectsTest}}
{color:red}ERROR:{color} Failed fast integration test:
{{org.apache.sqoop.integration.repository.derby.upgrade.Derby1_99_3UpgradeTest}}
{color:red}ERROR:{color} Failed fast integration test:
{{org.apache.sqoop.integration.connector.jdbc.generic.IncrementalReadTest}}
{color:green}SUCCESS:{color} Clean was successful
{color:green}SUCCESS:{color} Patch applied correctly
{color:green}SUCCESS:{color} Patch add/modify test case
{color:green}SUCCESS:{color} License check passed
{color:green}SUCCESS:{color} Patch compiled
{color:green}SUCCESS:{color} All fast unit tests passed (executed 272 tests)
{color:green}SUCCESS:{color} All slow unit tests passed (executed 1 tests)
{color:green}SUCCESS:{color} All slow integration tests passed (executed 0
tests)
Console output is available
[here|https://builds.apache.org/job/PreCommit-SQOOP-Build/1168/console].
This message is automatically generated.
> Sqoop2: Add possibility to read Hadoop configuration files to HFDS connector
> ----------------------------------------------------------------------------
>
> Key: SQOOP-2201
> URL: https://issues.apache.org/jira/browse/SQOOP-2201
> Project: Sqoop
> Issue Type: Bug
> Affects Versions: 1.99.5
> Reporter: Jarek Jarcec Cecho
> Assignee: Jarek Jarcec Cecho
> Fix For: 1.99.6
>
> Attachments: SQOOP-2201.patch
>
>
> Currently the HDFS connector is not explicitly reading Hadoop configuration
> files. During
> [Initialization|https://github.com/apache/sqoop/blob/sqoop2/connector/connector-hdfs/src/main/java/org/apache/sqoop/connector/hdfs/HdfsToInitializer.java]
> phase it doesn't do anything, so the configuration files are not needed.
> During other parts of the workflow, we're [explicitly
> casting|https://github.com/apache/sqoop/blob/sqoop2/connector/connector-hdfs/src/main/java/org/apache/sqoop/connector/hdfs/HdfsExtractor.java#L61]
> the general {{Context}} object to Hadoop {{Configuration}}.
> This is unfortunate because:
> * It couples HDFS connector to Mapreduce execution engine. It will break with
> adding non mapreduce based execution engine.
> * We can't do any HDFS specific checks in {{Initializer}} as the Hadoop
> {{Configuration}} object is not available there.
> As a result I would like to propose breaking this coupling between HDFS
> connector and Mapreduce execution engine and add configuration option to HDFS
> Link to specify directory from which we should read the appropriate Hadoop
> configuration files (with reasonable defaults such as {{/etc/conf/hadoop}}).
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)