[jira] [Commented] (SQOOP-3462) Sqoop ant build fails due to outdated maven repo URLs

2020-01-25 Thread Hudson (Jira)


[ 
https://issues.apache.org/jira/browse/SQOOP-3462?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17023648#comment-17023648
 ] 

Hudson commented on SQOOP-3462:
---

FAILURE: Integrated in Jenkins build Sqoop-hadoop200 #1263 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1263/])
SQOOP-3462: Sqoop ant build fails due to outdated maven repo URLs (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=a9f051c5af31a8e70be1e3c2684dc351e4311243])
* (edit) build.xml
* (edit) ivy/ivysettings.xml


> Sqoop ant build fails due to outdated maven repo URLs
> -
>
> Key: SQOOP-3462
> URL: https://issues.apache.org/jira/browse/SQOOP-3462
> Project: Sqoop
>  Issue Type: Bug
>  Components: build
>Reporter: Istvan Toth
>Assignee: Istvan Toth
>Priority: Critical
> Fix For: 1.5.0, 3.0.0
>
> Attachments: SQOOP-3462.v1.patch
>
>  Time Spent: 1h 20m
>  Remaining Estimate: 0h
>
> Sqoop can no longer be built with ant, as the maven central repos no longer 
> support HTTP.
> {{[ivy:resolve] SERVER ERROR: HTTPS Required 
> url=http://repo1.maven.org/maven2/org/apache/avro/avro/1.8.1/avro-1.8.1.pom}}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (SQOOP-3428) Fix the CI

2019-10-16 Thread Hudson (Jira)


[ 
https://issues.apache.org/jira/browse/SQOOP-3428?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16952847#comment-16952847
 ] 

Hudson commented on SQOOP-3428:
---

FAILURE: Integrated in Jenkins build Sqoop-hadoop200 #1261 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1261/])
SQOOP-3428: Try to fix the CI (#85) (fero: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=875e431af15c514a6c3dbf30e3afefdf640c66f9])
* (edit) build.gradle
* (edit) gradle.properties
* (delete) src/scripts/thirdpartytest/docker-compose/db2scripts/db2entrypoint.sh
* (edit) 
src/scripts/thirdpartytest/docker-compose/sqoop-thirdpartytest-db-services.yml
* (edit) src/test/org/apache/sqoop/metastore/JobToolTestBase.java


> Fix the CI
> --
>
> Key: SQOOP-3428
> URL: https://issues.apache.org/jira/browse/SQOOP-3428
> Project: Sqoop
>  Issue Type: Improvement
>Reporter: Fokko Driesprong
>Assignee: Fokko Driesprong
>Priority: Major
>  Time Spent: 40m
>  Remaining Estimate: 0h
>
> Currently, the CI is broken because the Oracle 11 XE Dockerimage isn't 
> available anymore.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (SQOOP-3441) Prepare Sqoop for Java 11 support

2019-06-12 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3441?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16862096#comment-16862096
 ] 

Hudson commented on SQOOP-3441:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1260 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1260/])
SQOOP-3441: Prepare Sqoop for Java 11 support (bogi: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=cee4ab15c729e213e622f8f91674a480aca6bd49])
* (edit) gradle/wrapper/gradle-wrapper.properties
* (edit) gradlew.bat
* (edit) src/test/org/apache/sqoop/accumulo/AccumuloTestCase.java
* (edit) COMPILING.adoc
* (edit) gradlew
* (edit) src/test/org/apache/sqoop/TestSqoopOptions.java
* (edit) gradle/wrapper/gradle-wrapper.jar


> Prepare Sqoop for Java 11 support
> -
>
> Key: SQOOP-3441
> URL: https://issues.apache.org/jira/browse/SQOOP-3441
> Project: Sqoop
>  Issue Type: Improvement
>Reporter: Fero Szabo
>Assignee: Fero Szabo
>Priority: Major
> Fix For: 1.5.0
>
>  Time Spent: 50m
>  Remaining Estimate: 0h
>
> A couple of code changes will be required in order for Sqoop to work with 
> Java11 and we'll also have to bump a couple of dependencies and the gradle 
> version. 
> I'm not sure what's required for ant, that is to be figured out in a separate 
> Jira, if we keep the ant build.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3438) Sqoop Import with create hcatalog table for ORC will not work with Hive3 as the table created would be a ACID table and transactional

2019-06-05 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3438?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16856532#comment-16856532
 ] 

Hudson commented on SQOOP-3438:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1259 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1259/])
SQOOP-3438: Sqoop Import with create hcatalog table for ORC will not (bogi: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=be260e3096a6a2710e661c7fe33f6b402ce66476])
* (edit) src/java/org/apache/sqoop/SqoopOptions.java
* (edit) src/java/org/apache/sqoop/mapreduce/hcat/SqoopHCatUtilities.java
* (edit) src/java/org/apache/sqoop/tool/BaseSqoopTool.java
* (edit) src/test/org/apache/sqoop/hcat/HCatalogTestUtils.java
* (edit) src/test/org/apache/sqoop/hcat/HCatalogImportTest.java
* (edit) src/docs/user/hcatalog.txt


> Sqoop Import with create hcatalog table for ORC will not work with Hive3 as 
> the table created would be a ACID table and transactional
> -
>
> Key: SQOOP-3438
> URL: https://issues.apache.org/jira/browse/SQOOP-3438
> Project: Sqoop
>  Issue Type: Improvement
>  Components: hive-integration
>Affects Versions: 1.4.7
>Reporter: Denes Bodo
>Assignee: Denes Bodo
>Priority: Critical
> Fix For: 1.5.0
>
>  Time Spent: 3h 20m
>  Remaining Estimate: 0h
>
> PROBLEM: Running a sqoop import command with the option 
> --create-hcatalog-table will not work due to the following reasons
> When create-hcatalog-table is used it creates the table as a Managed ACID 
> table.
> HCatalog does not support transactional or bucketing table
> So customer who need to create a ORC based table cannot use sqoop to create a 
> ORC based table which means their existing code where if in case they use 
> sqoop to create these tables would fail.
> The current workaround is a two step process
> 1. Create the ORC table in hive with the keyword external and set 
> transactional to false
> 2. Then use the sqoop command to load the data into the orc table.
> The request is to add in an extra argument in the sqoop command line to 
> specify that the table is external (example: --hcatalog-external-table )so we 
> can use the option --hcatalog-storage-stanza "stored as orc tblproperties 
> (\"transactional\"=\"false\")".
> 
> Thank you [~mbalakrishnan] for your findings. This ticket is created based on 
> your work.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3423) Let user pass password to connect Hive when it set to LDAP authentication

2019-05-06 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3423?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16833863#comment-16833863
 ] 

Hudson commented on SQOOP-3423:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1258 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1258/])
SQOOP-3423: Let user pass password to connect Hive when it set to LDAP (ebogi: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=dfeb14534ff0c57ebe27625aa9c79e37b4e12558])
* (edit) 
src/test/org/apache/sqoop/hive/HiveServer2ConnectionFactoryInitializerTest.java
* (edit) src/java/org/apache/sqoop/SqoopOptions.java
* (edit) src/java/org/apache/sqoop/tool/BaseSqoopTool.java
* (edit) src/docs/user/hive.txt
* (edit) src/docs/user/hive-args.txt
* (edit) 
src/java/org/apache/sqoop/hive/HiveServer2ConnectionFactoryInitializer.java


> Let user pass password to connect Hive when it set to LDAP authentication
> -
>
> Key: SQOOP-3423
> URL: https://issues.apache.org/jira/browse/SQOOP-3423
> Project: Sqoop
>  Issue Type: Improvement
>  Components: hive-integration
>Affects Versions: 1.4.7
>Reporter: Denes Bodo
>Assignee: Denes Bodo
>Priority: Major
> Fix For: 1.5.0
>
> Attachments: SQOOP-3423-001.patch
>
>  Time Spent: 0.5h
>  Remaining Estimate: 0h
>
> If HiveServer2 is using password based authentication, additional 
> username/password information has to be provided to be able to connect to it.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3134) --class-name should override default Avro schema name

2019-04-05 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3134?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16810898#comment-16810898
 ] 

Hudson commented on SQOOP-3134:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1257 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1257/])
SQOOP-3134: --class-name should override default Avro schema name (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=1a04d2007d7e4e111d9cde9efbe3485547075fc6])
* (edit) src/docs/user/import.txt
* (edit) src/test/org/apache/sqoop/TestAvroImport.java
* (edit) src/java/org/apache/sqoop/orm/AvroSchemaGenerator.java


> --class-name should override default Avro schema name
> -
>
> Key: SQOOP-3134
> URL: https://issues.apache.org/jira/browse/SQOOP-3134
> Project: Sqoop
>  Issue Type: Improvement
>Reporter: Markus Kemper
>Assignee: Daniel Voros
>Priority: Major
> Fix For: 1.5.0, 3.0.0
>
> Attachments: SQOOP-3134.1.patch
>
>  Time Spent: 0.5h
>  Remaining Estimate: 0h
>
> Please consider adding an option to configure the Avro schema output file 
> name that is created with Sqoop (import + --as-avrodatafile), example cases 
> below.
> {noformat}
> #
> # STEP 01 - Create Data
> #
> export MYCONN=jdbc:mysql://mysql.cloudera.com:3306/db_coe
> export MYUSER=sqoop
> export MYPSWD=cloudera
> sqoop list-tables --connect $MYCONN --username $MYUSER --password $MYPSWD
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query 
> "drop table t1"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query 
> "create table t1 (c1 int, c2 date, c3 varchar(10))"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query 
> "insert into t1 values (1, current_date, 'some data')"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query 
> "select * from t1"
> -
> | c1  | c2 | c3 | 
> -
> | 1   | 2017-02-13 | some data  | 
> -
> #
> # STEP 02 - Import + --table + --as-avrodatafile
> #
> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --table 
> t1 --target-dir /user/root/t1 --delete-target-dir --num-mappers 1 
> --as-avrodatafile 
> ls -l ./*
> Output:
> 17/02/13 12:14:52 INFO mapreduce.ImportJobBase: Transferred 413 bytes in 
> 20.6988 seconds (19.9529 bytes/sec)
> 17/02/13 12:14:52 INFO mapreduce.ImportJobBase: Retrieved 1 records.
> 
> -rw-r--r-- 1 root root   492 Feb 13 12:14 ./t1.avsc < want option to 
> configure this file name
> -rw-r--r-- 1 root root 12462 Feb 13 12:14 ./t1.java
> #
> # STEP 03 - Import + --query + --as-avrodatafile
> #
> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --query 
> "select * from t1 where \$CONDITIONS" --split-by c1 --target-dir 
> /user/root/t1 --delete-target-dir --num-mappers 1 --as-avrodatafile 
> ls -l ./*
> Output:
> 17/02/13 12:16:58 INFO mapreduce.ImportJobBase: Transferred 448 bytes in 
> 25.2757 seconds (17.7245 bytes/sec)
> 17/02/13 12:16:58 INFO mapreduce.ImportJobBase: Retrieved 1 records.
> ~
> -rw-r--r-- 1 root root   527 Feb 13 12:16 ./AutoGeneratedSchema.avsc < 
> want option to configure this file name
> -rw-r--r-- 1 root root 12590 Feb 13 12:16 ./QueryResult.java
> {noformat}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3435) Avoid NullPointerException due to different JSONObject library in classpath

2019-04-03 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3435?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16808512#comment-16808512
 ] 

Hudson commented on SQOOP-3435:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1256 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1256/])
SQOOP-3435: Avoid NullPointerException due to different JSONObject (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=e90e244396ecffeb332633c89e641357adcf8eda])
* (edit) src/java/org/apache/sqoop/util/SqoopJsonUtil.java


> Avoid NullPointerException due to different JSONObject library in classpath
> ---
>
> Key: SQOOP-3435
> URL: https://issues.apache.org/jira/browse/SQOOP-3435
> Project: Sqoop
>  Issue Type: Bug
>Affects Versions: 1.4.7, 1.5.0
>Reporter: Tak Lon (Stephen) Wu
>Assignee: Tak Lon (Stephen) Wu
>Priority: Major
> Fix For: 1.4.7, 1.5.0
>
> Attachments: SQOOP-3435.trunk.001.patch, SQOOP-3435.trunk.002.patch
>
>  Time Spent: 40m
>  Remaining Estimate: 0h
>
> In line of 
> [SqoopOptions.java#L785|https://github.com/apache/sqoop/blob/branch-1.4.7/src/java/org/apache/sqoop/SqoopOptions.java#L785]
>  with
> {\{SqoopJsonUtil.getJsonStringforMap((Map) f.get(this))); }}
> Above line should check NULL pointer like in line of 
> [SqoopOptions.java#L778|https://github.com/apache/sqoop/blob/branch-1.4.7/src/java/org/apache/sqoop/SqoopOptions.java#L778]
>  which has
> {{f.get(this) == null ? "null" : f.get(this).toString()}}
> Please see the stacktrace below when running command:
> {{sqoop job --create myjob -- import --connect jdbc:mysql://localhost/db 
> --username root --table employee --m 1}}
> {code:java}
> 19/02/02 01:09:21 ERROR sqoop.Sqoop: Got exception running Sqoop: 
> java.lang.NullPointerException
> java.lang.NullPointerException
> at org.json.JSONObject.(JSONObject.java:144)
> at 
> org.apache.sqoop.util.SqoopJsonUtil.getJsonStringforMap(SqoopJsonUtil.java:43)
> at org.apache.sqoop.SqoopOptions.writeProperties(SqoopOptions.java:785)
> at 
> org.apache.sqoop.metastore.hsqldb.HsqldbJobStorage.createInternal(HsqldbJobStorage.java:399)
> at 
> org.apache.sqoop.metastore.hsqldb.HsqldbJobStorage.create(HsqldbJobStorage.java:379)
> at org.apache.sqoop.tool.JobTool.createJob(JobTool.java:181)
> at org.apache.sqoop.tool.JobTool.run(JobTool.java:294)
> at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
> at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
> at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
> at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
> at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
> {code}
> The above {{NullPointerException}} is due to use the of 
> [{{com.tdunning}}|https://github.com/tdunning/open-json/blob/rc1.8/src/main/java/org/json/JSONObject.java#L141-L155]
>  as part of the HIVE libs (if one is reusing the {{HADOOP_CLASSPATH}}) in the 
> classpath. but I think we can better have a checker of {{null}} in 
> {{SqoopJsonUtil.getJsonStringforMap(Map map)}} before calling 
> {{JSONObject pathPartMap = new JSONObject(map);}}
> Reporting this bug and the right behavior need to be decided by the assignee. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3427) Add Travis badge to the Readme

2019-03-18 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3427?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16794895#comment-16794895
 ] 

Hudson commented on SQOOP-3427:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1255 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1255/])
SQOOP-3427: Add Travis badge to the Readme (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=0216f7fbb7b71f610d02dfa95dafb2c81acbc690])
* (add) README.md
* (add) COMPILING.adoc
* (delete) README.txt
* (delete) COMPILING.txt


> Add Travis badge to the Readme
> --
>
> Key: SQOOP-3427
> URL: https://issues.apache.org/jira/browse/SQOOP-3427
> Project: Sqoop
>  Issue Type: Improvement
>  Components: docs
>Reporter: Fokko Driesprong
>Assignee: Fokko Driesprong
>Priority: Major
>  Time Spent: 1h 20m
>  Remaining Estimate: 0h
>
> We love badges



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3429) Bump Hadoop to 2.9.2

2019-03-13 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3429?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16791869#comment-16791869
 ] 

Hudson commented on SQOOP-3429:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1254 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1254/])
SQOOP-3429: Bump Hadoop to 2.9.2 (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=3007648c0801c0ef2ad3ef1eb9e3b8681f1c2e02])
* (edit) ivy/libraries.properties
* (edit) gradle.properties


> Bump Hadoop to 2.9.2
> 
>
> Key: SQOOP-3429
> URL: https://issues.apache.org/jira/browse/SQOOP-3429
> Project: Sqoop
>  Issue Type: Improvement
>Affects Versions: 1.4.7
>Reporter: Fokko Driesprong
>Assignee: Fokko Driesprong
>Priority: Major
> Fix For: 1.5.0
>
>  Time Spent: 40m
>  Remaining Estimate: 0h
>
> I would like to bump Sqoop to Hadoop 2.9.3



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3430) Remove the old maven pom

2019-03-13 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3430?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16791577#comment-16791577
 ] 

Hudson commented on SQOOP-3430:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1253 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1253/])
SQOOP-3430: Remove the old maven pom (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=279f84c53db93c47845f6be7124774b1f5e40c76])
* (delete) pom-old.xml


> Remove the old maven pom
> 
>
> Key: SQOOP-3430
> URL: https://issues.apache.org/jira/browse/SQOOP-3430
> Project: Sqoop
>  Issue Type: Improvement
>Reporter: Fokko Driesprong
>Assignee: Fokko Driesprong
>Priority: Major
> Fix For: 1.5.0
>
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> I think both ant and gradle are enough :)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3428) Remove the old Maven pom

2019-03-08 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3428?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16788003#comment-16788003
 ] 

Hudson commented on SQOOP-3428:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1252 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1252/])
SQOOP-3428: Fix the CI (fero: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=20fe120706039adbf7e58985f93b2020f688abc0])
* (edit) 
src/scripts/thirdpartytest/docker-compose/sqoop-thirdpartytest-db-services.yml
* (edit) 
src/scripts/thirdpartytest/docker-compose/oraclescripts/startup/oracleusersetup.sql
* (edit) src/test/org/apache/sqoop/manager/oracle/OracleCompatTest.java
* (edit) src/test/org/apache/sqoop/manager/oracle/OracleManagerTest.java
* (edit) src/test/org/apache/sqoop/manager/oracle/util/OracleUtils.java
* (edit) src/test/org/apache/sqoop/testutil/ManagerCompatTestCase.java


> Remove the old Maven pom
> 
>
> Key: SQOOP-3428
> URL: https://issues.apache.org/jira/browse/SQOOP-3428
> Project: Sqoop
>  Issue Type: Improvement
>Reporter: Fokko Driesprong
>Priority: Major
>  Time Spent: 0.5h
>  Remaining Estimate: 0h
>
> I think both ant and gradle are enough :-)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3420) Invalid ERROR message initiates false alarms

2019-02-01 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3420?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16758476#comment-16758476
 ] 

Hudson commented on SQOOP-3420:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1251 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1251/])
SQOOP-3420: Invalid ERROR message initiates false alarms (ferroriuss: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=5ab5190304d0e13d1c5eaa56d1f486f3106529d3])
* (edit) src/java/org/apache/sqoop/orm/CompilationManager.java


> Invalid ERROR message initiates false alarms
> 
>
> Key: SQOOP-3420
> URL: https://issues.apache.org/jira/browse/SQOOP-3420
> Project: Sqoop
>  Issue Type: Bug
>Affects Versions: 1.4.7
>Reporter: Denes Bodo
>Assignee: Denes Bodo
>Priority: Critical
>  Labels: usability
> Attachments: SQOOP-3420_001.patch
>
>
> In SQOOP-3042, a debug message was refactored to be error instead means false 
> alarms in customer log analyser. After understanding the functionality it is 
> recommended to use info level message instead of error in case when 
> ImportTool is unable to backup generated .java file.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3237) Mainframe FTP transfer option to insert custom FTP commands prior to transfer

2018-12-13 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3237?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16720298#comment-16720298
 ] 

Hudson commented on SQOOP-3237:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1249 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1249/])
SQOOP-3237: Mainframe FTP transfer option to insert custom FTP commands (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=4a22691f45d7d66157ff6dfaa8fca5581e0a8955])
* (edit) src/test/org/apache/sqoop/tool/TestMainframeImportTool.java
* (edit) src/test/org/apache/sqoop/util/TestMainframeFTPClientUtils.java
* (edit) src/java/org/apache/sqoop/util/MainframeFTPClientUtils.java
* (edit) 
src/java/org/apache/sqoop/mapreduce/mainframe/MainframeConfiguration.java
* (edit) src/docs/user/import-mainframe.txt
* (edit) src/java/org/apache/sqoop/tool/MainframeImportTool.java
* (edit) src/java/org/apache/sqoop/SqoopOptions.java
* (edit) src/java/org/apache/sqoop/mapreduce/mainframe/MainframeImportJob.java


> Mainframe FTP transfer option to insert custom FTP commands prior to transfer
> -
>
> Key: SQOOP-3237
> URL: https://issues.apache.org/jira/browse/SQOOP-3237
> Project: Sqoop
>  Issue Type: Improvement
>Reporter: Chris Teoh
>Assignee: Chris Teoh
>Priority: Minor
> Fix For: 1.5.0, 3.0.0
>
>
> To get some data included in the FTP transfer, some extra FTP commands need 
> to be added prior to transfer. A sample interaction below:-
> ftp> binary
> 200 Representation type is Image
> ftp> {color:red}quote SITE RDW{color}
> 200 SITE command was accepted
> ftp> {color:red}quote SITE RDW READTAPEFORMAT=V{color}
> 200 SITE command was accepted
> ftp>
> Proposed approach: --ftpcmds {csv list of commands}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3396) Add parquet numeric support for Parquet in Hive import

2018-12-07 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3396?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16712831#comment-16712831
 ] 

Hudson commented on SQOOP-3396:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1246 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1246/])
SQOOP-3396: Add parquet numeric support for Parquet in Hive import (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=a50394977bcdec8ae2274618b3a5c9e7e6a1082b])
* (edit) src/java/org/apache/sqoop/hive/HiveTypes.java
* (delete) 
src/test/org/apache/sqoop/importjob/numerictypes/MysqlNumericTypesImportTest.java
* (add) 
src/test/org/apache/sqoop/importjob/numerictypes/parquet/SqlServerNumericTypesParquetImportTest.java
* (add) 
src/test/org/apache/sqoop/importjob/configuration/HiveTestConfiguration.java
* (add) 
src/test/org/apache/sqoop/hive/numerictypes/NumericTypesHiveImportTest.java
* (add) 
src/test/org/apache/sqoop/importjob/numerictypes/NumericTypesParquetImportTestBase.java
* (add) 
src/test/org/apache/sqoop/importjob/numerictypes/parquet/OracleNumericTypesParquetImportTest.java
* (edit) 
src/test/org/apache/sqoop/importjob/configuration/OracleImportJobTestConfigurationForNumber.java
* (add) 
src/test/org/apache/sqoop/importjob/numerictypes/avro/MysqlNumericTypesAvroImportTest.java
* (add) 
src/test/org/apache/sqoop/importjob/numerictypes/avro/SqlServerNumericTypesAvroImportTest.java
* (add) 
src/test/org/apache/sqoop/hive/numerictypes/NumericTypesHiveImportTestBase.java
* (delete) 
src/test/org/apache/sqoop/importjob/numerictypes/PostgresNumericTypesImportTest.java
* (edit) src/test/org/apache/sqoop/hive/TestHiveTypesForAvroTypeMapping.java
* (edit) 
src/test/org/apache/sqoop/importjob/numerictypes/NumericTypesImportTestBase.java
* (edit) 
src/test/org/apache/sqoop/importjob/configuration/OracleImportJobTestConfiguration.java
* (add) 
src/test/org/apache/sqoop/importjob/numerictypes/avro/PostgresNumericTypesAvroImportTest.java
* (add) 
src/test/org/apache/sqoop/importjob/numerictypes/parquet/PostgresNumericTypesParquetImportTest.java
* (delete) 
src/test/org/apache/sqoop/importjob/numerictypes/OracleNumericTypesImportTest.java
* (edit) 
src/test/org/apache/sqoop/importjob/configuration/MysqlImportJobTestConfiguration.java
* (add) src/test/org/apache/sqoop/testutil/NumericTypesTestUtils.java
* (add) 
src/test/org/apache/sqoop/importjob/numerictypes/NumericTypesAvroImportTestBase.java
* (delete) 
src/test/org/apache/sqoop/importjob/numerictypes/SqlServerNumericTypesImportTest.java
* (add) src/test/org/apache/sqoop/testutil/ThirdPartyTestBase.java
* (add) 
src/test/org/apache/sqoop/importjob/numerictypes/avro/OracleNumericTypesAvroImportTest.java
* (edit) 
src/test/org/apache/sqoop/importjob/configuration/SqlServerImportJobTestConfiguration.java
* (edit) 
src/test/org/apache/sqoop/importjob/configuration/PostgresqlImportJobTestConfigurationForNumeric.java
* (edit) 
src/test/org/apache/sqoop/importjob/configuration/PostgresqlImportJobTestConfigurationPaddingShouldSucceed.java
* (edit) src/java/org/apache/sqoop/hive/TableDefWriter.java
* (add) 
src/test/org/apache/sqoop/importjob/numerictypes/parquet/MysqlNumericTypesParquetImportTest.java


> Add parquet numeric support for Parquet in Hive import
> --
>
> Key: SQOOP-3396
> URL: https://issues.apache.org/jira/browse/SQOOP-3396
> Project: Sqoop
>  Issue Type: Sub-task
>Reporter: Fero Szabo
>Assignee: Fero Szabo
>Priority: Major
> Fix For: 1.5.0, 3.0.0
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3417) Execute Oracle XE tests on Travis CI

2018-12-04 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3417?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16708859#comment-16708859
 ] 

Hudson commented on SQOOP-3417:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1245 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1245/])
SQOOP-3417: Execute Oracle XE tests on Travis CI (fero: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=302674d96b18bae3c5283d16603afb985b892795])
* (edit) COMPILING.txt
* (edit) .travis.yml
* (edit) build.gradle
* (edit) gradle.properties


> Execute Oracle XE tests on Travis CI
> 
>
> Key: SQOOP-3417
> URL: https://issues.apache.org/jira/browse/SQOOP-3417
> Project: Sqoop
>  Issue Type: Test
>Affects Versions: 1.4.7
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
>Priority: Major
>
> The task is to enable the Travis CI to execute Oracle XE tests too 
> automatically.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3393) TestNetezzaExternalTableExportMapper hangs

2018-12-03 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16707074#comment-16707074
 ] 

Hudson commented on SQOOP-3393:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1244 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1244/])
SQOOP-3393: TestNetezzaExternalTableExportMapper hangs (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=3c1fb870e20bf2d0316a7eb0aa524e437499ab05])
* (edit) 
src/test/org/apache/sqoop/mapreduce/db/netezza/TestNetezzaExternalTableExportMapper.java


> TestNetezzaExternalTableExportMapper hangs
> --
>
> Key: SQOOP-3393
> URL: https://issues.apache.org/jira/browse/SQOOP-3393
> Project: Sqoop
>  Issue Type: Bug
>  Components: test
>Affects Versions: 1.5.0, 3.0.0
>Reporter: Daniel Voros
>Assignee: Daniel Voros
>Priority: Major
> Fix For: 1.5.0, 3.0.0
>
>
> Introduced in SQOOP-3378, spotted by [~vasas].



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3415) Fix gradle test+build when clean applied as the first command + warning issue fixes

2018-12-03 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3415?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16707019#comment-16707019
 ] 

Hudson commented on SQOOP-3415:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1243 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1243/])
SQOOP-3415: Fix gradle test+build when clean applied as the first (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=122d1c0b27c0f5e4532f7a23ce016a5c5756321a])
* (edit) build.gradle


> Fix gradle test+build when clean applied as the first command + warning issue 
> fixes
> ---
>
> Key: SQOOP-3415
> URL: https://issues.apache.org/jira/browse/SQOOP-3415
> Project: Sqoop
>  Issue Type: Bug
>Affects Versions: 1.5.0
>Reporter: Attila Szabo
>Assignee: Attila Szabo
>Priority: Major
> Fix For: 1.5.0
>
>
> If the user wants to build like the following command:
> gradlew clean unittest
> the gradle process ends up in an exception and the whole process left there 
> hanging forever. The root cause of this is the following:
> tasks.withType runs in the configuration part of the build, where we ensure 
> the neccessary directories exist.
> after that clean is executed and all of the dirs got deleted.
> Proposed fix:
> Apply directory creation as the first step of test tasks.
> on the top:
> there are some missing options b/c of Junit annotation processors, and also 
> Xlint information are swallowed currently. We aim to fix these things as well



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3414) Introduce a Gradle build parameter to set the ignoreTestFailures of the test tasks

2018-11-28 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3414?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16701673#comment-16701673
 ] 

Hudson commented on SQOOP-3414:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1242 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1242/])
SQOOP-3414: Introduce a Gradle build parameter to set the (bogi: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=6a159ed282c1816452a46dbc7acc2a56b5d2dc46])
* (edit) build.gradle
* (edit) COMPILING.txt


> Introduce a Gradle build parameter to set the ignoreTestFailures of the test 
> tasks
> --
>
> Key: SQOOP-3414
> URL: https://issues.apache.org/jira/browse/SQOOP-3414
> Project: Sqoop
>  Issue Type: Test
>Affects Versions: 1.4.7
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: test_with_ignoreTestFailures=true.txt, 
> test_without_ignoreTestFailures.txt
>
>
> The 
> [ignoreFailures|https://docs.gradle.org/current/dsl/org.gradle.api.tasks.testing.Test.html#org.gradle.api.tasks.testing.Test:ignoreFailures]
>  parameter of the Gradle test tasks is set to false which means that if a 
> Gradle test task fails the gradle
> process returns with non-zero. In some CI tools (e.g. Jenkins) this will make 
> the status of the job red and not yellow
> which usually means some more serious issue than a test failure.
> I would like to introduce a parameter to be able set this parameter of the 
> test tasks.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3413) TestMainframeManager does not restore the inner state of AccumuloUtil

2018-11-27 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3413?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16700666#comment-16700666
 ] 

Hudson commented on SQOOP-3413:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1239 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1239/])
SQOOP-3413: TestMainframeManager does not restore the inner state of (fero: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=f8fe691e21aadaf51cc51905d94b3dc22a7f818c])
* (edit) src/test/org/apache/sqoop/manager/TestMainframeManager.java


> TestMainframeManager does not restore the inner state of AccumuloUtil
> -
>
> Key: SQOOP-3413
> URL: https://issues.apache.org/jira/browse/SQOOP-3413
> Project: Sqoop
>  Issue Type: Test
>Affects Versions: 1.4.7
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
>Priority: Major
> Fix For: 3.0.0
>
>
> org.apache.sqoop.manager.TestMainframeManager#testImportTableNoAccumuloJarPresent
>  sets the testingMode field of AccumuloUtil to true but it does not restores 
> it so Accumulo tests will fail if they are executed after 
> TestMainframeManager.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3411) PostgresMetaConnectIncrementalImportTest fails if metastore tables are absent from the database

2018-11-26 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3411?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16698830#comment-16698830
 ] 

Hudson commented on SQOOP-3411:
---

FAILURE: Integrated in Jenkins build Sqoop-hadoop200 #1237 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1237/])
SQOOP-3411: PostgresMetaConnectIncrementalImportTest fails if metastore (bogi: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=696187bb16ae31ca2aa78be93c342604cc1290d1])
* (edit) 
src/test/org/apache/sqoop/metastore/MetaConnectIncrementalImportTestBase.java


> PostgresMetaConnectIncrementalImportTest fails if metastore tables are absent 
> from the database
> ---
>
> Key: SQOOP-3411
> URL: https://issues.apache.org/jira/browse/SQOOP-3411
> Project: Sqoop
>  Issue Type: Test
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
>Priority: Major
> Fix For: 3.0.0
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3410) Test S3 import with fs.s3a.security.credential.provider.path

2018-11-23 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3410?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16697284#comment-16697284
 ] 

Hudson commented on SQOOP-3410:
---

FAILURE: Integrated in Jenkins build Sqoop-hadoop200 #1236 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1236/])
SQOOP-3410: Test S3 import with fs.s3a.security.credential.provider.path 
(vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=08eb5bdc405c22a25cf680390bbd438f6513199c])
* (edit) src/java/org/apache/sqoop/util/password/CredentialProviderHelper.java
* (edit) src/docs/user/s3.txt
* (edit) src/test/org/apache/sqoop/credentials/TestPassingSecurePassword.java
* (edit) src/test/org/apache/sqoop/s3/TestS3ImportWithHadoopCredProvider.java


> Test S3 import with fs.s3a.security.credential.provider.path
> 
>
> Key: SQOOP-3410
> URL: https://issues.apache.org/jira/browse/SQOOP-3410
> Project: Sqoop
>  Issue Type: Improvement
>Reporter: Boglarka Egyed
>Assignee: Boglarka Egyed
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: SQOOP-3410.patch
>
>
> Based on 
> [https://hadoop.apache.org/docs/current/hadoop-aws/tools/hadoop-aws/index.html#Configure_the_hadoop.security.credential.provider.path_property]
>  property fs.s3a.security.credential.provider.path can also be used for 
> passing the location of the credential store. This should be also tested and 
> documented.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3289) Add .travis.yml

2018-11-23 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3289?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16697237#comment-16697237
 ] 

Hudson commented on SQOOP-3289:
---

FAILURE: Integrated in Jenkins build Sqoop-hadoop200 #1235 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1235/])
Revert "SQOOP-3289: Add .travis.yml" (fero: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=cbc39c3bfa04001a411fda456429e686220ecbba])
* (edit) 
src/test/org/apache/sqoop/manager/sqlserver/SQLServerManagerImportTest.java
* (edit) gradle.properties
* (edit) src/test/org/apache/sqoop/manager/db2/DB2XmlTypeImportManualTest.java
* (delete) 
src/test/org/apache/sqoop/testcategories/thirdpartytest/OracleEeTest.java
* (edit) build.gradle
* (edit) src/test/org/apache/sqoop/metastore/db2/DB2SavedJobsTest.java
* (edit) src/test/org/apache/sqoop/manager/mysql/MySQLTestUtils.java
* (delete) 
src/scripts/thirdpartytest/docker-compose/oraclescripts/ee-healthcheck.sh
* (edit) src/test/org/apache/sqoop/metastore/postgres/PostgresJobToolTest.java
* (edit) 
src/test/org/apache/sqoop/metastore/postgres/PostgresMetaConnectIncrementalImportTest.java
* (edit) 
src/test/org/apache/sqoop/manager/postgresql/PGBulkloadManagerManualTest.java
* (edit) src/test/org/apache/sqoop/manager/postgresql/PostgresqlTestUtil.java
* (edit) src/test/org/apache/sqoop/manager/sqlserver/MSSQLTestUtils.java
* (edit) src/test/org/apache/sqoop/manager/db2/DB2ManagerImportManualTest.java
* (edit) src/test/org/apache/sqoop/manager/oracle/ImportTest.java
* (edit) src/test/org/apache/sqoop/manager/oracle/TimestampDataTest.java
* (edit) 
src/test/org/apache/sqoop/manager/sqlserver/SQLServerManagerExportTest.java
* (edit) COMPILING.txt
* (delete) src/test/org/apache/sqoop/manager/db2/DB2TestUtils.java
* (edit) src/scripts/thirdpartytest/docker-compose/oraclescripts/healthcheck.sh
* (edit) src/test/org/apache/sqoop/metastore/postgres/PostgresSavedJobsTest.java
* (edit) build.xml
* (edit) 
src/test/org/apache/sqoop/manager/db2/DB2ImportAllTableWithSchemaManualTest.java
* (edit) src/test/org/apache/sqoop/manager/oracle/ExportTest.java
* (edit) 
src/test/org/apache/sqoop/manager/postgresql/DirectPostgreSQLExportManualTest.java
* (edit) 
src/test/org/apache/sqoop/metastore/db2/DB2MetaConnectIncrementalImportTest.java
* (delete) 
src/scripts/thirdpartytest/docker-compose/oraclescripts/startup/.cache
* (edit) src/test/org/apache/sqoop/manager/cubrid/CubridTestUtils.java
* (edit) src/test/org/apache/sqoop/manager/oracle/util/OracleUtils.java
* (edit) src/test/org/apache/sqoop/manager/oracle/OraOopTypesTest.java
* (edit) 
src/scripts/thirdpartytest/docker-compose/sqoop-thirdpartytest-db-services.yml
* (edit) src/test/org/apache/sqoop/metastore/db2/DB2JobToolTest.java
* (edit) src/test/org/apache/sqoop/manager/postgresql/PostgresqlExportTest.java
* (edit) 
src/test/org/apache/sqoop/manager/oracle/OracleConnectionFactoryTest.java
* (edit) 
src/test/org/apache/sqoop/manager/postgresql/PostgresqlExternalTableImportTest.java
* (delete) .travis.yml
* (edit) src/test/org/apache/sqoop/manager/oracle/OraOopTestCase.java
SQOOP-3289: Add .travis.yml (fero: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=2bf6f3ccd0b59c9ff29d56a32b25f3a659dcfe19])
* (edit) build.xml
* (add) 
src/scripts/thirdpartytest/docker-compose/oraclescripts/ee-healthcheck.sh
* (edit) src/test/org/apache/sqoop/manager/cubrid/CubridTestUtils.java
* (edit) src/test/org/apache/sqoop/manager/mysql/MySQLTestUtils.java
* (edit) src/test/org/apache/sqoop/metastore/db2/DB2JobToolTest.java
* (edit) src/test/org/apache/sqoop/manager/oracle/ExportTest.java
* (add) 
src/test/org/apache/sqoop/testcategories/thirdpartytest/OracleEeTest.java
* (edit) src/test/org/apache/sqoop/manager/db2/DB2ManagerImportManualTest.java
* (add) .travis.yml
* (edit) 
src/test/org/apache/sqoop/manager/postgresql/PGBulkloadManagerManualTest.java
* (edit) build.gradle
* (edit) 
src/test/org/apache/sqoop/metastore/db2/DB2MetaConnectIncrementalImportTest.java
* (edit) src/test/org/apache/sqoop/manager/oracle/OraOopTypesTest.java
* (edit) 
src/test/org/apache/sqoop/manager/sqlserver/SQLServerManagerImportTest.java
* (edit) src/test/org/apache/sqoop/metastore/db2/DB2SavedJobsTest.java
* (edit) gradle.properties
* (edit) COMPILING.txt
* (edit) 
src/test/org/apache/sqoop/manager/db2/DB2ImportAllTableWithSchemaManualTest.java
* (edit) src/test/org/apache/sqoop/manager/db2/DB2XmlTypeImportManualTest.java
* (edit) 
src/test/org/apache/sqoop/manager/postgresql/DirectPostgreSQLExportManualTest.java
* (edit) src/test/org/apache/sqoop/manager/oracle/ImportTest.java
* (edit) src/test/org/apache/sqoop/manager/oracle/TimestampDataTest.java
* (edit) src/test/org/apache/sqoop/manager/oracle/OraOopTestCase.java
* (edit) src/test/org/apache/sqoop/manager/postgresql/PostgresqlExportTest.java
* (edit) 
src/test/org/apache/sqoop/manager/sqlserver/SQLServerManagerExportTest.java
* (edit) 

[jira] [Commented] (SQOOP-3289) Add .travis.yml

2018-11-23 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3289?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16697093#comment-16697093
 ] 

Hudson commented on SQOOP-3289:
---

FAILURE: Integrated in Jenkins build Sqoop-hadoop200 #1234 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1234/])
SQOOP-3289: Add .travis.yml (fero: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=83a18e195111adb9f906401b0c030666378bae69])
* (edit) 
src/test/org/apache/sqoop/manager/postgresql/DirectPostgreSQLExportManualTest.java
* (edit) COMPILING.txt
* (edit) build.gradle
* (edit) src/test/org/apache/sqoop/manager/oracle/OraOopTestCase.java
* (edit) 
src/test/org/apache/sqoop/manager/postgresql/PostgresqlExternalTableImportTest.java
* (add) .travis.yml
* (add) 
src/scripts/thirdpartytest/docker-compose/oraclescripts/ee-healthcheck.sh
* (add) src/scripts/thirdpartytest/docker-compose/oraclescripts/startup/.cache
* (add) src/test/org/apache/sqoop/manager/db2/DB2TestUtils.java
* (edit) src/test/org/apache/sqoop/manager/oracle/ExportTest.java
* (edit) src/test/org/apache/sqoop/manager/oracle/ImportTest.java
* (edit) src/test/org/apache/sqoop/manager/postgresql/PostgresqlExportTest.java
* (edit) 
src/test/org/apache/sqoop/metastore/db2/DB2MetaConnectIncrementalImportTest.java
* (add) 
src/test/org/apache/sqoop/testcategories/thirdpartytest/OracleEeTest.java
* (edit) src/test/org/apache/sqoop/manager/db2/DB2XmlTypeImportManualTest.java
* (edit) src/test/org/apache/sqoop/manager/mysql/MySQLTestUtils.java
* (edit) src/test/org/apache/sqoop/manager/postgresql/PostgresqlTestUtil.java
* (edit) 
src/test/org/apache/sqoop/manager/db2/DB2ImportAllTableWithSchemaManualTest.java
* (edit) src/test/org/apache/sqoop/manager/db2/DB2ManagerImportManualTest.java
* (edit) src/scripts/thirdpartytest/docker-compose/oraclescripts/healthcheck.sh
* (edit) 
src/test/org/apache/sqoop/manager/sqlserver/SQLServerManagerExportTest.java
* (edit) src/test/org/apache/sqoop/manager/sqlserver/MSSQLTestUtils.java
* (edit) 
src/test/org/apache/sqoop/metastore/postgres/PostgresMetaConnectIncrementalImportTest.java
* (edit) 
src/test/org/apache/sqoop/manager/sqlserver/SQLServerManagerImportTest.java
* (edit) src/test/org/apache/sqoop/metastore/postgres/PostgresSavedJobsTest.java
* (edit) src/test/org/apache/sqoop/manager/oracle/TimestampDataTest.java
* (edit) gradle.properties
* (edit) build.xml
* (edit) src/test/org/apache/sqoop/manager/oracle/util/OracleUtils.java
* (edit) 
src/test/org/apache/sqoop/manager/oracle/OracleConnectionFactoryTest.java
* (edit) src/test/org/apache/sqoop/manager/oracle/OraOopTypesTest.java
* (edit) 
src/test/org/apache/sqoop/manager/postgresql/PGBulkloadManagerManualTest.java
* (edit) src/test/org/apache/sqoop/metastore/db2/DB2SavedJobsTest.java
* (edit) 
src/scripts/thirdpartytest/docker-compose/sqoop-thirdpartytest-db-services.yml
* (edit) src/test/org/apache/sqoop/manager/cubrid/CubridTestUtils.java
* (edit) src/test/org/apache/sqoop/metastore/db2/DB2JobToolTest.java
* (edit) src/test/org/apache/sqoop/metastore/postgres/PostgresJobToolTest.java


> Add .travis.yml
> ---
>
> Key: SQOOP-3289
> URL: https://issues.apache.org/jira/browse/SQOOP-3289
> Project: Sqoop
>  Issue Type: Sub-task
>  Components: build
>Affects Versions: 1.4.7
>Reporter: Daniel Voros
>Assignee: Szabolcs Vasas
>Priority: Minor
> Fix For: 1.5.0, 3.0.0
>
> Attachments: SQOOP-3289.patch
>
>
> Adding a .travis.yml would enable running builds/tests on travis-ci.org. 
> Currently if you wish to use Travis for testing your changes, you have to 
> manually add a .travis.yml to your branch. Having it committed to trunk would 
> save us this extra step.
> I currently have an example 
> [{{.travis.yml}}|https://github.com/dvoros/sqoop/blob/93a4c06c1a3da1fd5305c99e379484507797b3eb/.travis.yml]
>  on my travis branch running unit tests for every commit and every pull 
> request: https://travis-ci.org/dvoros/sqoop/builds
> Later we could add the build status to the project readme as well, see: 
> https://github.com/dvoros/sqoop/tree/travis
> Also, an example of a pull request: https://github.com/dvoros/sqoop/pull/1



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3409) Fix temporary rootdir clean up in Sqoop-S3 tests

2018-11-22 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3409?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16696133#comment-16696133
 ] 

Hudson commented on SQOOP-3409:
---

FAILURE: Integrated in Jenkins build Sqoop-hadoop200 #1233 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1233/])
SQOOP-3409: Fix temporary rootdir clean up in Sqoop-S3 tests (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=264cddbecf6d99a793600885e4c0ded72158695c])
* (edit) src/test/org/apache/sqoop/testutil/S3TestUtils.java


> Fix temporary rootdir clean up in Sqoop-S3 tests
> 
>
> Key: SQOOP-3409
> URL: https://issues.apache.org/jira/browse/SQOOP-3409
> Project: Sqoop
>  Issue Type: Task
>Affects Versions: 1.4.7
>Reporter: Boglarka Egyed
>Assignee: Boglarka Egyed
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: SQOOP-3409.patch
>
>
> Temporary root directory clean up doesn't work as expected, many generated 
> temprootdirs are being kept in the used bucket after test runs. Clean up 
> logic should be checked and fixed.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3408) Introduce a Gradle build parameter to set the default forkEvery value for the tests

2018-11-22 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3408?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16696060#comment-16696060
 ] 

Hudson commented on SQOOP-3408:
---

FAILURE: Integrated in Jenkins build Sqoop-hadoop200 #1232 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1232/])
SQOOP-3408 Introduce a Gradle build parameter to set the default (fero: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=74252163f0455747ed2a99eb6c2c9d0e48f3dedc])
* (edit) COMPILING.txt
* (edit) build.gradle


> Introduce a Gradle build parameter to set the default forkEvery value for the 
> tests
> ---
>
> Key: SQOOP-3408
> URL: https://issues.apache.org/jira/browse/SQOOP-3408
> Project: Sqoop
>  Issue Type: Test
>Affects Versions: 1.4.7
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
>Priority: Major
> Attachments: SQOOP-3408.patch
>
>
> The [forkEvery 
> parameter|https://docs.gradle.org/current/dsl/org.gradle.api.tasks.testing.Test.html#org.gradle.api.tasks.testing.Test:forkEvery]
>  of the Gradle test tasks is currently set to 0 which means that all of the 
> tests run in a single JVM (the only exception is the kerberizedTest task 
> which requires a new JVM for every test class).
> The benefit of this setup is that the test tasks finish much faster since the 
> JVM creation is a slow operation. However the Sqoop test framework seems to 
> consume/leak too much memory which can lead to an OutOfMemoryError during the 
> build if there is not enough memory on the machine running the tests.
> The goal of this JIRA is to introduce a new parameter to the Gradle build 
> which can be used to set the default forkEvery parameter and thus prevent the 
> JVM running out of memory.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3405) Refactor: break up Parameterized tests on a per database basis

2018-11-22 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3405?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16696017#comment-16696017
 ] 

Hudson commented on SQOOP-3405:
---

FAILURE: Integrated in Jenkins build Sqoop-hadoop200 #1231 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1231/])
SQOOP-3405: Refactor: break up Parameterized tests on a per database (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=0a7407613b12a4bb25e737506ef0f091d3a7dae1])
* (delete) 
src/test/org/apache/sqoop/importjob/configuration/MySQLImportJobTestConfiguration.java
* (delete) 
src/test/org/apache/sqoop/importjob/configuration/MSSQLServerImportJobTestConfiguration.java
* (delete) 
src/test/org/apache/sqoop/testutil/adapter/MSSQLServerDatabaseAdapter.java
* (add) src/test/org/apache/sqoop/importjob/splitby/SplitByImportTestBase.java
* (add) 
src/test/org/apache/sqoop/importjob/configuration/SqlServerImportJobTestConfiguration.java
* (add) 
src/test/org/apache/sqoop/importjob/numerictypes/OracleNumericTypesImportTest.java
* (add) src/test/org/apache/sqoop/testutil/adapter/SqlServerDatabaseAdapter.java
* (delete) src/test/org/apache/sqoop/importjob/NumericTypesImportTest.java
* (delete) src/test/org/apache/sqoop/importjob/SplitByImportTest.java
* (add) 
src/test/org/apache/sqoop/importjob/numerictypes/NumericTypesImportTestBase.java
* (add) 
src/test/org/apache/sqoop/importjob/numerictypes/SqlServerNumericTypesImportTest.java
* (add) 
src/test/org/apache/sqoop/importjob/numerictypes/MysqlNumericTypesImportTest.java
* (add) 
src/test/org/apache/sqoop/importjob/configuration/MysqlImportJobTestConfiguration.java
* (add) 
src/test/org/apache/sqoop/importjob/splitby/PostgresSplitByImportTest.java
* (add) 
src/test/org/apache/sqoop/importjob/splitby/SqlServerSplitByImportTest.java
* (add) src/test/org/apache/sqoop/importjob/splitby/OracleSplitByImportTest.java
* (add) src/test/org/apache/sqoop/testutil/adapter/MysqlDatabaseAdapter.java
* (delete) src/test/org/apache/sqoop/testutil/adapter/MySqlDatabaseAdapter.java
* (add) 
src/test/org/apache/sqoop/importjob/numerictypes/PostgresNumericTypesImportTest.java
* (add) src/test/org/apache/sqoop/importjob/splitby/MysqlSplitByImportTest.java
* (add) src/test/org/apache/sqoop/importjob/DatabaseAdapterFactory.java


> Refactor: break up Parameterized tests on a per database basis
> --
>
> Key: SQOOP-3405
> URL: https://issues.apache.org/jira/browse/SQOOP-3405
> Project: Sqoop
>  Issue Type: Sub-task
>Reporter: Fero Szabo
>Assignee: Fero Szabo
>Priority: Major
> Fix For: 3.0.0
>
>
> Follow the example of the abstract class SavedJobsTestBase and it's 
> subclasses!
> We need this to be able to add test categories (so for Travis integration) as 
> well.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3406) Sqoop should not try to execute test category interfaces as tests with Ant

2018-11-21 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3406?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16694780#comment-16694780
 ] 

Hudson commented on SQOOP-3406:
---

FAILURE: Integrated in Jenkins build Sqoop-hadoop200 #1230 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1230/])
SQOOP-3406 Sqoop should not try to execute test category interfaces as (fero: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=65b51b959eefd1e3eef540f570e048877f5a4026])
* (edit) build.xml


> Sqoop should not try to execute test category interfaces as tests with Ant
> --
>
> Key: SQOOP-3406
> URL: https://issues.apache.org/jira/browse/SQOOP-3406
> Project: Sqoop
>  Issue Type: Test
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
>Priority: Major
> Attachments: SQOOP-3406.patch, SQOOP-3406.patch
>
>
> When Ant third party test suite is being run Ant tries to execute the test 
> category interfaces too because they end with the 'Test' postfix.
> These "tests" obviously fail so we need to make sure that Ant does not 
> execute them.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3407) Introduce methods instead of TEMP_BASE_DIR and LOCAL_WAREHOUSE_DIR static fields

2018-11-21 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3407?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16694679#comment-16694679
 ] 

Hudson commented on SQOOP-3407:
---

FAILURE: Integrated in Jenkins build Sqoop-hadoop200 #1229 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1229/])
SQOOP-3407 Introduce methods instead of TEMP_BASE_DIR and (fero: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=28896c8df1d151da29b3697c08fe5e81da4f3e94])
* (edit) src/test/org/apache/sqoop/TestMerge.java
* (edit) 
src/test/org/apache/sqoop/manager/sqlserver/SQLServerManagerImportTest.java
* (edit) src/test/org/apache/sqoop/orm/TestClassWriter.java
* (edit) src/test/org/apache/sqoop/testutil/BaseSqoopTestCase.java
* (edit) src/test/org/apache/sqoop/io/TestSplittableBufferedWriter.java
* (edit) src/test/org/apache/sqoop/credentials/TestPassingSecurePassword.java
* (edit) src/test/org/apache/sqoop/hbase/HBaseImportAddRowKeyTest.java
* (edit) src/test/org/apache/sqoop/TestIncrementalImport.java


> Introduce methods instead of TEMP_BASE_DIR and LOCAL_WAREHOUSE_DIR static 
> fields
> 
>
> Key: SQOOP-3407
> URL: https://issues.apache.org/jira/browse/SQOOP-3407
> Project: Sqoop
>  Issue Type: Test
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
>Priority: Major
> Attachments: SQOOP-3407.patch
>
>
> BaseSqoopTestCase.TEMP_BASE_DIR and BaseSqoopTestCase.LOCAL_WAREHOUSE_DIR are 
> public static fields which get initialized once at the JVM startup and store 
> the paths for the test temp and warehouse directories.
> The problem is that HBase test cases change the value of the test.build.data 
> system property which can cause tests using these static fields to fail.
> Since we do not own the code in HBase which changes the system property we 
> need to turn these static fields into methods which evaluate the 
> test.build.data system property every time they invoked which will make sure 
> that the invoking tests will be successful.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3404) Categorize all tests in the project

2018-11-19 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3404?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16691553#comment-16691553
 ] 

Hudson commented on SQOOP-3404:
---

FAILURE: Integrated in Jenkins build Sqoop-hadoop200 #1228 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1228/])
SQOOP-3404: Categorize all tests in the project (fero: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=bb9c2dd85b0cac84503e69906a05c76d6a0413e1])
* (edit) 
src/test/org/apache/sqoop/mapreduce/db/netezza/TestNetezzaExternalTableExportMapper.java
* (edit) 
src/test/org/apache/sqoop/tool/TestS3IncrementalImportOptionValidations.java
* (edit) src/test/org/apache/sqoop/s3/TestS3ImportWithHadoopCredProvider.java
* (edit) src/test/org/apache/sqoop/s3/TestS3TextImport.java
* (edit) 
src/test/org/apache/sqoop/mapreduce/mainframe/TestMainframeDatasetBinaryRecord.java
* (edit) src/test/org/apache/sqoop/s3/TestS3AvroImport.java
* (edit) src/test/org/apache/sqoop/s3/TestS3IncrementalAppendParquetImport.java
* (edit) src/test/org/apache/sqoop/manager/TestMainframeManager.java
* (edit) 
src/test/org/apache/sqoop/mapreduce/db/netezza/TestNetezzaExternalTableImportMapper.java
* (edit) src/test/org/apache/sqoop/s3/TestS3ExternalHiveTableImport.java
* (edit) 
src/test/org/apache/sqoop/mapreduce/mainframe/TestMainframeFTPFileGdgEntryParser.java
* (edit) src/test/org/apache/sqoop/s3/TestS3IncrementalAppendTextImport.java
* (edit) src/test/org/apache/sqoop/s3/TestS3IncrementalMergeParquetImport.java
* (edit) 
src/test/org/apache/sqoop/s3/TestS3IncrementalAppendSequenceFileImport.java
* (edit) src/test/org/apache/sqoop/s3/TestS3ParquetImport.java
* (edit) 
src/test/org/apache/sqoop/manager/oracle/TestOraOopDBInputSplitGetDebugDetails.java
* (edit) src/test/org/apache/sqoop/s3/TestS3IncrementalAppendAvroImport.java
* (add) src/test/org/apache/sqoop/testcategories/thirdpartytest/S3Test.java
* (edit) src/test/org/apache/sqoop/importjob/SplitByImportTest.java
* (edit) src/test/org/apache/sqoop/s3/TestS3IncrementalMergeTextImport.java
* (edit) src/test/org/apache/sqoop/s3/TestS3SequenceFileImport.java
* (edit) COMPILING.txt
* (edit) build.gradle


> Categorize all tests in the project
> ---
>
> Key: SQOOP-3404
> URL: https://issues.apache.org/jira/browse/SQOOP-3404
> Project: Sqoop
>  Issue Type: Sub-task
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: SQOOP-3404.patch
>
>
> SQOOP-3104 has introduced test categories but while it was under review many 
> other patches with new test cases were committed.
> The task is to make sure that all of the new tests are properly categorized 
> and the test tasks are cleaned up.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3104) Create test categories instead of test suites and naming conventions

2018-11-14 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3104?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16686819#comment-16686819
 ] 

Hudson commented on SQOOP-3104:
---

FAILURE: Integrated in Jenkins build Sqoop-hadoop200 #1227 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1227/])
SQOOP-3104: Create test categories instead of test suites and naming (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=d58e5f106ba932879112d7b69997301e442335f1])
* (edit) src/test/org/apache/sqoop/manager/sqlserver/SQLServerSplitByTest.java
* (edit) src/test/org/apache/sqoop/testutil/BaseSqoopTestCase.java
* (add) 
src/test/org/apache/sqoop/util/BlockJUnit4ClassRunnerWithParametersFactory.java
* (add) 
src/test/org/apache/sqoop/testcategories/thirdpartytest/SqlServerTest.java
* (edit) 
src/test/org/apache/sqoop/manager/sqlserver/SQLServerDatatypeExportSequenceFileTest.java
* (edit) src/test/org/apache/sqoop/manager/sqlserver/SQLServerMultiMapsTest.java
* (edit) src/test/org/apache/sqoop/manager/oracle/OracleCompatTest.java
* (edit) src/test/org/apache/sqoop/manager/oracle/OracleSplitterTest.java
* (edit) src/test/org/apache/sqoop/tool/TestImportTool.java
* (edit) src/test/org/apache/sqoop/manager/oracle/OracleLobAvroImportTest.java
* (add) src/test/org/apache/sqoop/testcategories/thirdpartytest/CubridTest.java
* (edit) src/test/org/apache/sqoop/manager/mysql/MySQLAuthTest.java
* (edit) src/test/org/apache/sqoop/manager/netezza/NetezzaExportManualTest.java
* (edit) 
src/test/org/apache/sqoop/manager/sqlserver/SQLServerManagerExportTest.java
* (edit) src/test/org/apache/sqoop/hive/TestHiveTypesForAvroTypeMapping.java
* (edit) 
src/test/org/apache/sqoop/mapreduce/mainframe/TestMainframeDatasetFTPRecordReader.java
* (edit) src/test/org/apache/sqoop/manager/postgresql/PostgresqlImportTest.java
* (edit) 
src/test/org/apache/sqoop/metastore/sqlserver/SqlServerMetaConnectIncrementalImportTest.java
* (edit) src/test/org/apache/sqoop/tool/TestExportToolValidateOptions.java
* (add) 
src/test/org/apache/sqoop/testcategories/thirdpartytest/MainFrameTest.java
* (edit) src/test/org/apache/sqoop/metastore/postgres/PostgresJobToolTest.java
* (edit) src/test/org/apache/sqoop/manager/oracle/OracleCallExportTest.java
* (edit) src/test/org/apache/sqoop/testutil/TestArgumentArrayBuilder.java
* (edit) src/test/org/apache/sqoop/util/TestOptionsFileExpansion.java
* (edit) src/test/org/apache/sqoop/metastore/db2/DB2JobToolTest.java
* (edit) 
src/test/org/apache/sqoop/mapreduce/sqlserver/SqlServerUpsertOutputFormatTest.java
* (add) src/test/org/apache/sqoop/testcategories/thirdpartytest/NetezzaTest.java
* (edit) src/test/org/apache/sqoop/hive/TestHiveServer2Client.java
* (edit) 
src/test/org/apache/sqoop/mapreduce/mainframe/TestMainframeFTPFileEntryParser.java
* (edit) 
src/test/org/apache/sqoop/manager/sqlserver/SQLServerDatatypeImportDelimitedFileTest.java
* (edit) src/test/org/apache/sqoop/manager/sqlserver/SQLServerWhereTest.java
* (edit) src/test/org/apache/sqoop/util/TestFileSystemUtil.java
* (edit) 
src/test/org/apache/sqoop/manager/sqlserver/SQLServerParseMethodsTest.java
* (edit) src/test/org/apache/sqoop/metastore/postgres/PostgresSavedJobsTest.java
* (edit) src/test/org/apache/sqoop/mapreduce/db/TestIntegerSplitter.java
* (edit) src/test/org/apache/sqoop/hive/TestTableDefWriter.java
* (edit) src/test/org/apache/sqoop/orm/TestClassWriter.java
* (add) src/test/org/apache/sqoop/testcategories/thirdpartytest/MysqlTest.java
* (edit) 
src/test/org/apache/sqoop/manager/netezza/DirectNetezzaHCatImportManualTest.java
* (edit) 
src/test/org/apache/sqoop/manager/netezza/DirectNetezzaExportManualTest.java
* (edit) 
src/test/org/apache/sqoop/manager/sqlserver/SQLServerHiveImportTest.java
* (edit) 
src/test/org/apache/sqoop/db/decorator/TestKerberizedConnectionFactoryDecorator.java
* (edit) src/test/org/apache/sqoop/mapreduce/TestJdbcExportJob.java
* (edit) src/test/org/apache/sqoop/metastore/PasswordRedactorTest.java
* (edit) src/test/org/apache/sqoop/io/TestLobFile.java
* (edit) src/test/org/apache/sqoop/manager/db2/DB2XmlTypeImportManualTest.java
* (edit) src/test/org/apache/sqoop/accumulo/TestAccumuloUtil.java
* (edit) src/test/org/apache/sqoop/util/TestSubstitutionUtils.java
* (edit) src/test/org/apache/sqoop/manager/cubrid/CubridManagerImportTest.java
* (edit) src/test/org/apache/sqoop/validation/AbortOnFailureHandlerTest.java
* (edit) src/test/org/apache/sqoop/hbase/HBaseUtilTest.java
* (edit) src/test/org/apache/sqoop/manager/mysql/DirectMySQLTest.java
* (edit) src/test/org/apache/sqoop/manager/db2/DB2ManagerImportManualTest.java
* (edit) src/test/org/apache/sqoop/metastore/mysql/MySqlJobToolTest.java
* (edit) src/test/org/apache/sqoop/util/TestDirCleanupHook.java
* (edit) src/test/org/apache/sqoop/manager/cubrid/CubridAuthTest.java
* (add) 
src/test/org/apache/sqoop/testcategories/thirdpartytest/ThirdPartyTest.java
* (edit) 

[jira] [Commented] (SQOOP-3382) Add parquet numeric support for Parquet in hdfs import

2018-11-14 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3382?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16686375#comment-16686375
 ] 

Hudson commented on SQOOP-3382:
---

FAILURE: Integrated in Jenkins build Sqoop-hadoop200 #1226 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1226/])
SQOOP-3382: Add parquet numeric support for Parquet in hdfs import (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=00a02dec2f7507f813ee4899096c470ba1112a9e])
* (edit) src/test/org/apache/sqoop/util/ParquetReader.java
* (edit) src/java/org/apache/sqoop/config/ConfigurationConstants.java
* (add) 
src/test/org/apache/sqoop/importjob/configuration/PostgresqlImportJobTestConfigurationForNumeric.java
* (edit) src/java/org/apache/sqoop/mapreduce/ParquetImportMapper.java
* (delete) 
src/test/org/apache/sqoop/importjob/avro/configuration/OracleImportJobTestConfiguration.java
* (delete) 
src/test/org/apache/sqoop/importjob/avro/configuration/PostgresqlImportJobTestConfigurationForNumeric.java
* (edit) 
src/test/org/apache/sqoop/importjob/configuration/GenericImportJobSplitByTestConfiguration.java
* (add) 
src/test/org/apache/sqoop/importjob/configuration/ParquetTestConfiguration.java
* (delete) 
src/test/org/apache/sqoop/importjob/avro/configuration/OracleImportJobTestConfigurationForNumber.java
* (add) 
src/test/org/apache/sqoop/importjob/configuration/MSSQLServerImportJobTestConfiguration.java
* (delete) 
src/test/org/apache/sqoop/importjob/avro/configuration/MySQLImportJobTestConfiguration.java
* (add) 
src/test/org/apache/sqoop/importjob/configuration/AvroTestConfiguration.java
* (delete) src/test/org/apache/sqoop/importjob/ImportJobTestConfiguration.java
* (delete) 
src/test/org/apache/sqoop/importjob/avro/configuration/PostgresqlImportJobTestConfigurationPaddingShouldSucceed.java
* (add) 
src/test/org/apache/sqoop/importjob/configuration/OracleImportJobTestConfigurationForNumber.java
* (add) src/test/org/apache/sqoop/importjob/NumericTypesImportTest.java
* (edit) src/java/org/apache/sqoop/orm/AvroSchemaGenerator.java
* (add) 
src/test/org/apache/sqoop/importjob/configuration/PostgresqlImportJobTestConfigurationPaddingShouldSucceed.java
* (add) 
src/test/org/apache/sqoop/importjob/configuration/ImportJobTestConfiguration.java
* (delete) 
src/test/org/apache/sqoop/importjob/avro/configuration/MSSQLServerImportJobTestConfiguration.java
* (add) 
src/test/org/apache/sqoop/importjob/configuration/OracleImportJobTestConfiguration.java
* (edit) src/test/org/apache/sqoop/importjob/SplitByImportTest.java
* (edit) 
src/java/org/apache/sqoop/mapreduce/parquet/hadoop/HadoopParquetImportJobConfigurator.java
* (delete) 
src/test/org/apache/sqoop/importjob/avro/AvroImportForNumericTypesTest.java
* (add) 
src/test/org/apache/sqoop/importjob/configuration/MySQLImportJobTestConfiguration.java


> Add parquet numeric support for Parquet in hdfs import
> --
>
> Key: SQOOP-3382
> URL: https://issues.apache.org/jira/browse/SQOOP-3382
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>Reporter: Fero Szabo
>Assignee: Fero Szabo
>Priority: Major
> Fix For: 3.0.0
>
>
> The current Avro numeric tests are suitable to be used as Parquet tests, with 
> very minor modifications, as parquet can be written with the same input and 
> nearly the same args. Since we are writing Parquet with it's Avro support, it 
> would be good to cover this code with the same, or similar tests (including 
> the edge cases related to padding, missing scale and precision cases).
> Differences are:
>  * the expected output, since stored in a parquet file is different.
>  * input arguements



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3403) Sqoop2: Add Fero Szabo to committer list in our pom file

2018-11-09 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3403?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16681280#comment-16681280
 ] 

Hudson commented on SQOOP-3403:
---

FAILURE: Integrated in Jenkins build Sqoop2 #1056 (See 
[https://builds.apache.org/job/Sqoop2/1056/])
SQOOP-3403: Sqoop2: Add Fero Szabo to committer list in our pom file (fero: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=8382ffb9ce74e5a7ac37dac010eeebf034a6d5fb])
* (edit) pom.xml


> Sqoop2: Add Fero Szabo to committer list in our pom file
> 
>
> Key: SQOOP-3403
> URL: https://issues.apache.org/jira/browse/SQOOP-3403
> Project: Sqoop
>  Issue Type: Task
>Affects Versions: 1.99.8
>Reporter: Boglarka Egyed
>Assignee: Fero Szabo
>Priority: Major
>
> Now that [~fero] is committer we should update our committer list in the root 
> pom.xml file:



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-2949) SQL Syntax error when split-by column is of character type and min or max value has single quote inside it

2018-10-31 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-2949?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16670420#comment-16670420
 ] 

Hudson commented on SQOOP-2949:
---

FAILURE: Integrated in Jenkins build Sqoop-hadoop200 #1225 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1225/])
SQOOP-2949: SQL Syntax error when split-by column is of character type (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=6dd6a4fc863074d570919b183fdf5c20e86c5e0b])
* (add) 
src/test/org/apache/sqoop/importjob/configuration/GenericImportJobSplitByTestConfiguration.java
* (edit) src/java/org/apache/sqoop/mapreduce/db/TextSplitter.java
* (add) src/test/org/apache/sqoop/importjob/SplitByImportTest.java


> SQL Syntax error when split-by column is of character type and min or max 
> value has single quote inside it
> --
>
> Key: SQOOP-2949
> URL: https://issues.apache.org/jira/browse/SQOOP-2949
> Project: Sqoop
>  Issue Type: Bug
>Affects Versions: 1.4.6
> Environment: Sqoop 1.4.6
> Run on Hadoop 2.6.0
> On Ubuntu
>Reporter: Gireesh Puthumana
>Assignee: Gireesh Puthumana
>Priority: Major
> Fix For: 3.0.0
>
>
> Did a sqoop import from mysql table "emp", with split-by column "ename", 
> which is a varchar(100) type.
> +Used below command:+
> sqoop import --connect jdbc:mysql://localhost/testdb --username root 
> --password * --table emp --m 2 --target-dir /sqoopTest/5 --split-by ename;
> +Ename has following records:+
> | ename   |
> | gireesh |
> | aavesh  |
> | shiva'  |
> | jamir   |
> | balu|
> | santosh |
> | sameer  |
> Min value is "aavesh" and max value is "shiva'" (please note the single quote 
> inside max value).
> When run, it tried to execute below query in mapper 2 and failed:
> SELECT `ename`, `eid`, `deptid` FROM `emp` AS `emp` WHERE ( `ename` >= 
> 'jd聯聭聪G耀' ) AND ( `ename` <= 'shiva'' )
> +Stack trace:+
> {quote}
> 2016-06-05 16:54:06,749 ERROR [main] 
> org.apache.sqoop.mapreduce.db.DBRecordReader: Top level exception: 
> com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: You have an error 
> in your SQL syntax; check the manual that corresponds to your MySQL server 
> version for the right syntax to use near ''shiva'' )' at line 1
>   at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>   at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>   at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>   at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>   at com.mysql.jdbc.Util.handleNewInstance(Util.java:404)
>   at com.mysql.jdbc.Util.getInstance(Util.java:387)
>   at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:942)
>   at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3966)
>   at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3902)
>   at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2526)
>   at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2673)
>   at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2549)
>   at 
> com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:1861)
>   at 
> com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:1962)
>   at 
> org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:111)
>   at 
> org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:235)
>   at 
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:553)
>   at 
> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
>   at 
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
>   at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>   at 
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>   at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
>   at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
>   at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
>   at java.security.AccessController.doPrivileged(Native Method)
>   at javax.security.auth.Subject.doAs(Subject.java:422)
>   at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>   at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
> {quote}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3395) Document Hadoop CredentialProvider usage in case of import into S3

2018-10-25 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3395?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16663849#comment-16663849
 ] 

Hudson commented on SQOOP-3395:
---

FAILURE: Integrated in Jenkins build Sqoop-hadoop200 #1224 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1224/])
SQOOP-3395: Document Hadoop CredentialProvider usage in case of import (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=c2211d6118197f2ee2800e0e6cd6c0a8879f0d0a])
* (edit) src/docs/user/s3.txt


> Document Hadoop CredentialProvider usage in case of import into S3
> --
>
> Key: SQOOP-3395
> URL: https://issues.apache.org/jira/browse/SQOOP-3395
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>Reporter: Boglarka Egyed
>Assignee: Boglarka Egyed
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: SQOOP-3395.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3399) TestS3ImportWithHadoopCredProvider fails if credential generator command is not provided

2018-10-25 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3399?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16663418#comment-16663418
 ] 

Hudson commented on SQOOP-3399:
---

FAILURE: Integrated in Jenkins build Sqoop-hadoop200 #1223 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1223/])
SQOOP-3399: TestS3ImportWithHadoopCredProvider fails if credential (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=7f61ae21e39a45b2b5e0107c4165bc4a255a27eb])
* (edit) src/test/org/apache/sqoop/s3/TestS3ImportWithHadoopCredProvider.java


> TestS3ImportWithHadoopCredProvider fails if credential generator command is 
> not provided
> 
>
> Key: SQOOP-3399
> URL: https://issues.apache.org/jira/browse/SQOOP-3399
> Project: Sqoop
>  Issue Type: Test
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: SQOOP-3399.patch
>
>
> BeforeClass method of TestS3ImportWithHadoopCredProvider should not throw 
> NullPointerException when the credential generator command is not provided 
> since it fails the test with Gradle.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-1905) add --schema option for import-all-tables and list-tables against db2

2018-10-25 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-1905?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16663390#comment-16663390
 ] 

Hudson commented on SQOOP-1905:
---

FAILURE: Integrated in Jenkins build Sqoop-hadoop200 #1222 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1222/])
SQOOP-3355: Document SQOOP-1905 DB2 --schema option (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=7ab99f41cf75da22e523c132cb782bb20102b348])
* (edit) src/docs/user/connectors.txt


> add --schema option for import-all-tables and list-tables against db2
> -
>
> Key: SQOOP-1905
> URL: https://issues.apache.org/jira/browse/SQOOP-1905
> Project: Sqoop
>  Issue Type: Improvement
>  Components: connectors
>Affects Versions: 1.4.5
> Environment: RedHat6.4+Sqoop1.4.5+Hadoop2.4.1+Hive0.13.0+DB2 10.5
>Reporter: xieshiju
>Assignee: Ying Cao
>Priority: Major
>  Labels: features
> Fix For: 1.4.7
>
> Attachments: SQOOP-1905.patch, SQOOP-1905.patch, SQOOP-1905.patch
>
>   Original Estimate: 504h
>  Remaining Estimate: 504h
>
> For Sqoop import-all-tables, and list-tables function, we can use --schema 
> option to qualify the table name.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3355) Document SQOOP-1905 DB2 --schema option

2018-10-25 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3355?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16663389#comment-16663389
 ] 

Hudson commented on SQOOP-3355:
---

FAILURE: Integrated in Jenkins build Sqoop-hadoop200 #1222 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1222/])
SQOOP-3355: Document SQOOP-1905 DB2 --schema option (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=7ab99f41cf75da22e523c132cb782bb20102b348])
* (edit) src/docs/user/connectors.txt


> Document SQOOP-1905 DB2 --schema option
> ---
>
> Key: SQOOP-3355
> URL: https://issues.apache.org/jira/browse/SQOOP-3355
> Project: Sqoop
>  Issue Type: Sub-task
>Reporter: Fero Szabo
>Assignee: Fero Szabo
>Priority: Major
> Fix For: 3.0.0
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3398) Tests using HiveMiniCluster can be unstable on some platforms

2018-10-24 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3398?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16662497#comment-16662497
 ] 

Hudson commented on SQOOP-3398:
---

FAILURE: Integrated in Jenkins build Sqoop-hadoop200 #1221 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1221/])
SQOOP-3398: Tests using HiveMiniCluster can be unstable on some (bogi: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=f6455c86486e48dacd67885fb2b1d29aaed32c67])
* (edit) ivy.xml
* (edit) build.gradle


> Tests using HiveMiniCluster can be unstable on some platforms
> -
>
> Key: SQOOP-3398
> URL: https://issues.apache.org/jira/browse/SQOOP-3398
> Project: Sqoop
>  Issue Type: Test
>Affects Versions: 1.4.7
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: SQOOP-3398.patch
>
>
> Since the last Hive upgrade TestHiveMiniCluster fails on some platforms 
> because and older version of the ASM library is picked up.
> The task is to exclude the older ASM library in ivy and gradle to make sure 
> the test passes on all platforms.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3390) Document S3Guard usage with Sqoop

2018-10-24 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3390?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16662441#comment-16662441
 ] 

Hudson commented on SQOOP-3390:
---

FAILURE: Integrated in Jenkins build Sqoop-hadoop200 #1220 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1220/])
SQOOP-3390: Document S3Guard usage with Sqoop (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=9e328a53e1740ca1ed85861311281f8ea5846ecf])
* (edit) src/docs/user/s3.txt


> Document S3Guard usage with Sqoop
> -
>
> Key: SQOOP-3390
> URL: https://issues.apache.org/jira/browse/SQOOP-3390
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>Reporter: Boglarka Egyed
>Assignee: Boglarka Egyed
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: SQOOP-3390.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3394) External Hive table tests should use unique external dir names

2018-10-19 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3394?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16656576#comment-16656576
 ] 

Hudson commented on SQOOP-3394:
---

FAILURE: Integrated in Jenkins build Sqoop-hadoop200 #1218 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1218/])
SQOOP-3394: External Hive table tests should use unique external dir (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=15097756c7fc694ec86d003843f40a793bcd9633])
* (edit) src/test/org/apache/sqoop/testutil/S3TestUtils.java


> External Hive table tests should use unique external dir names
> --
>
> Key: SQOOP-3394
> URL: https://issues.apache.org/jira/browse/SQOOP-3394
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>Reporter: Boglarka Egyed
>Assignee: Boglarka Egyed
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: SQOOP-3394.patch, SQOOP-3394.patch
>
>
> Current external Hive table tests on S3 uses the same external directory name 
> in every unit test cases which can cause problems during running them in an 
> automated environment. These names should be unique in every test cases.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3391) Test storing AWS credentials in Hadoop CredentialProvider during import

2018-10-18 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3391?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16655365#comment-16655365
 ] 

Hudson commented on SQOOP-3391:
---

FAILURE: Integrated in Jenkins build Sqoop-hadoop200 #1217 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1217/])
SQOOP-3391: Test storing AWS credentials in Hadoop CredentialProvider (bogi: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=1f7b0bf841f9d1cbafb58062d0bb0b6f19312874])
* (edit) ivy.xml
* (edit) src/test/org/apache/sqoop/testutil/S3TestUtils.java
* (add) conf/password-file.txt
* (edit) build.gradle
* (edit) build.xml
* (add) conf/wrong-password-file.txt
* (edit) gradle.properties
* (edit) ivy/libraries.properties
* (add) src/test/org/apache/sqoop/s3/TestS3ImportWithHadoopCredProvider.java
* (edit) src/java/org/apache/sqoop/util/password/CredentialProviderHelper.java


> Test storing AWS credentials in Hadoop CredentialProvider during import
> ---
>
> Key: SQOOP-3391
> URL: https://issues.apache.org/jira/browse/SQOOP-3391
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>Reporter: Boglarka Egyed
>Assignee: Boglarka Egyed
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: SQOOP-3391.patch, SQOOP-3391.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3381) Upgrade the Parquet library from 1.6.0 to 1.9.0

2018-10-17 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3381?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16653217#comment-16653217
 ] 

Hudson commented on SQOOP-3381:
---

FAILURE: Integrated in Jenkins build Sqoop-hadoop200 #1216 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1216/])
SQOOP-3381: Upgrade the Parquet library from 1.6.0 to 1.9.0 (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=5dd8c8aad1c7732754fae190eb5424371ed6fef4])
* (edit) src/test/org/apache/sqoop/util/ParquetReader.java
* (add) src/java/org/apache/sqoop/mapreduce/hcat/DerbyPolicy.java
* (edit) src/test/org/apache/sqoop/TestParquetExport.java
* (edit) 
src/java/org/apache/sqoop/mapreduce/parquet/hadoop/HadoopParquetExportJobConfigurator.java
* (edit) gradle/sqoop-package.gradle
* (edit) src/java/org/apache/sqoop/avro/AvroUtil.java
* (edit) 
src/java/org/apache/sqoop/mapreduce/parquet/hadoop/HadoopParquetImportJobConfigurator.java
* (edit) src/test/org/apache/sqoop/hive/TestHiveServer2ParquetImport.java
* (edit) src/test/org/apache/sqoop/TestParquetImport.java
* (edit) src/java/org/apache/sqoop/mapreduce/hcat/SqoopHCatUtilities.java
* (edit) ivy.xml
* (edit) src/java/org/apache/sqoop/hive/HiveImport.java
* (edit) 
src/java/org/apache/sqoop/mapreduce/parquet/hadoop/HadoopParquetMergeJobConfigurator.java
* (edit) src/test/org/apache/sqoop/hive/minicluster/HiveMiniCluster.java
* (edit) build.gradle
* (edit) gradle.properties
* (edit) src/test/org/apache/sqoop/TestParquetIncrementalImportMerge.java
* (edit) testdata/hcatalog/conf/hive-site.xml
* (edit) ivy/libraries.properties


> Upgrade the Parquet library from 1.6.0 to 1.9.0
> ---
>
> Key: SQOOP-3381
> URL: https://issues.apache.org/jira/browse/SQOOP-3381
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>Reporter: Fero Szabo
>Assignee: Fero Szabo
>Priority: Major
> Fix For: 3.0.0
>
>
> As we will need to register a data supplier in the fix for parquet decimal 
> support, we will need a version that contains PARQUET-243.
> We need to upgrade the Parquet library to a version that contains this fix 
> and is compatible with Hadoop. Most probably, the newest version will be 
> adequate. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3384) Document import into external Hive table backed by S3

2018-10-15 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3384?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16650292#comment-16650292
 ] 

Hudson commented on SQOOP-3384:
---

FAILURE: Integrated in Jenkins build Sqoop-hadoop200 #1215 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1215/])
SQOOP-3384: Document import into external Hive table backed by S3 (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=c329f360dd08ef3b9bd82897fcd611e7431d32c8])
* (edit) src/docs/user/s3.txt


> Document import into external Hive table backed by S3
> -
>
> Key: SQOOP-3384
> URL: https://issues.apache.org/jira/browse/SQOOP-3384
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>Reporter: Boglarka Egyed
>Assignee: Boglarka Egyed
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: SQOOP-3384.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3378) Error during direct Netezza import/export can interrupt process in uncontrolled ways

2018-10-11 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3378?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16646179#comment-16646179
 ] 

Hudson commented on SQOOP-3378:
---

FAILURE: Integrated in Jenkins build Sqoop-hadoop200 #1212 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1212/])
SQOOP-3378: Error during direct Netezza import/export can interrupt (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=40f0b74c012da917c6750a0fcce1f0ae13bd5f46])
* (add) 
src/test/org/apache/sqoop/mapreduce/db/netezza/TestNetezzaExternalTableExportMapper.java
* (edit) 
src/java/org/apache/sqoop/mapreduce/db/netezza/NetezzaExternalTableImportMapper.java
* (edit) 
src/java/org/apache/sqoop/mapreduce/db/netezza/NetezzaExternalTableExportMapper.java
* (add) 
src/test/org/apache/sqoop/mapreduce/db/netezza/TestNetezzaExternalTableImportMapper.java
* (edit) 
src/java/org/apache/sqoop/mapreduce/db/netezza/NetezzaJDBCStatementRunner.java


> Error during direct Netezza import/export can interrupt process in 
> uncontrolled ways
> 
>
> Key: SQOOP-3378
> URL: https://issues.apache.org/jira/browse/SQOOP-3378
> Project: Sqoop
>  Issue Type: Bug
>Affects Versions: 1.4.7
>Reporter: Daniel Voros
>Assignee: Daniel Voros
>Priority: Major
> Fix For: 1.5.0, 3.0.0
>
>
> SQLException during JDBC operation in direct Netezza import/export signals 
> parent thread to fail fast by interrupting it (see 
> [here|https://github.com/apache/sqoop/blob/c814e58348308b05b215db427412cd6c0b21333e/src/java/org/apache/sqoop/mapreduce/db/netezza/NetezzaJDBCStatementRunner.java#L92]).
> We're [trying to process the interrupt in the 
> parent|https://github.com/apache/sqoop/blob/c814e58348308b05b215db427412cd6c0b21333e/src/java/org/apache/sqoop/mapreduce/db/netezza/NetezzaExternalTableExportMapper.java#L232]
>  (main) thread, but there's no guarantee that we're not in some blocking 
> internal call that will process the interrupted flag and reset it before 
> we're able to check.
> It is also possible that the parent thread has passed the "checking part" 
> when it gets interrupted. In case of {{NetezzaExternalTableExportMapper}} 
> this can interrupt the upload of log files.
> I'd recommend using some other means of communication between the threads 
> than interrupts.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3327) Mainframe FTP needs to Include "Migrated" datasets when parsing the FTP list

2018-10-11 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3327?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16646023#comment-16646023
 ] 

Hudson commented on SQOOP-3327:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1211 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1211/])
SQOOP-3327: Mainframe FTP needs to Include "Migrated" datasets when (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=71523079bc61061867ced9b6a597150a3c72a964])
* (edit) 
src/java/org/apache/sqoop/mapreduce/mainframe/MainframeFTPFileEntryParser.java
* (edit) 
src/test/org/apache/sqoop/mapreduce/mainframe/TestMainframeFTPFileEntryParser.java


> Mainframe FTP needs to Include "Migrated" datasets when parsing the FTP list
> 
>
> Key: SQOOP-3327
> URL: https://issues.apache.org/jira/browse/SQOOP-3327
> Project: Sqoop
>  Issue Type: Improvement
>Reporter: Chris Teoh
>Assignee: Chris Teoh
>Priority: Minor
> Fix For: 3.0.0
>
>
> Need to Include "Migrated" datasets when parsing the FTP list.
>  
> ** This applies to sequential datasets as well as GDG members **
>  
> Identifying migrated datasets – when performing manual FTP
>  
> ftp> open abc.def.ghi.jkl.mno
> Connected to abc.def.ghi.jkl.mno (11.22.33.444).
> 220-TCPFTP01 Some FTP Server at abc.def.ghi.jkl.mno, 22:34:11 on 2018-01-22.
> 220 Connection will close if idle for more than 10 minutes.
> Name (abc.def.ghi.jkl.mno:some_user): some_user
> 331 Send password please.
> Password:
> 230 some_user is logged on.  Working directory is "some_user.".
> Remote system type is MVS.
> ftp> dir
> 227 Entering Passive Mode (33,44,555,66,7,8)
> 125 List started OK
> Volume Unit    Referred Ext Used Recfm Lrecl BlkSz Dsorg Dsname
> Migrated    DEV.DATA
> Migrated    DUMMY.DATA
> OVR343 3390   2018/01/23  1    1  FB 132 27984  PS  EMPTY
> Migrated    JCL.CNTL
> OVR346 3390   2018/01/22  1    1  FB  80 27920  PS  MIXED.FB80
> Migrated    PLAIN.FB80
> OVR341 3390   2018/01/23  1    9  VA 125   129  PS  PRDA.SPFLOG1.LIST
> G20427 Tape 
> UNLOAD.ABCDE.ZZ9UYT.FB.TAPE
> SEM352 3390   2018/01/23  1    1  FB 150  1500  PS  USER.BRODCAST
> OVR346 3390   2018/01/23  3    3  FB  80  6160  PO  USER.ISPPROF
> 250 List completed successfully.
>  
> "Migrated" should be included as one of the regex pattern searches.
> Assuming space delimited, first column will be "Migrated", and the second 
> (and final) column will contain the dataset name.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3326) Mainframe FTP listing for GDG should filter out non-GDG datasets in a heterogeneous listing

2018-10-10 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3326?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16645013#comment-16645013
 ] 

Hudson commented on SQOOP-3326:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1210 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1210/])
SQOOP-3326: Mainframe FTP listing for GDG should filter out non-GDG (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=18212becec800336533697c5795a06bcd38e6242])
* (edit) build.xml
* (edit) src/java/org/apache/sqoop/util/MainframeFTPClientUtils.java
* (edit) 
src/scripts/thirdpartytest/docker-compose/sqoop-thirdpartytest-db-services.yml
* (add) 
src/test/org/apache/sqoop/mapreduce/mainframe/TestMainframeFTPFileGdgEntryParser.java
* (add) 
src/java/org/apache/sqoop/mapreduce/mainframe/MainframeFTPFileGdgEntryParser.java
* (edit) 
src/java/org/apache/sqoop/mapreduce/mainframe/MainframeConfiguration.java
* (edit) 
src/test/org/apache/sqoop/manager/mainframe/MainframeManagerImportTest.java
* (edit) src/test/org/apache/sqoop/manager/mainframe/MainframeTestUtil.java
* (edit) src/test/org/apache/sqoop/util/TestMainframeFTPClientUtils.java


> Mainframe FTP listing for GDG should filter out non-GDG datasets in a 
> heterogeneous listing
> ---
>
> Key: SQOOP-3326
> URL: https://issues.apache.org/jira/browse/SQOOP-3326
> Project: Sqoop
>  Issue Type: Improvement
>Reporter: Chris Teoh
>Assignee: Chris Teoh
>Priority: Minor
> Fix For: 3.0.0
>
>
> The FTP listing will automatically assume the first file in the listing is 
> the most recent GDG file. This is a problem when there are mixed datasets in 
> the listing that the FTP listing doesn't filter these out.
>  
> GDG base is : HLQ.ABC.DEF.AB15HUP
>  
> The sequential dataset in the middle of the GDG member listing is : 
> HLQ.ABC.DEF.AB15HUP.DATA
>  
> The pattern for listing GDG members should be : < name>>.G\d\{4}V\d\{2}
>  
>  Sample below:-
> {{   Menu  Options  View  Utilities  Compilers  Help  
>
>  
> ss
>  DSLIST - Data Sets Matching HLQ.ABC.DEF.GDGBASE   Row 1 of 8
>  Command ===>  Scroll ===> 
> PAGE
>   
>   
>  Command - Enter "/" to select action  Message   
> Volume
>  
> ---
>   HLQ.ABC.DEF.GDGBASE  ??
>   HLQ.ABC.DEF.GDGBASE.DUMMYSHT331
>   HLQ.ABC.DEF.GDGBASE.G0034V00 H19761
>   HLQ.ABC.DEF.GDGBASE.G0035V00 H81751
>   HLQ.ABC.DEF.GDGBASE.G0035V00.COPYSHT337
>   HLQ.ABC.DEF.GDGBASE.G0036V00 H73545
>   HLQ.ABC.DEF.GDGBASE.G0037V00 G10987
>   HLQ.ABC.DEF.GDGBASE.HELLOSHT33A
>  * End of Data Set list 
> 
> ftp> open some.machine.network.zxc.au
> Connected to some.machine.network.zxc.au (11.22.33.44).
> 220-TCPFTP01 IBM FTP CS V2R1 at some.machine.network.zxc.au, 00:12:29 on 
> 2018-05-29.
> 220 Connection will close if idle for more than 10 minutes.
> Name (some.machine.network.zxc.au:someuser):
> 331 Send password please.
> Password:
> 230 someuser is logged on.  Working directory is "someuser.".
> Remote system type is MVS.
> ftp> cd  'HLQ.ABC.DEF.GDGBASE'
> 250 "HLQ.ABC.DEF.GDGBASE." is the working directory name prefix.
> ftp> dir
> 227 Entering Passive Mode (11,22,33,44,55,66)
> 125 List started OK
> Volume UnitReferred Ext Used Recfm Lrecl BlkSz Dsorg Dsname
> H19761 Tape G0034V00
> H81751 Tape G0035V00
> H73545 Tape G0036V00
> G10987 Tape G0037V00
> SHT331 3390   **NONE**1   15  VB 114 27998  PS  DUMMY
> SHT337 3390   **NONE**1   15  VB 114 27998  PS  G0035V00.COPY
> SHT33A 3390   **NONE**1   15  VB 114 27998  PS  HELLO
> 250 List completed successfully.
> ftp>}}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3376) Test import into external Hive table backed by S3

2018-10-10 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3376?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16644959#comment-16644959
 ] 

Hudson commented on SQOOP-3376:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1209 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1209/])
SQOOP-3376: Test import into external Hive table backed by S3 (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=973629912ce9b849228e0bce94e245f016dd6101])
* (edit) src/test/org/apache/sqoop/testutil/HiveServer2TestUtil.java
* (add) src/test/org/apache/sqoop/s3/TestS3ExternalHiveTableImport.java
* (edit) src/test/org/apache/sqoop/testutil/S3TestUtils.java


> Test import into external Hive table backed by S3
> -
>
> Key: SQOOP-3376
> URL: https://issues.apache.org/jira/browse/SQOOP-3376
> Project: Sqoop
>  Issue Type: Sub-task
>Reporter: Boglarka Egyed
>Assignee: Boglarka Egyed
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: SQOOP-3376.patch, SQOOP-3376.patch, SQOOP-3376.patch, 
> SQOOP-3376.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3383) Disable FileSystem static cache in S3 tests

2018-09-12 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3383?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16611697#comment-16611697
 ] 

Hudson commented on SQOOP-3383:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1208 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1208/])
SQOOP-3383: Disable FileSystem static cache in S3 tests (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=932822aa8fdddeb0aff6445d4f585a599ccb0084])
* (edit) src/test/org/apache/sqoop/testutil/S3TestUtils.java


> Disable FileSystem static cache in S3 tests
> ---
>
> Key: SQOOP-3383
> URL: https://issues.apache.org/jira/browse/SQOOP-3383
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>Reporter: Boglarka Egyed
>Assignee: Boglarka Egyed
>Priority: Minor
> Fix For: 3.0.0
>
> Attachments: SQOOP-3383.patch, SQOOP-3383.patch
>
>
> FileSystem has a static cache meaning when the authentication happens in the 
> org.apache.sqoop.testutil.S3TestUtils#setS3CredentialsInHadoopConf method 
> Sqoop import will get the same FileSystem object from the cache thus its 
> authentication via the \{{-Dfs.s3a.access.key}} and -{{Dfs.s3a.secret.key}} 
> properties is effectless. See 
> org.apache.hadoop.fs.FileSystem#get(java.net.URI, 
> org.apache.hadoop.conf.Configuration). 
> This static cache should be disabled (by settin {{fs.s3a.impl.disable.cache}} 
> to true) in the setup phase of the S3 tests to make sure Sqoop relies on the 
> S3 credentials set via the -D properties.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3380) parquet-configurator-implementation is not recognized as an option

2018-09-11 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3380?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16611500#comment-16611500
 ] 

Hudson commented on SQOOP-3380:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1207 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1207/])
SQOOP-3380: parquet-configurator-implementation is not recognized as an (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=b37e0c6053b92f2d7a56026e7fde5637b7f70588])
* (edit) src/java/org/apache/sqoop/tool/BaseSqoopTool.java
* (edit) src/test/org/apache/sqoop/tool/TestBaseSqoopTool.java


> parquet-configurator-implementation is not recognized as an option
> --
>
> Key: SQOOP-3380
> URL: https://issues.apache.org/jira/browse/SQOOP-3380
> Project: Sqoop
>  Issue Type: Bug
>Reporter: Fero Szabo
>Assignee: Szabolcs Vasas
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: SQOOP-3380.patch
>
>
> The parquet-configurator-implementation option was added to Sqoop with 
> SQOOP-3329: Remove Kite dependency from the Sqoop project, but the command 
> line parser doesn't recognize it.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3375) HiveMiniCluster does not restore hive-site.xml location

2018-09-03 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3375?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16601986#comment-16601986
 ] 

Hudson commented on SQOOP-3375:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1206 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1206/])
SQOOP-3375: HiveMiniCluster does not restore hive-site.xml location (bogi: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=c814e58348308b05b215db427412cd6c0b21333e])
* (edit) src/test/org/apache/sqoop/hive/minicluster/HiveMiniCluster.java


> HiveMiniCluster does not restore hive-site.xml location
> ---
>
> Key: SQOOP-3375
> URL: https://issues.apache.org/jira/browse/SQOOP-3375
> Project: Sqoop
>  Issue Type: Sub-task
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
>Priority: Major
> Attachments: SQOOP-3375.patch
>
>
> HiveMiniCluster sets the hive-site.xml location using 
> org.apache.hadoop.hive.conf.HiveConf#setHiveSiteLocation static method during 
> startup but it does not restore the original location during shutdown.
> This makes HCatalogImportTest and HCatalogExportTest fail if they are ran in 
> the same JVM after any test using HiveMiniCluster.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3373) Document simple and incremental import into S3

2018-08-31 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3373?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16598520#comment-16598520
 ] 

Hudson commented on SQOOP-3373:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1205 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1205/])
SQOOP-3373: Document simple and incremental import into S3 (bogi: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=35556651e2eabc768f765ffd0db3379727c78276])
* (edit) src/docs/user/SqoopUserGuide.txt
* (add) src/docs/user/s3.txt


> Document simple and incremental import into S3
> --
>
> Key: SQOOP-3373
> URL: https://issues.apache.org/jira/browse/SQOOP-3373
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>Reporter: Boglarka Egyed
>Assignee: Boglarka Egyed
>Priority: Major
> Attachments: SQOOP-3373.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3368) Add fail-fast scenarios to S3 incremental import use cases without --temporary-rootdir option

2018-08-31 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3368?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16598421#comment-16598421
 ] 

Hudson commented on SQOOP-3368:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1204 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1204/])
SQOOP-3368: Add fail-fast scenarios to S3 incremental import use cases (bogi: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=6fa45a95a41f80778338cd2a2bb13418b2c37376])
* (add) 
src/test/org/apache/sqoop/tool/TestS3IncrementalImportOptionValidations.java
* (edit) src/java/org/apache/sqoop/tool/ImportTool.java


> Add fail-fast scenarios to S3 incremental import use cases without 
> --temporary-rootdir option
> -
>
> Key: SQOOP-3368
> URL: https://issues.apache.org/jira/browse/SQOOP-3368
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>Reporter: Boglarka Egyed
>Assignee: Boglarka Egyed
>Priority: Major
> Attachments: SQOOP-3368.patch, SQOOP-3368.patch, SQOOP-3368.patch
>
>
> The current implementation of Sqoop handles HDFS as a default filesystem, 
> i.e. it creates temporary directories on HDFS in case of incremental append 
> or merge imports. To make these incremental import use cases work with S3 the 
> user needs to set the {{--temporary-rootdir}} to an S3 location properly.
> There should be fail-fast scenarios without the {{--temporary-rootdir}} 
> option.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3367) Improve third party tests to be able to execute them in a single JVM

2018-08-28 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3367?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16595000#comment-16595000
 ] 

Hudson commented on SQOOP-3367:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1203 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1203/])
SQOOP-3367: Improve third party tests to be able to execute them in a (bogi: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=327aec8bf1b7a40936503a9c5a94b759caf16e11])
* (edit) testdata/hive/scripts/normalImport.q
* (edit) testdata/hive/scripts/numericImport.q
* (edit) testdata/hive/scripts/customDelimImport.q
* (edit) testdata/hive/scripts/failingImport.q
* (edit) src/java/org/apache/sqoop/hive/HiveImport.java
* (edit) src/test/org/apache/sqoop/manager/oracle/SystemImportTest.java
* (edit) testdata/hive/scripts/partitionImport.q
* (edit) testdata/hive/scripts/fieldWithNewlineImport.q
* (edit) testdata/hive/scripts/incrementalHiveAppend20.q
* (edit) src/test/org/apache/sqoop/testutil/LobAvroImportTestCase.java
* (edit) testdata/hive/scripts/dateImport.q
* (edit) testdata/hive/scripts/fieldWithNewlineReplacementImport.q
* (edit) src/test/org/apache/sqoop/testutil/BaseSqoopTestCase.java
* (edit) src/test/org/apache/sqoop/manager/oracle/OraOopTypesTest.java
* (edit) testdata/hive/scripts/incrementalHiveAppendEmpty.q
* (edit) testdata/hive/scripts/incrementalHiveAppend10.q
* (edit) testdata/hive/scripts/decimalMapImport.q


> Improve third party tests to be able to execute them in a single JVM
> 
>
> Key: SQOOP-3367
> URL: https://issues.apache.org/jira/browse/SQOOP-3367
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
>Priority: Major
> Attachments: SQOOP-3367.patch
>
>
> The goal of this JIRA is to improve the third party tests to be able to 
> execute them in a single JVM. See the parent JIRA for the details.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3363) Test incremental import with S3

2018-08-28 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3363?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16594969#comment-16594969
 ] 

Hudson commented on SQOOP-3363:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1202 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1202/])
SQOOP-3363: Test incremental import with S3 (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=816146df567cf46a972d16873c46595f778b136c])
* (edit) src/test/org/apache/sqoop/TestAppendUtils.java
* (edit) src/test/org/apache/sqoop/testutil/TextFileTestUtils.java
* (add) src/test/org/apache/sqoop/s3/TestS3IncrementalAppendAvroImport.java
* (edit) src/test/org/apache/sqoop/testutil/AvroTestUtils.java
* (add) src/test/org/apache/sqoop/s3/TestS3IncrementalMergeParquetImport.java
* (edit) src/test/org/apache/sqoop/testutil/SequenceFileTestUtils.java
* (edit) src/java/org/apache/sqoop/util/FileSystemUtil.java
* (edit) src/test/org/apache/sqoop/s3/TestS3TextImport.java
* (edit) src/test/org/apache/sqoop/s3/TestS3SequenceFileImport.java
* (edit) src/java/org/apache/sqoop/util/AppendUtils.java
* (add) src/test/org/apache/sqoop/s3/TestS3ParquetImport.java
* (add) src/test/org/apache/sqoop/s3/TestS3IncrementalAppendParquetImport.java
* (edit) src/test/org/apache/sqoop/testutil/BaseSqoopTestCase.java
* (add) src/test/org/apache/sqoop/s3/TestS3IncrementalAppendTextImport.java
* (add) 
src/test/org/apache/sqoop/s3/TestS3IncrementalAppendSequenceFileImport.java
* (edit) src/test/org/apache/sqoop/testutil/S3TestUtils.java
* (edit) src/test/org/apache/sqoop/s3/TestS3AvroImport.java
* (add) src/test/org/apache/sqoop/s3/TestS3IncrementalMergeTextImport.java


> Test incremental import with S3
> ---
>
> Key: SQOOP-3363
> URL: https://issues.apache.org/jira/browse/SQOOP-3363
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>Reporter: Boglarka Egyed
>Assignee: Boglarka Egyed
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: SQOOP-3363.patch, SQOOP-3363.patch, SQOOP-3363.patch, 
> SQOOP-3363.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3371) Fix tests using HiveMiniCluster

2018-08-27 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3371?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16593463#comment-16593463
 ] 

Hudson commented on SQOOP-3371:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1201 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1201/])
SQOOP-3371: Fix tests using HiveMiniCluster (bogi: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=1f73341070c56bda9c22820b2fa1cb7449239d32])
* (edit) gradle.properties
* (edit) build.gradle
* (edit) ivy.xml
* (edit) ivy/libraries.properties


> Fix tests using HiveMiniCluster
> ---
>
> Key: SQOOP-3371
> URL: https://issues.apache.org/jira/browse/SQOOP-3371
> Project: Sqoop
>  Issue Type: Test
>Affects Versions: 1.4.7
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
>Priority: Major
> Attachments: SQOOP-3371.patch
>
>
> It seems that SQOOP-3360 broke our tests which use HiveMiniCluster because 
> org.apache.calcite is not present in the hive-exec:core JAR but this 
> dependency seems to be needed by these tests.
> I am not sure why our Jenkins job did not catch the issue earlier but I get 
> consistent failures when I run these tests with a clean ivy cache and ant so 
> the dependency issue needs to be fixed.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3366) Improve unit tests to be able to execute them in a single JVM

2018-08-24 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3366?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16591469#comment-16591469
 ] 

Hudson commented on SQOOP-3366:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1200 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1200/])
SQOOP-3366: Improve unit tests to be able to execute them in a single (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=f0215808447f40bcddd50b42f48a765d55094273])
* (edit) 
src/test/org/apache/sqoop/metastore/TestMetastoreConfigurationParameters.java
* (edit) src/test/org/apache/sqoop/testutil/BaseSqoopTestCase.java
* (edit) src/test/org/apache/sqoop/TestIncrementalImport.java
* (edit) src/test/org/apache/sqoop/TestSqoopOptions.java
* (edit) src/test/org/apache/sqoop/testutil/HsqldbTestServer.java
* (edit) src/test/org/apache/sqoop/tool/TestMainframeImportTool.java


> Improve unit tests to be able to execute them in a single JVM
> -
>
> Key: SQOOP-3366
> URL: https://issues.apache.org/jira/browse/SQOOP-3366
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>Reporter: Szabolcs Vasas
>Assignee: Nguyen Truong
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: SQOOP-3366.patch
>
>
> The goal of this JIRA is to improve the unit tests to be able to execute them 
> in a single JVM. See the parent JIRA for the details.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3224) Mainframe FTP transfer should have an option to use binary mode for transfer

2018-08-23 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3224?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16590384#comment-16590384
 ] 

Hudson commented on SQOOP-3224:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1199 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1199/])
SQOOP-3224: Mainframe FTP transfer should have an option to use binary (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=0d6c455e5bbfc092d4a90f352eb262f347758132])
* (edit) src/java/org/apache/sqoop/util/MainframeFTPClientUtils.java
* (add) 
src/java/org/apache/sqoop/mapreduce/mainframe/AbstractMainframeDatasetImportMapper.java
* (add) 
src/java/org/apache/sqoop/mapreduce/mainframe/MainframeDatasetBinaryRecord.java
* (add) 
src/test/org/apache/sqoop/mapreduce/mainframe/TestMainframeDatasetBinaryRecord.java
* (edit) src/test/org/apache/sqoop/tool/TestMainframeImportTool.java
* (edit) src/docs/user/import-mainframe.txt
* (edit) src/java/org/apache/sqoop/tool/BaseSqoopTool.java
* (add) src/java/org/apache/sqoop/mapreduce/ByteKeyOutputFormat.java
* (edit) src/java/org/apache/sqoop/mapreduce/RawKeyTextOutputFormat.java
* (add) src/java/org/apache/sqoop/mapreduce/KeyRecordWriters.java
* (edit) build.xml
* (edit) src/java/org/apache/sqoop/mapreduce/mainframe/MainframeImportJob.java
* (edit) src/test/org/apache/sqoop/manager/mainframe/MainframeTestUtil.java
* (edit) src/java/org/apache/sqoop/tool/ImportTool.java
* (edit) src/java/org/apache/sqoop/SqoopOptions.java
* (edit) src/java/org/apache/sqoop/tool/MainframeImportTool.java
* (edit) 
src/java/org/apache/sqoop/mapreduce/mainframe/MainframeDatasetFTPRecordReader.java
* (edit) 
src/test/org/apache/sqoop/manager/mainframe/MainframeManagerImportTest.java
* (edit) 
src/java/org/apache/sqoop/mapreduce/mainframe/MainframeDatasetImportMapper.java
* (edit) 
src/java/org/apache/sqoop/mapreduce/mainframe/MainframeConfiguration.java
* (add) 
src/java/org/apache/sqoop/mapreduce/mainframe/MainframeDatasetBinaryImportMapper.java


> Mainframe FTP transfer should have an option to use binary mode for transfer
> 
>
> Key: SQOOP-3224
> URL: https://issues.apache.org/jira/browse/SQOOP-3224
> Project: Sqoop
>  Issue Type: Improvement
>Reporter: Chris Teoh
>Assignee: Chris Teoh
>Priority: Minor
> Fix For: 3.0.0
>
>
> Currently the mainframe FTP module is hard coded to use ascii transfer mode. 
> Propose a mainframe module flag to be able to change modes.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3364) Upgrade Gradle version to 4.9

2018-08-22 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3364?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16588980#comment-16588980
 ] 

Hudson commented on SQOOP-3364:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1198 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1198/])
SQOOP-3364: Upgrade Gradle version to 4.9 (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=f5eda1e208fba9058c250dd1b142a97eb118ced9])
* (edit) gradle/wrapper/gradle-wrapper.properties
* (edit) src/test/org/apache/sqoop/hbase/HBaseTestCase.java
* (edit) build.gradle
* (edit) gradle/wrapper/gradle-wrapper.jar
* (edit) settings.gradle


> Upgrade Gradle version to 4.9
> -
>
> Key: SQOOP-3364
> URL: https://issues.apache.org/jira/browse/SQOOP-3364
> Project: Sqoop
>  Issue Type: Improvement
>Affects Versions: 1.4.7
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: SQOOP-3364.patch
>
>
> Sqoop uses Gradle 3.5.1 currently which is a pretty old version, let's 
> upgrade it to the newest 4.9 version.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3362) Fix toString() methods of OraOopOracleDataChunk

2018-08-16 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3362?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16582385#comment-16582385
 ] 

Hudson commented on SQOOP-3362:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1197 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1197/])
SQOOP-3362: Fix toString() methods of OraOopOracleDataChunk (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=005e5d6af82679b64a9683366aa5ae67e216f18d])
* (edit) 
src/java/org/apache/sqoop/manager/oracle/OraOopOracleDataChunkExtent.java
* (add) 
src/test/org/apache/sqoop/manager/oracle/TestOraOopDBInputSplitGetDebugDetails.java
* (edit) src/java/org/apache/sqoop/manager/oracle/OraOopOracleDataChunk.java
* (edit) src/java/org/apache/sqoop/manager/oracle/OraOopDBInputSplit.java
* (edit) 
src/java/org/apache/sqoop/manager/oracle/OraOopOracleDataChunkPartition.java


> Fix toString() methods of OraOopOracleDataChunk
> ---
>
> Key: SQOOP-3362
> URL: https://issues.apache.org/jira/browse/SQOOP-3362
> Project: Sqoop
>  Issue Type: Bug
>Reporter: Nguyen Truong
>Assignee: Nguyen Truong
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: SQOOP-3362.patch
>
>
> The method is currently returning the hash of data chunk object. It would be 
> nice to show the information of the class's variables.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3359) Add LOG message for git hash

2018-08-10 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3359?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16576273#comment-16576273
 ] 

Hudson commented on SQOOP-3359:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1196 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1196/])
SQOOP-3359: Add LOG message for git hash (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=96593b1a9b8168954f1d5b13f802f2f3ee5beab8])
* (edit) src/java/org/apache/sqoop/Sqoop.java


> Add LOG message for git hash
> 
>
> Key: SQOOP-3359
> URL: https://issues.apache.org/jira/browse/SQOOP-3359
> Project: Sqoop
>  Issue Type: Improvement
>Reporter: Nguyen Truong
>Assignee: Nguyen Truong
>Priority: Minor
> Fix For: 3.0.0
>
> Attachments: SQOOP-3359.patch
>
>
> Besides the version, it would be nice to also know the git hash of the 
> running Sqoop.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3348) Add hadoop-aws dependency and S3 credential generator logic for tests

2018-08-10 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3348?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16576228#comment-16576228
 ] 

Hudson commented on SQOOP-3348:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1195 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1195/])
SQOOP-3348: Add hadoop-aws dependency and S3 credential generator logic (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=282d41ee376d67c229dc8e4d775432ba732d9b5f])
* (add) src/test/org/apache/sqoop/testutil/DefaultS3CredentialGenerator.java
* (add) src/test/org/apache/sqoop/s3/TestS3SequenceFileImport.java
* (add) src/test/org/apache/sqoop/testutil/S3CredentialGenerator.java
* (add) src/test/org/apache/sqoop/testutil/SequenceFileTestUtils.java
* (add) src/test/org/apache/sqoop/s3/TestS3TextImport.java
* (add) src/test/org/apache/sqoop/testutil/S3TestUtils.java
* (edit) build.gradle
* (add) src/test/org/apache/sqoop/testutil/TextFileTestUtils.java
* (edit) COMPILING.txt
* (edit) ivy.xml
* (edit) build.xml
* (add) src/test/org/apache/sqoop/s3/TestS3AvroImport.java


> Add hadoop-aws dependency and S3 credential generator logic for tests
> -
>
> Key: SQOOP-3348
> URL: https://issues.apache.org/jira/browse/SQOOP-3348
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>Reporter: Boglarka Egyed
>Assignee: Boglarka Egyed
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: SQOOP-3348.patch, SQOOP-3348.patch
>
>
> Task is to add {{hadoop-aws}} dependency to Sqoop and implement an S3 
> credential generator logic to enable automated testing.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3360) Fix hive-exec dependency issues in Gradle

2018-08-10 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3360?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16575956#comment-16575956
 ] 

Hudson commented on SQOOP-3360:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1194 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1194/])
SQOOP-3360: Fix hive-exec dependency issues in Gradle (bogi: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=ec18e6f47ebe4308495052afb4a2fd6f9ad0102d])
* (edit) build.gradle
* (edit) ivy.xml
* (edit) ivy/libraries.properties
* (edit) gradle.properties


> Fix hive-exec dependency issues in Gradle
> -
>
> Key: SQOOP-3360
> URL: https://issues.apache.org/jira/browse/SQOOP-3360
> Project: Sqoop
>  Issue Type: Bug
>Affects Versions: 1.4.7
>Reporter: Boglarka Egyed
>Assignee: Szabolcs Vasas
>Priority: Major
> Attachments: SQOOP-3360.patch
>
>
> It turned out during making SQOOP-3345 work with Gradle that there are issues 
> with the dependency handling which cause problems with Gradle. Root cause of 
> the problem seems to be the hive-exec dependency which also contains its own 
> dependencies pulling in incorrect libs and versions runtime. This has to be 
> cleaned up via better dependency management.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3357) Change MainframeImportTool to refer to MainframeManager class directly

2018-08-09 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3357?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16575016#comment-16575016
 ] 

Hudson commented on SQOOP-3357:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1193 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1193/])
SQOOP-3357: Change MainframeImportTool to refer to MainframeManager (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=c3d916ab20ca02c9d6b85bb51d6c31eb31aaff88])
* (edit) src/java/org/apache/sqoop/tool/MainframeImportTool.java
* (edit) src/test/org/apache/sqoop/manager/TestMainframeManager.java


> Change MainframeImportTool to refer to MainframeManager class directly
> --
>
> Key: SQOOP-3357
> URL: https://issues.apache.org/jira/browse/SQOOP-3357
> Project: Sqoop
>  Issue Type: Improvement
>Affects Versions: 1.4.7
>Reporter: Nguyen Truong
>Assignee: Nguyen Truong
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: SQOOP-3357.patch
>
>
> Currently MainframeImportTool refers to the MainframeManager with a string 
> which can create a problem in refactoring. It would be beneficial to change 
> the string to a MainframeManager class reference.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3349) Remove Kite dependency from the Gradle dependencies

2018-07-27 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3349?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16559851#comment-16559851
 ] 

Hudson commented on SQOOP-3349:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1192 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1192/])
SQOOP-3349: Remove Kite dependency from the Gradle dependencies (bogi: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=b56b1767d2615abba03f07bab28a9161e58bb6f6])
* (edit) testdata/hcatalog/conf/hive-site.xml
* (edit) gradle/sqoop-package.gradle
* (edit) LICENSE.txt
* (edit) gradle.properties


> Remove Kite dependency from the Gradle dependencies
> ---
>
> Key: SQOOP-3349
> URL: https://issues.apache.org/jira/browse/SQOOP-3349
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
>Priority: Major
> Attachments: SQOOP-3349.patch
>
>
> Since Sqoop can be built using Gradle as well we need to make sure that the 
> Kite dependency is removed from sqoop-package.gradle too.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3350) Fix tests which use warehouse-dir as target-dir

2018-07-27 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3350?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16559507#comment-16559507
 ] 

Hudson commented on SQOOP-3350:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1191 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1191/])
SQOOP-3350: Fix tests which use warehouse-dir as target-dir (bogi: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=1213b7725a65e561d8db071141e04944c85d1f94])
* (edit) 
src/test/org/apache/sqoop/manager/mysql/MySqlColumnEscapeImportTest.java
* (edit) src/test/org/apache/sqoop/TestFreeFormQueryImport.java
* (edit) 
src/test/org/apache/sqoop/manager/oracle/OracleColumnEscapeImportTest.java
* (edit) src/test/org/apache/sqoop/manager/oracle/OracleSplitterTest.java
* (edit) 
src/test/org/apache/sqoop/manager/oracle/OracleIncrementalImportTest.java
* (edit) 
src/test/org/apache/sqoop/manager/oracle/OracleSpecialCharacterTableImportTest.java


> Fix tests which use warehouse-dir as target-dir
> ---
>
> Key: SQOOP-3350
> URL: https://issues.apache.org/jira/browse/SQOOP-3350
> Project: Sqoop
>  Issue Type: Bug
>Affects Versions: 1.4.7
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
>Priority: Major
> Attachments: SQOOP-3350.patch
>
>
> Some Sqoop tests use the value of the 
> org.apache.sqoop.testutil.BaseSqoopTestCase#getWarehouseDir as target-dir:
> {code:java}
> args.add("--target-dir");
> args.add(getWarehouseDir());
> {code}
> This leads to an error when the warehouse directory exists which can happen 
> if a previous test does not clean up properly.
> The issue was found when executing the tests with Gradle since it probably 
> executes the tests in a different order.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3353) Sqoop should not check incremental constraints for HBase imports

2018-07-27 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3353?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16559467#comment-16559467
 ] 

Hudson commented on SQOOP-3353:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1190 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1190/])
SQOOP-3353: Sqoop should not check incremental constraints for HBase (bogi: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=a06f2f3a8305b7a58b35e27f1fa6aea820dda5aa])
* (edit) src/test/org/apache/sqoop/tool/TestImportTool.java
* (edit) src/java/org/apache/sqoop/tool/ImportTool.java
* (edit) src/test/org/apache/sqoop/TestIncrementalImport.java


> Sqoop should not check incremental constraints for HBase imports
> 
>
> Key: SQOOP-3353
> URL: https://issues.apache.org/jira/browse/SQOOP-3353
> Project: Sqoop
>  Issue Type: Bug
>Affects Versions: 1.4.7
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
>Priority: Major
> Attachments: SQOOP-3353.patch
>
>
> ImportTool#initIncrementalConstraints method is invoked for every import in 
> Sqoop and it can throw the following error even for HBase imports:
> {code:java}
> --merge-key or --append is required when using --incremental lastmodified and 
> the output directory exists.{code}
> The task is to fix the validation not to throw an exception when importing 
> into an HBase table since in that case it does not matter if the table 
> directory exists on HDFS.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3352) Bump java target version to 1.8

2018-07-25 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3352?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16555831#comment-16555831
 ] 

Hudson commented on SQOOP-3352:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1189 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1189/])
SQOOP-3352: Bump java target version to 1.8 (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=eefb7a0f3e28956fc51346b91f8a9337c5be9aca])
* (edit) src/docs/man/sqoop.txt
* (edit) build.gradle
* (edit) README.txt
* (edit) build.xml
* (edit) COMPILING.txt
* (edit) gradle.properties


> Bump java target version to 1.8
> ---
>
> Key: SQOOP-3352
> URL: https://issues.apache.org/jira/browse/SQOOP-3352
> Project: Sqoop
>  Issue Type: Improvement
>Affects Versions: 1.4.7
>Reporter: Boglarka Egyed
>Assignee: Boglarka Egyed
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: SQOOP-3352.patch, SQOOP-3352.patch, SQOOP-3352.patch
>
>
> We should move to Java 8 as a minimum requirement.
> Java 7 is EOL'ed for more than 3 years now: 
> [http://www.oracle.com/technetwork/java/eol-135779.html]
> Many Apache projects are adopting Java 8 as a minimum requirement, for 
> instance:
>  * Hadoop 3: HADOOP-11858
>  * Hbase 2: HBASE-15624
>  * Flume 1.8: FLUME-2945
> This change affects the Ant and the Gradle build too.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3052) Introduce Gradle based build for Sqoop to make it more developer friendly / open

2018-07-23 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3052?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16552759#comment-16552759
 ] 

Hudson commented on SQOOP-3052:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1186 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1186/])
SQOOP-3052: Introduce Gradle based build for Sqoop to make it more (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=b148d54000da8d718f95917f8bf805860a6cbfe6])
* (edit) .gitignore
* (add) config/checkstyle/checkstyle-java-header.txt
* (add) gradle/wrapper/gradle-wrapper.properties
* (add) gradle/customUnixStartScript.txt
* (add) gradle/sqoop-version-gen.gradle
* (add) config/checkstyle/checkstyle-noframes.xsl
* (edit) testdata/hcatalog/conf/hive-site.xml
* (edit) COMPILING.txt
* (add) build.gradle
* (add) gradle/sqoop-package.gradle
* (add) gradlew
* (add) gradle.properties
* (add) gradlew.bat
* (add) config/checkstyle/checkstyle.xml
* (add) gradle/customWindowsStartScript.txt
* (add) settings.gradle
* (edit) src/scripts/rat-violations.sh
* (add) gradle/wrapper/gradle-wrapper.jar


> Introduce Gradle based build for Sqoop to make it more developer friendly / 
> open
> 
>
> Key: SQOOP-3052
> URL: https://issues.apache.org/jira/browse/SQOOP-3052
> Project: Sqoop
>  Issue Type: Improvement
>Reporter: Attila Szabo
>Assignee: Anna Szonyi
>Priority: Major
> Fix For: 1.5.0
>
> Attachments: SQOOP-3052.patch
>
>
> The current trunk version can only be build with Ant/Ivy combination, which 
> has some painful limitations (resolve is slow / needs to be tweaked to use 
> only caches, the current profile / variable based settings are not working in 
> IDEs out of the box, the current solution does not download the related 
> sources, etc.)
> It would be nice to provide a solution, which would give the possibility for 
> the developers to choose between the nowadays well used build infrsturctures 
> (e.g. Maven, Gradle, etc.). For this solution it would be also essential to 
> keep the different build files (if there is more then one) synchronized 
> easily, and the configuration wouldn't diverege by time. Test execution has 
> to be solved also, and should cover all the available test cases.
> In this scenario:
> If we can provide one good working solution is much better, then provide 
> three different ones which become out of sync easily. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3347) Make verify() more generic in AvroTestUtils

2018-07-20 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3347?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16550883#comment-16550883
 ] 

Hudson commented on SQOOP-3347:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1185 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1185/])
SQOOP-3347: Make verify() more generic in AvroTestUtils (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=6c6963abe8f0513a19bf61a8c9055563ff245b1c])
* (edit) src/test/org/apache/sqoop/manager/hsqldb/TestHsqldbAvroPadding.java
* (edit) src/test/org/apache/sqoop/testutil/AvroTestUtils.java
* (edit) 
src/test/org/apache/sqoop/importjob/avro/AvroImportForNumericTypesTest.java


> Make verify() more generic in AvroTestUtils
> ---
>
> Key: SQOOP-3347
> URL: https://issues.apache.org/jira/browse/SQOOP-3347
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>Reporter: Boglarka Egyed
>Assignee: Boglarka Egyed
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: SQOOP-3347.patch
>
>
> AvroTestUtils is a util class for Avro related tests however its verify() 
> method contains a decimal conversion logic which should be extracted and used 
> only in relevant test cases.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3346) Upgrade Hadoop version to 2.8.0

2018-07-20 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3346?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16550554#comment-16550554
 ] 

Hudson commented on SQOOP-3346:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1184 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1184/])
SQOOP-3346: Upgrade Hadoop version to 2.8.0 (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=b3e941be050d615f6aae5b4fafb9e0754c8c6a72])
* (edit) src/java/org/apache/sqoop/tool/ImportTool.java
* (edit) 
src/test/org/apache/sqoop/manager/sqlserver/SQLServerParseMethodsTest.java
* (edit) src/java/org/apache/sqoop/mapreduce/JobBase.java
* (edit) src/test/org/apache/sqoop/orm/TestParseMethods.java
* (edit) src/java/org/apache/sqoop/config/ConfigurationConstants.java
* (edit) src/java/org/apache/sqoop/config/ConfigurationHelper.java
* (edit) src/test/org/apache/sqoop/TestSqoopOptions.java
* (edit) ivy/libraries.properties


> Upgrade Hadoop version to 2.8.0
> ---
>
> Key: SQOOP-3346
> URL: https://issues.apache.org/jira/browse/SQOOP-3346
> Project: Sqoop
>  Issue Type: Sub-task
>Reporter: Boglarka Egyed
>Assignee: Boglarka Egyed
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: SQOOP-3346.patch
>
>
> Support for AWS temporary credentials has been introduced in Hadoop 2.8.0 
> based on HADOOP-12537 and it would make more sense to test and support this 
> capability too with Sqoop.
> There is [SQOOP-3305|https://reviews.apache.org/r/66300/bugs/SQOOP-3305/] 
> being open for upgrading Hadoop to 3.0.0 however it has several issues 
> described in [https://reviews.apache.org/r/66300/] currently thus I would 
> like to proceed with an "intermediate" upgrade to 2.8.0 to enable development 
> on S3 front. [~dvoros] are you OK with this?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3329) Remove Kite dependency from the Sqoop project

2018-07-20 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3329?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16550366#comment-16550366
 ] 

Hudson commented on SQOOP-3329:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1183 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1183/])
SQOOP-3329: Remove Kite dependency from the Sqoop project (bogi: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=739bbce48593a82575435f1cc48ca7ebd48537c9])
* (edit) src/java/org/apache/sqoop/SqoopOptions.java
* (edit) src/test/org/apache/sqoop/TestParquetImport.java
* (delete) 
src/java/org/apache/sqoop/mapreduce/parquet/kite/KiteParquetJobConfiguratorFactory.java
* (delete) 
src/java/org/apache/sqoop/mapreduce/parquet/kite/KiteParquetExportJobConfigurator.java
* (delete) 
src/java/org/apache/sqoop/mapreduce/parquet/kite/KiteParquetExportMapper.java
* (edit) src/docs/user/hive-notes.txt
* (edit) 
src/java/org/apache/sqoop/mapreduce/parquet/ParquetJobConfiguratorImplementation.java
* (edit) src/test/org/apache/sqoop/TestParquetExport.java
* (edit) ivy.xml
* (edit) ivy/libraries.properties
* (edit) src/docs/user/import.txt
* (edit) src/java/org/apache/sqoop/tool/BaseSqoopTool.java
* (delete) 
src/java/org/apache/sqoop/mapreduce/parquet/kite/KiteParquetUtils.java
* (edit) src/test/org/apache/sqoop/tool/TestBaseSqoopTool.java
* (delete) 
src/java/org/apache/sqoop/mapreduce/parquet/kite/KiteParquetImportJobConfigurator.java
* (delete) 
src/java/org/apache/sqoop/mapreduce/parquet/kite/KiteParquetMergeJobConfigurator.java
* (edit) src/test/org/apache/sqoop/TestMerge.java
* (edit) src/test/org/apache/sqoop/hive/TestHiveImport.java
* (delete) 
src/java/org/apache/sqoop/mapreduce/parquet/kite/KiteParquetImportMapper.java
* (delete) 
src/java/org/apache/sqoop/mapreduce/parquet/kite/KiteMergeParquetReducer.java


> Remove Kite dependency from the Sqoop project
> -
>
> Key: SQOOP-3329
> URL: https://issues.apache.org/jira/browse/SQOOP-3329
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
>Priority: Major
> Attachments: SQOOP-3329.patch
>
>
> Now that we have an alternative solution for reading and writing Parquet 
> files we can remove the Kite dependency and the classes using Kite from the 
> Sqoop project.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3338) Document Parquet support

2018-07-16 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3338?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16545657#comment-16545657
 ] 

Hudson commented on SQOOP-3338:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1182 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1182/])
SQOOP-3338: Document Parquet support (bogi: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=17461e91db01bf67663caf0fb35e8920128c1aba])
* (edit) src/docs/user/import.txt
* (edit) src/docs/user/hive-notes.txt
* (edit) src/docs/user/hive-args.txt


> Document Parquet support
> 
>
> Key: SQOOP-3338
> URL: https://issues.apache.org/jira/browse/SQOOP-3338
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: SQOOP-3338.patch, SQOOP-3338.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3335) Add Hive support to the new Parquet writing implementation

2018-07-16 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3335?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16545124#comment-16545124
 ] 

Hudson commented on SQOOP-3335:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1181 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1181/])
SQOOP-3335: Add Hive support to the new Parquet writing implementation (bogi: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=e639053251b65f943481666a62ed137bfef15b76])
* (edit) src/test/org/apache/sqoop/hive/TestHiveServer2TextImport.java
* (add) src/test/org/apache/sqoop/hive/TestHiveServer2ParquetImport.java
* (edit) 
src/java/org/apache/sqoop/mapreduce/parquet/hadoop/HadoopParquetImportJobConfigurator.java
* (edit) src/test/org/apache/sqoop/TestParquetIncrementalImportMerge.java
* (edit) src/test/org/apache/sqoop/testutil/BaseSqoopTestCase.java
* (edit) src/java/org/apache/sqoop/tool/BaseSqoopTool.java
* (add) src/test/org/apache/sqoop/hive/TestHiveTypesForAvroTypeMapping.java
* (edit) 
src/java/org/apache/sqoop/mapreduce/parquet/ParquetImportJobConfigurator.java
* (edit) 
src/java/org/apache/sqoop/mapreduce/parquet/kite/KiteParquetImportJobConfigurator.java
* (edit) src/java/org/apache/sqoop/hive/HiveTypes.java
* (edit) src/java/org/apache/sqoop/tool/ImportTool.java
* (edit) src/test/org/apache/sqoop/tool/TestHiveServer2OptionValidations.java
* (edit) src/java/org/apache/sqoop/hive/TableDefWriter.java
* (edit) src/test/org/apache/sqoop/hive/TestTableDefWriter.java


> Add Hive support to the new Parquet writing implementation
> --
>
> Key: SQOOP-3335
> URL: https://issues.apache.org/jira/browse/SQOOP-3335
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: SQOOP-3335.patch, SQOOP-3335.patch
>
>
> SQOOP-3328 adds a new Parquet reading and writing implementation to Sqoop it 
> does not add support to Hive Parquet imports. The task of this Jira is to add 
> this missing functionality.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3334) Improve ArgumentArrayBuilder, so arguments are replaceable

2018-06-25 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3334?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16522327#comment-16522327
 ] 

Hudson commented on SQOOP-3334:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1176 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1176/])
SQOOP-3334: Improve ArgumentArrayBuilder, so arguments are replaceable (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=f4f9543010e6bacc8db938ab81c239c2273691ea])
* (add) src/test/org/apache/sqoop/testutil/TestArgumentArrayBuilder.java
* (edit) src/test/org/apache/sqoop/testutil/ArgumentArrayBuilder.java


> Improve ArgumentArrayBuilder, so arguments are replaceable
> --
>
> Key: SQOOP-3334
> URL: https://issues.apache.org/jira/browse/SQOOP-3334
> Project: Sqoop
>  Issue Type: Improvement
>Reporter: Fero Szabo
>Assignee: Fero Szabo
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: SQOOP-3334-4.patch
>
>
> The current implementation of the  ArgumentArrayBuilder allows duplicating 
> options. Instead we should be able to override options that were already 
> specified 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3333) Change default behavior of the MS SQL connector to non-resilient.

2018-06-20 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16518033#comment-16518033
 ] 

Hudson commented on SQOOP-:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1175 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1175/])
SQOOP-: Change default behavior of the MS SQL connector to (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=57278b9e3370dc65cb465e9ed3cb225203dc7eab])
* (edit) 
src/test/org/apache/sqoop/manager/sqlserver/SQLServerManagerImportTest.java
* (add) 
src/test/org/apache/sqoop/manager/sqlserver/TestSqlServerManagerContextConfigurator.java
* (edit) src/docs/user/connectors.txt
* (edit) src/java/org/apache/sqoop/manager/ExportJobContext.java
* (edit) src/java/org/apache/sqoop/manager/SQLServerManager.java
* (add) 
src/java/org/apache/sqoop/manager/SqlServerManagerContextConfigurator.java


> Change default behavior of the MS SQL connector to non-resilient.
> -
>
> Key: SQOOP-
> URL: https://issues.apache.org/jira/browse/SQOOP-
> Project: Sqoop
>  Issue Type: Task
>Reporter: Fero Szabo
>Assignee: Fero Szabo
>Priority: Major
>
> The default behavior of Sqoop is to use a "resilient" retry mechanism in the 
> SQL Server connector. However, this relies on the split-by column being 
> unique and ordered ascending. This can lead to obscure errors (duplicate or 
> missing records in imports / exports) and should be used only if specifically 
> wanted by the users.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3331) Add Mainframe FTP integration test for GDG dataset.

2018-06-15 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3331?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16513937#comment-16513937
 ] 

Hudson commented on SQOOP-3331:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1174 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1174/])
SQOOP-3331: Add Mainframe FTP integration test for GDG dataset. (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=b2643d5094da405acfc695d1da22a54d32aa74ed])
* (add) src/test/org/apache/sqoop/manager/mainframe/MainframeTestUtil.java
* (add) 
src/test/org/apache/sqoop/manager/mainframe/MainframeManagerImportTest.java
* (edit) build.xml
* (edit) src/java/org/apache/sqoop/manager/MainframeManager.java
* (edit) 
src/scripts/thirdpartytest/docker-compose/sqoop-thirdpartytest-db-services.yml


> Add Mainframe FTP integration test for GDG dataset.
> ---
>
> Key: SQOOP-3331
> URL: https://issues.apache.org/jira/browse/SQOOP-3331
> Project: Sqoop
>  Issue Type: Test
>Reporter: Chris Teoh
>Assignee: Chris Teoh
>Priority: Minor
>
> Current mainframe import functionality doesn't have integration tests, only 
> unit tests. Adding this test improves validation of this functionality.
> Dockerfile and accompanying contents at 
> https://github.com/christeoh/zos-ftpmock/



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3082) Sqoop import fails after TCP connection reset if split by datetime column

2018-05-18 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/SQOOP-3082?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16480457#comment-16480457
 ] 

Hudson commented on SQOOP-3082:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1167 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1167/])
SQOOP-3082: Sqoop import fails after TCP connection reset if split by (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=ad7d046ef1d68b46aacdf6e124056f5901e95e5c])
* (edit) src/java/org/apache/sqoop/mapreduce/db/SQLServerDBRecordReader.java


> Sqoop import fails after TCP connection reset if split by datetime column
> -
>
> Key: SQOOP-3082
> URL: https://issues.apache.org/jira/browse/SQOOP-3082
> Project: Sqoop
>  Issue Type: Bug
>Affects Versions: 1.4.6
>Reporter: Sergey Svynarchuk
>Priority: Major
> Fix For: 1.5.0
>
> Attachments: SQOOP-3082-1.patch, SQOOP-3082.patch
>
>
> If sqoop-to-mssqlserver connection reset, the whole command fails with 
> "Connection reset with com.microsoft.sqlserver.jdbc.SQLServerException: 
> Incorrect syntax near '00'" . On reestablishing connection, Sqoop tries to 
> resume import from the last record that was successfully read by :
> {code}
> 2016-12-10 15:18:54,523 INFO [main] 
> org.apache.sqoop.mapreduce.db.DBRecordReader: Executing query: select * from 
> test.dbo.test1 WITH (nolock) where Date >= '2015-01-10' and Date <= 
> '2016-11-24' and ( Date > 2015-09-18 00:00:00.0 ) AND ( Date < '2015-09-23 
> 11:48:00.0' ) 
> {code}
> Not quoted 2015-09-18 00:00:00.0 in SQL.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-816) Scoop and support for external Hive tables

2018-05-17 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/SQOOP-816?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16479182#comment-16479182
 ] 

Hudson commented on SQOOP-816:
--

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1166 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1166/])
SQOOP-3324: Document SQOOP-816: Sqoop add support for external Hive (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=6c0b3201280f3c7c4c56530909fbb2977e58542e])
* (edit) src/docs/man/hive-args.txt
* (edit) src/docs/user/hive-args.txt
* (edit) src/docs/user/hive.txt


> Scoop and support for external Hive tables
> --
>
> Key: SQOOP-816
> URL: https://issues.apache.org/jira/browse/SQOOP-816
> Project: Sqoop
>  Issue Type: Improvement
>  Components: hive-integration
>Reporter: Santosh Achhra
>Assignee: Chris Teoh
>Priority: Minor
>  Labels: External, Hive,, Scoop,, Tables, newbie
> Fix For: 1.4.7
>
>
> Sqoop is not supporting HIVE external tables at the moment. Any imports using 
> scoop creates a managed table, in real world scenario it is very important to 
> have EXTERNAL tables. As of now we have to execute ALTER statement to change 
> table properties to make the the table as external table which is not a big 
> deal but it would nice have an option in scoop to specify type of table which 
> is required



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3324) Document SQOOP-816: Sqoop add support for external Hive tables

2018-05-17 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/SQOOP-3324?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16479180#comment-16479180
 ] 

Hudson commented on SQOOP-3324:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1166 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1166/])
SQOOP-3324: Document SQOOP-816: Sqoop add support for external Hive (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=6c0b3201280f3c7c4c56530909fbb2977e58542e])
* (edit) src/docs/man/hive-args.txt
* (edit) src/docs/user/hive-args.txt
* (edit) src/docs/user/hive.txt


> Document SQOOP-816: Sqoop add support for external Hive tables
> --
>
> Key: SQOOP-3324
> URL: https://issues.apache.org/jira/browse/SQOOP-3324
> Project: Sqoop
>  Issue Type: Sub-task
>Reporter: Fero Szabo
>Assignee: Fero Szabo
>Priority: Major
> Fix For: 1.5.0
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3322) Version differences between ivy configurations

2018-05-10 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/SQOOP-3322?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16470839#comment-16470839
 ] 

Hudson commented on SQOOP-3322:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1165 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1165/])
SQOOP-3322: Version differences between ivy configurations (bogi: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=2ca85527fd9f927add9127f91f3f3ef0c98fed6e])
* (edit) ivy.xml
* (edit) ivy/libraries.properties


> Version differences between ivy configurations
> --
>
> Key: SQOOP-3322
> URL: https://issues.apache.org/jira/browse/SQOOP-3322
> Project: Sqoop
>  Issue Type: Bug
>  Components: build
>Affects Versions: 1.4.7
>Reporter: Daniel Voros
>Assignee: Daniel Voros
>Priority: Minor
> Fix For: 3.0.0
>
>
> We have multiple ivy configurations defined in ivy.xml.
>  - The {{redist}} configuration is used to select the artifacts that need to 
> be distributed with Sqoop in its tar.gz.
>  - The {{common}} configuration is used to set the classpath during 
> compilation (also refered to as 'hadoop classpath')
>  -  The {{test}} configuration is used to set the classpath during junit 
> execution. It extends the {{common}} config.
> Some artifacts end up having different versions between these three 
> configurations, which means we're using different versions during 
> compilation/testing/runtime.
> Differences:
> ||Artifact||redist||common (compilation)||test||
> |commons-pool|not in redist|1.5.4|*1.6*|
> |commons-codec|1.4|1.9|*1.9*|
> |commons-io|1.4|2.4|*2.4*|
> |commons-logging|1.1.1|1.2|*1.2*|
> |slf4j-api|1.6.1|1.7.7|*1.7.7*|
> I'd suggest using the version *in bold* in all three configurations to use 
> the latest versions.
> To achieve this we should exclude these artifacts from the transitive 
> dependencies and define them explicitly.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3321) Fix TestHiveImport failing on Jenkins and Linux

2018-05-10 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/SQOOP-3321?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16470615#comment-16470615
 ] 

Hudson commented on SQOOP-3321:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1164 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1164/])
SQOOP-3321: Fix TestHiveImport failing on Jenkins and Linux (bogi: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=03a895c5881b6dc9f748480b7d0562377d3cc978])
* (edit) src/test/org/apache/sqoop/hive/TestHiveImport.java


> Fix TestHiveImport failing on Jenkins and Linux
> ---
>
> Key: SQOOP-3321
> URL: https://issues.apache.org/jira/browse/SQOOP-3321
> Project: Sqoop
>  Issue Type: Bug
>Affects Versions: 1.4.7
>Reporter: Boglarka Egyed
>Assignee: Daniel Voros
>Priority: Major
> Attachments: TEST-org.apache.sqoop.hive.TestHiveImport.txt
>
>
> org.apache.sqoop.hive.TestHiveImport is failing since 
> [SQOOP-3318|https://reviews.apache.org/r/66761/bugs/SQOOP-3318/] has been 
> committed. This test seem to be failing only in the Jenkins environment as it 
> pass on several local machines. There can be some difference in the 
> filesystem which may cause this issue, it shall be investigated. I am 
> attaching the log from a failed run.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3318) Remove Kite dependency from test cases

2018-04-27 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/SQOOP-3318?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16456134#comment-16456134
 ] 

Hudson commented on SQOOP-3318:
---

FAILURE: Integrated in Jenkins build Sqoop-hadoop200 #1162 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1162/])
SQOOP-3318: Remove Kite dependency from test cases (bogi: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=11c83f68386add243762929ecf7f6f25a99efbf4])
* (edit) src/test/org/apache/sqoop/TestParquetImport.java
* (edit) src/test/org/apache/sqoop/TestMerge.java
* (add) src/test/org/apache/sqoop/util/ParquetReader.java
* (edit) src/java/org/apache/sqoop/util/FileSystemUtil.java
* (edit) src/test/org/apache/sqoop/hive/TestHiveImport.java
* (edit) src/test/org/apache/sqoop/TestAllTables.java
* (edit) src/test/org/apache/sqoop/TestParquetExport.java


> Remove Kite dependency from test cases
> --
>
> Key: SQOOP-3318
> URL: https://issues.apache.org/jira/browse/SQOOP-3318
> Project: Sqoop
>  Issue Type: Sub-task
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
>Priority: Major
> Attachments: SQOOP-3318.patch
>
>
> Some Sqoop tests use Kite to create test data and verify test results.
> Since we want to remove the Kite dependency from Sqoop we should rewrite 
> these test cases not to use Kite anymore.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3216) Expanded Metastore support for MySql, Oracle, Postgresql, MSSql, and DB2

2018-04-10 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/SQOOP-3216?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16431980#comment-16431980
 ] 

Hudson commented on SQOOP-3216:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1159 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1159/])
SQOOP-3301: Document SQOOP-3216 - metastore related change (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=af7a594d987ece6c1990be950c48d94bbab8271f])
* (edit) src/docs/man/sqoop-job.txt
* (edit) src/docs/user/saved-jobs.txt
* (edit) src/docs/user/metastore-purpose.txt


> Expanded Metastore support for MySql, Oracle, Postgresql, MSSql, and DB2
> 
>
> Key: SQOOP-3216
> URL: https://issues.apache.org/jira/browse/SQOOP-3216
> Project: Sqoop
>  Issue Type: New Feature
>  Components: metastore
>Reporter: Zach Berkowitz
>Assignee: Zach Berkowitz
>Priority: Minor
> Fix For: 1.5.0
>
> Attachments: SQOOP-3216-2.patch, SQOOP-3216-3.patch, 
> SQOOP-3216-4.patch, SQOOP-3216.patch
>
>
> Reconfigured HsqldbJobStorage class to support MySql, Oracle, Postgresql, 
> MSSql, and DB2 databases in addition to Hsqldb, renamed HsqldbJobStorage 
> GenericJobStorage. This new class also serves the function of 
> AutoHsqldbStorage, which has been removed.
> Two new commands, --meta-username and --meta-password have been added to 
> connect to metastore databases that require a username and password. 
> Added an enum class JdbcDrivers that holds Jdbc connection information.
> Added two testClasses, MetaConnectIncrementalImportTest and JobToolTest, and 
> modified TestSavedJobs (now SavedJobsTest) to test with all supported 
> database services.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3301) Document SQOOP-3216 - metastore related change

2018-04-10 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/SQOOP-3301?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16431979#comment-16431979
 ] 

Hudson commented on SQOOP-3301:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1159 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1159/])
SQOOP-3301: Document SQOOP-3216 - metastore related change (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=af7a594d987ece6c1990be950c48d94bbab8271f])
* (edit) src/docs/man/sqoop-job.txt
* (edit) src/docs/user/saved-jobs.txt
* (edit) src/docs/user/metastore-purpose.txt


> Document SQOOP-3216 - metastore related change
> --
>
> Key: SQOOP-3301
> URL: https://issues.apache.org/jira/browse/SQOOP-3301
> Project: Sqoop
>  Issue Type: Sub-task
>Reporter: Fero Szabo
>Assignee: Fero Szabo
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3307) Don't create HTML during Ivy report

2018-03-29 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/SQOOP-3307?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16419167#comment-16419167
 ] 

Hudson commented on SQOOP-3307:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1158 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1158/])
SQOOP-3307: Don't create HTML during Ivy report (bogi: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=c146b3f94e937a8c03b6ae85a60800d1c62bdc94])
* (edit) build.xml


> Don't create HTML during Ivy report
> ---
>
> Key: SQOOP-3307
> URL: https://issues.apache.org/jira/browse/SQOOP-3307
> Project: Sqoop
>  Issue Type: Task
>Affects Versions: 1.4.7
>Reporter: Daniel Voros
>Assignee: Daniel Voros
>Priority: Minor
> Fix For: 1.5.0
>
>
> {{ant clean report}} invokes the [ivy:report 
> |https://ant.apache.org/ivy/history/2.1.0/use/report.html] task and creates 
> both HTML and GraphML reports.
> Creation of the HTML reports takes ~7 minutes and results in a ~700MB html 
> that's hard to make use of, while the GraphML reporting is fast and is easier 
> to read.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3308) Mock ConnManager field in TestTableDefWriter

2018-03-29 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/SQOOP-3308?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16419089#comment-16419089
 ] 

Hudson commented on SQOOP-3308:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1157 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1157/])
SQOOP-3308: Mock ConnManager field in TestTableDefWriter (bogi: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=d2c40366a7458b0f4804a382403bb40ac72fbb4f])
* (edit) src/java/org/apache/sqoop/hive/TableDefWriter.java
* (edit) src/test/org/apache/sqoop/hive/TestTableDefWriter.java


> Mock ConnManager field in TestTableDefWriter
> 
>
> Key: SQOOP-3308
> URL: https://issues.apache.org/jira/browse/SQOOP-3308
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.5.0
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
>Priority: Major
> Attachments: SQOOP-3308.patch, SQOOP-3308.patch, SQOOP-3308.patch, 
> SQOOP-3308.patch
>
>
> TableDefWriter has a dependency on ConnManager to retrieve the names and the 
> types of the table. It also introduces a field called _externalColTypes_ for 
> testing purposes and TestTableDefWriter uses this field to inject the test 
> table column names and types instead of mocking the ConnManager field.
> This setup makes it harder to add test cases to TestTableDefWriter and not a 
> good practice so it should be fixed.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3303) Fix warnings during Sqoop compilation

2018-03-25 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/SQOOP-3303?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16413077#comment-16413077
 ] 

Hudson commented on SQOOP-3303:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1156 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1156/])
SQOOP-3303: Fix warnings during Sqoop compilation (bogi: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=7186b9d654c54fbe575e273c0a182307d3d48893])
* (edit) src/test/org/apache/sqoop/TestParquetExport.java
* (edit) src/test/org/apache/sqoop/TestAvroExport.java


> Fix warnings during Sqoop compilation
> -
>
> Key: SQOOP-3303
> URL: https://issues.apache.org/jira/browse/SQOOP-3303
> Project: Sqoop
>  Issue Type: Task
>Affects Versions: 1.5.0
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
>Priority: Major
> Attachments: SQOOP-3303.patch
>
>
> Ant prints the following warning during every Sqoop compilation:
> {code:java}
> /Users/szabolcsvasas/Documents/workspace/sqoop/sqoop-vasas-gradle/sqoop/src/test/org/apache/sqoop/TestAvroExport.java:477:
>  warning: non-varargs call of varargs method with inexact argument type for 
> last parameter;
>     createAvroFile(0, TOTAL_RECORDS, null);
>                                      ^
>   cast to ColumnGenerator for a varargs call
>   cast to ColumnGenerator[] for a non-varargs call and to suppress this 
> warning
> /Users/szabolcsvasas/Documents/workspace/sqoop/sqoop-vasas-gradle/sqoop/src/test/org/apache/sqoop/TestAvroExport.java:492:
>  warning: non-varargs call of varargs method with inexact argument type for 
> last parameter;
>     createAvroFile(0, TOTAL_RECORDS, null);
>                                      ^
>   cast to ColumnGenerator for a varargs call
>   cast to ColumnGenerator[] for a non-varargs call and to suppress this 
> warning
> /Users/szabolcsvasas/Documents/workspace/sqoop/sqoop-vasas-gradle/sqoop/src/test/org/apache/sqoop/TestParquetExport.java:422:
>  warning: non-varargs call of varargs method with inexact argument type for 
> last parameter;
>     createParquetFile(0, TOTAL_RECORDS, null);
>                                         ^
>   cast to ColumnGenerator for a varargs call
>   cast to ColumnGenerator[] for a non-varargs call and to suppress this 
> warning
> /Users/szabolcsvasas/Documents/workspace/sqoop/sqoop-vasas-gradle/sqoop/src/test/org/apache/sqoop/TestParquetExport.java:435:
>  warning: non-varargs call of varargs method with inexact argument type for 
> last parameter;
>     createParquetFile(0, TOTAL_RECORDS, null);
>                                         ^
>   cast to ColumnGenerator for a varargs call
>   cast to ColumnGenerator[] for a non-varargs call and to suppress this 
> warning
> {code}
> It is kind of annoying and would be very easy to fix.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3302) Add missing license header to HBaseKerberizedConnectivityTest and TestSQLServerDBRecordReader

2018-03-25 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/SQOOP-3302?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16413067#comment-16413067
 ] 

Hudson commented on SQOOP-3302:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1155 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1155/])
SQOOP-3302: Add missing license header to (bogi: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=a69e716d1ebece53036a242ad88ebd1f52404b38])
* (edit) src/test/org/apache/sqoop/hbase/HBaseKerberizedConnectivityTest.java
* (edit) src/test/org/apache/sqoop/mapreduce/db/TestSQLServerDBRecordReader.java


> Add missing license header to HBaseKerberizedConnectivityTest and 
> TestSQLServerDBRecordReader
> -
>
> Key: SQOOP-3302
> URL: https://issues.apache.org/jira/browse/SQOOP-3302
> Project: Sqoop
>  Issue Type: Task
>Affects Versions: 1.5.0
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
>Priority: Major
> Attachments: SQOOP-3302.patch
>
>
> HBaseKerberizedConnectivityTest and TestSQLServerDBRecordReader do not have 
> the Apache license header which should be fixed.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3304) Increase the maximum memory for JUnit tests

2018-03-25 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/SQOOP-3304?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16413042#comment-16413042
 ] 

Hudson commented on SQOOP-3304:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1154 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1154/])
SQOOP-3304: Increase the maximum memory for JUnit tests (bogi: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=5de9603cd2dd74c1210ff69d9a55f24417f83513])
* (edit) build.xml


> Increase the maximum memory for JUnit tests
> ---
>
> Key: SQOOP-3304
> URL: https://issues.apache.org/jira/browse/SQOOP-3304
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.5.0
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
>Priority: Major
> Attachments: SQOOP-3304.patch
>
>
> TestKerberosAuthenticator committed in SQOOP-3300 fails in our Jenkins build 
> with an OutOfMemoryError:
> {code:java}
> [junit] Running org.apache.sqoop.authentication.TestKerberosAuthenticator
> [junit] Tests run: 6, Failures: 0, Errors: 0, Time elapsed: 50.61 sec
> [junit] Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
> [junit] at java.util.Arrays.copyOf(Arrays.java:2367)
> [junit] at 
> java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:130)
> [junit] at 
> java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:114)
> [junit] at 
> java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:415)
> [junit] at java.lang.StringBuffer.append(StringBuffer.java:237)
> [junit] at 
> org.apache.tools.ant.taskdefs.optional.junit.PlainJUnitResultFormatter.endTestSuite(PlainJUnitResultFormatter.java:141)
> [junit] at 
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.fireEndTestSuite(JUnitTestRunner.java:731)
> [junit] at 
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:553)
> [junit] at 
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:1060)
> [junit] at 
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:911)
> [junit] Running org.apache.sqoop.authentication.TestKerberosAuthenticator
> [junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 0 sec
> {code}
> As this test case worked fine in our local environments increasing the 
> maximum memory available for JUnit tests should solve the problem.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3300) Implement JDBC and Kerberos tools for HiveServer2 support

2018-03-23 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/SQOOP-3300?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16411390#comment-16411390
 ] 

Hudson commented on SQOOP-3300:
---

FAILURE: Integrated in Jenkins build Sqoop-hadoop200 #1152 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1152/])
SQOOP-3300: Implement JDBC and Kerberos tools for HiveServer2 support (bogi: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=d67bb816ce35cdf59a3504845a0bc5639f690dd6])
* (add) src/test/org/apache/sqoop/db/TestDriverManagerJdbcConnectionFactory.java
* (add) src/java/org/apache/sqoop/authentication/KerberosAuthenticator.java
* (add) src/java/org/apache/sqoop/db/JdbcConnectionFactory.java
* (add) 
src/test/org/apache/sqoop/db/decorator/TestKerberizedConnectionFactoryDecorator.java
* (edit) 
src/test/org/apache/sqoop/infrastructure/kerberos/MiniKdcInfrastructureRule.java
* (add) 
src/java/org/apache/sqoop/db/decorator/JdbcConnectionFactoryDecorator.java
* (add) src/java/org/apache/sqoop/db/DriverManagerJdbcConnectionFactory.java
* (add) 
src/java/org/apache/sqoop/db/decorator/KerberizedConnectionFactoryDecorator.java
* (add) src/test/org/apache/sqoop/authentication/TestKerberosAuthenticator.java
* (edit) src/test/org/apache/sqoop/hbase/HBaseTestCase.java


> Implement JDBC and Kerberos tools for HiveServer2 support
> -
>
> Key: SQOOP-3300
> URL: https://issues.apache.org/jira/browse/SQOOP-3300
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.5.0
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
>Priority: Major
> Attachments: SQOOP-3300.patch
>
>
> The idea of the Sqoop HS2 support is to connect to HS2 using JDBC and execute 
> the Hive commands on this connection. Sqoop should also support Kerberos 
> authentication when building this JDBC connection.
> The goal of this JIRA is to implement the necessary classes for building JDBC 
> connections and authenticating with Kerberos.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-2976) Flag to expand decimal values to fit AVRO schema

2018-03-21 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/SQOOP-2976?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16408059#comment-16408059
 ] 

Hudson commented on SQOOP-2976:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1151 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1151/])
SQOOP-3293: Document SQOOP-2976 (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=d57f9fb06b55650adc75cd1972df0024d7e4dba1])
* (edit) src/docs/user/import.txt
* (edit) COMPILING.txt


> Flag to expand decimal values to fit AVRO schema
> 
>
> Key: SQOOP-2976
> URL: https://issues.apache.org/jira/browse/SQOOP-2976
> Project: Sqoop
>  Issue Type: Improvement
>Affects Versions: 1.4.6
>Reporter: Thomas Scott
>Assignee: Fero Szabo
>Priority: Major
> Fix For: 1.5.0
>
> Attachments: SQOOP-2976.patch, SQOOP-2976.patch
>
>
> As per https://issues.apache.org/jira/browse/AVRO-1864 when importing from 
> Oracle (or any other database that truncates decimals) Sqoop jobs can fail 
> because the scale of the decimal produced by the database does not match the 
> scale in the AVRO file.
> For instance if the value 3.15 is produced by Oracle and the AVRO decimal 
> scale  is 3 (this can happen even if the Oracle column is defined with scale 
> of 3) then the job will fail. 
> Can we have a flag (--pad-decimals) that pads incoming values with zeros to 
> fit the AVRO schema (e.g. 3.15 becomes 3.150).



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3293) Document SQOOP-2976

2018-03-21 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/SQOOP-3293?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16408058#comment-16408058
 ] 

Hudson commented on SQOOP-3293:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1151 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1151/])
SQOOP-3293: Document SQOOP-2976 (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=d57f9fb06b55650adc75cd1972df0024d7e4dba1])
* (edit) src/docs/user/import.txt
* (edit) COMPILING.txt


> Document SQOOP-2976
> ---
>
> Key: SQOOP-3293
> URL: https://issues.apache.org/jira/browse/SQOOP-3293
> Project: Sqoop
>  Issue Type: Sub-task
>Reporter: Fero Szabo
>Assignee: Fero Szabo
>Priority: Major
> Fix For: 1.5.0
>
> Attachments: SQOOP-3292.1.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3288) Incremental import's upper bound ignores session time zone in Oracle

2018-02-23 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/SQOOP-3288?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16374158#comment-16374158
 ] 

Hudson commented on SQOOP-3288:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1150 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1150/])
SQOOP-3288: Changing OracleManager to use CURRENT_TIMESTAMP instead of (maugli: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=a7f5e0d298ffbf8e674bd35ee10f2accc1da5453])
* (edit) src/java/org/apache/sqoop/manager/OracleManager.java


> Incremental import's upper bound ignores session time zone in Oracle
> 
>
> Key: SQOOP-3288
> URL: https://issues.apache.org/jira/browse/SQOOP-3288
> Project: Sqoop
>  Issue Type: Bug
>  Components: connectors/oracle
>Affects Versions: 1.4.7
>Reporter: Daniel Voros
>Assignee: Daniel Voros
>Priority: Major
> Fix For: 1.5.0
>
> Attachments: SQOOP-3288.1.patch
>
>
> At the moment we're using [{{SELECT SYSDATE FROM 
> dual}}|https://github.com/apache/sqoop/blob/3153c3610da7e5db388bfb14f3681d308e9e89c6/src/java/org/apache/sqoop/manager/OracleManager.java#L652]
>  when getting current time from Oracle.
> SYSDATE returns the underlying operating system's current time, while 
> CURRENT_TIMESTAMP uses the session time zone. This could lead to problems 
> during incremental imports *when Oracle's time zone is different from the OS*.
> Consider the following scenario when Oracle is configured to {{+0:00}}, while 
> the OS is {{+5:00}}:
> ||Oracle time||OS time||Event||
> |2:00|7:00|{{sqoop import --last-value 1:00 ...}} => imports {{[1:00, 7:00)}}|
> |2:30|7:30|{{update ... set last_updated = current_timestamp ...}} => set to 
> {{2:30}} *Won't be imported!*|
> |3:00|8:00|{{sqoop import --last-value 7:00 ...}} => imports {{[7:00, 8:00)}}|
> This way records updated within 5 hours after the last sqoop import won't get 
> imported.
> Please note, that the example above assumes, that the user/administrator 
> who's updating the Oracle table will use the current session time of Oracle 
> when setting the "last updated" column of the table.
> I think the solution is to use CURRENT_TIMESTAMP instead of SYSDATE. Other 
> connection managers, like MySQL or PostgreSQL are using that as well.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3267) Incremental import to HBase deletes only last version of column

2018-02-22 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/SQOOP-3267?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16372867#comment-16372867
 ] 

Hudson commented on SQOOP-3267:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1149 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1149/])
SQOOP-3267: Incremental import to HBase deletes only last version of (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=69463f0b3ed3af28581202ef59079b9df7bc0bad])
* (edit) src/docs/user/hbase.txt
* (edit) src/java/org/apache/sqoop/hbase/HBasePutProcessor.java
* (edit) src/java/org/apache/sqoop/tool/BaseSqoopTool.java
* (edit) src/java/org/apache/sqoop/hbase/ToStringPutTransformer.java
* (edit) src/docs/man/hbase-args.txt
* (edit) src/java/org/apache/sqoop/SqoopOptions.java
* (edit) src/test/org/apache/sqoop/hbase/HBaseTestCase.java
* (edit) src/java/org/apache/sqoop/mapreduce/HBaseImportJob.java
* (edit) src/test/org/apache/sqoop/hbase/HBaseImportTest.java
* (edit) src/docs/user/hbase-args.txt
* (edit) src/test/org/apache/sqoop/TestSqoopOptions.java


> Incremental import to HBase deletes only last version of column
> ---
>
> Key: SQOOP-3267
> URL: https://issues.apache.org/jira/browse/SQOOP-3267
> Project: Sqoop
>  Issue Type: Bug
>  Components: hbase-integration
>Affects Versions: 1.4.7
>Reporter: Daniel Voros
>Assignee: Daniel Voros
>Priority: Major
> Fix For: 1.5.0
>
> Attachments: SQOOP-3267.1.patch, SQOOP-3267.2.patch
>
>
> Deletes are supported since SQOOP-3149, but we're only deleting the last 
> version of a column when the corresponding cell was set to NULL in the source 
> table.
> This can lead to unexpected and misleading results if the row has been 
> transferred multiple times, which can easily happen if it's being modified on 
> the source side.
> Also SQOOP-3149 is using a new Put command for every column instead of a 
> single Put per row as before. This could probably lead to a performance drop 
> for wide tables (for which HBase is otherwise usually recommended).
> [~jilani], [~anna.szonyi] could you please comment on what you think would be 
> the expected behavior here?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3283) MySQL thirdparty tests hang if there's no USER environment variable

2018-02-14 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/SQOOP-3283?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16364622#comment-16364622
 ] 

Hudson commented on SQOOP-3283:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1148 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1148/])
SQOOP-3283: Fixing MySQL 3rd party test hanging issue by getting (maugli: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=3153c3610da7e5db388bfb14f3681d308e9e89c6])
* (edit) src/test/org/apache/sqoop/manager/mysql/MySQLTestUtils.java


> MySQL thirdparty tests hang if there's no USER environment variable
> ---
>
> Key: SQOOP-3283
> URL: https://issues.apache.org/jira/browse/SQOOP-3283
> Project: Sqoop
>  Issue Type: Bug
>  Components: connectors/mysql, test
>Affects Versions: 1.4.7
>Reporter: Daniel Voros
>Assignee: Daniel Voros
>Priority: Minor
> Fix For: 1.5.0
>
> Attachments: SQOOP-3283.1.patch, SQOOP-3283.2.patch
>
>
> {{org.apache.sqoop.manager.mysql.MySQLTestUtils#getCurrentUser()}} executes 
> {{whoami}} in a subprocess if there's no USER environment variable (happened 
> to me while running tests from Docker). However, it waits for the Process 
> variable to become null, that never happens:
> {code:java}
> // wait for whoami to exit.
> while (p != null) {
>   try {
> int ret = p.waitFor();
> if (0 != ret) {
>   LOG.error("whoami exited with error status " + ret);
>   // suppress original return value from this method.
>   return null;
> }
>   } catch (InterruptedException ie) {
> continue; // loop around.
>   }
> }
> {code}
> We could get rid of the while loop since {{Process#waitFor()}} blocks while 
> it completes.
> Note, that it's easy to workaround the issue by setting the USER environment 
> variable when running the tests.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-2976) Flag to expand decimal values to fit AVRO schema

2018-02-14 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/SQOOP-2976?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16364122#comment-16364122
 ] 

Hudson commented on SQOOP-2976:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1146 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1146/])
SQOOP-2976: Flag to expand decimal values to fit AVRO schema (bogi: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=f7b460b3f57c1bc81e2e0a1e8c28a331729f4213])
* (delete) src/test/org/apache/sqoop/testutil/ArgumentUtils.java
* (edit) src/test/org/apache/sqoop/TestAvroImport.java
* (edit) 
src/test/org/apache/sqoop/metastore/TestMetastoreConfigurationParameters.java
* (add) 
src/test/org/apache/sqoop/manager/sqlserver/SQLServerAvroPaddingImportTest.java
* (edit) src/java/org/apache/sqoop/avro/AvroUtil.java
* (add) src/test/org/apache/sqoop/manager/hsqldb/TestHsqldbAvroPadding.java
* (edit) src/test/org/apache/sqoop/testutil/BaseSqoopTestCase.java
* (edit) src/java/org/apache/sqoop/mapreduce/AvroImportMapper.java
* (edit) src/java/org/apache/sqoop/config/ConfigurationConstants.java
* (add) 
src/test/org/apache/sqoop/manager/oracle/OracleAvroPaddingImportTest.java
* (add) src/test/org/apache/sqoop/testutil/ArgumentArrayBuilder.java
* (add) src/test/org/apache/sqoop/testutil/AvroTestUtils.java


> Flag to expand decimal values to fit AVRO schema
> 
>
> Key: SQOOP-2976
> URL: https://issues.apache.org/jira/browse/SQOOP-2976
> Project: Sqoop
>  Issue Type: Improvement
>Affects Versions: 1.4.6
>Reporter: Thomas Scott
>Assignee: Ferenc Szabo
>Priority: Major
> Attachments: SQOOP-2976.patch, SQOOP-2976.patch
>
>
> As per https://issues.apache.org/jira/browse/AVRO-1864 when importing from 
> Oracle (or any other database that truncates decimals) Sqoop jobs can fail 
> because the scale of the decimal produced by the database does not match the 
> scale in the AVRO file.
> For instance if the value 3.15 is produced by Oracle and the AVRO decimal 
> scale  is 3 (this can happen even if the Oracle column is defined with scale 
> of 3) then the job will fail. 
> Can we have a flag (--pad-decimals) that pads incoming values with zeros to 
> fit the AVRO schema (e.g. 3.15 becomes 3.150).



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3275) HBase test cases should start mini DFS cluster as well

2018-01-08 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/SQOOP-3275?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16316648#comment-16316648
 ] 

Hudson commented on SQOOP-3275:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1144 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1144/])
SQOOP-3275: HBase test cases should start mini DFS cluster as well (bogi: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=bf8b8a6acd9f69c01c1bc8e42a2070f5cc596ccd])
* (edit) ivy.xml
* (edit) src/test/com/cloudera/sqoop/hbase/HBaseTestCase.java


> HBase test cases should start mini DFS cluster as well
> --
>
> Key: SQOOP-3275
> URL: https://issues.apache.org/jira/browse/SQOOP-3275
> Project: Sqoop
>  Issue Type: Test
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
> Attachments: SQOOP-3275.patch
>
>
> The HBase test cases in Sqoop start mini HBase and mini Zookeeper clusters 
> but they do not start mini DFS cluster.
> The recommended way to use the mini HBase cluster is to use the mini DFS 
> cluster too it would be good if Sqoop should start it as well.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (SQOOP-3255) Sqoop ignores metastore properties defined in sqoop-site.xml

2018-01-05 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/SQOOP-3255?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16313325#comment-16313325
 ] 

Hudson commented on SQOOP-3255:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1143 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1143/])
SQOOP-3255: Sqoop ignores metastore properties defined in sqoop-site.xml (bogi: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=010042dd47296e0132944cba146c077754b15195])
* (edit) src/java/org/apache/sqoop/tool/JobTool.java
* (edit) src/test/com/cloudera/sqoop/TestIncrementalImport.java
* (edit) src/java/org/apache/sqoop/metastore/GenericJobStorage.java
* (edit) src/test/com/cloudera/sqoop/metastore/SavedJobsTestBase.java
* (add) src/java/org/apache/sqoop/metastore/AutoGenericJobStorage.java
* (add) src/test/org/apache/sqoop/metastore/TestAutoGenericJobStorage.java
* (add) src/test/org/apache/sqoop/testutil/ArgumentUtils.java
* (edit) src/java/org/apache/sqoop/SqoopOptions.java
* (add) 
src/test/com/cloudera/sqoop/metastore/TestMetastoreConfigurationParameters.java
* (edit) src/test/com/cloudera/sqoop/testutil/HsqldbTestServer.java
* (add) src/test/org/apache/sqoop/metastore/TestGenericJobStorageValidate.java
* (edit) src/java/org/apache/sqoop/metastore/JobStorageFactory.java
* (add) src/test/org/apache/sqoop/metastore/TestGenericJobStorage.java
* (add) src/test/org/apache/sqoop/testutil/Argument.java


> Sqoop ignores metastore properties defined in sqoop-site.xml
> 
>
> Key: SQOOP-3255
> URL: https://issues.apache.org/jira/browse/SQOOP-3255
> Project: Sqoop
>  Issue Type: Bug
>Affects Versions: 1.5.0
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
> Attachments: SQOOP-3255.patch, SQOOP-3255.patch, SQOOP-3255.patch
>
>
> Sqoop ignores the following configuration parameters defined in 
> sqoop-site.xml:
> sqoop.metastore.client.autoconnect.url
> sqoop.metastore.client.autoconnect.username
> sqoop.metastore.client.autoconnect.password
> These parameters are ignored even if they are specified with -D option for 
> Sqoop.
> This bug is introduced in SQOOP-3216 the task is to restore the original 
> behavior and continue supporting these parameters.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (SQOOP-3241) ImportAllTablesTool uses the same SqoopOptions object for every table import

2018-01-05 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/SQOOP-3241?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16313200#comment-16313200
 ] 

Hudson commented on SQOOP-3241:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1142 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1142/])
SQOOP-3241: ImportAllTablesTool uses the same SqoopOptions object for (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=7ef6a8013a99738e38a41b20f1f0edccc927a9b8])
* (edit) src/java/org/apache/sqoop/tool/ImportAllTablesTool.java
* (edit) src/java/org/apache/sqoop/SqoopOptions.java
* (edit) src/test/org/apache/sqoop/TestSqoopOptions.java
* (edit) ivy.xml
* (edit) ivy/libraries.properties


> ImportAllTablesTool uses the same SqoopOptions object for every table import
> 
>
> Key: SQOOP-3241
> URL: https://issues.apache.org/jira/browse/SQOOP-3241
> Project: Sqoop
>  Issue Type: Bug
>Affects Versions: 1.4.6
>Reporter: Szabolcs Vasas
>Assignee: Ferenc Szabo
> Attachments: SQOOP-3241.patch, SQOOP-3241.patch
>
>
> ImportAllTablesTool queries the list of tables from the database and invokes 
> ImportTool#importTable method for each table.
> The problem is that it passes the same SqoopOptions object in every 
> invocation and since SqoopOptions is not immutable this can lead to issues.
> For example in case of Parquet imports the CodeGenTool#generateORM method 
> modifies the className field of the SqoopOptions object which is then remains 
> the same for all the subsequent table imports and can cause job failures.
> One solution could be to create a new SqoopOptions object with the same field 
> values for every ImportTool#importTable invocation.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (SQOOP-3246) Update 1.4.7 change log entries + updated the related JIRA tasks (patch available + TODO tasks) by moving them to next version (1.5.0)

2017-12-19 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/SQOOP-3246?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16297446#comment-16297446
 ] 

Hudson commented on SQOOP-3246:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1141 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1141/])
SQOOP-3246: Update 1.4.7 change log entries + updated the related JIRA (maugli: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=735389f7bd07faa98114da4fe66dccee365cbf2f])
* (edit) CHANGELOG.txt


> Update 1.4.7 change log entries + updated the related JIRA tasks (patch 
> available + TODO tasks) by moving them to next version (1.5.0)
> --
>
> Key: SQOOP-3246
> URL: https://issues.apache.org/jira/browse/SQOOP-3246
> Project: Sqoop
>  Issue Type: Sub-task
>Reporter: Attila Szabo
>Assignee: Attila Szabo
> Fix For: no-release
>
> Attachments: SQOOP-3246.patch
>
>




--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


  1   2   3   4   5   6   7   8   9   10   >