Re: VOTE: Move Apache Sqoop to attic

2021-05-16 Thread Boglarka Egyed
Hi Venkat,

Thanks for initiating the community survey and this vote thread.

Based on the activity in the last couple of years, here is my +1

Regards,
Bogi

Venkat  ezt írta (időpont: 2021. máj. 15., Szo,
1:42):

> Dear Sqoop PMCs,
>
> More than a week ago, I sent an email [1] requesting suggestions for
> roadmap items and contributions from the Sqoop community.  Since we
> have not been successful in eliciting roadmap or contribution feedback
> , I am proposing that we move the Apache Sqoop PMC to Apache Attic
>
> One of the requirements[2] in the process to move to the attic is that
> we conduct the PMC vote in the public dev list.   I would like the
> PMCs to cast their votes in this thread.   The voting will end on May
> 17th 2021 at 5PM PST.
>
> [+1] Move to Apache Attic
> [0] No objection/No opinion.
> [-1] Do NOT move to Apache Attic
>
> Here is my +1
>
> Thanks
>
> Venkat
> [1] - https://s.apache.org/nvs0i
> [2] - https://attic.apache.org/process.html
>


[jira] [Commented] (SQOOP-3455) Sqoop job fails while importing to S3 as Parquet

2020-05-19 Thread Boglarka Egyed (Jira)


[ 
https://issues.apache.org/jira/browse/SQOOP-3455?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17110959#comment-17110959
 ] 

Boglarka Egyed commented on SQOOP-3455:
---

Hi [~shailendra...@gmail.com], there is no active effort around putting 
together a next release at the moment, thus I'm not able to provide any ETA. 
This may worth a question at the @dev mailing list if it is urgent, to see if 
there is someone in the community who would take the lead of a next release.

> Sqoop job fails while importing to S3 as Parquet
> 
>
> Key: SQOOP-3455
> URL: https://issues.apache.org/jira/browse/SQOOP-3455
> Project: Sqoop
>  Issue Type: Bug
>  Components: sqoop2-kite-connector
>Affects Versions: 1.4.7
>Reporter: Kriti Jha
>Priority: Blocker
>
> A Sqoop job to import data from a MySQL database into S3 fails on using 
> --as-parquetfile with the error as shown below:
> 
> {{ERROR sqoop.Sqoop: Got exception running Sqoop: 
> org.kitesdk.data.DatasetNotFoundException: Unknown dataset URI pattern: 
> dataset:s3://sqoop-trial-bucket/sqoop-trial/trial
> Check that JARs for s3 datasets are on the classpath
> org.kitesdk.data.DatasetNotFoundException: Unknown dataset URI pattern: 
> dataset:s3://}}{{sqoop-trial-bucket}}{{/sqoop-trial/trial Check that JARs for 
> s3 datasets are on the classpath at 
> org.kitesdk.data.spi.Registration.lookupDatasetUri(Registration.java:128) at 
> org.kitesdk.data.Datasets.exists(Datasets.java:624) at 
> org.kitesdk.data.Datasets.exists(Datasets.java:646) at 
> org.apache.sqoop.mapreduce.ParquetJob.configureImportJob(ParquetJob.java:118) 
> at 
> org.apache.sqoop.mapreduce.DataDrivenImportJob.configureMapper(DataDrivenImportJob.java:132)
>  at 
> org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:264) at 
> org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692) at 
> org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:127) at 
> org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:520) at 
> org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628) at 
> org.apache.sqoop.Sqoop.run(Sqoop.java:147) at 
> org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at 
> org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183) at 
> org.apache.sqoop.Sqoop.runTool(Sqoop.java:234) at 
> org.apache.sqoop.Sqoop.runTool(Sqoop.java:243) at 
> org.apache.sqoop.Sqoop.main(Sqoop.java:252)}}
> 
> {{}}
> {{All the JARs for S3 are present in the classpath. Further, the same works 
> on simply removing the argument --as-parquetfile, i.e. with any other 
> format.}}
> {{}}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (SQOOP-3462) Sqoop ant build fails due to outdated maven repo URLs

2020-01-17 Thread Boglarka Egyed (Jira)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3462?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed reassigned SQOOP-3462:
-

Assignee: (was: Boglarka Egyed)

> Sqoop ant build fails due to outdated maven repo URLs
> -
>
> Key: SQOOP-3462
> URL: https://issues.apache.org/jira/browse/SQOOP-3462
> Project: Sqoop
>  Issue Type: Bug
>  Components: build
>Reporter: Istvan Toth
>Priority: Critical
> Attachments: SQOOP-3462.v1.patch
>
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> Sqoop can no longer be built with ant, as the maven central repos no longer 
> support HTTP.
> {{[ivy:resolve] SERVER ERROR: HTTPS Required 
> url=http://repo1.maven.org/maven2/org/apache/avro/avro/1.8.1/avro-1.8.1.pom}}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (SQOOP-3462) Sqoop ant build fails due to outdated maven repo URLs

2020-01-17 Thread Boglarka Egyed (Jira)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3462?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed reassigned SQOOP-3462:
-

Assignee: Boglarka Egyed

> Sqoop ant build fails due to outdated maven repo URLs
> -
>
> Key: SQOOP-3462
> URL: https://issues.apache.org/jira/browse/SQOOP-3462
> Project: Sqoop
>  Issue Type: Bug
>  Components: build
>Reporter: Istvan Toth
>    Assignee: Boglarka Egyed
>Priority: Critical
> Attachments: SQOOP-3462.v1.patch
>
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> Sqoop can no longer be built with ant, as the maven central repos no longer 
> support HTTP.
> {{[ivy:resolve] SERVER ERROR: HTTPS Required 
> url=http://repo1.maven.org/maven2/org/apache/avro/avro/1.8.1/avro-1.8.1.pom}}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (SQOOP-3455) Sqoop job fails while importing to S3 as Parquet

2019-11-04 Thread Boglarka Egyed (Jira)


[ 
https://issues.apache.org/jira/browse/SQOOP-3455?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16966604#comment-16966604
 ] 

Boglarka Egyed commented on SQOOP-3455:
---

[~kritijha] this is the same issue as SQOOP-3453.

Sqoop import into S3 is not supported with this version, it has been introduced 
only by SQOOP-3345 but has not been included into any official release yet. It 
works in trunk version though. (There are some use cases that are already 
working with 1.4.7 too but these are not tested at all.)

Furthermore, Parquet import into S3 will require some more options to set as 
traditionally Sqoop used Kite SDK to read/write Parquet that has been changed 
in SQOOP-3313 because it has many limitations. This change has also not been 
released yet but can be found in trunk.

> Sqoop job fails while importing to S3 as Parquet
> 
>
> Key: SQOOP-3455
> URL: https://issues.apache.org/jira/browse/SQOOP-3455
> Project: Sqoop
>  Issue Type: Bug
>  Components: sqoop2-kite-connector
>Affects Versions: 1.4.7
>Reporter: Kriti Jha
>Priority: Blocker
>
> A Sqoop job to import data from a MySQL database into S3 fails on using 
> --as-parquetfile with the error as shown below:
> 
> {{ERROR sqoop.Sqoop: Got exception running Sqoop: 
> org.kitesdk.data.DatasetNotFoundException: Unknown dataset URI pattern: 
> dataset:s3://sqoop-trial-bucket/sqoop-trial/trial
> Check that JARs for s3 datasets are on the classpath
> org.kitesdk.data.DatasetNotFoundException: Unknown dataset URI pattern: 
> dataset:s3://}}{{sqoop-trial-bucket}}{{/sqoop-trial/trial Check that JARs for 
> s3 datasets are on the classpath at 
> org.kitesdk.data.spi.Registration.lookupDatasetUri(Registration.java:128) at 
> org.kitesdk.data.Datasets.exists(Datasets.java:624) at 
> org.kitesdk.data.Datasets.exists(Datasets.java:646) at 
> org.apache.sqoop.mapreduce.ParquetJob.configureImportJob(ParquetJob.java:118) 
> at 
> org.apache.sqoop.mapreduce.DataDrivenImportJob.configureMapper(DataDrivenImportJob.java:132)
>  at 
> org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:264) at 
> org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692) at 
> org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:127) at 
> org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:520) at 
> org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628) at 
> org.apache.sqoop.Sqoop.run(Sqoop.java:147) at 
> org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at 
> org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183) at 
> org.apache.sqoop.Sqoop.runTool(Sqoop.java:234) at 
> org.apache.sqoop.Sqoop.runTool(Sqoop.java:243) at 
> org.apache.sqoop.Sqoop.main(Sqoop.java:252)}}
> 
> {{}}
> {{All the JARs for S3 are present in the classpath. Further, the same works 
> on simply removing the argument --as-parquetfile, i.e. with any other 
> format.}}
> {{}}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (SQOOP-3453) Kite sdk issue with sqoop version 1.4.7

2019-10-30 Thread Boglarka Egyed (Jira)


[ 
https://issues.apache.org/jira/browse/SQOOP-3453?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16962916#comment-16962916
 ] 

Boglarka Egyed commented on SQOOP-3453:
---

[~saiyam1712] Sqoop import into S3 is not supported with this version, it has 
been introduced only by SQOOP-3345 but has not been included into any official 
release yet. It works in trunk version though.

Furthermore, Parquet import into S3 will require some more options to set as 
traditionally Sqoop used Kite SDK to read/write Parquet that has been changed 
in SQOOP-3313 because it has many limitations. This change has also not been 
released yet but can be found in trunk.

> Kite sdk issue with sqoop version 1.4.7
> ---
>
> Key: SQOOP-3453
> URL: https://issues.apache.org/jira/browse/SQOOP-3453
> Project: Sqoop
>  Issue Type: Bug
>  Components: connectors/oracle
>Affects Versions: 1.4.7
>Reporter: Saiyam Agarwal
>Priority: Major
>
>  
> Trying to run sqoop import command with sqoop version 1.4.7 and kite sdk 
> version 1.1.0.
> {code:java}
> /usr/lib/sqoop/bin/sqoop import -Dmapreduce.map.memory.mb=2048 
> -Dmapred.fairscheduler.pool=sqoop-coop -Doraoop.timestamp.string=false 
> --connect jdbc:oracle:thin:@abc.net:1521/coopdvm --username abc 
> --password-file s3://abc/user/app.ab.core_etl/credentials/coopdvm_coop 
> --target-dir 's3://abc/dev/temp/TEST_MEIS/T1/master/timestamp=20191017103331' 
> --direct --as-parquetfile -z --map-column-java ID=Integer --query "select ID 
> from TEST_MEIS.T1 where \$CONDITIONS" -m 1 --verbose{code}
> Failing with below issue:
> {code:java}
> App > 19/10/23 05:58:04 DEBUG util.ClassLoaderStack: Restoring classloader: 
> sun.misc.Launcher$AppClassLoader@642c39d2
> App > 19/10/23 05:58:04 DEBUG manager.OracleManager$ConnCache: Caching 
> released connection for jdbc:oracle:thin:@abc:1521/coopdvm/coop
> App > 19/10/23 05:58:04 ERROR sqoop.Sqoop: Got exception running Sqoop: 
> org.kitesdk.data.DatasetNotFoundException: Unknown dataset URI pattern: 
> dataset:s3://abc/dev/temp/TEST_MEIS/T1/master/timestamp=20191017103331
> App > Check that JARs for s3 datasets are on the classpath
> App > org.kitesdk.data.DatasetNotFoundException: Unknown dataset URI pattern: 
> dataset:s3://abc/dev/temp/TEST_MEIS/T1/master/timestamp=20191017103331
> App > Check that JARs for s3 datasets are on the classpath
> App > at 
> org.kitesdk.data.spi.Registration.lookupDatasetUri(Registration.java:128)
> App > at org.kitesdk.data.Datasets.exists(Datasets.java:624)
> App > at org.kitesdk.data.Datasets.exists(Datasets.java:646)
> App > at 
> org.apache.sqoop.mapreduce.ParquetJob.configureImportJob(ParquetJob.java:118)
> {code}
>  Can anyone help in debugging or suggest any workaround?
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (SQOOP-2903) Add Kudu connector for Sqoop

2019-10-10 Thread Boglarka Egyed (Jira)


[ 
https://issues.apache.org/jira/browse/SQOOP-2903?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16948590#comment-16948590
 ] 

Boglarka Egyed commented on SQOOP-2903:
---

[~sanysand...@gmail.com], If [~sabhyankar] is OK with taking the patch over I 
think you just need to rework it on trunk branch and then open a new pull 
request so that
 * Community can review
 * Automated tests will run with the change

Thanks,
Bogi

> Add Kudu connector for Sqoop
> 
>
> Key: SQOOP-2903
> URL: https://issues.apache.org/jira/browse/SQOOP-2903
> Project: Sqoop
>  Issue Type: Improvement
>  Components: connectors
>Reporter: Sameer Abhyankar
>Assignee: Sameer Abhyankar
>Priority: Major
> Attachments: SQOOP-2903.1.patch, SQOOP-2903.2.patch, SQOOP-2903.patch
>
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> Sqoop currently does not have a connector for Kudu. We should add the 
> functionality to allow Sqoop to ingest data directly into Kudu.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (SQOOP-3449) Support for DB2 DECFLOAT data type when importing to HDFS/Hive

2019-09-06 Thread Boglarka Egyed (Jira)


[ 
https://issues.apache.org/jira/browse/SQOOP-3449?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16924145#comment-16924145
 ] 

Boglarka Egyed commented on SQOOP-3449:
---

Hi [~chiran54321],

Thanks for creating this JIRA. I have added you to the Contributors group and 
assigned this ticket to you.

For getting your change reviewed please open a pull request on 
[https://github.com/apache/sqoop] against trunk branch so that automated tests 
will run with your change.

Thank you,

Bogi

> Support for DB2 DECFLOAT data type when importing to HDFS/Hive
> --
>
> Key: SQOOP-3449
> URL: https://issues.apache.org/jira/browse/SQOOP-3449
> Project: Sqoop
>  Issue Type: Improvement
>  Components: connectors
>Affects Versions: 1.4.7
>Reporter: Chiran Ravani
>Assignee: Chiran Ravani
>Priority: Major
> Attachments: SQOOP-3449.patch
>
>
> The DB2 connector should add DECFLOAT data type support otherwise you will 
> get an error that Hive does not support the SQL type.
> Example with column DESCRIPTION data type DECFLOAT:
> $SQOOP_HOME/bin/sqoop import --connect 
> 'jdbc:db2://xxx.svl.ibm.com:6/testdb1' --username db2inst1 --password 
> db2inst1 --hive-table product_ct --table dexfloattest



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Assigned] (SQOOP-3449) Support for DB2 DECFLOAT data type when importing to HDFS/Hive

2019-09-06 Thread Boglarka Egyed (Jira)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3449?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed reassigned SQOOP-3449:
-

Assignee: Chiran Ravani  (was: Boglarka Egyed)

> Support for DB2 DECFLOAT data type when importing to HDFS/Hive
> --
>
> Key: SQOOP-3449
> URL: https://issues.apache.org/jira/browse/SQOOP-3449
> Project: Sqoop
>  Issue Type: Improvement
>  Components: connectors
>Affects Versions: 1.4.7
>Reporter: Chiran Ravani
>Assignee: Chiran Ravani
>Priority: Major
> Attachments: SQOOP-3449.patch
>
>
> The DB2 connector should add DECFLOAT data type support otherwise you will 
> get an error that Hive does not support the SQL type.
> Example with column DESCRIPTION data type DECFLOAT:
> $SQOOP_HOME/bin/sqoop import --connect 
> 'jdbc:db2://xxx.svl.ibm.com:6/testdb1' --username db2inst1 --password 
> db2inst1 --hive-table product_ct --table dexfloattest



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Assigned] (SQOOP-3449) Support for DB2 DECFLOAT data type when importing to HDFS/Hive

2019-09-06 Thread Boglarka Egyed (Jira)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3449?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed reassigned SQOOP-3449:
-

Assignee: Boglarka Egyed

> Support for DB2 DECFLOAT data type when importing to HDFS/Hive
> --
>
> Key: SQOOP-3449
> URL: https://issues.apache.org/jira/browse/SQOOP-3449
> Project: Sqoop
>  Issue Type: Improvement
>  Components: connectors
>Affects Versions: 1.4.7
>Reporter: Chiran Ravani
>    Assignee: Boglarka Egyed
>Priority: Major
> Attachments: SQOOP-3449.patch
>
>
> The DB2 connector should add DECFLOAT data type support otherwise you will 
> get an error that Hive does not support the SQL type.
> Example with column DESCRIPTION data type DECFLOAT:
> $SQOOP_HOME/bin/sqoop import --connect 
> 'jdbc:db2://xxx.svl.ibm.com:6/testdb1' --username db2inst1 --password 
> db2inst1 --hive-table product_ct --table dexfloattest



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Resolved] (SQOOP-3441) Prepare Sqoop for Java 11 support

2019-06-12 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3441?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed resolved SQOOP-3441.
---
   Resolution: Fixed
Fix Version/s: 1.5.0

Thanks [~fero] for this preparational effort!

> Prepare Sqoop for Java 11 support
> -
>
> Key: SQOOP-3441
> URL: https://issues.apache.org/jira/browse/SQOOP-3441
> Project: Sqoop
>  Issue Type: Improvement
>Reporter: Fero Szabo
>Assignee: Fero Szabo
>Priority: Major
> Fix For: 1.5.0
>
>  Time Spent: 50m
>  Remaining Estimate: 0h
>
> A couple of code changes will be required in order for Sqoop to work with 
> Java11 and we'll also have to bump a couple of dependencies and the gradle 
> version. 
> I'm not sure what's required for ant, that is to be figured out in a separate 
> Jira, if we keep the ant build.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (SQOOP-3438) Sqoop Import with create hcatalog table for ORC will not work with Hive3 as the table created would be a ACID table and transactional

2019-06-05 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3438?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed resolved SQOOP-3438.
---
   Resolution: Fixed
Fix Version/s: 1.5.0

Thank you [~dionusos] for your contribution! Your change is now merged to trunk.

> Sqoop Import with create hcatalog table for ORC will not work with Hive3 as 
> the table created would be a ACID table and transactional
> -
>
> Key: SQOOP-3438
> URL: https://issues.apache.org/jira/browse/SQOOP-3438
> Project: Sqoop
>  Issue Type: Improvement
>  Components: hive-integration
>Affects Versions: 1.4.7
>Reporter: Denes Bodo
>Assignee: Denes Bodo
>Priority: Critical
> Fix For: 1.5.0
>
>  Time Spent: 3h 20m
>  Remaining Estimate: 0h
>
> PROBLEM: Running a sqoop import command with the option 
> --create-hcatalog-table will not work due to the following reasons
> When create-hcatalog-table is used it creates the table as a Managed ACID 
> table.
> HCatalog does not support transactional or bucketing table
> So customer who need to create a ORC based table cannot use sqoop to create a 
> ORC based table which means their existing code where if in case they use 
> sqoop to create these tables would fail.
> The current workaround is a two step process
> 1. Create the ORC table in hive with the keyword external and set 
> transactional to false
> 2. Then use the sqoop command to load the data into the orc table.
> The request is to add in an extra argument in the sqoop command line to 
> specify that the table is external (example: --hcatalog-external-table )so we 
> can use the option --hcatalog-storage-stanza "stored as orc tblproperties 
> (\"transactional\"=\"false\")".
> 
> Thank you [~mbalakrishnan] for your findings. This ticket is created based on 
> your work.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3438) Sqoop Import with create hcatalog table for ORC will not work with Hive3 as the table created would be a ACID table and transactional

2019-05-10 Thread Boglarka Egyed (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3438?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16837050#comment-16837050
 ] 

Boglarka Egyed commented on SQOOP-3438:
---

[~dionusos] I have reviewed your change, please find it at your pull request.

I think this change is more of an Improvement than a Bug, would you mind to 
change the type of this issue?

> Sqoop Import with create hcatalog table for ORC will not work with Hive3 as 
> the table created would be a ACID table and transactional
> -
>
> Key: SQOOP-3438
> URL: https://issues.apache.org/jira/browse/SQOOP-3438
> Project: Sqoop
>  Issue Type: Bug
>  Components: hive-integration
>Affects Versions: 1.4.7
>Reporter: Denes Bodo
>Assignee: Denes Bodo
>Priority: Critical
>  Time Spent: 1h
>  Remaining Estimate: 0h
>
> PROBLEM: Running a sqoop import command with the option 
> --create-hcatalog-table will not work due to the following reasons
> When create-hcatalog-table is used it creates the table as a Managed ACID 
> table.
> HCatalog does not support transactional or bucketing table
> So customer who need to create a ORC based table cannot use sqoop to create a 
> ORC based table which means their existing code where if in case they use 
> sqoop to create these tables would fail.
> The current workaround is a two step process
> 1. Create the ORC table in hive with the keyword external and set 
> transactional to false
> 2. Then use the sqoop command to load the data into the orc table.
> The request is to add in an extra argument in the sqoop command line to 
> specify that the table is external (example: --hcatalog-external-table )so we 
> can use the option --hcatalog-storage-stanza "stored as orc tblproperties 
> (\"transactional\"=\"false\")".
> 
> Thank you [~mbalakrishnan] for your findings. This ticket is created based on 
> your work.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (SQOOP-3423) Let user pass password to connect Hive when it set to LDAP authentication

2019-05-06 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3423?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed resolved SQOOP-3423.
---
   Resolution: Fixed
Fix Version/s: 1.5.0

Thanks for your contribution [~dionusos]!

> Let user pass password to connect Hive when it set to LDAP authentication
> -
>
> Key: SQOOP-3423
> URL: https://issues.apache.org/jira/browse/SQOOP-3423
> Project: Sqoop
>  Issue Type: Improvement
>  Components: hive-integration
>Affects Versions: 1.4.7
>Reporter: Denes Bodo
>Assignee: Denes Bodo
>Priority: Major
> Fix For: 1.5.0
>
> Attachments: SQOOP-3423-001.patch
>
>  Time Spent: 0.5h
>  Remaining Estimate: 0h
>
> If HiveServer2 is using password based authentication, additional 
> username/password information has to be provided to be able to connect to it.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3429) Bump Sqoop to Hadoop 2.9.x

2019-02-22 Thread Boglarka Egyed (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3429?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16775131#comment-16775131
 ] 

Boglarka Egyed commented on SQOOP-3429:
---

Hi [~Fokko],

I have added you to Contributors group so that you can assign tickets to 
yourself from now.

Thanks for opening this Jira but please be aware that another effort has been 
started last year regarding upgrade to Hadoop 3, please take a look at 
SQOOP-3305.

Is there any particular reason why you want to upgrade Hadoop version to 2.9.x?

Cheers,

Bogi

> Bump Sqoop to Hadoop 2.9.x
> --
>
> Key: SQOOP-3429
> URL: https://issues.apache.org/jira/browse/SQOOP-3429
> Project: Sqoop
>  Issue Type: Improvement
>Affects Versions: 1.4.7
>Reporter: Fokko Driesprong
>Priority: Major
> Fix For: 1.5.0
>
>
> I would like to bump Sqoop to Hadoop 2.9.3



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Re: Mandatory relocation of Sqoop git repository to gitbox

2018-12-11 Thread Boglarka Egyed
Hi All,

Thanks Szabolcs for driving this discussion!

+1 to start the migration in the initial phase.

Regards,
Bogi

On Mon, Dec 10, 2018 at 8:51 PM Attila Szabó  wrote:

> Hello everyone,
>
> +1
>
> @Szabi:
> Thanks for owning the consensus process!!!
>
> Cheers,
> Attila
>
> On Mon, Dec 10, 2018, 3:50 PM Fero Szabo 
> > Hi Szabi,
> >
> > +1 from my side to get this done in the initial phase!
> >
> > Best Regards,
> > Fero
> >
> > On Mon, Dec 10, 2018 at 2:29 PM Szabolcs Vasas  wrote:
> >
> > > Hi All,
> > >
> > > According to this
> > > <
> > >
> >
> http://mail-archives.apache.org/mod_mbox/incubator-general/201812.mbox/%3c6edbcae6-4eb9-6f61-beac-4198fd750...@apache.org%3E
> > > >
> > > email
> > > the git-wip-us server, which hosts Apache Sqoop git repository too, is
> > > going to be decommissioned soon and all the projects are going to be
> > > migrated to gitbox.
> > > For the detailed description of the planned change please refer to the
> > > email linked, but the bottom line is that after the migration we are
> > going
> > > to be able to merge pull requests on the GitHub UI as well which will
> > > greatly simplify our commit process.
> > >
> > > This relocation is mandatory however we have the option execute it in
> the
> > > initial phase which would be great in my opinion because we could start
> > > enjoying the benefits very soon.
> > >
> > > Please reply to this chain with your opinion because we need a
> consensus
> > to
> > > be able to start the migration in the initial phase.
> > >
> > > Thanks and regards,
> > > Szabolcs
> > >
> >
> >
> > --
> > *Ferenc Szabo* | Software Engineer
> > t. (+361) 701 1201 <+361+701+1201>
> > cloudera.com 
> >
> > [image: Cloudera] 
> >
> > [image: Cloudera on Twitter]  [image:
> > Cloudera on Facebook]  [image:
> Cloudera
> > on LinkedIn] 
> > --
> >
>


[jira] [Commented] (SQOOP-3410) Test S3 import with fs.s3a.security.credential.provider.path

2018-11-30 Thread Boglarka Egyed (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3410?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16704947#comment-16704947
 ] 

Boglarka Egyed commented on SQOOP-3410:
---

Hey [~ste...@apache.org],

Thanks for pointing this out! Do you know the exact Hadoop version in which 
these bucket-specific properties have been introduced? Also, has 
fs.s3a.security.credential.provider.path been officially deprecated/removed in 
a newer release?

Thanks,
Bogi

> Test S3 import with fs.s3a.security.credential.provider.path
> 
>
> Key: SQOOP-3410
> URL: https://issues.apache.org/jira/browse/SQOOP-3410
> Project: Sqoop
>  Issue Type: Improvement
>    Reporter: Boglarka Egyed
>        Assignee: Boglarka Egyed
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: SQOOP-3410.patch
>
>
> Based on 
> [https://hadoop.apache.org/docs/current/hadoop-aws/tools/hadoop-aws/index.html#Configure_the_hadoop.security.credential.provider.path_property]
>  property fs.s3a.security.credential.provider.path can also be used for 
> passing the location of the credential store. This should be also tested and 
> documented.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Re: Next release

2018-11-29 Thread Boglarka Egyed
Hi All,

+1 for Fero as the release manager of the next release.

Here is the link for the previous discussion:
https://lists.apache.org/thread.html/f2b3477f4c904ec150c6046f2e230733b2a17c12136618734c3ec1c7@%3Cdev.sqoop.apache.org%3E
Based on it Hadoop/Hive/HBase upgrades are missing from the original scope
and as these can have serious dependencies on each other it may not be a
trivial effort to get them done quickly.

I think a good starting point would be to catch up on
https://issues.apache.org/jira/browse/SQOOP-3305 that is being handled by
Daniel and got blocked by Hive and HBase related issues.

Regards,
Bogi

On Thu, Nov 29, 2018 at 5:00 PM Fero Szabo 
wrote:

> Hi Attila, Ferenc
>
> I would be happy to drive the next release!
>
> As for scoping, there is another thread that discusses this topic with the
> subject: Release to support Hadoop 3
>
> There is only one item missing from that original scope: upgrading Hadoop
> to 3.x. We'll need to make this upgrade in order to support the newest
> versions of Hive, HBase and Accumulo. I remember that we were waiting for a
> new release from HBase. HBase version 2.1.1 was released on 2018/10/31, but
> I'm not sure if / when / how the Hadoop upgrade will move forward and if
> this new HBase has what we need exactly.
>
> Best Regards,
> Fero
>
> On Tue, Nov 27, 2018 at 5:55 PM Attila Szabó  wrote:
>
> > Hi Ferenc,
> >
> > I did not have any specific plan for the end of the year, but as I did
> the
> > very last release I would be happy to help anyone who would like to drive
> > it ( or if no one volunteers I might own it, but IMHO from community POV
> it
> > would be better if someone else would execute it this time).
> >
> > On the scoping front :
> > AFAIR there are a few tickets which seems to be abandoned for a while and
> > targeted to this release. A grooming there would be a great start for the
> > scoping.
> >
> > My2cents,
> > Attila
> >
> > On Tue, Nov 27, 2018, 2:10 PM Ferenc Szabo  >
> > > Hi sqoop developers,
> > >
> > > Do you have a plan for the next release?
> > >
> > > Regards
> > > Ferenc
> > >
> >
>
>
> --
> *Ferenc Szabo* | Software Engineer
> t. (+361) 701 1201 <+361+701+1201>
> cloudera.com 
>
> [image: Cloudera] 
>
> [image: Cloudera on Twitter]  [image:
> Cloudera on Facebook]  [image: Cloudera
> on LinkedIn] 
> --
>


Re: Sqoop build infrastructure improvements

2018-11-29 Thread Boglarka Egyed
Thank you Szabolcs for driving these efforts!

+1 for using pull requests from now.

Thanks,
Bogi

On Wed, Nov 28, 2018 at 4:54 PM Szabolcs Vasas 
wrote:

> Dear Sqoop community,
>
> We have been working on quite a few exciting build infrastructure
> improvements recently, I am sending this email to summarize them.
>
> *Gradle can now execute all the Sqoop tests in a single JVM*
> This improvement makes the Gradle test tasks significantly faster since we
> do not have to start up a new JVM for every test class. It also made
> possible to introduce fine grained test categories which were essential to
> be able to parallelize the test execution in our CI systems. For more
> information please refer to COMPILING.txt
> .
>
> *Apache Sqoop Jenkins job
>  now builds and tests with
> Gradle*
> Since our Gradle build became much more stable and faster it made sense to
> reconfigure our Jenkins job to benefit from these improvements. The job is
> faster now (~30 minutes instead of ~40) and it executes all of the tests
> which can be run without external RDBMS or cloud systems (while the old Ant
> based job executed the unit test suite only).
>
> *Travis CI is enabled for Apache Sqoop*
> The new Travis CI job  now runs for
> every commit and every pull request on Apache Sqoop GitHub repository and
> it executes all of the tests except the Oracle third party test cases. One
> of the biggest benefit of Travis CI is that it can be really easily
> configured for the individual forks as well so contributors get a well
> configured CI job for their own feature branches for free. For more
> information please refer to COMPILING.txt
> .
>
>
> Since we have a CI job now which integrates very well with GitHub pull
> requests I suggest deprecating the old Review Board and patch file based
> contribution process and use pull requests in the future. We had a mail
> chain about the same proposal last year and it seemed that the community
> was happy about the idea so I think we can evaluate it for some time and if
> everything goes well we can update our how to contribute wiki.
>
> Feel free to reply to this chain with your questions and suggestions on the
> above!
>
> Regards,
> Szabolcs
>


[jira] [Resolved] (SQOOP-3414) Introduce a Gradle build parameter to set the ignoreTestFailures of the test tasks

2018-11-28 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3414?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed resolved SQOOP-3414.
---
   Resolution: Fixed
Fix Version/s: 3.0.0

Thanks for improving our CI [~vasas]!

> Introduce a Gradle build parameter to set the ignoreTestFailures of the test 
> tasks
> --
>
> Key: SQOOP-3414
> URL: https://issues.apache.org/jira/browse/SQOOP-3414
> Project: Sqoop
>  Issue Type: Test
>Affects Versions: 1.4.7
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: test_with_ignoreTestFailures=true.txt, 
> test_without_ignoreTestFailures.txt
>
>
> The 
> [ignoreFailures|https://docs.gradle.org/current/dsl/org.gradle.api.tasks.testing.Test.html#org.gradle.api.tasks.testing.Test:ignoreFailures]
>  parameter of the Gradle test tasks is set to false which means that if a 
> Gradle test task fails the gradle
> process returns with non-zero. In some CI tools (e.g. Jenkins) this will make 
> the status of the job red and not yellow
> which usually means some more serious issue than a test failure.
> I would like to introduce a parameter to be able set this parameter of the 
> test tasks.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Review Request 69434: SQOOP-3410: Test S3 import with fs.s3a.security.credential.provider.path

2018-11-22 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/69434/
---

Review request for Sqoop, Fero Szabo and Szabolcs Vasas.


Bugs: SQOOP-3410
https://issues.apache.org/jira/browse/SQOOP-3410


Repository: sqoop-trunk


Description
---

Based on 
https://hadoop.apache.org/docs/current/hadoop-aws/tools/hadoop-aws/index.html#Configure_the_hadoop.security.credential.provider.path_property
 property fs.s3a.security.credential.provider.path can also be used for passing 
the location of the credential store. This should be also tested and documented.


Diffs
-

  src/docs/user/s3.txt 6ff828c497e0711a2394f768ed5d61ecaf9ec273 
  src/java/org/apache/sqoop/util/password/CredentialProviderHelper.java 
4e79f0ae252969c4a426d1ff69072695eb37b7a6 
  src/test/org/apache/sqoop/credentials/TestPassingSecurePassword.java 
dca3195b8051048c5c7c2fb3bf30774e9d19eda8 
  src/test/org/apache/sqoop/s3/TestS3ImportWithHadoopCredProvider.java 
e1d7cbda2c65aa59a89715adff52b85fb3730477 


Diff: https://reviews.apache.org/r/69434/diff/1/


Testing
---

ant clean test
./gradlew -Ds3.bucket.url= 
-Ds3.generator.command= s3Test --tests 
TestS3ImportWithHadoopCredProvider

ant clean docs
./gradlew docs


Thanks,

Boglarka Egyed



[jira] [Updated] (SQOOP-3410) Test S3 import with fs.s3a.security.credential.provider.path

2018-11-22 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3410?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed updated SQOOP-3410:
--
Attachment: SQOOP-3410.patch

> Test S3 import with fs.s3a.security.credential.provider.path
> 
>
> Key: SQOOP-3410
> URL: https://issues.apache.org/jira/browse/SQOOP-3410
> Project: Sqoop
>  Issue Type: Improvement
>    Reporter: Boglarka Egyed
>        Assignee: Boglarka Egyed
>Priority: Major
> Attachments: SQOOP-3410.patch
>
>
> Based on 
> [https://hadoop.apache.org/docs/current/hadoop-aws/tools/hadoop-aws/index.html#Configure_the_hadoop.security.credential.provider.path_property]
>  property fs.s3a.security.credential.provider.path can also be used for 
> passing the location of the credential store. This should be also tested and 
> documented.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (SQOOP-3410) Test S3 import with fs.s3a.security.credential.provider.path

2018-11-22 Thread Boglarka Egyed (JIRA)
Boglarka Egyed created SQOOP-3410:
-

 Summary: Test S3 import with 
fs.s3a.security.credential.provider.path
 Key: SQOOP-3410
 URL: https://issues.apache.org/jira/browse/SQOOP-3410
 Project: Sqoop
  Issue Type: Improvement
Reporter: Boglarka Egyed
Assignee: Boglarka Egyed


Based on 
[https://hadoop.apache.org/docs/current/hadoop-aws/tools/hadoop-aws/index.html#Configure_the_hadoop.security.credential.provider.path_property]
 property fs.s3a.security.credential.provider.path can also be used for passing 
the location of the credential store. This should be also tested and documented.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Re: Review Request 69429: Introduce a Gradle build parameter to set the default forkEvery value for the tests

2018-11-22 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/69429/#review210798
---


Ship it!




Thanks Szabolcs for this improvement and the thorough documentation around it! 
Your change LGTM.

- Boglarka Egyed


On Nov. 22, 2018, 1:11 p.m., Szabolcs Vasas wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/69429/
> ---
> 
> (Updated Nov. 22, 2018, 1:11 p.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Bugs: SQOOP-3408
> https://issues.apache.org/jira/browse/SQOOP-3408
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> ---
> 
> Documented forkEvery.default in COMPILING.txt.
> Needed to move the definition of the kerberizedTest task below 
> tasks.withType(Test) block to preserve forkEvery 1 setting.
> 
> 
> Diffs
> -
> 
>   COMPILING.txt 0383707f689102a3a543d94646cfaaf21710 
>   build.gradle 954935daeaaaf45e1b2fd83f74e11f5ed2d58377 
> 
> 
> Diff: https://reviews.apache.org/r/69429/diff/1/
> 
> 
> Testing
> ---
> 
> ./gradlew test : runs the test task with forkEvery=0
> ./gradlew -DforkEvery.default=5 test : runs the test taks with forkEvery=5
> 
> ./gradlew kerberizedTest : runs the kerberizedTest task with forkEvery=1
> ./gradlew -DforkEvery.default=5 kerberizedTest : runs the kerberizedTest task 
> with forkEvery=1, so the forkEvery.default parameter does not affect 
> kerberizedTest
> 
> 
> Thanks,
> 
> Szabolcs Vasas
> 
>



Re: Review Request 69407: Refactor: break up Parameterized tests on a per database basis

2018-11-22 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/69407/#review210796
---


Ship it!




Ship It!

- Boglarka Egyed


On Nov. 22, 2018, 1:39 p.m., Fero Szabo wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/69407/
> ---
> 
> (Updated Nov. 22, 2018, 1:39 p.m.)
> 
> 
> Review request for Sqoop, Boglarka Egyed and Szabolcs Vasas.
> 
> 
> Bugs: SQOOP-3405
> https://issues.apache.org/jira/browse/SQOOP-3405
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> ---
> 
> Breaking up the parameterized test classes into a per database basis. 
> Provides better readability, needed for proper test categorization (and thus, 
> for travis integration).
> 
> 
> Diffs
> -
> 
>   src/test/org/apache/sqoop/importjob/DatabaseAdapterFactory.java 
> PRE-CREATION 
>   src/test/org/apache/sqoop/importjob/NumericTypesImportTest.java af310cbe2 
>   src/test/org/apache/sqoop/importjob/SplitByImportTest.java 90b7cbbd3 
>   
> src/test/org/apache/sqoop/importjob/configuration/MSSQLServerImportJobTestConfiguration.java
>  4ad7defe1 
>   
> src/test/org/apache/sqoop/importjob/configuration/MySQLImportJobTestConfiguration.java
>  fbcbdebeb 
>   
> src/test/org/apache/sqoop/importjob/numerictypes/MysqlNumericTypesImportTest.java
>  PRE-CREATION 
>   
> src/test/org/apache/sqoop/importjob/numerictypes/OracleNumericTypesImportTest.java
>  PRE-CREATION 
>   
> src/test/org/apache/sqoop/importjob/numerictypes/PostgresNumericTypesImportTest.java
>  PRE-CREATION 
>   
> src/test/org/apache/sqoop/importjob/numerictypes/SqlServerNumericTypesImportTest.java
>  PRE-CREATION 
>   src/test/org/apache/sqoop/importjob/splitby/MysqlSplitByImportTest.java 
> PRE-CREATION 
>   src/test/org/apache/sqoop/importjob/splitby/OracleSplitByImportTest.java 
> PRE-CREATION 
>   src/test/org/apache/sqoop/importjob/splitby/PostgresSplitByImportTest.java 
> PRE-CREATION 
>   src/test/org/apache/sqoop/importjob/splitby/SqlServerSplitByImportTest.java 
> PRE-CREATION 
>   src/test/org/apache/sqoop/testutil/adapter/MSSQLServerDatabaseAdapter.java 
> 22567162d 
>   src/test/org/apache/sqoop/testutil/adapter/MySqlDatabaseAdapter.java 
> ebd014688 
> 
> 
> Diff: https://reviews.apache.org/r/69407/diff/7/
> 
> 
> Testing
> ---
> 
> unit and 3rd party tests.
> 
> 
> Thanks,
> 
> Fero Szabo
> 
>



[jira] [Updated] (SQOOP-3409) Fix temporary rootdir clean up in Sqoop-S3 tests

2018-11-22 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3409?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed updated SQOOP-3409:
--
Attachment: SQOOP-3409.patch

> Fix temporary rootdir clean up in Sqoop-S3 tests
> 
>
> Key: SQOOP-3409
> URL: https://issues.apache.org/jira/browse/SQOOP-3409
> Project: Sqoop
>  Issue Type: Task
>Affects Versions: 1.4.7
>        Reporter: Boglarka Egyed
>    Assignee: Boglarka Egyed
>Priority: Major
> Attachments: SQOOP-3409.patch
>
>
> Temporary root directory clean up doesn't work as expected, many generated 
> temprootdirs are being kept in the used bucket after test runs. Clean up 
> logic should be checked and fixed.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Review Request 69430: SQOOP-3409: Fix temporary rootdir clean up in Sqoop-S3 tests

2018-11-22 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/69430/
---

Review request for Sqoop, Fero Szabo and Szabolcs Vasas.


Bugs: SQOOP-3409
https://issues.apache.org/jira/browse/SQOOP-3409


Repository: sqoop-trunk


Description
---

Temporary root directory clean up doesn't work as expected, many generated 
temprootdirs are being kept in the used bucket after test runs. This was caused 
as the target directory cleanup and name reset happened before the temprootdir 
cleanup however the temprootdir name depends on the target dir name in the 
tests.


Diffs
-

  src/test/org/apache/sqoop/testutil/S3TestUtils.java 
2fc606115196a7a2b6088be104e2a421888f8798 


Diff: https://reviews.apache.org/r/69430/diff/1/


Testing
---

./gradlew s3Test -Ds3.bucket.url= 
-Ds3.generator.command=, all the used 
temprootdirs have been cleaned up


Thanks,

Boglarka Egyed



[jira] [Created] (SQOOP-3409) Fix temporary rootdir clean up in Sqoop-S3 tests

2018-11-22 Thread Boglarka Egyed (JIRA)
Boglarka Egyed created SQOOP-3409:
-

 Summary: Fix temporary rootdir clean up in Sqoop-S3 tests
 Key: SQOOP-3409
 URL: https://issues.apache.org/jira/browse/SQOOP-3409
 Project: Sqoop
  Issue Type: Task
Affects Versions: 1.4.7
Reporter: Boglarka Egyed
Assignee: Boglarka Egyed


Temporary root directory clean up doesn't work as expected, many generated 
temprootdirs are being kept in the used bucket after test runs. Clean up logic 
should be checked and fixed.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Re: Review Request 69407: Refactor: break up Parameterized tests on a per database basis

2018-11-22 Thread Boglarka Egyed


> On Nov. 22, 2018, 8:29 a.m., Boglarka Egyed wrote:
> > src/test/org/apache/sqoop/importjob/configuration/MysqlImportJobTestConfiguration.java
> > Lines 24 (patched)
> > <https://reviews.apache.org/r/69407/diff/6/?file=2109532#file2109532line24>
> >
> > Renamed files are shown as new files now which compromises the diff. 
> > Could you please take a look and regenerate the diff?

Sorry, I wanted to say that it corrupts the diff.


- Boglarka


---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/69407/#review210785
---


On Nov. 21, 2018, 3 p.m., Fero Szabo wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/69407/
> ---
> 
> (Updated Nov. 21, 2018, 3 p.m.)
> 
> 
> Review request for Sqoop, Boglarka Egyed and Szabolcs Vasas.
> 
> 
> Bugs: SQOOP-3405
> https://issues.apache.org/jira/browse/SQOOP-3405
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> ---
> 
> Breaking up the parameterized test classes into a per database basis. 
> Provides better readability, needed for proper test categorization (and thus, 
> for travis integration).
> 
> 
> Diffs
> -
> 
>   src/test/org/apache/sqoop/importjob/DatabaseAdapterFactory.java 
> PRE-CREATION 
>   src/test/org/apache/sqoop/importjob/NumericTypesImportTest.java af310cb 
>   src/test/org/apache/sqoop/importjob/SplitByImportTest.java 90b7cbb 
>   
> src/test/org/apache/sqoop/importjob/configuration/MSSQLServerImportJobTestConfiguration.java
>  4ad7def 
>   
> src/test/org/apache/sqoop/importjob/configuration/MySQLImportJobTestConfiguration.java
>  fbcbdeb 
>   
> src/test/org/apache/sqoop/importjob/configuration/MysqlImportJobTestConfiguration.java
>  PRE-CREATION 
>   
> src/test/org/apache/sqoop/importjob/configuration/SqlServerImportJobTestConfiguration.java
>  PRE-CREATION 
>   
> src/test/org/apache/sqoop/importjob/numerictypes/MysqlNumericTypesImportTest.java
>  PRE-CREATION 
>   
> src/test/org/apache/sqoop/importjob/numerictypes/NumericTypesImportTestBase.java
>  PRE-CREATION 
>   
> src/test/org/apache/sqoop/importjob/numerictypes/OracleNumericTypesImportTest.java
>  PRE-CREATION 
>   
> src/test/org/apache/sqoop/importjob/numerictypes/PostgresNumericTypesImportTest.java
>  PRE-CREATION 
>   
> src/test/org/apache/sqoop/importjob/numerictypes/SqlServerNumericTypesImportTest.java
>  PRE-CREATION 
>   src/test/org/apache/sqoop/importjob/splitby/MysqlSplitByImportTest.java 
> PRE-CREATION 
>   src/test/org/apache/sqoop/importjob/splitby/OracleSplitByImportTest.java 
> PRE-CREATION 
>   src/test/org/apache/sqoop/importjob/splitby/PostgresSplitByImportTest.java 
> PRE-CREATION 
>   src/test/org/apache/sqoop/importjob/splitby/SplitByImportTestBase.java 
> PRE-CREATION 
>   src/test/org/apache/sqoop/importjob/splitby/SqlServerSplitByImportTest.java 
> PRE-CREATION 
>   src/test/org/apache/sqoop/testutil/adapter/MSSQLServerDatabaseAdapter.java 
> 2256716 
>   src/test/org/apache/sqoop/testutil/adapter/MySqlDatabaseAdapter.java 
> ebd0146 
>   src/test/org/apache/sqoop/testutil/adapter/MysqlDatabaseAdapter.java 
> PRE-CREATION 
>   src/test/org/apache/sqoop/testutil/adapter/SqlServerDatabaseAdapter.java 
> PRE-CREATION 
> 
> 
> Diff: https://reviews.apache.org/r/69407/diff/6/
> 
> 
> Testing
> ---
> 
> unit and 3rd party tests.
> 
> 
> Thanks,
> 
> Fero Szabo
> 
>



Re: Review Request 69407: Refactor: break up Parameterized tests on a per database basis

2018-11-22 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/69407/#review210785
---




src/test/org/apache/sqoop/importjob/configuration/MysqlImportJobTestConfiguration.java
Lines 24 (patched)
<https://reviews.apache.org/r/69407/#comment295545>

Renamed files are shown as new files now which compromises the diff. Could 
you please take a look and regenerate the diff?


- Boglarka Egyed


On Nov. 21, 2018, 3 p.m., Fero Szabo wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/69407/
> ---
> 
> (Updated Nov. 21, 2018, 3 p.m.)
> 
> 
> Review request for Sqoop, Boglarka Egyed and Szabolcs Vasas.
> 
> 
> Bugs: SQOOP-3405
> https://issues.apache.org/jira/browse/SQOOP-3405
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> ---
> 
> Breaking up the parameterized test classes into a per database basis. 
> Provides better readability, needed for proper test categorization (and thus, 
> for travis integration).
> 
> 
> Diffs
> -
> 
>   src/test/org/apache/sqoop/importjob/DatabaseAdapterFactory.java 
> PRE-CREATION 
>   src/test/org/apache/sqoop/importjob/NumericTypesImportTest.java af310cb 
>   src/test/org/apache/sqoop/importjob/SplitByImportTest.java 90b7cbb 
>   
> src/test/org/apache/sqoop/importjob/configuration/MSSQLServerImportJobTestConfiguration.java
>  4ad7def 
>   
> src/test/org/apache/sqoop/importjob/configuration/MySQLImportJobTestConfiguration.java
>  fbcbdeb 
>   
> src/test/org/apache/sqoop/importjob/configuration/MysqlImportJobTestConfiguration.java
>  PRE-CREATION 
>   
> src/test/org/apache/sqoop/importjob/configuration/SqlServerImportJobTestConfiguration.java
>  PRE-CREATION 
>   
> src/test/org/apache/sqoop/importjob/numerictypes/MysqlNumericTypesImportTest.java
>  PRE-CREATION 
>   
> src/test/org/apache/sqoop/importjob/numerictypes/NumericTypesImportTestBase.java
>  PRE-CREATION 
>   
> src/test/org/apache/sqoop/importjob/numerictypes/OracleNumericTypesImportTest.java
>  PRE-CREATION 
>   
> src/test/org/apache/sqoop/importjob/numerictypes/PostgresNumericTypesImportTest.java
>  PRE-CREATION 
>   
> src/test/org/apache/sqoop/importjob/numerictypes/SqlServerNumericTypesImportTest.java
>  PRE-CREATION 
>   src/test/org/apache/sqoop/importjob/splitby/MysqlSplitByImportTest.java 
> PRE-CREATION 
>   src/test/org/apache/sqoop/importjob/splitby/OracleSplitByImportTest.java 
> PRE-CREATION 
>   src/test/org/apache/sqoop/importjob/splitby/PostgresSplitByImportTest.java 
> PRE-CREATION 
>   src/test/org/apache/sqoop/importjob/splitby/SplitByImportTestBase.java 
> PRE-CREATION 
>   src/test/org/apache/sqoop/importjob/splitby/SqlServerSplitByImportTest.java 
> PRE-CREATION 
>   src/test/org/apache/sqoop/testutil/adapter/MSSQLServerDatabaseAdapter.java 
> 2256716 
>   src/test/org/apache/sqoop/testutil/adapter/MySqlDatabaseAdapter.java 
> ebd0146 
>   src/test/org/apache/sqoop/testutil/adapter/MysqlDatabaseAdapter.java 
> PRE-CREATION 
>   src/test/org/apache/sqoop/testutil/adapter/SqlServerDatabaseAdapter.java 
> PRE-CREATION 
> 
> 
> Diff: https://reviews.apache.org/r/69407/diff/6/
> 
> 
> Testing
> ---
> 
> unit and 3rd party tests.
> 
> 
> Thanks,
> 
> Fero Szabo
> 
>



Re: Review Request 69407: Refactor: break up Parameterized tests on a per database basis

2018-11-21 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/69407/#review210762
---


Fix it, then Ship it!




Hi Fero,

Your change generally looks good to me, I only have a minor finding.

Unit and 3rd party tests passed with your patch.

Cheers,
Bogi


src/test/org/apache/sqoop/importjob/numerictypes/OracleNumericTypesImportTest.java
Lines 1 (patched)
<https://reviews.apache.org/r/69407/#comment295517>

Apache headers are missing from new files.


- Boglarka Egyed


On Nov. 21, 2018, 10:20 a.m., Fero Szabo wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/69407/
> ---
> 
> (Updated Nov. 21, 2018, 10:20 a.m.)
> 
> 
> Review request for Sqoop, Boglarka Egyed and Szabolcs Vasas.
> 
> 
> Bugs: SQOOP-3405
> https://issues.apache.org/jira/browse/SQOOP-3405
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> ---
> 
> Breaking up the parameterized test classes into a per database basis. 
> Provides better readability, needed for proper test categorization (and thus, 
> for travis integration).
> 
> 
> Diffs
> -
> 
>   src/test/org/apache/sqoop/importjob/DatabaseAdapterFactory.java 
> PRE-CREATION 
>   src/test/org/apache/sqoop/importjob/NumericTypesImportTest.java af310cbe 
>   src/test/org/apache/sqoop/importjob/SplitByImportTest.java 90b7cbbd 
>   
> src/test/org/apache/sqoop/importjob/configuration/MSSQLServerImportJobTestConfiguration.java
>  4ad7defe 
>   
> src/test/org/apache/sqoop/importjob/configuration/MySQLImportJobTestConfiguration.java
>  fbcbdebe 
>   
> src/test/org/apache/sqoop/importjob/numerictypes/MysqlNumericTypesImportTest.java
>  PRE-CREATION 
>   
> src/test/org/apache/sqoop/importjob/numerictypes/OracleNumericTypesImportTest.java
>  PRE-CREATION 
>   
> src/test/org/apache/sqoop/importjob/numerictypes/PostgresNumericTypesImportTest.java
>  PRE-CREATION 
>   
> src/test/org/apache/sqoop/importjob/numerictypes/SqlServerNumericTypesImportTest.java
>  PRE-CREATION 
>   src/test/org/apache/sqoop/importjob/splitby/MysqlSplitByImportTest.java 
> PRE-CREATION 
>   src/test/org/apache/sqoop/importjob/splitby/OracleSplitByImportTest.java 
> PRE-CREATION 
>   src/test/org/apache/sqoop/importjob/splitby/PostgresSplitByImportTest.java 
> PRE-CREATION 
>   src/test/org/apache/sqoop/importjob/splitby/SqlServerSplitByImportTest.java 
> PRE-CREATION 
>   src/test/org/apache/sqoop/testutil/adapter/MSSQLServerDatabaseAdapter.java 
> 22567162 
>   src/test/org/apache/sqoop/testutil/adapter/MySqlDatabaseAdapter.java 
> ebd01468 
> 
> 
> Diff: https://reviews.apache.org/r/69407/diff/5/
> 
> 
> Testing
> ---
> 
> unit and 3rd party tests.
> 
> 
> Thanks,
> 
> Fero Szabo
> 
>



Re: Review Request 69414: Sqoop should not try to execute test category interfaces as tests with Ant

2018-11-21 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/69414/#review210757
---


Ship it!




Ship It!

- Boglarka Egyed


On Nov. 21, 2018, 12:22 p.m., Szabolcs Vasas wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/69414/
> ---
> 
> (Updated Nov. 21, 2018, 12:22 p.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Bugs: SQOOP-3406
> https://issues.apache.org/jira/browse/SQOOP-3406
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> ---
> 
> When Ant third party test suite is being run Ant tries to execute the test 
> category interfaces too because they end with the 'Test' postfix.
> 
> These "tests" obviously fail so we need to make sure that Ant does not 
> execute them.
> 
> 
> Diffs
> -
> 
>   build.xml 995a513040f85b6c2043a977a09e93b56913bbed 
> 
> 
> Diff: https://reviews.apache.org/r/69414/diff/2/
> 
> 
> Testing
> ---
> 
> ant unit and third party test
> 
> 
> Thanks,
> 
> Szabolcs Vasas
> 
>



Re: Review Request 69413: Introduce methods instead of TEMP_BASE_DIR and LOCAL_WAREHOUSE_DIR static fields

2018-11-21 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/69413/#review210755
---


Ship it!




Hi Szabolcs, 

Thanks for this fix! Unit and 3rd party tests ran successfully for me.

Cheers,
Bogi

- Boglarka Egyed


On Nov. 20, 2018, 5:29 p.m., Szabolcs Vasas wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/69413/
> ---
> 
> (Updated Nov. 20, 2018, 5:29 p.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Bugs: SQOOP-3407
> https://issues.apache.org/jira/browse/SQOOP-3407
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> ---
> 
> BaseSqoopTestCase.TEMP_BASE_DIR and BaseSqoopTestCase.LOCAL_WAREHOUSE_DIR are 
> public static fields which get initialized once at the JVM startup and store 
> the paths for the test temp and warehouse directories.
> 
> The problem is that HBase test cases change the value of the test.build.data 
> system property which can cause tests using these static fields to fail.
> 
> Since we do not own the code in HBase which changes the system property we 
> need to turn these static fields into methods which evaluate the 
> test.build.data system property every time they invoked which will make sure 
> that the invoking tests will be successful.
> 
> 
> Diffs
> -
> 
>   src/test/org/apache/sqoop/TestIncrementalImport.java 
> dbdd05c13e77af514bd996a92f7ebea3a27aedd5 
>   src/test/org/apache/sqoop/TestMerge.java 
> b283174b8b3df7c16c496795fcbae2f91dd1c375 
>   src/test/org/apache/sqoop/credentials/TestPassingSecurePassword.java 
> 9c1e9f9a93323655bc313303bf84d566b551ee00 
>   src/test/org/apache/sqoop/hbase/HBaseImportAddRowKeyTest.java 
> df1840b37ce29ffb303b31e1fcbfe4c5842e7c36 
>   src/test/org/apache/sqoop/io/TestSplittableBufferedWriter.java 
> 71d6971489e489ae501739fdad5a7409375b6ec1 
>   src/test/org/apache/sqoop/manager/sqlserver/SQLServerManagerImportTest.java 
> ea7942f62d623895f242e69e77cf9920bbb7e18c 
>   src/test/org/apache/sqoop/orm/TestClassWriter.java 
> 59a8908f13c51b9caca42e8602413ee0b8634b0a 
>   src/test/org/apache/sqoop/testutil/BaseSqoopTestCase.java 
> e23aad3ee997780e5708e9180550339d834b74d9 
> 
> 
> Diff: https://reviews.apache.org/r/69413/diff/1/
> 
> 
> Testing
> ---
> 
> Executed unit and third party tests.
> 
> 
> Thanks,
> 
> Szabolcs Vasas
> 
>



Re: Review Request 69346: Categorize all tests in the project

2018-11-15 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/69346/#review210577
---


Fix it, then Ship it!




Hi Szabolcs,

Thanks for this improvement, it makes Sqoop tests pretty neat!

I have checked all the categories and they seem to be OK for me. I have two 
minor findings, please find them below.

Thank you,
Bogi


COMPILING.txt
Lines 374-430 (patched)
<https://reviews.apache.org/r/69346/#comment295282>

Shouldn't we include S3 related properties here too as those re part of the 
third party test suite as well?



build.gradle
Line 188 (original), 204 (patched)
<https://reviews.apache.org/r/69346/#comment295284>

Typo: This the same


- Boglarka Egyed


On Nov. 15, 2018, 1:40 p.m., Szabolcs Vasas wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/69346/
> ---
> 
> (Updated Nov. 15, 2018, 1:40 p.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Bugs: SQOOP-3404
> https://issues.apache.org/jira/browse/SQOOP-3404
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> ---
> 
> - All tests are categorized now
> - Introduced S3Test category as a subcategory of ThirdPartyTest
> - Reorganized test tasks: we have unitTest, integrationTest, kerberizedTest, 
> test, s3Test, allTest tasks now
> - jacocoTestReport task is fixed to contain the coverage information of the 
> kerberizedTest task too. This is needed because the kerberizedTest needs the 
> forkEvery parameter to be set to 1 and because of that it has to be a 
> separate task which generates separate coverage information too. However it 
> is automatically triggered after the test task so the invocation is more 
> convenient for the tester.
> 
> 
> Diffs
> -
> 
>   COMPILING.txt 835ba33b1e89158bed0e05698b188ab3323eb881 
>   build.gradle cb9eeca74bbf278c3e5fd15de608d8c37c917ddb 
>   src/test/org/apache/sqoop/importjob/SplitByImportTest.java 
> c6fe4f2e8a80c96ad667f4fe4a26510af96562dc 
>   src/test/org/apache/sqoop/manager/TestMainframeManager.java 
> c84f05f660c396a06a5031e00abdae77ffbcf2aa 
>   
> src/test/org/apache/sqoop/manager/oracle/TestOraOopDBInputSplitGetDebugDetails.java
>  6f33ad3b650436b7f268b4ef5bfd451bd5e6958e 
>   
> src/test/org/apache/sqoop/mapreduce/db/netezza/TestNetezzaExternalTableExportMapper.java
>  5e558717c0d43301ecbf81a37d5ee3fd35756d65 
>   
> src/test/org/apache/sqoop/mapreduce/db/netezza/TestNetezzaExternalTableImportMapper.java
>  1a6943786834d27f27523f484d76cf678f18cf48 
>   
> src/test/org/apache/sqoop/mapreduce/mainframe/TestMainframeDatasetBinaryRecord.java
>  b4cba28c3611400b5c4227a5166b6c91e9152dc4 
>   
> src/test/org/apache/sqoop/mapreduce/mainframe/TestMainframeFTPFileGdgEntryParser.java
>  521a04266e8806321fe7aa6a89c064f369174523 
>   src/test/org/apache/sqoop/s3/TestS3AvroImport.java 
> 7f5f5d62c5cab10f932aa22c3a713b13fefc2b58 
>   src/test/org/apache/sqoop/s3/TestS3ExternalHiveTableImport.java 
> 0c3161e5a783446e35f4754124f86715d103ec0b 
>   src/test/org/apache/sqoop/s3/TestS3ImportWithHadoopCredProvider.java 
> 3a0d6365dc20f8eef5bdd67a4a2dc9c68ff74d7f 
>   src/test/org/apache/sqoop/s3/TestS3IncrementalAppendAvroImport.java 
> 5faf59ea80c48fe025294cabd100e7d176032138 
>   src/test/org/apache/sqoop/s3/TestS3IncrementalAppendParquetImport.java 
> a4f986423ea299716a29f9d02f7c8453a7f2ba02 
>   src/test/org/apache/sqoop/s3/TestS3IncrementalAppendSequenceFileImport.java 
> d271588c5af060bbc3d301a845f45c46d0f6a2ba 
>   src/test/org/apache/sqoop/s3/TestS3IncrementalAppendTextImport.java 
> 52d89c775b5f1219471df44d222fd92a59ed408c 
>   src/test/org/apache/sqoop/s3/TestS3IncrementalMergeParquetImport.java 
> 39238c5fab56b54a85dde5aed0d4bb2c77382fa6 
>   src/test/org/apache/sqoop/s3/TestS3IncrementalMergeTextImport.java 
> 597e3def2cc33adebeeb3bc1ee35ad8a7f4b990d 
>   src/test/org/apache/sqoop/s3/TestS3ParquetImport.java 
> c9785d816d4a7a5870d74c51a9faa229f6d3818e 
>   src/test/org/apache/sqoop/s3/TestS3SequenceFileImport.java 
> bba8b74ebe639df26e977abf377f4904144dcfaa 
>   src/test/org/apache/sqoop/s3/TestS3TextImport.java 
> 114f97cbb8857a7633cae5d030769ac4a90e36aa 
>   src/test/org/apache/sqoop/testcategories/thirdpartytest/S3Test.java 
> PRE-CREATION 
>   
> src/test/org/apache/sqoop/tool/TestS3IncrementalImportOptionValidations.java 
> 7745f1b07e6d6c457b0164deeace12587ec058d0 
> 
> 
> Diff: https://reviews.apache.org/r/69346/diff/2/
> 
> 
> Testing
> ---
> 
> ./gradlew unitTest
> ./gradlew integrationTest
> ./gradlew kerberizedTest
> ./gradlew ... s3Test
> ./gradlew test
> ./gradlew ... thirdPartyTest
> ./gradlew allTest
> 
> 
> Thanks,
> 
> Szabolcs Vasas
> 
>



[jira] [Assigned] (SQOOP-3361) Test compressing imported data with S3

2018-11-15 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3361?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed reassigned SQOOP-3361:
-

Assignee: (was: Boglarka Egyed)

> Test compressing imported data with S3
> --
>
> Key: SQOOP-3361
> URL: https://issues.apache.org/jira/browse/SQOOP-3361
> Project: Sqoop
>  Issue Type: Improvement
>Affects Versions: 1.4.7
>        Reporter: Boglarka Egyed
>Priority: Minor
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Re: Review Request 68541: SQOOP-3104: Create test categories instead of test suites and naming conventions

2018-11-14 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/68541/#review210545
---


Ship it!




Hi Nguyen,

Thank you very much for the update! Now it looks good to me, let's ship it! :)

Thanks,
Bogi

- Boglarka Egyed


On Nov. 13, 2018, 6:18 a.m., Nguyen Truong wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/68541/
> ---
> 
> (Updated Nov. 13, 2018, 6:18 a.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Bugs: SQOOP-3104
> https://issues.apache.org/jira/browse/SQOOP-3104
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> ---
> 
> We are currently unsing test naming conventions to differentiate between 
> ManualTests, Unit tests and 3rd party tests. Instead of that, I implemented 
> junit categories which will allow us to have more categories in the future. 
> This would also remove the reliance on the test class name.
> 
> Test categories skeleton:
>   SqoopTest _ UnitTest
>   |__ IntegrationTest
>   |__ ManualTest
> 
>   ThirdPartyTest _ CubridTest
>|__ Db2Test
>|__ MainFrameTest
>|__ MysqlTest
>|__ NetezzaTest
>|__ OracleTest
>|__ PostgresqlTest
>|__ SqlServerTest
> 
>   KerberizedTest
> 
> Categories explanation:
> * SqoopTest: Group of the big categories, including:
> - UnitTest: It tests one class only with its dependencies mocked or 
> if the dependency
> is lightweight we can keep it. It must not start a minicluster or an 
> hsqldb database.
> It does not need JCDB drivers.
> - IntegrationTest: It usually tests a whole scenario. It may start up 
> miniclusters,
> hsqldb and connect to external resources like RDBMSs.
> - ManualTest: This should be a deprecated category which should not 
> be used in the future.
> It only exists to mark the currently existing manual tests.
> * ThirdPartyTest: An orthogonal hierarchy for tests that need a JDBC 
> driver and/or a docker
> container/external RDBMS instance to run. Subcategories express what kind 
> of external
> resource the test needs. E.g: OracleTest needs an Oracle RDBMS and Oracle 
> driver on the classpath
> * KerberizedTest: Test that needs Kerberos, which needs to be run on a 
> separate JVM.
> 
> Opinions are very welcomed. Thanks!
> 
> 
> Diffs
> -
> 
>   build.gradle 2014b5cf5 
>   src/test/org/apache/sqoop/TestConnFactory.java fb6c94059 
>   src/test/org/apache/sqoop/TestIncrementalImport.java 29c477954 
>   src/test/org/apache/sqoop/TestSqoopOptions.java e55682edf 
>   src/test/org/apache/sqoop/accumulo/TestAccumuloUtil.java 631eeff5e 
>   src/test/org/apache/sqoop/authentication/TestKerberosAuthenticator.java 
> f5700ce65 
>   src/test/org/apache/sqoop/db/TestDriverManagerJdbcConnectionFactory.java 
> 244831672 
>   
> src/test/org/apache/sqoop/db/decorator/TestKerberizedConnectionFactoryDecorator.java
>  d3e3fb23e 
>   src/test/org/apache/sqoop/hbase/HBaseImportAddRowKeyTest.java c4caafba5 
>   src/test/org/apache/sqoop/hbase/HBaseKerberizedConnectivityTest.java 
> 3bfb39178 
>   src/test/org/apache/sqoop/hbase/HBaseTestCase.java 94b71b61c 
>   src/test/org/apache/sqoop/hbase/HBaseUtilTest.java c6a808c33 
>   src/test/org/apache/sqoop/hbase/TestHBasePutProcessor.java e78a535f4 
>   src/test/org/apache/sqoop/hcat/TestHCatalogBasic.java ba05cabbb 
>   
> src/test/org/apache/sqoop/hive/HiveServer2ConnectionFactoryInitializerTest.java
>  4d2cb2f88 
>   src/test/org/apache/sqoop/hive/TestHiveClientFactory.java a3c2dc939 
>   src/test/org/apache/sqoop/hive/TestHiveMiniCluster.java 419f888c0 
>   src/test/org/apache/sqoop/hive/TestHiveServer2Client.java 02617295e 
>   src/test/org/apache/sqoop/hive/TestHiveServer2ParquetImport.java 65f079467 
>   src/test/org/apache/sqoop/hive/TestHiveServer2TextImport.java 410724f37 
>   src/test/org/apache/sqoop/hive/TestHiveTypesForAvroTypeMapping.java 
> 276e9eaa4 
>   src/test/org/apache/sqoop/hive/TestTableDefWriter.java 626ad22f6 
>   src/test/org/apache/sqoop/hive/TestTableDefWriterForExternalTable.java 
> f1768ee76 
>   src/test/org/apache/sqoop/importjob/avro/AvroImportForNumericTypesTest.java 
> ff13dc3bc 
>   src/test/org/apache/sqoop/io/TestCodecMap.java e71921823 
&g

Re: Review Request 62523: SQOOP-3237: Mainframe FTP transfer option to insert custom FTP commands prior to transfer

2018-11-14 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/62523/#review210537
---



Hi Chris,

Thanks for the update, MainframeManagerImportTest passes now.

I have however a couple of new findings regarding your latest update and in 
general regarding to extra FTP commands usage, please find them below.

Thank you,
Bogi


src/java/org/apache/sqoop/mapreduce/mainframe/MainframeImportJob.java
Lines 86-90 (patched)
<https://reviews.apache.org/r/62523/#comment295259>

This block is embedded under the condition of 'if 
(SqoopOptions.FileLayout.BinaryFile == options.getFileLayout())'. Does this 
mean that these extra commands can work only with binary file? If yes, could 
you please explain why?



src/test/org/apache/sqoop/util/TestMainframeFTPClientUtils.java
Lines 56 (patched)
<https://reviews.apache.org/r/62523/#comment295256>

You never use DEFAULT_FTP_URL anywhere.



src/test/org/apache/sqoop/util/TestMainframeFTPClientUtils.java
Line 120 (original), 129 (patched)
<https://reviews.apache.org/r/62523/#comment295257>

I think clean up of these lines has been missed. You are setting the 
username twice now. This applies for the other test cases too.



src/test/org/apache/sqoop/util/TestMainframeFTPClientUtils.java
Lines 192-194 (original), 197-199 (patched)
<https://reviews.apache.org/r/62523/#comment295258>

These should have been cleaned up too I guess. This applies for the other 
test cases too.


- Boglarka Egyed


On Nov. 13, 2018, 11:24 p.m., Chris Teoh wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/62523/
> ---
> 
> (Updated Nov. 13, 2018, 11:24 p.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Bugs: SQOOP-3237
> https://issues.apache.org/jira/browse/SQOOP-3237
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> ---
> 
> Added --ftpcmds command to allow comma separated list of FTP commands to send.
> 
> 
> Diffs
> -
> 
>   src/docs/user/import-mainframe.txt 3ecfb7e4 
>   src/java/org/apache/sqoop/SqoopOptions.java f06872f9 
>   src/java/org/apache/sqoop/mapreduce/mainframe/MainframeConfiguration.java 
> 9842daa6 
>   src/java/org/apache/sqoop/mapreduce/mainframe/MainframeImportJob.java 
> 90dc2ddd 
>   src/java/org/apache/sqoop/tool/MainframeImportTool.java fbc8c3db 
>   src/java/org/apache/sqoop/util/MainframeFTPClientUtils.java e7c48a6b 
>   src/test/org/apache/sqoop/tool/TestMainframeImportTool.java 00e57bd0 
>   src/test/org/apache/sqoop/util/TestMainframeFTPClientUtils.java 0714bdcf 
> 
> 
> Diff: https://reviews.apache.org/r/62523/diff/9/
> 
> 
> Testing
> ---
> 
> Unit tests.
> 
> 
> File Attachments
> 
> 
> SQOOP-3237-1.patch
>   
> https://reviews.apache.org/media/uploaded/files/2017/09/26/56041556-e355-4372-83ab-1bcc01680201__SQOOP-3237-1.patch
> 
> 
> Thanks,
> 
> Chris Teoh
> 
>



Re: Review Request 69060: SQOOP-3382 Add parquet numeric support for Parquet in hdfs import

2018-11-14 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/69060/#review210533
---


Ship it!




Thanks for the updates Fero, let's ship this!

- Boglarka Egyed


On Nov. 12, 2018, 4:33 p.m., Fero Szabo wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/69060/
> ---
> 
> (Updated Nov. 12, 2018, 4:33 p.m.)
> 
> 
> Review request for Sqoop, Boglarka Egyed and Szabolcs Vasas.
> 
> 
> Bugs: SQOOP-3382
> https://issues.apache.org/jira/browse/SQOOP-3382
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> ---
> 
> This patch is about adding support for fixed point decimal types in parquet 
> import.
> 
> The implementation is simple after the fact that parquet was upgraded to 
> 1.9.0 in SQOOP-3381: we just need to register the GenericDataSupplier with 
> AvroParquetOutputFormat.
> 
> For testing, we can reuse the existing Avro tests, because Sqoop uses Avro 
> under the hood to write parquet.
> 
> I also moved around and renamed the classes involved in this change so their 
> name and package reflect their purpose.
> 
> ** Note: A key design decision can be seen in the ImportJobTestConfiguration 
> interface **
> - I decided to create a new function to get the expected results for each 
> file format, since we seldom add new fileformats. 
> - However this also enforces future configurations to always define their 
> expected result for every file forma or throw a NotImplementedException 
> should they lack the support for one.
> - The alternative for this is to define the fileLayout as an input parameter 
> instead. This would allow for better extendability.
> _Please share your thoughts on this!_
> 
> 
> Diffs
> -
> 
>   src/java/org/apache/sqoop/config/ConfigurationConstants.java 3724f250e 
>   src/java/org/apache/sqoop/mapreduce/ParquetImportMapper.java 62334f8ab 
>   
> src/java/org/apache/sqoop/mapreduce/parquet/hadoop/HadoopParquetImportJobConfigurator.java
>  e82154309 
>   src/java/org/apache/sqoop/orm/AvroSchemaGenerator.java 7a2a5f9cd 
>   src/test/org/apache/sqoop/importjob/ImportJobTestConfiguration.java 
> 14de910b9 
>   src/test/org/apache/sqoop/importjob/SplitByImportTest.java 7977c0b0f 
>   src/test/org/apache/sqoop/importjob/avro/AvroImportForNumericTypesTest.java 
> ff13dc3bc 
>   
> src/test/org/apache/sqoop/importjob/avro/configuration/MSSQLServerImportJobTestConfiguration.java
>  182d2967f 
>   
> src/test/org/apache/sqoop/importjob/avro/configuration/MySQLImportJobTestConfiguration.java
>  e9bf9912a 
>   
> src/test/org/apache/sqoop/importjob/avro/configuration/OracleImportJobTestConfiguration.java
>  b7bad08c0 
>   
> src/test/org/apache/sqoop/importjob/avro/configuration/OracleImportJobTestConfigurationForNumber.java
>  465e61f4b 
>   
> src/test/org/apache/sqoop/importjob/avro/configuration/PostgresqlImportJobTestConfigurationForNumeric.java
>  66715c171 
>   
> src/test/org/apache/sqoop/importjob/avro/configuration/PostgresqlImportJobTestConfigurationPaddingShouldSucceed.java
>  ec4db41bd 
>   
> src/test/org/apache/sqoop/importjob/configuration/AvroTestConfiguration.java 
> PRE-CREATION 
>   
> src/test/org/apache/sqoop/importjob/configuration/GenericImportJobSplitByTestConfiguration.java
>  f137b56b7 
>   
> src/test/org/apache/sqoop/importjob/configuration/ParquetTestConfiguration.java
>  PRE-CREATION 
>   src/test/org/apache/sqoop/util/ParquetReader.java 908ce566f 
> 
> 
> Diff: https://reviews.apache.org/r/69060/diff/4/
> 
> 
> Testing
> ---
> 
> 3rd party tests and unit tests, both gradle and ant
> 
> 
> Thanks,
> 
> Fero Szabo
> 
>



Re: Review Request 69060: SQOOP-3382 Add parquet numeric support for Parquet in hdfs import

2018-11-13 Thread Boglarka Egyed


> On Nov. 9, 2018, 2:26 p.m., Boglarka Egyed wrote:
> > src/test/org/apache/sqoop/importjob/avro/AvroImportForNumericTypesTest.java
> > Lines 220 (patched)
> > <https://reviews.apache.org/r/69060/diff/3/?file=2106486#file2106486line224>
> >
> > I think these tests could be parameterized as they are doing the same 
> > but with different file formats (Avro and Parquet).
> 
> Fero Szabo wrote:
> Hi Bogi,
> 
> Thanks for the review!
> 
> There is a tiny difference: to enable logical types in parquet, there is 
> new flag (sqoop.parquet.logical_types.decimal.enable), i.e. only used in the 
> parquet tests. 
> 
> I'd keep this code as is, as deduplication might lead to spaghetti code 
> here (since these are different features after all).
> 
> Even though this is a bit of a compromise, I'd like to drop this issue if 
> that's OK with you (?)

Thanks for pointing this out, Fero. I think you are right about creating 
spaghetti code here, please feel free to drop my previous comment. Also, the 
tests are now easy-to-read which is good as it is.


- Boglarka


---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/69060/#review210439
---


On Nov. 12, 2018, 4:33 p.m., Fero Szabo wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/69060/
> ---
> 
> (Updated Nov. 12, 2018, 4:33 p.m.)
> 
> 
> Review request for Sqoop, Boglarka Egyed and Szabolcs Vasas.
> 
> 
> Bugs: SQOOP-3382
> https://issues.apache.org/jira/browse/SQOOP-3382
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> ---
> 
> This patch is about adding support for fixed point decimal types in parquet 
> import.
> 
> The implementation is simple after the fact that parquet was upgraded to 
> 1.9.0 in SQOOP-3381: we just need to register the GenericDataSupplier with 
> AvroParquetOutputFormat.
> 
> For testing, we can reuse the existing Avro tests, because Sqoop uses Avro 
> under the hood to write parquet.
> 
> I also moved around and renamed the classes involved in this change so their 
> name and package reflect their purpose.
> 
> ** Note: A key design decision can be seen in the ImportJobTestConfiguration 
> interface **
> - I decided to create a new function to get the expected results for each 
> file format, since we seldom add new fileformats. 
> - However this also enforces future configurations to always define their 
> expected result for every file forma or throw a NotImplementedException 
> should they lack the support for one.
> - The alternative for this is to define the fileLayout as an input parameter 
> instead. This would allow for better extendability.
> _Please share your thoughts on this!_
> 
> 
> Diffs
> -
> 
>   src/java/org/apache/sqoop/config/ConfigurationConstants.java 3724f250e 
>   src/java/org/apache/sqoop/mapreduce/ParquetImportMapper.java 62334f8ab 
>   
> src/java/org/apache/sqoop/mapreduce/parquet/hadoop/HadoopParquetImportJobConfigurator.java
>  e82154309 
>   src/java/org/apache/sqoop/orm/AvroSchemaGenerator.java 7a2a5f9cd 
>   src/test/org/apache/sqoop/importjob/ImportJobTestConfiguration.java 
> 14de910b9 
>   src/test/org/apache/sqoop/importjob/SplitByImportTest.java 7977c0b0f 
>   src/test/org/apache/sqoop/importjob/avro/AvroImportForNumericTypesTest.java 
> ff13dc3bc 
>   
> src/test/org/apache/sqoop/importjob/avro/configuration/MSSQLServerImportJobTestConfiguration.java
>  182d2967f 
>   
> src/test/org/apache/sqoop/importjob/avro/configuration/MySQLImportJobTestConfiguration.java
>  e9bf9912a 
>   
> src/test/org/apache/sqoop/importjob/avro/configuration/OracleImportJobTestConfiguration.java
>  b7bad08c0 
>   
> src/test/org/apache/sqoop/importjob/avro/configuration/OracleImportJobTestConfigurationForNumber.java
>  465e61f4b 
>   
> src/test/org/apache/sqoop/importjob/avro/configuration/PostgresqlImportJobTestConfigurationForNumeric.java
>  66715c171 
>   
> src/test/org/apache/sqoop/importjob/avro/configuration/PostgresqlImportJobTestConfigurationPaddingShouldSucceed.java
>  ec4db41bd 
>   
> src/test/org/apache/sqoop/importjob/configuration/AvroTestConfiguration.java 
> PRE-CREATION 
>   
> src/test/org/apache/sqoop/importjob/configuration/GenericImportJobSplitByTestConfiguration.java
>  f137b56b7 
>   
> src/test/org/apache/sqoop/importjob/configuration/ParquetTestConfiguration.java
>  PRE-CREATION 
>   src/test/org/apache/sqoop/util/ParquetReader.java 908ce566f 
> 
> 
> Diff: https://reviews.apache.org/r/69060/diff/4/
> 
> 
> Testing
> ---
> 
> 3rd party tests and unit tests, both gradle and ant
> 
> 
> Thanks,
> 
> Fero Szabo
> 
>



Re: Review Request 69060: SQOOP-3382 Add parquet numeric support for Parquet in hdfs import

2018-11-09 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/69060/#review210439
---



Hi Fero,

Thanks for this improvement!

I have left a couple of comments related to testing, please find them below.

Thanks,
Bogi


src/test/org/apache/sqoop/importjob/avro/AvroImportForNumericTypesTest.java
Lines 220 (patched)
<https://reviews.apache.org/r/69060/#comment295098>

I think these tests could be parameterized as they are doing the same but 
with different file formats (Avro and Parquet).



src/test/org/apache/sqoop/importjob/avro/AvroImportForNumericTypesTest.java
Lines 299 (patched)
<https://reviews.apache.org/r/69060/#comment295099>

Why aren't you assert the result as a list?



src/test/org/apache/sqoop/importjob/avro/AvroImportForNumericTypesTest.java
Lines 301-305 (patched)
<https://reviews.apache.org/r/69060/#comment295100>

With this logic now you don't test if the size of the expected result and 
the output are the same.


- Boglarka Egyed


On Nov. 8, 2018, 3:34 p.m., Fero Szabo wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/69060/
> ---
> 
> (Updated Nov. 8, 2018, 3:34 p.m.)
> 
> 
> Review request for Sqoop, Boglarka Egyed and Szabolcs Vasas.
> 
> 
> Bugs: SQOOP-3382
> https://issues.apache.org/jira/browse/SQOOP-3382
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> ---
> 
> This patch is about adding support for fixed point decimal types in parquet 
> import.
> 
> The implementation is simple after the fact that parquet was upgraded to 
> 1.9.0 in SQOOP-3381: we just need to register the GenericDataSupplier with 
> AvroParquetOutputFormat.
> 
> For testing, we can reuse the existing Avro tests, because Sqoop uses Avro 
> under the hood to write parquet.
> 
> I also moved around and renamed the classes involved in this change so their 
> name and package reflect their purpose.
> 
> ** Note: A key design decision can be seen in the ImportJobTestConfiguration 
> interface **
> - I decided to create a new function to get the expected results for each 
> file format, since we seldom add new fileformats. 
> - However this also enforces future configurations to always define their 
> expected result for every file forma or throw a NotImplementedException 
> should they lack the support for one.
> - The alternative for this is to define the fileLayout as an input parameter 
> instead. This would allow for better extendability.
> _Please share your thoughts on this!_
> 
> 
> Diffs
> -
> 
>   src/java/org/apache/sqoop/config/ConfigurationConstants.java 3724f250e 
>   src/java/org/apache/sqoop/mapreduce/ImportJobBase.java 80c069888 
>   src/java/org/apache/sqoop/mapreduce/ParquetImportMapper.java 62334f8ab 
>   src/java/org/apache/sqoop/orm/AvroSchemaGenerator.java 7a2a5f9cd 
>   src/test/org/apache/sqoop/importjob/ImportJobTestConfiguration.java 
> 14de910b9 
>   src/test/org/apache/sqoop/importjob/SplitByImportTest.java 7977c0b0f 
>   src/test/org/apache/sqoop/importjob/avro/AvroImportForNumericTypesTest.java 
> ff13dc3bc 
>   
> src/test/org/apache/sqoop/importjob/avro/configuration/MSSQLServerImportJobTestConfiguration.java
>  182d2967f 
>   
> src/test/org/apache/sqoop/importjob/avro/configuration/MySQLImportJobTestConfiguration.java
>  e9bf9912a 
>   
> src/test/org/apache/sqoop/importjob/avro/configuration/OracleImportJobTestConfiguration.java
>  b7bad08c0 
>   
> src/test/org/apache/sqoop/importjob/avro/configuration/OracleImportJobTestConfigurationForNumber.java
>  465e61f4b 
>   
> src/test/org/apache/sqoop/importjob/avro/configuration/PostgresqlImportJobTestConfigurationForNumeric.java
>  66715c171 
>   
> src/test/org/apache/sqoop/importjob/avro/configuration/PostgresqlImportJobTestConfigurationPaddingShouldSucceed.java
>  ec4db41bd 
>   
> src/test/org/apache/sqoop/importjob/configuration/AvroTestConfiguration.java 
> PRE-CREATION 
>   
> src/test/org/apache/sqoop/importjob/configuration/GenericImportJobSplitByTestConfiguration.java
>  f137b56b7 
>   
> src/test/org/apache/sqoop/importjob/configuration/ParquetTestConfiguration.java
>  PRE-CREATION 
>   src/test/org/apache/sqoop/util/ParquetReader.java 908ce566f 
> 
> 
> Diff: https://reviews.apache.org/r/69060/diff/3/
> 
> 
> Testing
> ---
> 
> 3rd party tests and unit tests, both gradle and ant
> 
> 
> Thanks,
> 
> Fero Szabo
> 
>



[jira] [Updated] (SQOOP-3403) Sqoop2: Add Fero Szabo to committer list in our pom file

2018-11-09 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3403?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed updated SQOOP-3403:
--
Description: Now that [~fero] is committer we should update our committer 
list in the root pom.xml file:

> Sqoop2: Add Fero Szabo to committer list in our pom file
> 
>
> Key: SQOOP-3403
> URL: https://issues.apache.org/jira/browse/SQOOP-3403
> Project: Sqoop
>  Issue Type: Task
>Affects Versions: 1.99.8
>        Reporter: Boglarka Egyed
>Assignee: Fero Szabo
>Priority: Major
>
> Now that [~fero] is committer we should update our committer list in the root 
> pom.xml file:



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (SQOOP-3403) Sqoop2: Add Fero Szabo to committer list in our pom file

2018-11-09 Thread Boglarka Egyed (JIRA)
Boglarka Egyed created SQOOP-3403:
-

 Summary: Sqoop2: Add Fero Szabo to committer list in our pom file
 Key: SQOOP-3403
 URL: https://issues.apache.org/jira/browse/SQOOP-3403
 Project: Sqoop
  Issue Type: Task
Affects Versions: 1.99.8
Reporter: Boglarka Egyed
Assignee: Fero Szabo






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[ANNOUNCE] New Sqoop committer: Fero Szabo

2018-11-09 Thread Boglarka Egyed
On behalf of the Apache Sqoop PMC, I am excited to welcome Fero Szabo
as a new committer to Apache Sqoop. Please join me in congratulating him
for this accomplishment!

Fero has a decent contribution to the code[1] containing new features,
bugfixes
and documentation updates and he is one of the most active
reviewers[2] on the project. He constantly helps others on the mailing
lists and
the Jira, as a part of this effort he is also trying to resurrect old
patches
that could be useful but were abandoned.

We look forward to Fero's continued contributions!

1: https://s.apache.org/Im35
2: https://reviews.apache.org/users/fero/reviews/


[jira] [Commented] (SQOOP-3387) Include Column-Remarks

2018-11-08 Thread Boglarka Egyed (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3387?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16679587#comment-16679587
 ] 

Boglarka Egyed commented on SQOOP-3387:
---

Yes [~hatala91], review has been created, referenced correctly and is open now.

> Include Column-Remarks
> --
>
> Key: SQOOP-3387
> URL: https://issues.apache.org/jira/browse/SQOOP-3387
> Project: Sqoop
>  Issue Type: Wish
>  Components: connectors, metastore
>Affects Versions: 1.4.7
>Reporter: Tomas Sebastian Hätälä
>Assignee: Tomas Sebastian Hätälä
>Priority: Critical
>  Labels: easy-fix, features, pull-request-available
> Fix For: 1.5.0
>
> Attachments: SQOOP_3387.patch
>
>
> In most RDBMS it is possible to enter comments/ remarks for table and view 
> columns. That way a user can obtain additional information regarding the data 
> and how to use it.
> With the avro file format it would be possible to store this information in 
> the schema file using the "doc"-tag. At the moment this is, however, left 
> blanc.
> Review: https://reviews.apache.org/r/68989/



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Re: Review Request 62523: SQOOP-3237: Mainframe FTP transfer option to insert custom FTP commands prior to transfer

2018-11-08 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/62523/#review210413
---



Hi Chris,

Many thanks for your continued efforts on Mainframe side!

I took a look at your patch and found a bug, please find it below. 
MainframeManagerImportTest failed for me with the following error message, I 
believe the root cause is the bug you introduced with the latest version of 
your patch:

ERROR - org.apache.sqoop.tool.ImportTool.run(ImportTool.java:635)] Import 
failed: The value of property mainframe.ftp.commands must not be null

Could you please take a look?

Thanks,
Bogi


src/java/org/apache/sqoop/mapreduce/mainframe/MainframeImportJob.java
Lines 86 (patched)
<https://reviews.apache.org/r/62523/#comment295058>

Shouldn't this condition be negated? StringUtils.isBalnk will be true if 
the character sequence is null, empty or whitespace. I'm wondering how 
TestMainframeImportTool could pass with your patch.



src/test/org/apache/sqoop/util/TestMainframeFTPClientUtils.java
Lines 383-393 (patched)
<https://reviews.apache.org/r/62523/#comment295059>

These configurations could be extracted to a separate method having a 
self-explanatory name.


- Boglarka Egyed


On Oct. 31, 2018, 5:40 a.m., Chris Teoh wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/62523/
> ---
> 
> (Updated Oct. 31, 2018, 5:40 a.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Bugs: SQOOP-3237
> https://issues.apache.org/jira/browse/SQOOP-3237
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> ---
> 
> Added --ftpcmds command to allow comma separated list of FTP commands to send.
> 
> 
> Diffs
> -
> 
>   src/docs/user/import-mainframe.txt 3ecfb7e4 
>   src/java/org/apache/sqoop/SqoopOptions.java f06872f9 
>   src/java/org/apache/sqoop/mapreduce/mainframe/MainframeConfiguration.java 
> 9842daa6 
>   src/java/org/apache/sqoop/mapreduce/mainframe/MainframeImportJob.java 
> 90dc2ddd 
>   src/java/org/apache/sqoop/tool/MainframeImportTool.java fbc8c3db 
>   src/java/org/apache/sqoop/util/MainframeFTPClientUtils.java e7c48a6b 
>   src/test/org/apache/sqoop/tool/TestMainframeImportTool.java 00e57bd0 
>   src/test/org/apache/sqoop/util/TestMainframeFTPClientUtils.java 0714bdcf 
> 
> 
> Diff: https://reviews.apache.org/r/62523/diff/8/
> 
> 
> Testing
> ---
> 
> Unit tests.
> 
> 
> File Attachments
> 
> 
> SQOOP-3237-1.patch
>   
> https://reviews.apache.org/media/uploaded/files/2017/09/26/56041556-e355-4372-83ab-1bcc01680201__SQOOP-3237-1.patch
> 
> 
> Thanks,
> 
> Chris Teoh
> 
>



[ANNOUNCE] New Sqoop PMC member - Szabolcs Vasas

2018-11-06 Thread Boglarka Egyed
On behalf of the Apache Sqoop PMC, I am pleased to welcome Szabolcs Vasas as
a new Sqoop PMC Member. Please join me in congratulating him!

Szabolcs has countless code contributions[1] as well as provides thorough
reviews
for others constantly[2]. He continuously offers help to new contributors
enabling
the project to grow and he also demonstrated a huge interest to shape the
project
with helping and scoping previous and upcoming releases.

Szabolcs's hard work is much appreciated and we look forward to his
continued contributions!

1: https://s.apache.org/nzgU
2: https://reviews.apache.org/users/vasas/reviews/

Kind Regards,
Bogi


Re: Review Request 69199: Create tests for SQOOP-2949, quote escaping in split-by

2018-10-31 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/69199/#review210223
---


Ship it!




Hi Fero,

Thanks for this fix! I ran unit tests successfully with your patch. Change LGTM.

Cheers,
Bogi

- Boglarka Egyed


On Oct. 31, 2018, 1:51 p.m., Fero Szabo wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/69199/
> ---
> 
> (Updated Oct. 31, 2018, 1:51 p.m.)
> 
> 
> Review request for Sqoop, Boglarka Egyed and Szabolcs Vasas.
> 
> 
> Bugs: SQOOP-3400
> https://issues.apache.org/jira/browse/SQOOP-3400
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> ---
> 
> Integration tests for SQOOP-2949.
> 
> 
> Diffs
> -
> 
>   src/java/org/apache/sqoop/mapreduce/db/TextSplitter.java 22bbfe68 
>   src/test/org/apache/sqoop/importjob/SplitByImportTest.java PRE-CREATION 
>   
> src/test/org/apache/sqoop/importjob/configuration/GenericImportJobSplitByTestConfiguration.java
>  PRE-CREATION 
> 
> 
> Diff: https://reviews.apache.org/r/69199/diff/6/
> 
> 
> Testing
> ---
> 
> This is the testing part for a fix that lacked testing. 
> gradle test and gradle 3rdpartytests.
> 
> 
> Thanks,
> 
> Fero Szabo
> 
>



[jira] [Resolved] (SQOOP-3345) Make Sqoop work with S3: test and document current capabilities

2018-10-25 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3345?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed resolved SQOOP-3345.
---
   Resolution: Done
Fix Version/s: no-release

> Make Sqoop work with S3: test and document current capabilities
> ---
>
> Key: SQOOP-3345
> URL: https://issues.apache.org/jira/browse/SQOOP-3345
> Project: Sqoop
>  Issue Type: Improvement
>Affects Versions: 1.4.7
>        Reporter: Boglarka Egyed
>    Assignee: Boglarka Egyed
>Priority: Major
> Fix For: no-release
>
>
> The 
> {{[hadoop-aws|https://hadoop.apache.org/docs/stable/hadoop-aws/tools/hadoop-aws/index.html]}}
>  module provides support for AWS integration which could be used by Sqoop to 
> work with S3.
> The scope of this task is to explore and document the capabilities of the 
> current Sqoop implementation regarding the "from RDBMS to S3" use case by 
> adding the {{hadoop-aws}} dependency.
> This is an umbrella ticket to collect all the related tasks and also the 
> first step to enable further improvements to support all the required use 
> cases.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (SQOOP-3395) Document Hadoop CredentialProvider usage in case of import into S3

2018-10-25 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3395?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed updated SQOOP-3395:
--
Affects Version/s: 1.4.7

> Document Hadoop CredentialProvider usage in case of import into S3
> --
>
> Key: SQOOP-3395
> URL: https://issues.apache.org/jira/browse/SQOOP-3395
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>        Reporter: Boglarka Egyed
>    Assignee: Boglarka Egyed
>Priority: Major
> Attachments: SQOOP-3395.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (SQOOP-3395) Document Hadoop CredentialProvider usage in case of import into S3

2018-10-25 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3395?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed updated SQOOP-3395:
--
Attachment: SQOOP-3395.patch

> Document Hadoop CredentialProvider usage in case of import into S3
> --
>
> Key: SQOOP-3395
> URL: https://issues.apache.org/jira/browse/SQOOP-3395
> Project: Sqoop
>  Issue Type: Sub-task
>    Reporter: Boglarka Egyed
>        Assignee: Boglarka Egyed
>Priority: Major
> Attachments: SQOOP-3395.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Review Request 69164: SQOOP-3395: Document Hadoop CredentialProvider usage in case of import into S3

2018-10-25 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/69164/
---

Review request for Sqoop, Fero Szabo and Szabolcs Vasas.


Bugs: SQOOP-3395
https://issues.apache.org/jira/browse/SQOOP-3395


Repository: sqoop-trunk


Description
---

Document Hadoop CredentialProvider usage in case of import into S3


Diffs
-

  src/docs/user/s3.txt 52ab6ac07203494922db7a7aaa991c2ea1fc52c8 


Diff: https://reviews.apache.org/r/69164/diff/1/


Testing
---

ant docs
./gradlew docs


Thanks,

Boglarka Egyed



[jira] [Updated] (SQOOP-3398) Tests using HiveMiniCluster can be unstable on some platforms

2018-10-24 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3398?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed updated SQOOP-3398:
--
Fix Version/s: (was: 1.5.0)
   3.0.0

> Tests using HiveMiniCluster can be unstable on some platforms
> -
>
> Key: SQOOP-3398
> URL: https://issues.apache.org/jira/browse/SQOOP-3398
> Project: Sqoop
>  Issue Type: Test
>Affects Versions: 1.4.7
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: SQOOP-3398.patch
>
>
> Since the last Hive upgrade TestHiveMiniCluster fails on some platforms 
> because and older version of the ASM library is picked up.
> The task is to exclude the older ASM library in ivy and gradle to make sure 
> the test passes on all platforms.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Re: Review Request 68064: SQOOP-3355 Document SQOOP-1905 DB2 --schema option

2018-10-24 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/68064/#review209967
---


Ship it!




Ran ant docs and ./gradlew docs successfully, change looks good too.

- Boglarka Egyed


On July 26, 2018, 2:58 p.m., Fero Szabo wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/68064/
> ---
> 
> (Updated July 26, 2018, 2:58 p.m.)
> 
> 
> Review request for Sqoop, Boglarka Egyed, daniel voros, and Szabolcs Vasas.
> 
> 
> Bugs: SQOOP-3355
> https://issues.apache.org/jira/browse/SQOOP-3355
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> ---
> 
> Adding documentation for a previously implemented feature. This one is quite 
> simple.
> 
> 
> Diffs
> -
> 
>   src/docs/user/connectors.txt 59e3e00b 
> 
> 
> Diff: https://reviews.apache.org/r/68064/diff/1/
> 
> 
> Testing
> ---
> 
> ant docs, 
> + unit and 3rd party tests, though these shouldn't be affected.
> 
> 
> Thanks,
> 
> Fero Szabo
> 
>



Re: Review Request 69139: TestS3ImportWithHadoopCredProvider fails if credential generator command is not provided

2018-10-24 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/69139/#review209961
---


Ship it!




Good catch, thanks for the fix! I tested both with and without the AWS 
credentials, seems good to me.

- Boglarka Egyed


On Oct. 24, 2018, 10:58 a.m., Szabolcs Vasas wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/69139/
> ---
> 
> (Updated Oct. 24, 2018, 10:58 a.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Bugs: SQOOP-3399
> https://issues.apache.org/jira/browse/SQOOP-3399
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> ---
> 
> BeforeClass method of TestS3ImportWithHadoopCredProvider should not throw 
> NullPointerException when the credential generator command is not provided 
> since it fails the test with Gradle.
> 
> 
> Diffs
> -
> 
>   src/test/org/apache/sqoop/s3/TestS3ImportWithHadoopCredProvider.java 
> e03eb64ef 
> 
> 
> Diff: https://reviews.apache.org/r/69139/diff/1/
> 
> 
> Testing
> ---
> 
> Executed the test with both and and gradle, with and without S3 credential 
> generator provided.
> 
> 
> Thanks,
> 
> Szabolcs Vasas
> 
>



Re: Review Request 69141: Tests using HiveMiniCluster can be unstable on some platforms

2018-10-24 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/69141/#review209958
---


Ship it!




Thanks for the fix! Ran unit and 3rd party tests successfully with your patch.

- Boglarka Egyed


On Oct. 24, 2018, 12:25 p.m., Szabolcs Vasas wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/69141/
> ---
> 
> (Updated Oct. 24, 2018, 12:25 p.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Bugs: SQOOP-3398
> https://issues.apache.org/jira/browse/SQOOP-3398
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> ---
> 
> Since the last Hive upgrade TestHiveMiniCluster fails on some platforms 
> because and older version of the ASM library is picked up.
> 
> The task is to exclude the older ASM library in ivy and gradle to make sure 
> the test passes on all platforms.
> 
> 
> Diffs
> -
> 
>   build.gradle 2340bce7519a46b203a287a4b5160c62e0c09509 
>   ivy.xml 6805fc329d44bcc0707e7cab67f3749a42e6f769 
> 
> 
> Diff: https://reviews.apache.org/r/69141/diff/1/
> 
> 
> Testing
> ---
> 
> Executed unit and third party tests with both ant and gradle.
> 
> 
> Thanks,
> 
> Szabolcs Vasas
> 
>



[jira] [Updated] (SQOOP-3394) External Hive table tests should use unique external dir names

2018-10-18 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3394?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed updated SQOOP-3394:
--
Attachment: SQOOP-3394.patch

> External Hive table tests should use unique external dir names
> --
>
> Key: SQOOP-3394
> URL: https://issues.apache.org/jira/browse/SQOOP-3394
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>        Reporter: Boglarka Egyed
>    Assignee: Boglarka Egyed
>Priority: Major
> Attachments: SQOOP-3394.patch, SQOOP-3394.patch
>
>
> Current external Hive table tests on S3 uses the same external directory name 
> in every unit test cases which can cause problems during running them in an 
> automated environment. These names should be unique in every test cases.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Re: Review Request 69070: SQOOP-3394: External Hive table tests should use unique external dir names

2018-10-18 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/69070/
---

(Updated Oct. 18, 2018, 5:49 p.m.)


Review request for Sqoop, Fero Szabo and Szabolcs Vasas.


Changes
---

Rationalized unique name generation for temp directories


Bugs: SQOOP-3394
https://issues.apache.org/jira/browse/SQOOP-3394


Repository: sqoop-trunk


Description
---

Current external Hive table tests on S3 uses the same external directory name 
in every unit test cases which can cause problems during running them in an 
automated environment. These names should be unique in every test cases.


Diffs (updated)
-

  src/test/org/apache/sqoop/testutil/S3TestUtils.java 
97d53bbaa7c72d2ad1b890d7a8367c45a3e2b95c 


Diff: https://reviews.apache.org/r/69070/diff/2/

Changes: https://reviews.apache.org/r/69070/diff/1-2/


Testing
---

./gradlew test -Ds3.bucket.url= 
-Ds3.generator.command=


Thanks,

Boglarka Egyed



[jira] [Updated] (SQOOP-3394) External Hive table tests should use unique external dir names

2018-10-18 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3394?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed updated SQOOP-3394:
--
Attachment: SQOOP-3394.patch

> External Hive table tests should use unique external dir names
> --
>
> Key: SQOOP-3394
> URL: https://issues.apache.org/jira/browse/SQOOP-3394
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>        Reporter: Boglarka Egyed
>    Assignee: Boglarka Egyed
>Priority: Major
> Attachments: SQOOP-3394.patch
>
>
> Current external Hive table tests on S3 uses the same external directory name 
> in every unit test cases which can cause problems during running them in an 
> automated environment. These names should be unique in every test cases.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Re: Review Request 69070: SQOOP-3394: External Hive table tests should use unique external dir names

2018-10-18 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/69070/
---

(Updated Oct. 18, 2018, 3:11 p.m.)


Review request for Sqoop, Fero Szabo and Szabolcs Vasas.


Summary (updated)
-

SQOOP-3394: External Hive table tests should use unique external dir names


Bugs: SQOOP-3394
https://issues.apache.org/jira/browse/SQOOP-3394


Repository: sqoop-trunk


Description
---

Current external Hive table tests on S3 uses the same external directory name 
in every unit test cases which can cause problems during running them in an 
automated environment. These names should be unique in every test cases.


Diffs
-

  src/test/org/apache/sqoop/testutil/S3TestUtils.java 
97d53bbaa7c72d2ad1b890d7a8367c45a3e2b95c 


Diff: https://reviews.apache.org/r/69070/diff/1/


Testing
---

./gradlew test -Ds3.bucket.url= 
-Ds3.generator.command=


Thanks,

Boglarka Egyed



Review Request 69070: SQOOP-3345: External Hive table tests should use unique external dir names

2018-10-18 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/69070/
---

Review request for Sqoop, Fero Szabo and Szabolcs Vasas.


Bugs: SQOOP-3394
https://issues.apache.org/jira/browse/SQOOP-3394


Repository: sqoop-trunk


Description
---

Current external Hive table tests on S3 uses the same external directory name 
in every unit test cases which can cause problems during running them in an 
automated environment. These names should be unique in every test cases.


Diffs
-

  src/test/org/apache/sqoop/testutil/S3TestUtils.java 
97d53bbaa7c72d2ad1b890d7a8367c45a3e2b95c 


Diff: https://reviews.apache.org/r/69070/diff/1/


Testing
---

./gradlew test -Ds3.bucket.url= 
-Ds3.generator.command=


Thanks,

Boglarka Egyed



[jira] [Created] (SQOOP-3395) Document Hadoop CredentialProvider usage in case of import into S3

2018-10-18 Thread Boglarka Egyed (JIRA)
Boglarka Egyed created SQOOP-3395:
-

 Summary: Document Hadoop CredentialProvider usage in case of 
import into S3
 Key: SQOOP-3395
 URL: https://issues.apache.org/jira/browse/SQOOP-3395
 Project: Sqoop
  Issue Type: Sub-task
Reporter: Boglarka Egyed
Assignee: Boglarka Egyed






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (SQOOP-3394) External Hive table tests should use unique external dir names

2018-10-18 Thread Boglarka Egyed (JIRA)
Boglarka Egyed created SQOOP-3394:
-

 Summary: External Hive table tests should use unique external dir 
names
 Key: SQOOP-3394
 URL: https://issues.apache.org/jira/browse/SQOOP-3394
 Project: Sqoop
  Issue Type: Sub-task
Affects Versions: 1.4.7
Reporter: Boglarka Egyed
Assignee: Boglarka Egyed


Current external Hive table tests on S3 uses the same external directory name 
in every unit test cases which can cause problems during running them in an 
automated environment. These names should be unique in every test cases.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (SQOOP-3391) Test storing AWS credentials in Hadoop CredentialProvider during import

2018-10-18 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3391?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed updated SQOOP-3391:
--
Attachment: SQOOP-3391.patch

> Test storing AWS credentials in Hadoop CredentialProvider during import
> ---
>
> Key: SQOOP-3391
> URL: https://issues.apache.org/jira/browse/SQOOP-3391
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>        Reporter: Boglarka Egyed
>    Assignee: Boglarka Egyed
>Priority: Major
> Attachments: SQOOP-3391.patch, SQOOP-3391.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Re: Review Request 69063: SQOOP-3391: Test storing AWS credentials in Hadoop CredentialProvider during import

2018-10-18 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/69063/
---

(Updated Oct. 18, 2018, 11:45 a.m.)


Review request for Sqoop, Fero Szabo, Ferenc Szabo, and Szabolcs Vasas.


Changes
---

Clean up only HADOOP_CREDSTORE_PASSWORD environment variable in tests.


Bugs: SQOOP-3391
https://issues.apache.org/jira/browse/SQOOP-3391


Repository: sqoop-trunk


Description
---

Test storing AWS credentials in Hadoop CredentialProvider during import in case 
of
- CredntialProvider with default password
- CredntialProvider with password stored in environment variable
- CredntialProvider with password file
Added test cases for happy and sad paths as well.

Added a new test dependency both in Ant and Gradle for setting environment 
variables in tests easily.


Diffs (updated)
-

  build.gradle 7a0712e3242e31ef2593c34f469f9136cf5dc85d 
  build.xml f3975317140e66c700d85231669ccb2b70367f80 
  conf/password-file.txt PRE-CREATION 
  conf/wrong-password-file.txt PRE-CREATION 
  gradle.properties 4808ec7d090b9732f9246f21e44bd736adf6efd0 
  ivy.xml 91157ca74bee3b50269564ddb747638946e45a7e 
  ivy/libraries.properties 2ca95ee99c09fe1aaff6797a6ee0958ac1977663 
  src/java/org/apache/sqoop/util/password/CredentialProviderHelper.java 
1d6481a0697db2fc0ffeb1b012bb143beb615bc0 
  src/test/org/apache/sqoop/s3/TestS3ImportWithHadoopCredProvider.java 
PRE-CREATION 
  src/test/org/apache/sqoop/testutil/S3TestUtils.java 
c9d17bc728d6a229e32c157b56268d6418b3de94 


Diff: https://reviews.apache.org/r/69063/diff/2/

Changes: https://reviews.apache.org/r/69063/diff/1-2/


Testing
---

./gradlew test -Ds3.bucket.url= 
-Ds3.generator.command=
ant clean test -Ds3.bucket.url= 
-Ds3.generator.command=


Thanks,

Boglarka Egyed



[jira] [Updated] (SQOOP-3390) Document S3Guard usage with Sqoop

2018-10-17 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3390?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed updated SQOOP-3390:
--
Attachment: SQOOP-3390.patch

> Document S3Guard usage with Sqoop
> -
>
> Key: SQOOP-3390
> URL: https://issues.apache.org/jira/browse/SQOOP-3390
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>        Reporter: Boglarka Egyed
>    Assignee: Boglarka Egyed
>Priority: Major
> Attachments: SQOOP-3390.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Review Request 69066: SQOOP-3390: Document S3Guard usage with Sqoop

2018-10-17 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/69066/
---

Review request for Sqoop, Fero Szabo and Szabolcs Vasas.


Bugs: SQOOP-3390
https://issues.apache.org/jira/browse/SQOOP-3390


Repository: sqoop-trunk


Description
---

Document Hadoop's S3Guard usage with Sqoop to overcome Amazon S3's eventual 
consistency.


Diffs
-

  src/docs/user/s3.txt c54b26bc5ef71f8cd7d18ce6eb98a296dbffed92 


Diff: https://reviews.apache.org/r/69066/diff/1/


Testing
---

ant docs
./gradlew docs


Thanks,

Boglarka Egyed



[jira] [Updated] (SQOOP-3391) Test storing AWS credentials in Hadoop CredentialProvider during import

2018-10-17 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3391?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed updated SQOOP-3391:
--
Attachment: SQOOP-3391.patch

> Test storing AWS credentials in Hadoop CredentialProvider during import
> ---
>
> Key: SQOOP-3391
> URL: https://issues.apache.org/jira/browse/SQOOP-3391
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>        Reporter: Boglarka Egyed
>    Assignee: Boglarka Egyed
>Priority: Major
> Attachments: SQOOP-3391.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Review Request 69063: SQOOP-3391: Test storing AWS credentials in Hadoop CredentialProvider during import

2018-10-17 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/69063/
---

Review request for Sqoop, Fero Szabo, Ferenc Szabo, and Szabolcs Vasas.


Bugs: SQOOP-3391
https://issues.apache.org/jira/browse/SQOOP-3391


Repository: sqoop-trunk


Description
---

Test storing AWS credentials in Hadoop CredentialProvider during import in case 
of
- CredntialProvider with default password
- CredntialProvider with password stored in environment variable
- CredntialProvider with password file
Added test cases for happy and sad paths as well.

Added a new test dependency both in Ant and Gradle for setting environment 
variables in tests easily.


Diffs
-

  build.gradle 7a0712e3242e31ef2593c34f469f9136cf5dc85d 
  build.xml f3975317140e66c700d85231669ccb2b70367f80 
  conf/password-file.txt PRE-CREATION 
  conf/wrong-password-file.txt PRE-CREATION 
  gradle.properties 4808ec7d090b9732f9246f21e44bd736adf6efd0 
  ivy.xml 91157ca74bee3b50269564ddb747638946e45a7e 
  ivy/libraries.properties 2ca95ee99c09fe1aaff6797a6ee0958ac1977663 
  src/java/org/apache/sqoop/util/password/CredentialProviderHelper.java 
1d6481a0697db2fc0ffeb1b012bb143beb615bc0 
  src/test/org/apache/sqoop/s3/TestS3ImportWithHadoopCredProvider.java 
PRE-CREATION 
  src/test/org/apache/sqoop/testutil/S3TestUtils.java 
c9d17bc728d6a229e32c157b56268d6418b3de94 


Diff: https://reviews.apache.org/r/69063/diff/1/


Testing
---

./gradlew test -Ds3.bucket.url= 
-Ds3.generator.command=
ant clean test -Ds3.bucket.url= 
-Ds3.generator.command=


Thanks,

Boglarka Egyed



Re: [ANNOUNCE] New Sqoop PMC member - Boglarka Egyed

2018-10-17 Thread Boglarka Egyed
Thank you all!

On Tue, Oct 16, 2018 at 3:58 PM Venkat  wrote:

> Congratulations  Bogi!   Well deserved
>
> Venkat
> On Tue, Oct 16, 2018 at 6:12 AM Dániel Vörös 
> wrote:
> >
> > Congrats Bogi, this was well deserved! Keep it up!
> >
> > Regards,
> > Daniel
> > On Tue, Oct 16, 2018 at 3:09 PM Szabolcs Vasas
> >  wrote:
> > >
> > > Congratulations, keep up the good work!
> > >
> > > On Tue, Oct 16, 2018 at 2:46 PM Fero Szabo 
> > > wrote:
> > >
> > > > Hi Bogi,
> > > >
> > > > Well earned, congratulations!
> > > >
> > > > Live long and keep contributing! :)
> > > >
> > > > Cheers,
> > > > Fero
> > > >
> > > >
> > > > On Tue, Oct 16, 2018 at 10:28 AM Jarek Jarcec Cecho <
> jar...@apache.org>
> > > > wrote:
> > > >
> > > > > On behalf of the Apache Sqoop PMC, I am excited to welcome
> Boglarka Egyed
> > > > > as new Sqoop PMC Member. Please join me in congratulating her!
> > > > >
> > > > > Jarcec
> > > > >
> > > > >
> > > >
> > > > --
> > > > *Ferenc Szabo* | Software Engineer
> > > > t. (+361) 701 1201 <+361+701+1201>
> > > > cloudera.com <https://www.cloudera.com>
> > > >
> > > > [image: Cloudera] <https://www.cloudera.com/>
> > > >
> > > > [image: Cloudera on Twitter] <https://twitter.com/cloudera> [image:
> > > > Cloudera on Facebook] <https://www.facebook.com/cloudera> [image:
> Cloudera
> > > > on LinkedIn] <https://www.linkedin.com/company/cloudera>
> > > > --
> > > >
> > >
> > >
> > > --
> > > Szabolcs Vasas
> > > Software Engineer
> > > <http://www.cloudera.com>
>
>
>
> --
> Regards
>
> Venkat
>


[jira] [Updated] (SQOOP-3392) Document metadata-transaction-isolation-level option

2018-10-15 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3392?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed updated SQOOP-3392:
--
Description: Option {{metadata-transaction-isolation-level}} has been added 
in SQOOP-2349 however documentation, example commands are missing.  (was: 
Option {{metadata-transaction-isolation-level}} has been added in SQOOP-2349 
however documentation, example command are missing.)

> Document metadata-transaction-isolation-level option
> 
>
> Key: SQOOP-3392
> URL: https://issues.apache.org/jira/browse/SQOOP-3392
> Project: Sqoop
>  Issue Type: Task
>Affects Versions: 1.4.7
>        Reporter: Boglarka Egyed
>Priority: Major
>
> Option {{metadata-transaction-isolation-level}} has been added in SQOOP-2349 
> however documentation, example commands are missing.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (SQOOP-3392) Document metadata-transaction-isolation-level option

2018-10-15 Thread Boglarka Egyed (JIRA)
Boglarka Egyed created SQOOP-3392:
-

 Summary: Document metadata-transaction-isolation-level option
 Key: SQOOP-3392
 URL: https://issues.apache.org/jira/browse/SQOOP-3392
 Project: Sqoop
  Issue Type: Task
Affects Versions: 1.4.7
Reporter: Boglarka Egyed


Option {{metadata-transaction-isolation-level}} has been added in SQOOP-2349 
however documentation, example command are missing.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Re: Review Request 68606: Error during direct Netezza import/export can interrupt process in uncontrolled ways

2018-10-11 Thread Boglarka Egyed


> On Oct. 4, 2018, 12:46 p.m., Boglarka Egyed wrote:
> > Hi Daniel,
> > 
> > Apart from the discussion with Szabolcs about the expected exception 
> > handling I'm OK with your change. All tests passed.
> > 
> > Thanks,
> > Bogi
> 
> daniel voros wrote:
> Hey Bogi,
> 
> Thanks for reviewing! What do you mean by expected exception handling? 
> I'm happy to update the patch if you have concerns!
> 
> Regards,
> Daniel

Hi Daniel,

I apologize, I confused your patch with another one, sorry about that. Please 
ignore my previous comment regarding expected exceptions.
Ship it! :)

Regards,
Bogi


- Boglarka


---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/68606/#review209221
---


On Sept. 3, 2018, 11:32 a.m., daniel voros wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/68606/
> ---
> 
> (Updated Sept. 3, 2018, 11:32 a.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Bugs: SQOOP-3378
> https://issues.apache.org/jira/browse/SQOOP-3378
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> ---
> 
> `SQLException` during JDBC operation in direct Netezza import/export signals 
> parent thread to fail fast by interrupting it.
> We're trying to process the interrupt in the parent (main) thread, but 
> there's no guarantee that we're not in some internal call that will process 
> the interrupted flag and reset it before we're able to check.
> 
> It is also possible that the parent thread has passed the "checking part" 
> when it gets interrupted. In case of `NetezzaExternalTableExportMapper` this 
> can interrupt the upload of log files.
> 
> I'd recommend using some other means of communication between the threads 
> than interrupts.
> 
> 
> Diffs
> -
> 
>   
> src/java/org/apache/sqoop/mapreduce/db/netezza/NetezzaExternalTableExportMapper.java
>  5bf21880 
>   
> src/java/org/apache/sqoop/mapreduce/db/netezza/NetezzaExternalTableImportMapper.java
>  306062aa 
>   
> src/java/org/apache/sqoop/mapreduce/db/netezza/NetezzaJDBCStatementRunner.java
>  cedfd235 
>   
> src/test/org/apache/sqoop/mapreduce/db/netezza/TestNetezzaExternalTableExportMapper.java
>  PRE-CREATION 
>   
> src/test/org/apache/sqoop/mapreduce/db/netezza/TestNetezzaExternalTableImportMapper.java
>  PRE-CREATION 
> 
> 
> Diff: https://reviews.apache.org/r/68606/diff/2/
> 
> 
> Testing
> ---
> 
> added new UTs and checked manual Netezza tests (NetezzaExportManualTest, 
> NetezzaImportManualTest)
> 
> 
> Thanks,
> 
> daniel voros
> 
>



[jira] [Updated] (SQOOP-3361) Test compressing imported data with S3

2018-10-11 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3361?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed updated SQOOP-3361:
--
Issue Type: Improvement  (was: Sub-task)
Parent: (was: SQOOP-3345)

> Test compressing imported data with S3
> --
>
> Key: SQOOP-3361
> URL: https://issues.apache.org/jira/browse/SQOOP-3361
> Project: Sqoop
>  Issue Type: Improvement
>Affects Versions: 1.4.7
>        Reporter: Boglarka Egyed
>    Assignee: Boglarka Egyed
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (SQOOP-3361) Test compressing imported data with S3

2018-10-11 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3361?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed updated SQOOP-3361:
--
Priority: Minor  (was: Major)

> Test compressing imported data with S3
> --
>
> Key: SQOOP-3361
> URL: https://issues.apache.org/jira/browse/SQOOP-3361
> Project: Sqoop
>  Issue Type: Improvement
>Affects Versions: 1.4.7
>        Reporter: Boglarka Egyed
>    Assignee: Boglarka Egyed
>Priority: Minor
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (SQOOP-3384) Document import into external Hive table backed by S3

2018-10-11 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3384?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed updated SQOOP-3384:
--
Affects Version/s: 1.4.7

> Document import into external Hive table backed by S3
> -
>
> Key: SQOOP-3384
> URL: https://issues.apache.org/jira/browse/SQOOP-3384
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>        Reporter: Boglarka Egyed
>    Assignee: Boglarka Egyed
>Priority: Major
> Attachments: SQOOP-3384.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (SQOOP-3376) Test import into external Hive table backed by S3

2018-10-11 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3376?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed updated SQOOP-3376:
--
Affects Version/s: 1.4.7

> Test import into external Hive table backed by S3
> -
>
> Key: SQOOP-3376
> URL: https://issues.apache.org/jira/browse/SQOOP-3376
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>        Reporter: Boglarka Egyed
>    Assignee: Boglarka Egyed
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: SQOOP-3376.patch, SQOOP-3376.patch, SQOOP-3376.patch, 
> SQOOP-3376.patch, SQOOP-3376.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (SQOOP-3391) Test storing AWS credentials in Hadoop CredentialProvider during import

2018-10-11 Thread Boglarka Egyed (JIRA)
Boglarka Egyed created SQOOP-3391:
-

 Summary: Test storing AWS credentials in Hadoop CredentialProvider 
during import
 Key: SQOOP-3391
 URL: https://issues.apache.org/jira/browse/SQOOP-3391
 Project: Sqoop
  Issue Type: Sub-task
Affects Versions: 1.4.7
Reporter: Boglarka Egyed
Assignee: Boglarka Egyed






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (SQOOP-3390) Document S3Guard usage with Sqoop

2018-10-11 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3390?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed updated SQOOP-3390:
--
Affects Version/s: 1.4.7

> Document S3Guard usage with Sqoop
> -
>
> Key: SQOOP-3390
> URL: https://issues.apache.org/jira/browse/SQOOP-3390
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>        Reporter: Boglarka Egyed
>    Assignee: Boglarka Egyed
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (SQOOP-3384) Document import into external Hive table backed by S3

2018-10-10 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3384?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed updated SQOOP-3384:
--
Attachment: SQOOP-3384.patch

> Document import into external Hive table backed by S3
> -
>
> Key: SQOOP-3384
> URL: https://issues.apache.org/jira/browse/SQOOP-3384
> Project: Sqoop
>  Issue Type: Sub-task
>    Reporter: Boglarka Egyed
>        Assignee: Boglarka Egyed
>Priority: Major
> Attachments: SQOOP-3384.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Review Request 68979: SQOOP-3384: Document import into external Hive table backed by S3

2018-10-10 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/68979/
---

Review request for Sqoop, Fero Szabo and Szabolcs Vasas.


Bugs: SQOOP-3384
https://issues.apache.org/jira/browse/SQOOP-3384


Repository: sqoop-trunk


Description
---

Document import into external Hive table backed by S3


Diffs
-

  src/docs/user/s3.txt 3724454d7efda6b390a5984d9be44d20c404f766 


Diff: https://reviews.apache.org/r/68979/diff/1/


Testing
---

ant clean docs
./gradlew docs


Thanks,

Boglarka Egyed



[jira] [Reopened] (SQOOP-3376) Test import into external Hive table backed by S3

2018-10-10 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3376?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed reopened SQOOP-3376:
---

> Test import into external Hive table backed by S3
> -
>
> Key: SQOOP-3376
> URL: https://issues.apache.org/jira/browse/SQOOP-3376
> Project: Sqoop
>  Issue Type: Sub-task
>    Reporter: Boglarka Egyed
>        Assignee: Boglarka Egyed
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: SQOOP-3376.patch, SQOOP-3376.patch, SQOOP-3376.patch, 
> SQOOP-3376.patch, SQOOP-3376.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (SQOOP-3376) Test import into external Hive table backed by S3

2018-10-10 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3376?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed updated SQOOP-3376:
--
Attachment: SQOOP-3376.patch

> Test import into external Hive table backed by S3
> -
>
> Key: SQOOP-3376
> URL: https://issues.apache.org/jira/browse/SQOOP-3376
> Project: Sqoop
>  Issue Type: Sub-task
>    Reporter: Boglarka Egyed
>        Assignee: Boglarka Egyed
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: SQOOP-3376.patch, SQOOP-3376.patch, SQOOP-3376.patch, 
> SQOOP-3376.patch, SQOOP-3376.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Re: Review Request 68712: SQOOP-3376: Test import into external Hive table backed by S3

2018-10-10 Thread Boglarka Egyed


> On Oct. 10, 2018, 11:26 a.m., Szabolcs Vasas wrote:
> > Hi Bogi,
> > 
> > Thank you for improving the patch, it is much more concise now.
> > I have ran the tests successfully however I noticed that when I run the 
> > tests without the S3 generator command the new test fails and does not get 
> > skipped.
> > I am not sure but the ExpectedException could have something to do with the 
> > failure.
> > Can you please check this?

Hi Szabolcs, 

Thanks, I missed to check this scenario with the new test case. Skip logic 
works as expected but a null check was missing from the After method.

Thank you,
Bogi


- Boglarka


---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/68712/#review209401
-------


On Oct. 10, 2018, 11:59 a.m., Boglarka Egyed wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/68712/
> ---
> 
> (Updated Oct. 10, 2018, 11:59 a.m.)
> 
> 
> Review request for Sqoop, daniel voros, Fero Szabo, and Szabolcs Vasas.
> 
> 
> Bugs: SQOOP-3376
> https://issues.apache.org/jira/browse/SQOOP-3376
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> ---
> 
> Testing the Text and Parquet imports into an external Hive table backed by S3.
> 
> 
> Diffs
> -
> 
>   src/test/org/apache/sqoop/s3/TestS3ExternalHiveTableImport.java 
> PRE-CREATION 
>   src/test/org/apache/sqoop/testutil/HiveServer2TestUtil.java 
> 799370816cccda7578d7c64add6e283d3123e1c8 
>   src/test/org/apache/sqoop/testutil/S3TestUtils.java 
> 0e6ef5bf001797aa70a7ad50d261c6fd384222fe 
> 
> 
> Diff: https://reviews.apache.org/r/68712/diff/5/
> 
> 
> Testing
> ---
> 
> ./gradlew test -Ds3.bucket.url= 
> -Ds3.generator.command=
> 
> 
> Thanks,
> 
> Boglarka Egyed
> 
>



Re: Review Request 68712: SQOOP-3376: Test import into external Hive table backed by S3

2018-10-10 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/68712/
---

(Updated Oct. 10, 2018, 11:59 a.m.)


Review request for Sqoop, daniel voros, Fero Szabo, and Szabolcs Vasas.


Changes
---

Removed unnecessary ExpectedException rule, added null check for 
hiveMiniCluster stopping in After method.


Bugs: SQOOP-3376
https://issues.apache.org/jira/browse/SQOOP-3376


Repository: sqoop-trunk


Description
---

Testing the Text and Parquet imports into an external Hive table backed by S3.


Diffs (updated)
-

  src/test/org/apache/sqoop/s3/TestS3ExternalHiveTableImport.java PRE-CREATION 
  src/test/org/apache/sqoop/testutil/HiveServer2TestUtil.java 
799370816cccda7578d7c64add6e283d3123e1c8 
  src/test/org/apache/sqoop/testutil/S3TestUtils.java 
0e6ef5bf001797aa70a7ad50d261c6fd384222fe 


Diff: https://reviews.apache.org/r/68712/diff/5/

Changes: https://reviews.apache.org/r/68712/diff/4-5/


Testing
---

./gradlew test -Ds3.bucket.url= 
-Ds3.generator.command=


Thanks,

Boglarka Egyed



[jira] [Created] (SQOOP-3390) Document S3Guard usage with Sqoop

2018-10-09 Thread Boglarka Egyed (JIRA)
Boglarka Egyed created SQOOP-3390:
-

 Summary: Document S3Guard usage with Sqoop
 Key: SQOOP-3390
 URL: https://issues.apache.org/jira/browse/SQOOP-3390
 Project: Sqoop
  Issue Type: Sub-task
Reporter: Boglarka Egyed
Assignee: Boglarka Egyed






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3389) Unable to build Sqoop 1.4.7 due to upstream TLS 1.2 issue

2018-10-09 Thread Boglarka Egyed (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3389?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16642951#comment-16642951
 ] 

Boglarka Egyed commented on SQOOP-3389:
---

[~devin.bost] your error log seems very odd to me, all the URLs are like this:
{noformat}
url=https://repository.cloudera.com/content/repositories/releases/x{noformat}
I don't understand why it tries to resolve every dependency from 
{{repository.cloudera.com.}} Do you maybe use some extra configuration in your 
environment? Where did you get the Sqoop source from?

> Unable to build Sqoop 1.4.7 due to upstream TLS 1.2 issue
> -
>
> Key: SQOOP-3389
> URL: https://issues.apache.org/jira/browse/SQOOP-3389
> Project: Sqoop
>  Issue Type: Bug
>  Components: build
>Affects Versions: 1.4.7
> Environment: Sqoop 1.4.7
> Java SDK 1.6.0_45 
> Ant 1.7.1
> Windows 10 via MinGW64
>Reporter: Devin G. Bost
>Priority: Blocker
>  Labels: build
> Attachments: sqoop_build_errors.txt
>
>
> When building Sqoop 1.4.7 with Java SDK 1.6.0_45 and Ant 1.7.1 on Windows in 
> MinGW64, I obtain these build errors:
> [ivy:resolve] Server access Error: Remote host closed connection during 
> handshake 
> url=https://repository.cloudera.com/content/repositories/releases/org/apache/avro/avro-mapred/1.8.1/avro-mapred-1.8.1-hadoop2.pom
> [ivy:resolve] Server access Error: Remote host closed connection during 
> handshake 
> url=https://repository.cloudera.com/content/repositories/releases/org/apache/avro/avro-mapred/1.8.1/avro-mapred-1.8.1-hadoop2.jar
> [ivy:resolve] Server access Error: Remote host closed connection during 
> handshake 
> url=https://repository.cloudera.com/content/repositories/staging/org/apache/avro/avro-mapred/1.8.1/avro-mapred-1.8.1-hadoop2.pom
> [ivy:resolve] Server access Error: Remote host closed connection during 
> handshake 
> url=[https://repository.cloudera.com/content/repositories/staging/org/apache/avro/avro-mapred/1.8.1/avro-mapred-1.8.1-hadoop2.jar]
>  
> I experienced similar build errors on a different project, which I traced to 
> this issue: 
> [https://stackoverflow.com/questions/21245796/javax-net-ssl-sslhandshakeexception-remote-host-closed-connection-during-handsh/22629008]
> The problem, however, is that because this particular build of Sqoop requires 
> Java 1.6, TLS v1.2 is unsupported, according to here: 
> [https://stackoverflow.com/questions/33364100/how-to-use-tls-1-2-in-java-6]
> which is a problem because some public repositories have dropped support for 
> TLS versions prior to 1.2, as reported here: 
> [https://github.com/Microsoft/vcpkg/issues/2969]
> and here: 
> [https://blog.github.com/2018-02-23-weak-cryptographic-standards-removed/]
> If it is impossible now to pull the upstream dependencies when building via 
> Ant due to TLS 1.2 being unsupported in Java 1.6, then this is a critical 
> dependency conflict. 
> If additional configuration steps are required to be able to successfully 
> build, they are not documented in any of the Sqoop documentation that I have 
> found. 
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Re: Review Request 68541: SQOOP-3104: Create test categories instead of test suites and naming conventions

2018-10-05 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/68541/#review209262
---



Hi Nguyen,

Thank you so much for the patch update and sorry for the late review!

I ran ./gradlew test but many tests with integrationPlainTest category failed 
for me because they were missing the JDBC driver. So should I pass them via the 
-Dsqoop.thirdparty.lib.dir= option? 
This is not clear to me and I couldn't find it in the description either.

Also, where can I find some guidelines (command examples) about how to run 
specific categories?

Thank you in advance,
Bogi

- Boglarka Egyed


On Sept. 23, 2018, 2:01 a.m., Nguyen Truong wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/68541/
> ---
> 
> (Updated Sept. 23, 2018, 2:01 a.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Bugs: SQOOP-3104
> https://issues.apache.org/jira/browse/SQOOP-3104
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> ---
> 
> We are currently unsing test naming conventions to differentiate between 
> ManualTests, Unit tests and 3rd party tests. Instead of that, I implemented 
> junit categories which will allow us to have more categories in the future. 
> This would also remove the reliance on the test class name.
> 
> Test categories skeleton:
>   SqoopTest _ UnitTest
>   |__ IntegrationTest
>   |__ ManualTest
> 
>   ThirdPartyTest _ CubridTest
>|__ Db2Test
>|__ MainFrameTest
>|__ MysqlTest
>|__ NetezzaTest
>|__ OracleTest
>|__ PostgresqlTest
>|__ SqlServerTest
> 
>   KerberizedTest
> 
> Categories explanation:
> * SqoopTest: Group of the big categories, including:
> - UnitTest: It tests one class only with its dependencies mocked or 
> if the dependency
> is lightweight we can keep it. It must not start a minicluster or an 
> hsqldb database.
> It does not need JCDB drivers.
> - IntegrationTest: It usually tests a whole scenario. It may start up 
> miniclusters,
> hsqldb and connect to external resources like RDBMSs.
> - ManualTest: This should be a deprecated category which should not 
> be used in the future.
> It only exists to mark the currently existing manual tests.
> * ThirdPartyTest: An orthogonal hierarchy for tests that need a JDBC 
> driver and/or a docker
> container/external RDBMS instance to run. Subcategories express what kind 
> of external
> resource the test needs. E.g: OracleTest needs an Oracle RDBMS and Oracle 
> driver on the classpath
> * KerberizedTest: Test that needs Kerberos, which needs to be run on a 
> separate JVM.
> 
> Opinions are very welcomed. Thanks!
> 
> 
> Diffs
> -
> 
>   build.gradle fc7fc0c4c 
>   src/test/org/apache/sqoop/TestConnFactory.java fb6c94059 
>   src/test/org/apache/sqoop/TestIncrementalImport.java 29c477954 
>   src/test/org/apache/sqoop/TestSqoopOptions.java e55682edf 
>   src/test/org/apache/sqoop/accumulo/TestAccumuloUtil.java 631eeff5e 
>   src/test/org/apache/sqoop/authentication/TestKerberosAuthenticator.java 
> f5700ce65 
>   src/test/org/apache/sqoop/db/TestDriverManagerJdbcConnectionFactory.java 
> 244831672 
>   
> src/test/org/apache/sqoop/db/decorator/TestKerberizedConnectionFactoryDecorator.java
>  d3e3fb23e 
>   src/test/org/apache/sqoop/hbase/HBaseImportAddRowKeyTest.java c4caafba5 
>   src/test/org/apache/sqoop/hbase/HBaseKerberizedConnectivityTest.java 
> 3bfb39178 
>   src/test/org/apache/sqoop/hbase/HBaseUtilTest.java c6a808c33 
>   src/test/org/apache/sqoop/hbase/TestHBasePutProcessor.java e78a535f4 
>   src/test/org/apache/sqoop/hcat/TestHCatalogBasic.java ba05cabbb 
>   
> src/test/org/apache/sqoop/hive/HiveServer2ConnectionFactoryInitializerTest.java
>  4d2cb2f88 
>   src/test/org/apache/sqoop/hive/TestHiveClientFactory.java a3c2dc939 
>   src/test/org/apache/sqoop/hive/TestHiveMiniCluster.java 419f888c0 
>   src/test/org/apache/sqoop/hive/TestHiveServer2Client.java 02617295e 
>   src/test/org/apache/sqoop/hive/TestHiveServer2ParquetImport.java b55179a4f 
>   src/test/org/apache/sqoop/hive/TestHiveServer2TextImport.java 410724f37 
>   src/test/org/apache/sqoop/hive/TestHiveTypesForAvroTypeMapping.java 
> 276e9eaa4 
>   src/test/org/apache/sqoop/h

[jira] [Updated] (SQOOP-1312) One of mappers does not load data from mySql if double column is used as split key

2018-10-05 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-1312?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed updated SQOOP-1312:
--
Affects Version/s: (was: 1.4.4)
   1.4.7

> One of mappers does not load data from mySql if double column is used as 
> split key
> --
>
> Key: SQOOP-1312
> URL: https://issues.apache.org/jira/browse/SQOOP-1312
> Project: Sqoop
>  Issue Type: Bug
>Affects Versions: 1.4.7
>Reporter: Jong Ho Lee
>Assignee: Jong Ho Lee
>Priority: Major
> Attachments: splitter.patch, splitter.patch
>
>
> When we used Sqoop to load data from mySQL using one double column as 
> split-key in Samsung SDS,
>   the last mapper did not load data from mySQL at all. 
>   The number of mappers was sometimes increased by 1.
>   I think they were caused by some bugs in FloatSplitter.java
>   For the last split, lowClausePrefix + Double.toString(curUpper), may be 
> lowClausePrefix + Double.toString(curLower).
>   In while (curUpper < maxVal) loop, because of round-off error, 
>   minVal + splitSize * numSplits can be smaller than maxVal.
>   Therefore, using for-loop would be better.
>   Attached is a proposed new FloatSplitter.java
> {code}
> /**
>  * Licensed to the Apache Software Foundation (ASF) under one
>  * or more contributor license agreements.  See the NOTICE file
>  * distributed with this work for additional information
>  * regarding copyright ownership.  The ASF licenses this file
>  * to you under the Apache License, Version 2.0 (the
>  * "License"); you may not use this file except in compliance
>  * with the License.  You may obtain a copy of the License at
>  *
>  * http://www.apache.org/licenses/LICENSE-2.0
>  *
>  * Unless required by applicable law or agreed to in writing, software
>  * distributed under the License is distributed on an "AS IS" BASIS,
>  * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
>  * See the License for the specific language governing permissions and
>  * limitations under the License.
>  */
> // modified by Jongho Lee at Samsung SDS.
> package org.apache.sqoop.mapreduce.db;
> import java.sql.ResultSet;
> import java.sql.SQLException;
> import java.util.ArrayList;
> import java.util.List;
> import org.apache.commons.logging.Log;
> import org.apache.commons.logging.LogFactory;
> import org.apache.hadoop.conf.Configuration;
> import org.apache.hadoop.mapreduce.InputSplit;
> import com.cloudera.sqoop.config.ConfigurationHelper;
> import com.cloudera.sqoop.mapreduce.db.DBSplitter;
> import com.cloudera.sqoop.mapreduce.db.DataDrivenDBInputFormat;
> /**
>  * Implement DBSplitter over floating-point values.
>  */
> public class FloatSplitter implements DBSplitter  {
>   private static final Log LOG = LogFactory.getLog(FloatSplitter.class);
>   private static final double MIN_INCREMENT = 1 * Double.MIN_VALUE;
>   public List split(Configuration conf, ResultSet results,
>   String colName) throws SQLException {
> LOG.warn("Generating splits for a floating-point index column. Due to 
> the");
> LOG.warn("imprecise representation of floating-point values in Java, 
> this");
> LOG.warn("may result in an incomplete import.");
> LOG.warn("You are strongly encouraged to choose an integral split 
> column.");
> List splits = new ArrayList();
> if (results.getString(1) == null && results.getString(2) == null) {
>   // Range is null to null. Return a null split accordingly.
>   splits.add(new DataDrivenDBInputFormat.DataDrivenDBInputSplit(
>   colName + " IS NULL", colName + " IS NULL"));
>   return splits;
> }
> double minVal = results.getDouble(1);
> double maxVal = results.getDouble(2);
> // Use this as a hint. May need an extra task if the size doesn't
> // divide cleanly.
> int numSplits = ConfigurationHelper.getConfNumMaps(conf);
> double splitSize = (maxVal - minVal) / (double) numSplits;
> if (splitSize < MIN_INCREMENT) {
>   splitSize = MIN_INCREMENT;
> }
> String lowClausePrefix = colName + " >= ";
> String highClausePrefix = colName + " < ";
> double curLower = minVal;
> double curUpper = curLower + splitSize;
> for (int i = 0; i < numSplits - 1; i++) {
>   // while (curUpper < maxVal) {  // changed to for loop
&g

[jira] [Commented] (SQOOP-1312) One of mappers does not load data from mySql if double column is used as split key

2018-10-05 Thread Boglarka Egyed (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-1312?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16639637#comment-16639637
 ] 

Boglarka Egyed commented on SQOOP-1312:
---

I have updated the Affects Version to include 1.4.6 and 1.4.7 too. Patch 
provided in [https://reviews.apache.org/r/25621/] seems to be abandoned, last 
update was 4 years ago, I think the fix is up for grabs now.

> One of mappers does not load data from mySql if double column is used as 
> split key
> --
>
> Key: SQOOP-1312
> URL: https://issues.apache.org/jira/browse/SQOOP-1312
> Project: Sqoop
>  Issue Type: Bug
>Affects Versions: 1.4.4, 1.4.6, 1.4.7
>Reporter: Jong Ho Lee
>Assignee: Jong Ho Lee
>Priority: Major
> Attachments: splitter.patch, splitter.patch
>
>
> When we used Sqoop to load data from mySQL using one double column as 
> split-key in Samsung SDS,
>   the last mapper did not load data from mySQL at all. 
>   The number of mappers was sometimes increased by 1.
>   I think they were caused by some bugs in FloatSplitter.java
>   For the last split, lowClausePrefix + Double.toString(curUpper), may be 
> lowClausePrefix + Double.toString(curLower).
>   In while (curUpper < maxVal) loop, because of round-off error, 
>   minVal + splitSize * numSplits can be smaller than maxVal.
>   Therefore, using for-loop would be better.
>   Attached is a proposed new FloatSplitter.java
> {code}
> /**
>  * Licensed to the Apache Software Foundation (ASF) under one
>  * or more contributor license agreements.  See the NOTICE file
>  * distributed with this work for additional information
>  * regarding copyright ownership.  The ASF licenses this file
>  * to you under the Apache License, Version 2.0 (the
>  * "License"); you may not use this file except in compliance
>  * with the License.  You may obtain a copy of the License at
>  *
>  * http://www.apache.org/licenses/LICENSE-2.0
>  *
>  * Unless required by applicable law or agreed to in writing, software
>  * distributed under the License is distributed on an "AS IS" BASIS,
>  * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
>  * See the License for the specific language governing permissions and
>  * limitations under the License.
>  */
> // modified by Jongho Lee at Samsung SDS.
> package org.apache.sqoop.mapreduce.db;
> import java.sql.ResultSet;
> import java.sql.SQLException;
> import java.util.ArrayList;
> import java.util.List;
> import org.apache.commons.logging.Log;
> import org.apache.commons.logging.LogFactory;
> import org.apache.hadoop.conf.Configuration;
> import org.apache.hadoop.mapreduce.InputSplit;
> import com.cloudera.sqoop.config.ConfigurationHelper;
> import com.cloudera.sqoop.mapreduce.db.DBSplitter;
> import com.cloudera.sqoop.mapreduce.db.DataDrivenDBInputFormat;
> /**
>  * Implement DBSplitter over floating-point values.
>  */
> public class FloatSplitter implements DBSplitter  {
>   private static final Log LOG = LogFactory.getLog(FloatSplitter.class);
>   private static final double MIN_INCREMENT = 1 * Double.MIN_VALUE;
>   public List split(Configuration conf, ResultSet results,
>   String colName) throws SQLException {
> LOG.warn("Generating splits for a floating-point index column. Due to 
> the");
> LOG.warn("imprecise representation of floating-point values in Java, 
> this");
> LOG.warn("may result in an incomplete import.");
> LOG.warn("You are strongly encouraged to choose an integral split 
> column.");
> List splits = new ArrayList();
> if (results.getString(1) == null && results.getString(2) == null) {
>   // Range is null to null. Return a null split accordingly.
>   splits.add(new DataDrivenDBInputFormat.DataDrivenDBInputSplit(
>   colName + " IS NULL", colName + " IS NULL"));
>   return splits;
> }
> double minVal = results.getDouble(1);
> double maxVal = results.getDouble(2);
> // Use this as a hint. May need an extra task if the size doesn't
> // divide cleanly.
> int numSplits = ConfigurationHelper.getConfNumMaps(conf);
> double splitSize = (maxVal - minVal) / (double) numSplits;
> if (splitSize < MIN_INCREMENT) {
>   splitSize = MIN_INCREMENT;
> }
> String lowClausePrefix = colName + " >= ";
> String highClausePrefix = colName + " < ";
> double curLower = minVal;
> d

[jira] [Updated] (SQOOP-1312) One of mappers does not load data from mySql if double column is used as split key

2018-10-05 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-1312?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed updated SQOOP-1312:
--
Affects Version/s: 1.4.6
   1.4.7

> One of mappers does not load data from mySql if double column is used as 
> split key
> --
>
> Key: SQOOP-1312
> URL: https://issues.apache.org/jira/browse/SQOOP-1312
> Project: Sqoop
>  Issue Type: Bug
>Affects Versions: 1.4.4, 1.4.6, 1.4.7
>Reporter: Jong Ho Lee
>Assignee: Jong Ho Lee
>Priority: Major
> Attachments: splitter.patch, splitter.patch
>
>
> When we used Sqoop to load data from mySQL using one double column as 
> split-key in Samsung SDS,
>   the last mapper did not load data from mySQL at all. 
>   The number of mappers was sometimes increased by 1.
>   I think they were caused by some bugs in FloatSplitter.java
>   For the last split, lowClausePrefix + Double.toString(curUpper), may be 
> lowClausePrefix + Double.toString(curLower).
>   In while (curUpper < maxVal) loop, because of round-off error, 
>   minVal + splitSize * numSplits can be smaller than maxVal.
>   Therefore, using for-loop would be better.
>   Attached is a proposed new FloatSplitter.java
> {code}
> /**
>  * Licensed to the Apache Software Foundation (ASF) under one
>  * or more contributor license agreements.  See the NOTICE file
>  * distributed with this work for additional information
>  * regarding copyright ownership.  The ASF licenses this file
>  * to you under the Apache License, Version 2.0 (the
>  * "License"); you may not use this file except in compliance
>  * with the License.  You may obtain a copy of the License at
>  *
>  * http://www.apache.org/licenses/LICENSE-2.0
>  *
>  * Unless required by applicable law or agreed to in writing, software
>  * distributed under the License is distributed on an "AS IS" BASIS,
>  * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
>  * See the License for the specific language governing permissions and
>  * limitations under the License.
>  */
> // modified by Jongho Lee at Samsung SDS.
> package org.apache.sqoop.mapreduce.db;
> import java.sql.ResultSet;
> import java.sql.SQLException;
> import java.util.ArrayList;
> import java.util.List;
> import org.apache.commons.logging.Log;
> import org.apache.commons.logging.LogFactory;
> import org.apache.hadoop.conf.Configuration;
> import org.apache.hadoop.mapreduce.InputSplit;
> import com.cloudera.sqoop.config.ConfigurationHelper;
> import com.cloudera.sqoop.mapreduce.db.DBSplitter;
> import com.cloudera.sqoop.mapreduce.db.DataDrivenDBInputFormat;
> /**
>  * Implement DBSplitter over floating-point values.
>  */
> public class FloatSplitter implements DBSplitter  {
>   private static final Log LOG = LogFactory.getLog(FloatSplitter.class);
>   private static final double MIN_INCREMENT = 1 * Double.MIN_VALUE;
>   public List split(Configuration conf, ResultSet results,
>   String colName) throws SQLException {
> LOG.warn("Generating splits for a floating-point index column. Due to 
> the");
> LOG.warn("imprecise representation of floating-point values in Java, 
> this");
> LOG.warn("may result in an incomplete import.");
> LOG.warn("You are strongly encouraged to choose an integral split 
> column.");
> List splits = new ArrayList();
> if (results.getString(1) == null && results.getString(2) == null) {
>   // Range is null to null. Return a null split accordingly.
>   splits.add(new DataDrivenDBInputFormat.DataDrivenDBInputSplit(
>   colName + " IS NULL", colName + " IS NULL"));
>   return splits;
> }
> double minVal = results.getDouble(1);
> double maxVal = results.getDouble(2);
> // Use this as a hint. May need an extra task if the size doesn't
> // divide cleanly.
> int numSplits = ConfigurationHelper.getConfNumMaps(conf);
> double splitSize = (maxVal - minVal) / (double) numSplits;
> if (splitSize < MIN_INCREMENT) {
>   splitSize = MIN_INCREMENT;
> }
> String lowClausePrefix = colName + " >= ";
> String highClausePrefix = colName + " < ";
> double curLower = minVal;
> double curUpper = curLower + splitSize;
> for (int i = 0; i < numSplits - 1; i++) {
>   // while (curUpper < maxVal) {  // changed to for loop
&g

[jira] [Updated] (SQOOP-1312) One of mappers does not load data from mySql if double column is used as split key

2018-10-05 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-1312?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed updated SQOOP-1312:
--
Affects Version/s: (was: 1.4.7)
   1.4.4

> One of mappers does not load data from mySql if double column is used as 
> split key
> --
>
> Key: SQOOP-1312
> URL: https://issues.apache.org/jira/browse/SQOOP-1312
> Project: Sqoop
>  Issue Type: Bug
>Affects Versions: 1.4.4
>Reporter: Jong Ho Lee
>Assignee: Jong Ho Lee
>Priority: Major
> Attachments: splitter.patch, splitter.patch
>
>
> When we used Sqoop to load data from mySQL using one double column as 
> split-key in Samsung SDS,
>   the last mapper did not load data from mySQL at all. 
>   The number of mappers was sometimes increased by 1.
>   I think they were caused by some bugs in FloatSplitter.java
>   For the last split, lowClausePrefix + Double.toString(curUpper), may be 
> lowClausePrefix + Double.toString(curLower).
>   In while (curUpper < maxVal) loop, because of round-off error, 
>   minVal + splitSize * numSplits can be smaller than maxVal.
>   Therefore, using for-loop would be better.
>   Attached is a proposed new FloatSplitter.java
> {code}
> /**
>  * Licensed to the Apache Software Foundation (ASF) under one
>  * or more contributor license agreements.  See the NOTICE file
>  * distributed with this work for additional information
>  * regarding copyright ownership.  The ASF licenses this file
>  * to you under the Apache License, Version 2.0 (the
>  * "License"); you may not use this file except in compliance
>  * with the License.  You may obtain a copy of the License at
>  *
>  * http://www.apache.org/licenses/LICENSE-2.0
>  *
>  * Unless required by applicable law or agreed to in writing, software
>  * distributed under the License is distributed on an "AS IS" BASIS,
>  * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
>  * See the License for the specific language governing permissions and
>  * limitations under the License.
>  */
> // modified by Jongho Lee at Samsung SDS.
> package org.apache.sqoop.mapreduce.db;
> import java.sql.ResultSet;
> import java.sql.SQLException;
> import java.util.ArrayList;
> import java.util.List;
> import org.apache.commons.logging.Log;
> import org.apache.commons.logging.LogFactory;
> import org.apache.hadoop.conf.Configuration;
> import org.apache.hadoop.mapreduce.InputSplit;
> import com.cloudera.sqoop.config.ConfigurationHelper;
> import com.cloudera.sqoop.mapreduce.db.DBSplitter;
> import com.cloudera.sqoop.mapreduce.db.DataDrivenDBInputFormat;
> /**
>  * Implement DBSplitter over floating-point values.
>  */
> public class FloatSplitter implements DBSplitter  {
>   private static final Log LOG = LogFactory.getLog(FloatSplitter.class);
>   private static final double MIN_INCREMENT = 1 * Double.MIN_VALUE;
>   public List split(Configuration conf, ResultSet results,
>   String colName) throws SQLException {
> LOG.warn("Generating splits for a floating-point index column. Due to 
> the");
> LOG.warn("imprecise representation of floating-point values in Java, 
> this");
> LOG.warn("may result in an incomplete import.");
> LOG.warn("You are strongly encouraged to choose an integral split 
> column.");
> List splits = new ArrayList();
> if (results.getString(1) == null && results.getString(2) == null) {
>   // Range is null to null. Return a null split accordingly.
>   splits.add(new DataDrivenDBInputFormat.DataDrivenDBInputSplit(
>   colName + " IS NULL", colName + " IS NULL"));
>   return splits;
> }
> double minVal = results.getDouble(1);
> double maxVal = results.getDouble(2);
> // Use this as a hint. May need an extra task if the size doesn't
> // divide cleanly.
> int numSplits = ConfigurationHelper.getConfNumMaps(conf);
> double splitSize = (maxVal - minVal) / (double) numSplits;
> if (splitSize < MIN_INCREMENT) {
>   splitSize = MIN_INCREMENT;
> }
> String lowClausePrefix = colName + " >= ";
> String highClausePrefix = colName + " < ";
> double curLower = minVal;
> double curUpper = curLower + splitSize;
> for (int i = 0; i < numSplits - 1; i++) {
>   // while (curUpper < maxVal) {  // changed to for loop
&g

[jira] [Commented] (SQOOP-3389) Unable to build Sqoop 1.4.7 due to upstream TLS 1.2 issue

2018-10-05 Thread Boglarka Egyed (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3389?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16639622#comment-16639622
 ] 

Boglarka Egyed commented on SQOOP-3389:
---

Hi [~devin.bost],

Java target version in Sqoop 1.4.7 is 1.7, see these lines in the build.xml 
file: [https://github.com/apache/sqoop/blob/branch-1.4.7/build.xml#L118-L119,] 
unfortunately the COMPILING and README text files contain misleading 
information about using version 1.6. It will be bumped and corrected to 1.8 in 
the very next release by the way.

Could you please try to build Sqoop 1.4.7 with Java 1.7?

Thank you,
Bogi

> Unable to build Sqoop 1.4.7 due to upstream TLS 1.2 issue
> -
>
> Key: SQOOP-3389
> URL: https://issues.apache.org/jira/browse/SQOOP-3389
> Project: Sqoop
>  Issue Type: Bug
>  Components: build
>Affects Versions: 1.4.7
> Environment: Sqoop 1.4.7
> Java SDK 1.6.0_45 
> Ant 1.7.1
> Windows 10 via MinGW64
>Reporter: Devin G. Bost
>Priority: Blocker
>  Labels: build
>
> When building Sqoop 1.4.7 with Java SDK 1.6.0_45 and Ant 1.7.1 on Windows in 
> MinGW64, I obtain these build errors:
> [ivy:resolve] Server access Error: Remote host closed connection during 
> handshake 
> url=https://repository.cloudera.com/content/repositories/releases/org/apache/avro/avro-mapred/1.8.1/avro-mapred-1.8.1-hadoop2.pom
> [ivy:resolve] Server access Error: Remote host closed connection during 
> handshake 
> url=https://repository.cloudera.com/content/repositories/releases/org/apache/avro/avro-mapred/1.8.1/avro-mapred-1.8.1-hadoop2.jar
> [ivy:resolve] Server access Error: Remote host closed connection during 
> handshake 
> url=https://repository.cloudera.com/content/repositories/staging/org/apache/avro/avro-mapred/1.8.1/avro-mapred-1.8.1-hadoop2.pom
> [ivy:resolve] Server access Error: Remote host closed connection during 
> handshake 
> url=[https://repository.cloudera.com/content/repositories/staging/org/apache/avro/avro-mapred/1.8.1/avro-mapred-1.8.1-hadoop2.jar]
>  
> I experienced similar build errors on a different project, which I traced to 
> this issue: 
> [https://stackoverflow.com/questions/21245796/javax-net-ssl-sslhandshakeexception-remote-host-closed-connection-during-handsh/22629008]
> The problem, however, is that because this particular build of Sqoop requires 
> Java 1.6, TLS v1.2 is unsupported, according to here: 
> [https://stackoverflow.com/questions/33364100/how-to-use-tls-1-2-in-java-6]
> which is a problem because some public repositories have dropped support for 
> TLS versions prior to 1.2, as reported here: 
> [https://github.com/Microsoft/vcpkg/issues/2969]
> and here: 
> [https://blog.github.com/2018-02-23-weak-cryptographic-standards-removed/]
> If it is impossible now to pull the upstream dependencies when building via 
> Ant due to TLS 1.2 being unsupported in Java 1.6, then this is a critical 
> dependency conflict. 
> If additional configuration steps are required to be able to successfully 
> build, they are not documented in any of the Sqoop documentation that I have 
> found. 
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (SQOOP-3387) Include Column-Remarks

2018-10-05 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3387?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed reassigned SQOOP-3387:
-

Assignee: Tomas Sebastian Hätälä

> Include Column-Remarks
> --
>
> Key: SQOOP-3387
> URL: https://issues.apache.org/jira/browse/SQOOP-3387
> Project: Sqoop
>  Issue Type: Wish
>  Components: connectors, metastore
>Affects Versions: 1.4.7
>Reporter: Tomas Sebastian Hätälä
>Assignee: Tomas Sebastian Hätälä
>Priority: Critical
>  Labels: easy-fix, features, pull-request-available
> Fix For: 1.5.0
>
>
> In most RDBMS it is possible to enter comments/ remarks for table and view 
> columns. That way a user can obtain additional information regarding the data 
> and how to use it.
> With the avro file format it would be possible to store this information in 
> the schema file using the "doc"-tag. At the moment this is, however, left 
> blanc.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3387) Include Column-Remarks

2018-10-05 Thread Boglarka Egyed (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3387?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16639610#comment-16639610
 ] 

Boglarka Egyed commented on SQOOP-3387:
---

Hi [~hatala91],


Thank you for opening an issue for your contribution!

We usually use the Review Board for reviewing patches.
You could make it easier for your peers to review and accept your change by 
uploading your patch for review there and attach it here. The process at the 
moment is roughly this:

* Upload the patch to Jira by the "Submit Patch" button. Make sure the issue 
has a status of "Patch Available" afterwards.
* Upload the patch to the Review Board 
([https://reviews.apache.org)|https://reviews.apache.org%29/] as well so other 
contributors can get familiar with your change. To keep the review accessible 
to your peers, fill in the following fields:
** Summary: generate your summary using the issue's Jira key + Jira title
** Groups: add the relevant group so everyone on the project will know about 
your patch ( Sqoop )
** Bugs: add the issue's Jira key so it's easy to navigate to the Jira side
** Repository: sqoop-trunk for Sqoop1 and sqoop-sqoop2 for Sqoop2
** (And as soon as the patch gets committed, it's very useful for the community 
if you close the review and mark it as "Submitted" at the Review board. The 
button to do this is top right at your own tickets, right next to the Download 
Diff button.)
* Please add the link of the review as an external/web link in the Jira ticket 
so it's easy to navigate to the reviews side from Jira as well.

Thank you,
Bogi

> Include Column-Remarks
> --
>
> Key: SQOOP-3387
> URL: https://issues.apache.org/jira/browse/SQOOP-3387
> Project: Sqoop
>  Issue Type: Wish
>  Components: connectors, metastore
>Affects Versions: 1.4.7
>Reporter: Tomas Sebastian Hätälä
>Priority: Critical
>  Labels: easy-fix, features, pull-request-available
> Fix For: 1.5.0
>
>
> In most RDBMS it is possible to enter comments/ remarks for table and view 
> columns. That way a user can obtain additional information regarding the data 
> and how to use it.
> With the avro file format it would be possible to store this information in 
> the schema file using the "doc"-tag. At the moment this is, however, left 
> blanc.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Re: Review Request 67407: SQOOP-3327: Mainframe FTP needs to Include "Migrated" datasets when parsing the FTP list

2018-10-05 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/67407/#review209253
---


Fix it, then Ship it!




Hi Chris,

Thanks for this patch! Unit and 3rd party tests passed for me but I have a 
minor finding regarding file formatting, please take a look.

Thank you,
Bogi


src/java/org/apache/sqoop/mapreduce/mainframe/MainframeFTPFileEntryParser.java
Lines 32-35 (original), 32-45 (patched)
<https://reviews.apache.org/r/67407/#comment293556>

Indentation is not aligned with the one used in this file, this applies to 
the further lines too.


- Boglarka Egyed


On Sept. 21, 2018, 10:40 a.m., Chris Teoh wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/67407/
> ---
> 
> (Updated Sept. 21, 2018, 10:40 a.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> ---
> 
> Mainframe FTP needs to Include "Migrated" datasets when parsing the FTP list. 
> Initially, these were excluded out of the regular expression.
> 
> 
> Diffs
> -
> 
>   
> src/java/org/apache/sqoop/mapreduce/mainframe/MainframeFTPFileEntryParser.java
>  f0b87868 
>   
> src/test/org/apache/sqoop/mapreduce/mainframe/TestMainframeFTPFileEntryParser.java
>  eb0f8c00 
> 
> 
> Diff: https://reviews.apache.org/r/67407/diff/5/
> 
> 
> Testing
> ---
> 
> Unit testing.
> 
> 
> Thanks,
> 
> Chris Teoh
> 
>



[jira] [Updated] (SQOOP-3376) Test import into external Hive table backed by S3

2018-10-05 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3376?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed updated SQOOP-3376:
--
Attachment: SQOOP-3376.patch

> Test import into external Hive table backed by S3
> -
>
> Key: SQOOP-3376
> URL: https://issues.apache.org/jira/browse/SQOOP-3376
> Project: Sqoop
>  Issue Type: Sub-task
>    Reporter: Boglarka Egyed
>        Assignee: Boglarka Egyed
>Priority: Major
> Attachments: SQOOP-3376.patch, SQOOP-3376.patch, SQOOP-3376.patch, 
> SQOOP-3376.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Re: Review Request 68712: SQOOP-3376: Test import into external Hive table backed by S3

2018-10-04 Thread Boglarka Egyed


> On Oct. 4, 2018, 3:42 p.m., Fero Szabo wrote:
> > Lgtm!
> > 
> > It is interesting to see that you ran into the problem that parameterized 
> > tests don't support multiple dimensions!
> > 
> > In any case, I like the tests as they are now, they are concise enough and 
> > descriptive enough.
> > 
> > My only concern is documentation, and that it should also cover the kinks 
> > and quirks. But I see you've filed a separate Jira for that.

Thanks Fero for taking a look! Yes, all documentation update will be covered in 
SQOOP-3384.


- Boglarka


---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/68712/#review209226
-------


On Oct. 4, 2018, 4:13 p.m., Boglarka Egyed wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/68712/
> ---
> 
> (Updated Oct. 4, 2018, 4:13 p.m.)
> 
> 
> Review request for Sqoop, daniel voros, Fero Szabo, and Szabolcs Vasas.
> 
> 
> Bugs: SQOOP-3376
> https://issues.apache.org/jira/browse/SQOOP-3376
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> ---
> 
> Testing the Text and Parquet imports into an external Hive table backed by S3.
> 
> 
> Diffs
> -
> 
>   src/test/org/apache/sqoop/s3/TestS3ExternalHiveTableImport.java 
> PRE-CREATION 
>   src/test/org/apache/sqoop/testutil/HiveServer2TestUtil.java 
> 799370816cccda7578d7c64add6e283d3123e1c8 
>   src/test/org/apache/sqoop/testutil/S3TestUtils.java 
> 0e6ef5bf001797aa70a7ad50d261c6fd384222fe 
> 
> 
> Diff: https://reviews.apache.org/r/68712/diff/4/
> 
> 
> Testing
> ---
> 
> ./gradlew test -Ds3.bucket.url= 
> -Ds3.generator.command=
> 
> 
> Thanks,
> 
> Boglarka Egyed
> 
>



Re: Review Request 68712: SQOOP-3376: Test import into external Hive table backed by S3

2018-10-04 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/68712/
---

(Updated Oct. 4, 2018, 4:13 p.m.)


Review request for Sqoop, daniel voros, Fero Szabo, and Szabolcs Vasas.


Changes
---

Fixed typo


Bugs: SQOOP-3376
https://issues.apache.org/jira/browse/SQOOP-3376


Repository: sqoop-trunk


Description
---

Testing the Text and Parquet imports into an external Hive table backed by S3.


Diffs (updated)
-

  src/test/org/apache/sqoop/s3/TestS3ExternalHiveTableImport.java PRE-CREATION 
  src/test/org/apache/sqoop/testutil/HiveServer2TestUtil.java 
799370816cccda7578d7c64add6e283d3123e1c8 
  src/test/org/apache/sqoop/testutil/S3TestUtils.java 
0e6ef5bf001797aa70a7ad50d261c6fd384222fe 


Diff: https://reviews.apache.org/r/68712/diff/4/

Changes: https://reviews.apache.org/r/68712/diff/3-4/


Testing
---

./gradlew test -Ds3.bucket.url= 
-Ds3.generator.command=


Thanks,

Boglarka Egyed



Re: Review Request 67408: SQOOP-3326: Mainframe FTP listing for GDG should filter out non-GDG datasets in a heterogeneous listing

2018-10-04 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/67408/#review209229
---


Fix it, then Ship it!




Hi Chris,

Thank you for your efforts on Mainframe support front!

Your change generally looks good to me and the unit, 3rd party tests passed 
with your patch. I have one minor finding, could you please take a look?

Thanks,
Bogi


src/test/org/apache/sqoop/mapreduce/mainframe/TestMainframeFTPFileGdgEntryParser.java
Lines 54-73 (patched)
<https://reviews.apache.org/r/67408/#comment293544>

"H19761 Tape" and "G0034V00" present in this code part more than once as 
raw strings. This could become difficult to maintain in the future, I would 
suggest to use constants instead.


- Boglarka Egyed


On Sept. 21, 2018, 10:22 a.m., Chris Teoh wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/67408/
> ---
> 
> (Updated Sept. 21, 2018, 10:22 a.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> ---
> 
> Mainframe FTP listing for GDG should filter out non-GDG datasets in a 
> heterogeneous listing as it grabs the last file and in the case where there 
> are other datasets mixed in, the latest file may not be the desired dataset.
> 
> 
> Diffs
> -
> 
>   build.xml cd2e9e29 
>   src/java/org/apache/sqoop/mapreduce/mainframe/MainframeConfiguration.java 
> 9d6a2fe7 
>   
> src/java/org/apache/sqoop/mapreduce/mainframe/MainframeFTPFileGdgEntryParser.java
>  PRE-CREATION 
>   src/java/org/apache/sqoop/util/MainframeFTPClientUtils.java 654721e3 
>   
> src/scripts/thirdpartytest/docker-compose/sqoop-thirdpartytest-db-services.yml
>  4648f545 
>   src/test/org/apache/sqoop/manager/mainframe/MainframeManagerImportTest.java 
> 3b8ed236 
>   src/test/org/apache/sqoop/manager/mainframe/MainframeTestUtil.java 9f86f6cd 
>   
> src/test/org/apache/sqoop/mapreduce/mainframe/TestMainframeFTPFileGdgEntryParser.java
>  PRE-CREATION 
>   src/test/org/apache/sqoop/util/TestMainframeFTPClientUtils.java 90a85194 
> 
> 
> Diff: https://reviews.apache.org/r/67408/diff/3/
> 
> 
> Testing
> ---
> 
> Unit tests. Integration testing locally on developer machine.
> 
> 
> Thanks,
> 
> Chris Teoh
> 
>



Re: Review Request 68606: Error during direct Netezza import/export can interrupt process in uncontrolled ways

2018-10-04 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/68606/#review209221
---


Ship it!




Hi Daniel,

Apart from the discussion with Szabolcs about the expected exception handling 
I'm OK with your change. All tests passed.

Thanks,
Bogi

- Boglarka Egyed


On Sept. 3, 2018, 11:32 a.m., daniel voros wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/68606/
> ---
> 
> (Updated Sept. 3, 2018, 11:32 a.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Bugs: SQOOP-3378
> https://issues.apache.org/jira/browse/SQOOP-3378
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> ---
> 
> `SQLException` during JDBC operation in direct Netezza import/export signals 
> parent thread to fail fast by interrupting it.
> We're trying to process the interrupt in the parent (main) thread, but 
> there's no guarantee that we're not in some internal call that will process 
> the interrupted flag and reset it before we're able to check.
> 
> It is also possible that the parent thread has passed the "checking part" 
> when it gets interrupted. In case of `NetezzaExternalTableExportMapper` this 
> can interrupt the upload of log files.
> 
> I'd recommend using some other means of communication between the threads 
> than interrupts.
> 
> 
> Diffs
> -
> 
>   
> src/java/org/apache/sqoop/mapreduce/db/netezza/NetezzaExternalTableExportMapper.java
>  5bf21880 
>   
> src/java/org/apache/sqoop/mapreduce/db/netezza/NetezzaExternalTableImportMapper.java
>  306062aa 
>   
> src/java/org/apache/sqoop/mapreduce/db/netezza/NetezzaJDBCStatementRunner.java
>  cedfd235 
>   
> src/test/org/apache/sqoop/mapreduce/db/netezza/TestNetezzaExternalTableExportMapper.java
>  PRE-CREATION 
>   
> src/test/org/apache/sqoop/mapreduce/db/netezza/TestNetezzaExternalTableImportMapper.java
>  PRE-CREATION 
> 
> 
> Diff: https://reviews.apache.org/r/68606/diff/2/
> 
> 
> Testing
> ---
> 
> added new UTs and checked manual Netezza tests (NetezzaExportManualTest, 
> NetezzaImportManualTest)
> 
> 
> Thanks,
> 
> daniel voros
> 
>



  1   2   3   4   5   6   7   >