[jira] [Commented] (SQOOP-3077) Add support for (import + --hcatalog + --as-avrodatafile) with RDBMS type TIMESTAMP

2018-08-22 Thread Eric Lin (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3077?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16589678#comment-16589678
 ] 

Eric Lin commented on SQOOP-3077:
-

I will see if I can add this feature.

> Add support for (import + --hcatalog + --as-avrodatafile) with RDBMS type 
> TIMESTAMP
> ---
>
> Key: SQOOP-3077
> URL: https://issues.apache.org/jira/browse/SQOOP-3077
> Project: Sqoop
>  Issue Type: Improvement
>Reporter: Markus Kemper
>Assignee: Eric Lin
>Priority: Major
>
> Please consider adding support for --hcatalog import and TIMESTAMPS, the Avro 
> Specification suggest that Logical Types support TIMESTAMPS.
> Avro Doc:
> https://avro.apache.org/docs/1.8.1/spec.html#Logical+Types
> {noformat}
> #
> # STEP 01 - Setup Table and Data
> #
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query 
> "drop table t1_dates"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query 
> "create table t1_dates (c1_int integer, c2_date date, c3_timestamp timestamp)"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query 
> "select dbms_metadata.get_ddl('TABLE', 'T1_DATES', 'SQOOP') from dual"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query 
> "insert into t1_dates values (1, current_date, current_timestamp)"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query 
> "select * from t1_dates"
> Output:
> 
> | DBMS_METADATA.GET_DDL('TABLE','T1_DATES','SQOOP') | 
> 
> | 
>   CREATE TABLE "SQOOP"."T1_DATES" 
>(  "C1_INT" NUMBER(*,0), 
>   "C2_DATE" DATE, 
>   "C3_TIMESTAMP" TIMESTAMP (6)
>) SEGMENT CREATION DEFERRED 
>   PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 
>  NOCOMPRESS LOGGING
>   TABLESPACE "SQOOP"  | 
> 
> ---
> 
> | C1_INT   | C2_DATE | C3_TIMESTAMP | 
> 
> | 1| 2016-12-10 15:48:23.0 | 2016-12-10 15:48:23.707327 | 
> 
> #
> # STEP 02 - Import with Text Format
> #
> beeline -u jdbc:hive2:// -e "use default; drop table t1_dates_text;"
> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --table 
> T1_DATES --hcatalog-database default --hcatalog-table t1_dates_text 
> --create-hcatalog-table --hcatalog-storage-stanza 'stored as textfile' 
> --num-mappers 1 --map-column-hive c2_date=date,c3_timestamp=timestamp
> beeline -u jdbc:hive2:// -e "use default; describe t1_dates_text; select * 
> from t1_dates_text;"
> +-+--+
> | createtab_stmt  |
> +-+--+
> | CREATE TABLE `t1_dates_text`(   |
> |   `c1_int` decimal(38,0),   |
> |   `c2_date` date,   |
> |   `c3_timestamp` timestamp) |
> | ROW FORMAT SERDE|
> |   'org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe'  |
> | STORED AS INPUTFORMAT   |
> |   'org.apache.hadoop.mapred.TextInputFormat'|
> | OUTPUTFORMAT|
> |   'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'  |
> | LOCATION|
> |   'hdfs://nameservice1/user/hive/warehouse/t1_dates_text'   |
> | TBLPROPERTIES ( |
> |   'transient_lastDdlTime'='1481386391') |
> +-+--+
> --
> +---++-+--+
> | t1_dates_text.c1_int  | t1_dates_text.c2_date  | t1_dates_text.c3_timestamp 
>  |
> +---++-+--+
> | 1 | 2016-12-10 | 2016-12-10 15:48:23.707327 
>  |
> +---++-+--+
> #
> # STEP 03 - Import with Avro Format (default)
> #
> beeline -u jdbc:hive2:// -e "use default; drop table t1_dates_text;"
> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --table 
> T1_DATES --hcatalog-database default --hcatalog-table t1_dates_avro 
> --create-hcatalog-table 

[jira] [Assigned] (SQOOP-3077) Add support for (import + --hcatalog + --as-avrodatafile) with RDBMS type TIMESTAMP

2018-08-22 Thread Eric Lin (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3077?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eric Lin reassigned SQOOP-3077:
---

Assignee: Eric Lin

> Add support for (import + --hcatalog + --as-avrodatafile) with RDBMS type 
> TIMESTAMP
> ---
>
> Key: SQOOP-3077
> URL: https://issues.apache.org/jira/browse/SQOOP-3077
> Project: Sqoop
>  Issue Type: Improvement
>Reporter: Markus Kemper
>Assignee: Eric Lin
>Priority: Major
>
> Please consider adding support for --hcatalog import and TIMESTAMPS, the Avro 
> Specification suggest that Logical Types support TIMESTAMPS.
> Avro Doc:
> https://avro.apache.org/docs/1.8.1/spec.html#Logical+Types
> {noformat}
> #
> # STEP 01 - Setup Table and Data
> #
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query 
> "drop table t1_dates"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query 
> "create table t1_dates (c1_int integer, c2_date date, c3_timestamp timestamp)"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query 
> "select dbms_metadata.get_ddl('TABLE', 'T1_DATES', 'SQOOP') from dual"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query 
> "insert into t1_dates values (1, current_date, current_timestamp)"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query 
> "select * from t1_dates"
> Output:
> 
> | DBMS_METADATA.GET_DDL('TABLE','T1_DATES','SQOOP') | 
> 
> | 
>   CREATE TABLE "SQOOP"."T1_DATES" 
>(  "C1_INT" NUMBER(*,0), 
>   "C2_DATE" DATE, 
>   "C3_TIMESTAMP" TIMESTAMP (6)
>) SEGMENT CREATION DEFERRED 
>   PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 
>  NOCOMPRESS LOGGING
>   TABLESPACE "SQOOP"  | 
> 
> ---
> 
> | C1_INT   | C2_DATE | C3_TIMESTAMP | 
> 
> | 1| 2016-12-10 15:48:23.0 | 2016-12-10 15:48:23.707327 | 
> 
> #
> # STEP 02 - Import with Text Format
> #
> beeline -u jdbc:hive2:// -e "use default; drop table t1_dates_text;"
> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --table 
> T1_DATES --hcatalog-database default --hcatalog-table t1_dates_text 
> --create-hcatalog-table --hcatalog-storage-stanza 'stored as textfile' 
> --num-mappers 1 --map-column-hive c2_date=date,c3_timestamp=timestamp
> beeline -u jdbc:hive2:// -e "use default; describe t1_dates_text; select * 
> from t1_dates_text;"
> +-+--+
> | createtab_stmt  |
> +-+--+
> | CREATE TABLE `t1_dates_text`(   |
> |   `c1_int` decimal(38,0),   |
> |   `c2_date` date,   |
> |   `c3_timestamp` timestamp) |
> | ROW FORMAT SERDE|
> |   'org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe'  |
> | STORED AS INPUTFORMAT   |
> |   'org.apache.hadoop.mapred.TextInputFormat'|
> | OUTPUTFORMAT|
> |   'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'  |
> | LOCATION|
> |   'hdfs://nameservice1/user/hive/warehouse/t1_dates_text'   |
> | TBLPROPERTIES ( |
> |   'transient_lastDdlTime'='1481386391') |
> +-+--+
> --
> +---++-+--+
> | t1_dates_text.c1_int  | t1_dates_text.c2_date  | t1_dates_text.c3_timestamp 
>  |
> +---++-+--+
> | 1 | 2016-12-10 | 2016-12-10 15:48:23.707327 
>  |
> +---++-+--+
> #
> # STEP 03 - Import with Avro Format (default)
> #
> beeline -u jdbc:hive2:// -e "use default; drop table t1_dates_text;"
> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --table 
> T1_DATES --hcatalog-database default --hcatalog-table t1_dates_avro 
> --create-hcatalog-table --hcatalog-storage-stanza 'stored as avro' 
> --num-mappers 1
> 

[jira] [Commented] (SQOOP-3077) Add support for (import + --hcatalog + --as-avrodatafile) with RDBMS type TIMESTAMP

2018-08-22 Thread Eric Lin (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3077?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16589565#comment-16589565
 ] 

Eric Lin commented on SQOOP-3077:
-

This is fixed upstream via: https://issues.apache.org/jira/browse/HIVE-8131, 
but not available in CDH yet. So I guess Sqoop upstream change should be able 
to go ahead

> Add support for (import + --hcatalog + --as-avrodatafile) with RDBMS type 
> TIMESTAMP
> ---
>
> Key: SQOOP-3077
> URL: https://issues.apache.org/jira/browse/SQOOP-3077
> Project: Sqoop
>  Issue Type: Improvement
>Reporter: Markus Kemper
>Priority: Major
>
> Please consider adding support for --hcatalog import and TIMESTAMPS, the Avro 
> Specification suggest that Logical Types support TIMESTAMPS.
> Avro Doc:
> https://avro.apache.org/docs/1.8.1/spec.html#Logical+Types
> {noformat}
> #
> # STEP 01 - Setup Table and Data
> #
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query 
> "drop table t1_dates"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query 
> "create table t1_dates (c1_int integer, c2_date date, c3_timestamp timestamp)"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query 
> "select dbms_metadata.get_ddl('TABLE', 'T1_DATES', 'SQOOP') from dual"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query 
> "insert into t1_dates values (1, current_date, current_timestamp)"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query 
> "select * from t1_dates"
> Output:
> 
> | DBMS_METADATA.GET_DDL('TABLE','T1_DATES','SQOOP') | 
> 
> | 
>   CREATE TABLE "SQOOP"."T1_DATES" 
>(  "C1_INT" NUMBER(*,0), 
>   "C2_DATE" DATE, 
>   "C3_TIMESTAMP" TIMESTAMP (6)
>) SEGMENT CREATION DEFERRED 
>   PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 
>  NOCOMPRESS LOGGING
>   TABLESPACE "SQOOP"  | 
> 
> ---
> 
> | C1_INT   | C2_DATE | C3_TIMESTAMP | 
> 
> | 1| 2016-12-10 15:48:23.0 | 2016-12-10 15:48:23.707327 | 
> 
> #
> # STEP 02 - Import with Text Format
> #
> beeline -u jdbc:hive2:// -e "use default; drop table t1_dates_text;"
> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --table 
> T1_DATES --hcatalog-database default --hcatalog-table t1_dates_text 
> --create-hcatalog-table --hcatalog-storage-stanza 'stored as textfile' 
> --num-mappers 1 --map-column-hive c2_date=date,c3_timestamp=timestamp
> beeline -u jdbc:hive2:// -e "use default; describe t1_dates_text; select * 
> from t1_dates_text;"
> +-+--+
> | createtab_stmt  |
> +-+--+
> | CREATE TABLE `t1_dates_text`(   |
> |   `c1_int` decimal(38,0),   |
> |   `c2_date` date,   |
> |   `c3_timestamp` timestamp) |
> | ROW FORMAT SERDE|
> |   'org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe'  |
> | STORED AS INPUTFORMAT   |
> |   'org.apache.hadoop.mapred.TextInputFormat'|
> | OUTPUTFORMAT|
> |   'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'  |
> | LOCATION|
> |   'hdfs://nameservice1/user/hive/warehouse/t1_dates_text'   |
> | TBLPROPERTIES ( |
> |   'transient_lastDdlTime'='1481386391') |
> +-+--+
> --
> +---++-+--+
> | t1_dates_text.c1_int  | t1_dates_text.c2_date  | t1_dates_text.c3_timestamp 
>  |
> +---++-+--+
> | 1 | 2016-12-10 | 2016-12-10 15:48:23.707327 
>  |
> +---++-+--+
> #
> # STEP 03 - Import with Avro Format (default)
> #
> beeline -u jdbc:hive2:// -e "use default; drop table t1_dates_text;"
> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --table 
> T1_DATES 

[jira] [Commented] (SQOOP-3077) Add support for (import + --hcatalog + --as-avrodatafile) with RDBMS type TIMESTAMP

2018-08-22 Thread Eric Lin (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3077?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16589549#comment-16589549
 ] 

Eric Lin commented on SQOOP-3077:
-

It looks like Hive itself does not support AVRO Timestamp logical type yet:

https://github.com/cloudera/hive/blob/cdh5-1.1.0_5.15.0/serde/src/java/org/apache/hadoop/hive/serde2/avro/TypeInfoToSchema.java#L102-L166

> Add support for (import + --hcatalog + --as-avrodatafile) with RDBMS type 
> TIMESTAMP
> ---
>
> Key: SQOOP-3077
> URL: https://issues.apache.org/jira/browse/SQOOP-3077
> Project: Sqoop
>  Issue Type: Improvement
>Reporter: Markus Kemper
>Priority: Major
>
> Please consider adding support for --hcatalog import and TIMESTAMPS, the Avro 
> Specification suggest that Logical Types support TIMESTAMPS.
> Avro Doc:
> https://avro.apache.org/docs/1.8.1/spec.html#Logical+Types
> {noformat}
> #
> # STEP 01 - Setup Table and Data
> #
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query 
> "drop table t1_dates"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query 
> "create table t1_dates (c1_int integer, c2_date date, c3_timestamp timestamp)"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query 
> "select dbms_metadata.get_ddl('TABLE', 'T1_DATES', 'SQOOP') from dual"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query 
> "insert into t1_dates values (1, current_date, current_timestamp)"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query 
> "select * from t1_dates"
> Output:
> 
> | DBMS_METADATA.GET_DDL('TABLE','T1_DATES','SQOOP') | 
> 
> | 
>   CREATE TABLE "SQOOP"."T1_DATES" 
>(  "C1_INT" NUMBER(*,0), 
>   "C2_DATE" DATE, 
>   "C3_TIMESTAMP" TIMESTAMP (6)
>) SEGMENT CREATION DEFERRED 
>   PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 
>  NOCOMPRESS LOGGING
>   TABLESPACE "SQOOP"  | 
> 
> ---
> 
> | C1_INT   | C2_DATE | C3_TIMESTAMP | 
> 
> | 1| 2016-12-10 15:48:23.0 | 2016-12-10 15:48:23.707327 | 
> 
> #
> # STEP 02 - Import with Text Format
> #
> beeline -u jdbc:hive2:// -e "use default; drop table t1_dates_text;"
> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --table 
> T1_DATES --hcatalog-database default --hcatalog-table t1_dates_text 
> --create-hcatalog-table --hcatalog-storage-stanza 'stored as textfile' 
> --num-mappers 1 --map-column-hive c2_date=date,c3_timestamp=timestamp
> beeline -u jdbc:hive2:// -e "use default; describe t1_dates_text; select * 
> from t1_dates_text;"
> +-+--+
> | createtab_stmt  |
> +-+--+
> | CREATE TABLE `t1_dates_text`(   |
> |   `c1_int` decimal(38,0),   |
> |   `c2_date` date,   |
> |   `c3_timestamp` timestamp) |
> | ROW FORMAT SERDE|
> |   'org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe'  |
> | STORED AS INPUTFORMAT   |
> |   'org.apache.hadoop.mapred.TextInputFormat'|
> | OUTPUTFORMAT|
> |   'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'  |
> | LOCATION|
> |   'hdfs://nameservice1/user/hive/warehouse/t1_dates_text'   |
> | TBLPROPERTIES ( |
> |   'transient_lastDdlTime'='1481386391') |
> +-+--+
> --
> +---++-+--+
> | t1_dates_text.c1_int  | t1_dates_text.c2_date  | t1_dates_text.c3_timestamp 
>  |
> +---++-+--+
> | 1 | 2016-12-10 | 2016-12-10 15:48:23.707327 
>  |
> +---++-+--+
> #
> # STEP 03 - Import with Avro Format (default)
> #
> beeline -u jdbc:hive2:// -e "use default; drop table t1_dates_text;"
> sqoop import --connect $MYCONN --username 

[jira] [Updated] (SQOOP-3368) Add fail-fast scenarios to S3 incremental import use cases without --temporary-rootdir option

2018-08-22 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3368?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed updated SQOOP-3368:
--
Summary: Add fail-fast scenarios to S3 incremental import use cases without 
--temporary-rootdir option  (was: Add fail-fast scenarios to S3 incremental 
import use cases without --temporary-rootdir oprion)

> Add fail-fast scenarios to S3 incremental import use cases without 
> --temporary-rootdir option
> -
>
> Key: SQOOP-3368
> URL: https://issues.apache.org/jira/browse/SQOOP-3368
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>Reporter: Boglarka Egyed
>Assignee: Boglarka Egyed
>Priority: Major
>
> The current implementation of Sqoop handles HDFS as a default filesystem, 
> i.e. it creates temporary directories on HDFS in case of incremental append 
> or merge imports. To make these incremental import use cases work with S3 the 
> user needs to set the {{--temporary-rootdir}} to an S3 location properly.
> There should be fail-fast scenarios without the \{{--temporary-rootdir}} 
> option as well as a documentation of this.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (SQOOP-3368) Add fail-fast scenarios to S3 incremental import use cases without --temporary-rootdir

2018-08-22 Thread Boglarka Egyed (JIRA)
Boglarka Egyed created SQOOP-3368:
-

 Summary: Add fail-fast scenarios to S3 incremental import use 
cases without --temporary-rootdir
 Key: SQOOP-3368
 URL: https://issues.apache.org/jira/browse/SQOOP-3368
 Project: Sqoop
  Issue Type: Sub-task
Affects Versions: 1.4.7
Reporter: Boglarka Egyed
Assignee: Boglarka Egyed


The current implementation of Sqoop handles HDFS as a default filesystem, i.e. 
it creates temporary directories on HDFS in case of incremental append or merge 
imports. To make these incremental import use cases work with S3 the user needs 
to set the {{--temporary-rootdir}} to an S3 location properly.

There should be fail-fast scenarios without the \{{--temporary-rootdir}} option 
as well as a documentation of this.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (SQOOP-3368) Add fail-fast scenarios to S3 incremental import use cases without --temporary-rootdir oprion

2018-08-22 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3368?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed updated SQOOP-3368:
--
Summary: Add fail-fast scenarios to S3 incremental import use cases without 
--temporary-rootdir oprion  (was: Add fail-fast scenarios to S3 incremental 
import use cases without --temporary-rootdir)

> Add fail-fast scenarios to S3 incremental import use cases without 
> --temporary-rootdir oprion
> -
>
> Key: SQOOP-3368
> URL: https://issues.apache.org/jira/browse/SQOOP-3368
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>Reporter: Boglarka Egyed
>Assignee: Boglarka Egyed
>Priority: Major
>
> The current implementation of Sqoop handles HDFS as a default filesystem, 
> i.e. it creates temporary directories on HDFS in case of incremental append 
> or merge imports. To make these incremental import use cases work with S3 the 
> user needs to set the {{--temporary-rootdir}} to an S3 location properly.
> There should be fail-fast scenarios without the \{{--temporary-rootdir}} 
> option as well as a documentation of this.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (SQOOP-3363) Test incremental import with S3

2018-08-22 Thread Boglarka Egyed (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3363?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Boglarka Egyed updated SQOOP-3363:
--
Attachment: SQOOP-3363.patch

> Test incremental import with S3
> ---
>
> Key: SQOOP-3363
> URL: https://issues.apache.org/jira/browse/SQOOP-3363
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>Reporter: Boglarka Egyed
>Assignee: Boglarka Egyed
>Priority: Major
> Attachments: SQOOP-3363.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Review Request 68475: SQOOP-3363: Test incremental import with S3

2018-08-22 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/68475/
---

Review request for Sqoop, daniel voros, Fero Szabo, Nguyen Truong, and Szabolcs 
Vasas.


Bugs: SQOOP-3363
https://issues.apache.org/jira/browse/SQOOP-3363


Repository: sqoop-trunk


Description
---

* Added new test cases for Parquet import into S3 as it was still missing
* Added new test cases for incremental append import into S3 in Text, Avro, 
Sequence and Parquet file format
* Added new test cases for incremental merge import into S3 in Text and Parquet 
file format
* Updated some previously added logic in S3 util and test classes


Diffs
-

  src/test/org/apache/sqoop/s3/TestS3AvroImport.java 
e130c42104b86e854d45babc009a5f1409a74a48 
  src/test/org/apache/sqoop/s3/TestS3IncrementalAppendAvroImport.java 
PRE-CREATION 
  src/test/org/apache/sqoop/s3/TestS3IncrementalAppendParquetImport.java 
PRE-CREATION 
  src/test/org/apache/sqoop/s3/TestS3IncrementalAppendSequenceFileImport.java 
PRE-CREATION 
  src/test/org/apache/sqoop/s3/TestS3IncrementalAppendTextImport.java 
PRE-CREATION 
  src/test/org/apache/sqoop/s3/TestS3IncrementalMergeParquetImport.java 
PRE-CREATION 
  src/test/org/apache/sqoop/s3/TestS3IncrementalMergeTextImport.java 
PRE-CREATION 
  src/test/org/apache/sqoop/s3/TestS3ParquetImport.java PRE-CREATION 
  src/test/org/apache/sqoop/s3/TestS3SequenceFileImport.java 
c17c1c54918df0b4d1ecbaef4e381975d72756ae 
  src/test/org/apache/sqoop/s3/TestS3TextImport.java 
60e2cd3025e67ecd43bdfb6b30d1b8d69a50da86 
  src/test/org/apache/sqoop/testutil/AvroTestUtils.java 
04a8494a5d1d8a5020d5a3b629bbab62d3c09ffd 
  src/test/org/apache/sqoop/testutil/BaseSqoopTestCase.java 
1730698e80cc77395f8a296b7bf01c104533e10b 
  src/test/org/apache/sqoop/testutil/ParquetFileTestUtils.java PRE-CREATION 
  src/test/org/apache/sqoop/testutil/S3TestUtils.java 
ceaff3b3a2bfd031b9772c9b43afdfa670c23718 
  src/test/org/apache/sqoop/testutil/SequenceFileTestUtils.java 
ad7576dbb2447423c677429f24163031a9d39b5f 
  src/test/org/apache/sqoop/testutil/TextFileTestUtils.java 
df19cb8be7a633a6f1e1e3f9bc7d0dbc268aa90a 


Diff: https://reviews.apache.org/r/68475/diff/1/


Testing
---

ant clean test -Ds3.bucket.url= 
-Ds3.generator.command=


Thanks,

Boglarka Egyed



Re: Review Request 62492: SQOOP-3224: Mainframe FTP transfer should have an option to use binary mode for transfer

2018-08-22 Thread Boglarka Egyed

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/62492/#review207667
---


Fix it, then Ship it!




Hi Chris,

Thank you for improving your patch!

I have some very minor findings regarding the latest version otherwise it LGTM 
in general, the tests passed too.

Thanks,
Bogi


src/java/org/apache/sqoop/mapreduce/DataDrivenImportJob.java
Line 40 (original)


Seems to be an unnecessary line deletion, this file doesn't contain anz 
other relevant change. Please revert.



src/java/org/apache/sqoop/mapreduce/DataDrivenImportJob.java
Line 51 (original)


Seems to be an unnecessary line deletion, this file doesn't contain anz 
other relevant change. Please revert.



src/test/org/apache/sqoop/mapreduce/mainframe/TestMainframeDatasetFTPRecordReader.java
Lines 46-47 (patched)


Unnecessary new lines, there is no other change in this file, please revert.


- Boglarka Egyed


On Aug. 17, 2018, 1:39 p.m., Chris Teoh wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/62492/
> ---
> 
> (Updated Aug. 17, 2018, 1:39 p.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Bugs: SQOOP-3224
> https://issues.apache.org/jira/browse/SQOOP-3224
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> ---
> 
> Added --as-binaryfile and --buffersize to support FTP transfer mode switching.
> 
> 
> Diffs
> -
> 
>   build.xml 084823cf 
>   src/docs/user/import-mainframe.txt abeb7cde 
>   src/java/org/apache/sqoop/SqoopOptions.java f97dbfdf 
>   src/java/org/apache/sqoop/mapreduce/ByteKeyOutputFormat.java PRE-CREATION 
>   src/java/org/apache/sqoop/mapreduce/DataDrivenImportJob.java 349ca8d8 
>   src/java/org/apache/sqoop/mapreduce/KeyRecordWriters.java PRE-CREATION 
>   src/java/org/apache/sqoop/mapreduce/RawKeyTextOutputFormat.java fec34f21 
>   
> src/java/org/apache/sqoop/mapreduce/mainframe/AbstractMainframeDatasetImportMapper.java
>  PRE-CREATION 
>   src/java/org/apache/sqoop/mapreduce/mainframe/MainframeConfiguration.java 
> ea54b07f 
>   
> src/java/org/apache/sqoop/mapreduce/mainframe/MainframeDatasetBinaryImportMapper.java
>  PRE-CREATION 
>   
> src/java/org/apache/sqoop/mapreduce/mainframe/MainframeDatasetBinaryRecord.java
>  PRE-CREATION 
>   
> src/java/org/apache/sqoop/mapreduce/mainframe/MainframeDatasetFTPRecordReader.java
>  1f78384b 
>   
> src/java/org/apache/sqoop/mapreduce/mainframe/MainframeDatasetImportMapper.java
>  0b7b5b85 
>   src/java/org/apache/sqoop/mapreduce/mainframe/MainframeImportJob.java 
> 8ef30d38 
>   src/java/org/apache/sqoop/tool/BaseSqoopTool.java 9dcbdd59 
>   src/java/org/apache/sqoop/tool/ImportTool.java 478f1748 
>   src/java/org/apache/sqoop/tool/MainframeImportTool.java cdd9d6d0 
>   src/java/org/apache/sqoop/util/MainframeFTPClientUtils.java 95bc0ecb 
>   src/test/org/apache/sqoop/manager/mainframe/MainframeManagerImportTest.java 
> 041dfb78 
>   src/test/org/apache/sqoop/manager/mainframe/MainframeTestUtil.java f28ff36c 
>   
> src/test/org/apache/sqoop/mapreduce/mainframe/TestMainframeDatasetBinaryRecord.java
>  PRE-CREATION 
>   
> src/test/org/apache/sqoop/mapreduce/mainframe/TestMainframeDatasetFTPRecordReader.java
>  3547294f 
>   src/test/org/apache/sqoop/tool/TestMainframeImportTool.java 0b0c6c34 
> 
> 
> Diff: https://reviews.apache.org/r/62492/diff/21/
> 
> 
> Testing
> ---
> 
> Unit tests.
> 
> Functional testing on mainframe.
> 
> 
> Thanks,
> 
> Chris Teoh
> 
>



[jira] [Commented] (SQOOP-3364) Upgrade Gradle version to 4.9

2018-08-22 Thread Hudson (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3364?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16588980#comment-16588980
 ] 

Hudson commented on SQOOP-3364:
---

SUCCESS: Integrated in Jenkins build Sqoop-hadoop200 #1198 (See 
[https://builds.apache.org/job/Sqoop-hadoop200/1198/])
SQOOP-3364: Upgrade Gradle version to 4.9 (vasas: 
[https://git-wip-us.apache.org/repos/asf?p=sqoop.git=commit=f5eda1e208fba9058c250dd1b142a97eb118ced9])
* (edit) gradle/wrapper/gradle-wrapper.properties
* (edit) src/test/org/apache/sqoop/hbase/HBaseTestCase.java
* (edit) build.gradle
* (edit) gradle/wrapper/gradle-wrapper.jar
* (edit) settings.gradle


> Upgrade Gradle version to 4.9
> -
>
> Key: SQOOP-3364
> URL: https://issues.apache.org/jira/browse/SQOOP-3364
> Project: Sqoop
>  Issue Type: Improvement
>Affects Versions: 1.4.7
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
>Priority: Major
> Fix For: 3.0.0
>
> Attachments: SQOOP-3364.patch
>
>
> Sqoop uses Gradle 3.5.1 currently which is a pretty old version, let's 
> upgrade it to the newest 4.9 version.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3364) Upgrade Gradle version to 4.9

2018-08-22 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3364?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16588920#comment-16588920
 ] 

ASF subversion and git services commented on SQOOP-3364:


Commit f5eda1e208fba9058c250dd1b142a97eb118ced9 in sqoop's branch 
refs/heads/trunk from [~vasas]
[ https://git-wip-us.apache.org/repos/asf?p=sqoop.git;h=f5eda1e ]

SQOOP-3364: Upgrade Gradle version to 4.9

(Szabolcs Vasas)


> Upgrade Gradle version to 4.9
> -
>
> Key: SQOOP-3364
> URL: https://issues.apache.org/jira/browse/SQOOP-3364
> Project: Sqoop
>  Issue Type: Improvement
>Affects Versions: 1.4.7
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
>Priority: Major
> Attachments: SQOOP-3364.patch
>
>
> Sqoop uses Gradle 3.5.1 currently which is a pretty old version, let's 
> upgrade it to the newest 4.9 version.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Re: Review Request 68382: Upgrade Gradle version to 4.9

2018-08-22 Thread Szabolcs Vasas


> On Aug. 22, 2018, 2:20 p.m., Nguyen Truong wrote:
> > Hi Szabolcs,
> > 
> > Thanks for taking care of this.
> > 
> > The test PostgresMetaConnectIncrementalImportTest failed when I ran the 
> > third party test suite but succeeded when I ran it again alone. I guess it 
> > was not because of your patch though.
> > 
> > Best,
> > Nguyen

Thank you for reviewing my patch!
Yes, PostgresMetaConnectIncrementalImportTest seems to be flaky I have seen it 
failing earlier as well with Gradle.
This is something we will probably need to investigate in another JIRA.


- Szabolcs


---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/68382/#review207747
---


On Aug. 16, 2018, 2:37 p.m., Szabolcs Vasas wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/68382/
> ---
> 
> (Updated Aug. 16, 2018, 2:37 p.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Bugs: SQOOP-3364
> https://issues.apache.org/jira/browse/SQOOP-3364
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> ---
> 
> Apart from the Gradle version bump the change contains the following:
> - html.destination type is modified to file to avoid deprecation warning
> - the task wrapper is replaced with wrapper {} to avoid deprecation warning
> - enableFeaturePreview('STABLE_PUBLISHING') is added to settings.gradle to 
> avoid deprecation warning. This is a change I could not test since we cannot 
> publish to maven repo now. However in a case of a future release we should 
> test it as described here: 
> https://docs.gradle.org/4.9/userguide/publishing_maven.html#publishing_maven:deferred_configuration
> - The HBase test cases failed at first because the regionserver web ui was 
> not able to start up most probably because of a bad version of a Jetty class 
> on the classpath. However we do not need the regionserver web ui for the 
> Sqoop tests so instead of playing around with libraries I disabled it just 
> like we have already disabled the master web ui.
> 
> 
> Diffs
> -
> 
>   build.gradle 709172cc0 
>   gradle/wrapper/gradle-wrapper.jar 99340b4ad18d3c7e764794d300ffd35017036793 
>   gradle/wrapper/gradle-wrapper.properties 90a06cec7 
>   settings.gradle 7d64af500 
>   src/test/org/apache/sqoop/hbase/HBaseTestCase.java 87fce34a8 
> 
> 
> Diff: https://reviews.apache.org/r/68382/diff/1/
> 
> 
> Testing
> ---
> 
> Executed unit and third party test suite successfully.
> 
> 
> Thanks,
> 
> Szabolcs Vasas
> 
>



Re: Review Request 68382: Upgrade Gradle version to 4.9

2018-08-22 Thread Nguyen Truong

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/68382/#review207747
---


Ship it!




Hi Szabolcs,

Thanks for taking care of this.

The test PostgresMetaConnectIncrementalImportTest failed when I ran the third 
party test suite but succeeded when I ran it again alone. I guess it was not 
because of your patch though.

Best,
Nguyen

- Nguyen Truong


On Aug. 16, 2018, 2:37 p.m., Szabolcs Vasas wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/68382/
> ---
> 
> (Updated Aug. 16, 2018, 2:37 p.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Bugs: SQOOP-3364
> https://issues.apache.org/jira/browse/SQOOP-3364
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> ---
> 
> Apart from the Gradle version bump the change contains the following:
> - html.destination type is modified to file to avoid deprecation warning
> - the task wrapper is replaced with wrapper {} to avoid deprecation warning
> - enableFeaturePreview('STABLE_PUBLISHING') is added to settings.gradle to 
> avoid deprecation warning. This is a change I could not test since we cannot 
> publish to maven repo now. However in a case of a future release we should 
> test it as described here: 
> https://docs.gradle.org/4.9/userguide/publishing_maven.html#publishing_maven:deferred_configuration
> - The HBase test cases failed at first because the regionserver web ui was 
> not able to start up most probably because of a bad version of a Jetty class 
> on the classpath. However we do not need the regionserver web ui for the 
> Sqoop tests so instead of playing around with libraries I disabled it just 
> like we have already disabled the master web ui.
> 
> 
> Diffs
> -
> 
>   build.gradle 709172cc0 
>   gradle/wrapper/gradle-wrapper.jar 99340b4ad18d3c7e764794d300ffd35017036793 
>   gradle/wrapper/gradle-wrapper.properties 90a06cec7 
>   settings.gradle 7d64af500 
>   src/test/org/apache/sqoop/hbase/HBaseTestCase.java 87fce34a8 
> 
> 
> Diff: https://reviews.apache.org/r/68382/diff/1/
> 
> 
> Testing
> ---
> 
> Executed unit and third party test suite successfully.
> 
> 
> Thanks,
> 
> Szabolcs Vasas
> 
>



Review Request 68470: SQOOP-3366 Improve unit tests to be able to execute them in a single JVM

2018-08-22 Thread Nguyen Truong

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/68470/
---

Review request for Sqoop.


Bugs: SQOOP-3366
https://issues.apache.org/jira/browse/SQOOP-3366


Repository: sqoop-trunk


Description
---

Fix the unit tests which changed the state of the JVM so that other tests will 
not be affected when they are run on a single JVM.


Diffs
-

  src/test/org/apache/sqoop/TestFreeFormQueryImport.java d39faee55 
  src/test/org/apache/sqoop/TestIncrementalImport.java e1faf351f 
  src/test/org/apache/sqoop/TestSqoopOptions.java d0591ad0d 
  src/test/org/apache/sqoop/metastore/TestMetastoreConfigurationParameters.java 
0f1eb890b 
  src/test/org/apache/sqoop/testutil/BaseSqoopTestCase.java 1730698e8 
  src/test/org/apache/sqoop/testutil/HsqldbTestServer.java c63a8f2dc 


Diff: https://reviews.apache.org/r/68470/diff/1/


Testing
---

No test has been added.


Thanks,

Nguyen Truong



Re: Review Request 68382: Upgrade Gradle version to 4.9

2018-08-22 Thread Szabolcs Vasas


> On Aug. 21, 2018, 3:49 p.m., daniel voros wrote:
> > Thank you for picking this up! I've checked the following:
> >  - tar.gz contents (and lib/ in particular) are the same when generated 
> > with `./gradlew tar -x test`
> >  - publishing of snapshot and released artifacts works with local and 
> > remote repositories
> > 
> > I couldn't get the ant way of publishing to work with remote repositories 
> > but comparing to the Maven central I've noticed that we've only released 
> > 1.4.7 with the classifier `hadoop260`. This is something we might need to 
> > revisit when deploying the next release; whether it makes sense to add a 
> > classifier if we're only releasing a single version. (For 1.4.6 there were 
> > multiple versions: 
> > http://central.maven.org/maven2/org/apache/sqoop/sqoop/1.4.6/)

Great, thank you for looking at it!
Yes, I agree, we should not use classifiers if there are only one version.


- Szabolcs


---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/68382/#review207668
---


On Aug. 16, 2018, 2:37 p.m., Szabolcs Vasas wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/68382/
> ---
> 
> (Updated Aug. 16, 2018, 2:37 p.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Bugs: SQOOP-3364
> https://issues.apache.org/jira/browse/SQOOP-3364
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> ---
> 
> Apart from the Gradle version bump the change contains the following:
> - html.destination type is modified to file to avoid deprecation warning
> - the task wrapper is replaced with wrapper {} to avoid deprecation warning
> - enableFeaturePreview('STABLE_PUBLISHING') is added to settings.gradle to 
> avoid deprecation warning. This is a change I could not test since we cannot 
> publish to maven repo now. However in a case of a future release we should 
> test it as described here: 
> https://docs.gradle.org/4.9/userguide/publishing_maven.html#publishing_maven:deferred_configuration
> - The HBase test cases failed at first because the regionserver web ui was 
> not able to start up most probably because of a bad version of a Jetty class 
> on the classpath. However we do not need the regionserver web ui for the 
> Sqoop tests so instead of playing around with libraries I disabled it just 
> like we have already disabled the master web ui.
> 
> 
> Diffs
> -
> 
>   build.gradle 709172cc0 
>   gradle/wrapper/gradle-wrapper.jar 99340b4ad18d3c7e764794d300ffd35017036793 
>   gradle/wrapper/gradle-wrapper.properties 90a06cec7 
>   settings.gradle 7d64af500 
>   src/test/org/apache/sqoop/hbase/HBaseTestCase.java 87fce34a8 
> 
> 
> Diff: https://reviews.apache.org/r/68382/diff/1/
> 
> 
> Testing
> ---
> 
> Executed unit and third party test suite successfully.
> 
> 
> Thanks,
> 
> Szabolcs Vasas
> 
>



[jira] [Assigned] (SQOOP-3366) Improve unit tests to be able to execute them in a single JVM

2018-08-22 Thread Szabolcs Vasas (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3366?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Szabolcs Vasas reassigned SQOOP-3366:
-

Assignee: Nguyen Truong

> Improve unit tests to be able to execute them in a single JVM
> -
>
> Key: SQOOP-3366
> URL: https://issues.apache.org/jira/browse/SQOOP-3366
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>Reporter: Szabolcs Vasas
>Assignee: Nguyen Truong
>Priority: Major
>
> The goal of this JIRA is to improve the unit tests to be able to execute them 
> in a single JVM. See the parent JIRA for the details.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (SQOOP-3367) Improve third party tests to be able to execute them in a single JVM

2018-08-22 Thread Szabolcs Vasas (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3367?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Szabolcs Vasas reassigned SQOOP-3367:
-

Assignee: Szabolcs Vasas

> Improve third party tests to be able to execute them in a single JVM
> 
>
> Key: SQOOP-3367
> URL: https://issues.apache.org/jira/browse/SQOOP-3367
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>Reporter: Szabolcs Vasas
>Assignee: Szabolcs Vasas
>Priority: Major
>
> The goal of this JIRA is to improve the third party tests to be able to 
> execute them in a single JVM. See the parent JIRA for the details.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (SQOOP-3367) Improve third party tests to be able to execute them in a single JVM

2018-08-22 Thread Szabolcs Vasas (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3367?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Szabolcs Vasas updated SQOOP-3367:
--
Description: The goal of this JIRA is to improve the third party tests to 
be able to execute them in a single JVM. See the parent JIRA for the details.

> Improve third party tests to be able to execute them in a single JVM
> 
>
> Key: SQOOP-3367
> URL: https://issues.apache.org/jira/browse/SQOOP-3367
> Project: Sqoop
>  Issue Type: Sub-task
>Affects Versions: 1.4.7
>Reporter: Szabolcs Vasas
>Priority: Major
>
> The goal of this JIRA is to improve the third party tests to be able to 
> execute them in a single JVM. See the parent JIRA for the details.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (SQOOP-3367) Improve third party tests to be able to execute them in a single JVM

2018-08-22 Thread Szabolcs Vasas (JIRA)
Szabolcs Vasas created SQOOP-3367:
-

 Summary: Improve third party tests to be able to execute them in a 
single JVM
 Key: SQOOP-3367
 URL: https://issues.apache.org/jira/browse/SQOOP-3367
 Project: Sqoop
  Issue Type: Sub-task
Affects Versions: 1.4.7
Reporter: Szabolcs Vasas






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (SQOOP-3366) Improve unit tests to be able to execute them in a single JVM

2018-08-22 Thread Szabolcs Vasas (JIRA)
Szabolcs Vasas created SQOOP-3366:
-

 Summary: Improve unit tests to be able to execute them in a single 
JVM
 Key: SQOOP-3366
 URL: https://issues.apache.org/jira/browse/SQOOP-3366
 Project: Sqoop
  Issue Type: Sub-task
Affects Versions: 1.4.7
Reporter: Szabolcs Vasas


The goal of this JIRA is to improve the unit tests to be able to execute them 
in a single JVM. See the parent JIRA for the details.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (SQOOP-3365) Improve Sqoop tests to be able to execute them in a single JVM

2018-08-22 Thread Szabolcs Vasas (JIRA)
Szabolcs Vasas created SQOOP-3365:
-

 Summary: Improve Sqoop tests to be able to execute them in a 
single JVM
 Key: SQOOP-3365
 URL: https://issues.apache.org/jira/browse/SQOOP-3365
 Project: Sqoop
  Issue Type: Improvement
Affects Versions: 1.4.7
Reporter: Szabolcs Vasas


Gradle currently creates a new JVM for every test case it executes (the 
forkEvery parameter is set to 1, see: 
https://docs.gradle.org/current/dsl/org.gradle.api.tasks.testing.Test.html#org.gradle.api.tasks.testing.Test:forkEvery).
 This provides better isolation for the tests but it has huge performance 
overhead as well since creating a JVM is an expensive operation. It would be 
great if we could execute all our tests in a single JVM so we could save that 
big cost but unfortunately it does not work out of the box because:
 * There are test classes which unnecessarily change the state of the JVM (for 
example by setting static fields and system properties) and can cause other 
tests to fail if they are run in the same JVM.
 * There are test classes which rely on a "clean JVM state" and they fail if 
the state is different than implicitly expected by them.
 * There are test classes which create a Kerberos KDC which sets a lot of 
static fields in the JVM which are really hard if not impossible to restore.

This JIRA addresses the first two of the above issues.
I expect the third category to be addressed by SQOOP-3104. It should introduce 
a category for kerberized tests and Gradle should execute this category with 
the forkEvery parameter set to 1.
However Gradle should be able to execute the rest of the tests with forkEvery 0.

Please note that the test execution order with Gradle is not strictly defined 
but platform dependent so it is possible that running the tests on a new 
platform will produce an execution order which will unveil a test 
interdependency which is not resolved by the subtasks of the JIRA.
If you encounter such a situation please feel free to create a new subtasks 
here.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)