This is an automatically generated e-mail. To reply, visit:

Lines 88 (patched)

    I think this if statement is redundant since if schemaContainingScale is 
null then the method will return null anyway.

Lines 93 (patched)

    Can we extract this string value to a constant?

Lines 101 (patched)

    Do we really need this file? It seems we do not use dates in this test case.

Lines 108 (patched)

    I think we can delete these lines.

Lines 93 (patched)

    A similar logic is already present in 
org.apache.sqoop.testutil.BaseSqoopTestCase can we somehow reuse that logic?
    For example it already has a ConnManager field which can be initialized in 

Lines 99 (patched)

    Do we really need this property to be set? It seems to be initialized to 
the default factor classes which are used by default anyway, aren't they?

Lines 145 (patched)

    Can we make these test methods names similar to the ones defined in 
TestHsqldbAvroPadding (testAvroImportWithoutPadding, testAvroImportWithPadding)

Lines 81 (patched)

    Could we introduce a parameter-less version of this which would set the 
withHadoopCommons to true? I think most of the time this would be called to set 
the flag to true so could simplify the code on the client side.

Lines 82 (patched)

    I think withHadoopFlags or withCommonHadoopFlags would be more descriptive 
name for this field.

Lines 87 (patched)

    typo: Transforms

Lines 102 (patched)

    I think this logic handling the tool options could be also moved to 
ArgumentUtils since it already contains similar stuff.
    After that this method could be simplified, the notEmpty checks could also 
be removed.
    We could also think about moving all the logic from ArgumentUtils to the 
builder since it would be a better practice to use the builder instead of the 
static methods.

Lines 60 (patched)

    This variable does not seem to be used.

Lines 64 (patched)

    I think instead of 'r' this reader should be closed finally. Can we use a 
try-with-resources statement here?

Lines 67 (patched)

    We could convert the comment message to a parameter of fail() so it's more 
obvious for the test runner why the test has failed.

Lines 73 (patched)

    Please remove System.out.println

Lines 77 (patched)

    I think we could just let this method throw an IOException  (or rethrow it 
in a RuntimeException) and JUnit will fail the test anyway and log the stack 

Lines 94 (patched)

    This seems to be a dupicate of org.apache.sqoop.TestAvroImport#read can we 
resolve it somehow?

Lines 434 (patched)

    This documentation seems to be duplicated now, add the extra parameter or 
just delete it :)

- Szabolcs Vasas

On Feb. 12, 2018, 2:31 p.m., Fero Szabo wrote:
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/65607/
> -----------------------------------------------------------
> (Updated Feb. 12, 2018, 2:31 p.m.)
> Review request for Sqoop, Boglarka Egyed and Szabolcs Vasas.
> Bugs: SQOOP-2976
>     https://issues.apache.org/jira/browse/SQOOP-2976
> Repository: sqoop-trunk
> Description
> -------
> **Summary:**
> Certain databases, such as SQL Server and Postgres are storing decimal values 
> padded with 0s, should the user insert them with less digits than the given 
> scale. 
> Other databases however, such as Oracle and HSQLDB store these numbers 
> without trailing 0s. Then, when the JDBC driver returns these as BigDecimals, 
> they won't match the scale in the avro schema.
> Take the following SQL commands for an example: 
> ```
> create table salary (id int, amount number (10,5));
> insert into salary (id, amount) values (1, 10.5);
> insert into salary (id, amount) values (2, 10.50);
> select * from salary;
> ```
> Records in an Oracle database:
> 1     10.5
> 2     10.5
> Records in SQL Server (using decimal instead of number in the create 
> statement):
> 1     10.50000
> 2     10.50000
> **Solition:**
> The fix is simply checking the scale of the returned BigDecimals against 
> what's in the avro schema and recreates the objects in case of a mismatch. 
> I've introduced a new property to enable this new feature, so existing 
> behavior is not affected. 
> **Concerns: **
> - trimmings can happen silently, should we rather raise an exception? 
> Enabling trimming adds a new feature, but it also adds the possibility 
> silently lose scale while import. The latter could be mitigated by a thorough 
> documentation.
> - The flags current name () doesn't really match the behavior, should I 
> change it to something else? (avro.decimal_scale_harmonization.enable)
> - How / where to document this new flag?
> **Other notable changes:**
> - Introduced ArgumentArrayBuilder that reuses the existing Argument class and 
> introduces a useful builder pattern for creating commandline arguments for 
> tests.
> - Slightly modified BaseSqoopTest to fit my needs. *(However, further 
> refactoring would be required in this class to enable better reuse. For 
> example: the current implementation can't be used with SQL Server, because 
> one also needs to specify the schema besides the tablename in the create and 
> insert statements. There are also code duplications.)*
> Diffs
> -----
>   src/java/org/apache/sqoop/avro/AvroUtil.java 1aae8df2 
>   src/java/org/apache/sqoop/config/ConfigurationConstants.java 7a19a62c 
>   src/java/org/apache/sqoop/mapreduce/AvroImportMapper.java a5e5bf5a 
>   src/test/org/apache/sqoop/manager/hsqldb/TestHsqldbAvroPadding.java 
>   src/test/org/apache/sqoop/manager/oracle/OracleAvroPaddingImportTest.java 
>   src/test/org/apache/sqoop/manager/sqlserver/MSSQLTestUtils.java 2220b7d5 
> src/test/org/apache/sqoop/manager/sqlserver/SQLServerAvroPaddingImportTest.java
>   src/test/org/apache/sqoop/testutil/ArgumentArrayBuilder.java PRE-CREATION 
>   src/test/org/apache/sqoop/testutil/AvroTestUtils.java PRE-CREATION 
>   src/test/org/apache/sqoop/testutil/BaseSqoopTestCase.java 588f439c 
> Diff: https://reviews.apache.org/r/65607/diff/2/
> Testing
> -------
> See the 3 new test classes (HSQLDB, Oracle, SQL Server).
> Thanks,
> Fero Szabo

Reply via email to