[ 
https://issues.apache.org/jira/browse/SQOOP-2567?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16442172#comment-16442172
 ] 

Szabolcs Vasas commented on SQOOP-2567:
---------------------------------------

Hi [~fero],

Thank you for your contribution you can now close the review board.

Szabolcs

> SQOOP import for Oracle fails with invalid precision/scale for decimal
> ----------------------------------------------------------------------
>
>                 Key: SQOOP-2567
>                 URL: https://issues.apache.org/jira/browse/SQOOP-2567
>             Project: Sqoop
>          Issue Type: Bug
>          Components: connectors
>    Affects Versions: 1.4.5
>         Environment: CDH5.3
>            Reporter: Suresh Deoda
>            Assignee: Fero Szabo
>            Priority: Major
>              Labels: AVRO, ORACLE
>
> Sqoop import fails creating avrodata file from the oracle source with decimal 
> data. If the table in oracle is defined as say,
> Col1  as Decimal(12,11) , but if some data has few less digits in scale then 
> it fails with the error as,
> Error: org.apache.avro.file.DataFileWriter$AppendWriteException: 
> org.apache.avro.AvroTypeException: Cannot encode decimal with scale 10 as 
> scale 11
>         at org.apache.avro.file.DataFileWriter.append(DataFileWriter.java:296)
>         at 
> org.apache.sqoop.mapreduce.AvroOutputFormat$1.write(AvroOutputFormat.java:112)
>         at 
> org.apache.sqoop.mapreduce.AvroOutputFormat$1.write(AvroOutputFormat.java:108)
>         at 
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655)
>         at 
> org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
>         at 
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
>         at 
> org.apache.sqoop.mapreduce.AvroImportMapper.map(AvroImportMapper.java:73)
>         at 
> org.apache.sqoop.mapreduce.AvroImportMapper.map(AvroImportMapper.java:39)
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
>         at 
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:415)
>         at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
> Caused by: org.apache.avro.AvroTypeException: Cannot encode decimal with 
> scale 10 as scale 11
>         at 
> org.apache.avro.Conversions$DecimalConversion.toBytes(Conversions.java:68)
>         at 
> org.apache.avro.Conversions$DecimalConversion.toBytes(Conversions.java:39)
>         at 
> org.apache.avro.generic.GenericDatumWriter.convert(GenericDatumWriter.java:90)
>         at 
> org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:70)
>         at 
> org.apache.avro.reflect.ReflectDatumWriter.write(ReflectDatumWriter.java:143)
>         at 
> org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:112)
>         at 
> org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:73)
>         at 
> org.apache.avro.reflect.ReflectDatumWriter.write(ReflectDatumWriter.java:143)
>         at 
> org.apache.avro.generic.GenericDatumWriter.writeField(GenericDatumWriter.java:153)
>         at 
> org.apache.avro.reflect.ReflectDatumWriter.writeField(ReflectDatumWriter.java:175)
> also, when we dont have precision defined in Oracle ( which it takes default 
> (38,0) i guess) it gives error as ,
> ERROR tool.ImportTool: Imported Failed: Invalid decimal precision: 0 (must be 
> positive)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to