Hi all, 

 

I am using QueryDatabaseTable to extract records from mysql. I have set
Logical Data Type to true. I am using the PutParquet processor to write to
HDFS. It is not able to convert the logical decimal type.

 

It throws an exception :-

2018-02-15 17:59:05,189 ERROR [Timer-Driven Process Thread-10]
o.a.nifi.processors.parquet.PutParquet
PutParquet[id=01611011-e4a8-106a-f933-eb66d923cfd1] Failed to write due to
org.apache.nifi.serialization.record.util.IllegalTypeConversionException:
Cannot convert value 1234567.2 of type class java.lang.Double because no
compatible types exist in the UNION for field dectype: {}

org.apache.nifi.serialization.record.util.IllegalTypeConversionException:
Cannot convert value 1234567.2 of type class java.lang.Double because no
compatible types exist in the UNION for field dectype

               at
org.apache.nifi.avro.AvroTypeUtil.convertUnionFieldValue(AvroTypeUtil.java:6
67)

               at
org.apache.nifi.avro.AvroTypeUtil.convertToAvroObject(AvroTypeUtil.java:572)

               at
org.apache.nifi.avro.AvroTypeUtil.createAvroRecord(AvroTypeUtil.java:432)

               at
org.apache.nifi.processors.parquet.record.AvroParquetHDFSRecordWriter.write(
AvroParquetHDFSRecordWriter.java:43)

               at
org.apache.nifi.processors.hadoop.record.HDFSRecordWriter.write(HDFSRecordWr
iter.java:48)

               at
org.apache.nifi.processors.hadoop.AbstractPutHDFSRecord.lambda$null$0(Abstra
ctPutHDFSRecord.java:324)

               at
org.apache.nifi.controller.repository.StandardProcessSession.read(StandardPr
ocessSession.java:2174)

               at
org.apache.nifi.controller.repository.StandardProcessSession.read(StandardPr
ocessSession.java:2144)

               at
org.apache.nifi.processors.hadoop.AbstractPutHDFSRecord.lambda$onTrigger$1(A
bstractPutHDFSRecord.java:305)

               at java.security.AccessController.doPrivileged(Native Method)

               at javax.security.auth.Subject.doAs(Subject.java:360)

               at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.ja
va:1678)

               at
org.apache.nifi.processors.hadoop.AbstractPutHDFSRecord.onTrigger(AbstractPu
tHDFSRecord.java:272)

               at
org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java
:27)

               at
org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessor
Node.java:1119)

               at
org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(Continuall
yRunProcessorTask.java:147)

               at
org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(Continuall
yRunProcessorTask.java:47)

               at
org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(Timer
DrivenSchedulingAgent.java:128)

               at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)

               at
java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)

               at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$
301(ScheduledThreadPoolExecutor.java:180)

               at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Sch
eduledThreadPoolExecutor.java:294)

               at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:11
49)

               at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:6
24)

               at java.lang.Thread.run(Thread.java:748)

 

 

Please let me know if I'm doing anything wrong.

 

Regards,

Mohit Jain

Reply via email to