Are you using Nifi 1.5.0? If not, try with it first. There are bugs in
older versions related to Record/Avro.

On Fri, 16 Feb 2018 at 02:46 <[email protected]> wrote:

> Hi all,
>
>
>
> I am using QueryDatabaseTable to extract records from mysql. I have set
> Logical Data Type to true. I am using the PutParquet processor to write to
> HDFS. It is not able to convert the logical decimal type.
>
>
>
> It throws an exception :-
>
> 2018-02-15 17:59:05,189 ERROR [Timer-Driven Process Thread-10]
> o.a.nifi.processors.parquet.PutParquet
> PutParquet[id=01611011-e4a8-106a-f933-eb66d923cfd1] Failed to write due to
> org.apache.nifi.serialization.record.util.IllegalTypeConversionException:
> Cannot convert value 1234567.2 of type class java.lang.Double because no
> compatible types exist in the UNION for field dectype: {}
>
> org.apache.nifi.serialization.record.util.IllegalTypeConversionException:
> Cannot convert value 1234567.2 of type class java.lang.Double because no
> compatible types exist in the UNION for field dectype
>
>                at
> org.apache.nifi.avro.AvroTypeUtil.convertUnionFieldValue(AvroTypeUtil.java:667)
>
>                at
> org.apache.nifi.avro.AvroTypeUtil.convertToAvroObject(AvroTypeUtil.java:572)
>
>                at
> org.apache.nifi.avro.AvroTypeUtil.createAvroRecord(AvroTypeUtil.java:432)
>
>                at
> org.apache.nifi.processors.parquet.record.AvroParquetHDFSRecordWriter.write(AvroParquetHDFSRecordWriter.java:43)
>
>                at
> org.apache.nifi.processors.hadoop.record.HDFSRecordWriter.write(HDFSRecordWriter.java:48)
>
>                at
> org.apache.nifi.processors.hadoop.AbstractPutHDFSRecord.lambda$null$0(AbstractPutHDFSRecord.java:324)
>
>                at
> org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:2174)
>
>                at
> org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:2144)
>
>                at
> org.apache.nifi.processors.hadoop.AbstractPutHDFSRecord.lambda$onTrigger$1(AbstractPutHDFSRecord.java:305)
>
>                at java.security.AccessController.doPrivileged(Native
> Method)
>
>                at javax.security.auth.Subject.doAs(Subject.java:360)
>
>                at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1678)
>
>                at
> org.apache.nifi.processors.hadoop.AbstractPutHDFSRecord.onTrigger(AbstractPutHDFSRecord.java:272)
>
>                at
> org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
>
>                at
> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1119)
>
>                at
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)
>
>                at
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
>
>                at
> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:128)
>
>                at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>
>                at
> java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
>
>                at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
>
>                at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
>
>                at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>
>                at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>
>                at java.lang.Thread.run(Thread.java:748)
>
>
>
>
>
> Please let me know if I’m doing anything wrong.
>
>
>
> Regards,
>
> Mohit Jain
>

Reply via email to