[ 
https://issues.apache.org/jira/browse/NIFI-11621?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17728137#comment-17728137
 ] 

ASF subversion and git services commented on NIFI-11621:
--------------------------------------------------------

Commit ac810671c5ad4d5b6a1d4b996d3b9a0da929105f in nifi's branch 
refs/heads/main from Mark Payne
[ https://gitbox.apache.org/repos/asf?p=nifi.git;h=ac810671c5 ]

NIFI-11621: Handle the case of CHOICE fields when inferring the type of ARRAY 
elements. E.g., support ARRAY<CHOICE<STRING, NULL>>

Signed-off-by: Matt Burgess <[email protected]>

This closes #7322


> Inferring schema for JSON fails when there's a CHOICE of different ARRAY types
> ------------------------------------------------------------------------------
>
>                 Key: NIFI-11621
>                 URL: https://issues.apache.org/jira/browse/NIFI-11621
>             Project: Apache NiFi
>          Issue Type: Bug
>          Components: Extensions
>            Reporter: Mark Payne
>            Assignee: Mark Payne
>            Priority: Major
>             Fix For: 1.latest, 2.latest
>
>          Time Spent: 40m
>  Remaining Estimate: 0h
>
> From Apache Slack: 
> https://apachenifi.slack.com/archives/C0L9VCD47/p1685553667778359?thread_ts=1685461745.470939&cid=C0L9VCD47
> When using ConvertRecord with a JSON Reader and an Avro Writer, when 
> inferring the JSON schema, each of the following two records works properly:
> {code}
> {"test_record":{"array_test_record":{"test_array":[]}}}
> {code}
> {code}
> {"test_record":{"array_test_record":{"test_array":["test"]}}}
> {code}
> However, when combined into a single FlowFile:
> {code}
> {"test_record":{"array_test_record":{"test_array":[]}}}
> {"test_record":{"array_test_record":{"test_array":["test"]}}}
> {code}
> It fails with a NullPointerException:
> {code}
> 2023-05-31 13:51:35,632 ERROR [Timer-Driven Process Thread-8] 
> o.a.n.processors.standard.ConvertRecord 
> ConvertRecord[id=72e564dc-0188-1000-360a-9f86b50ec8ac] Failed to process 
> StandardFlowFileRecord[uuid=9bf4f0fb-0942-48ba-8a16-0dbd98db3f97,claim=StandardContentClaim
>  [resourceClaim=StandardResourceClaim[id=1685554864966-1, container=default, 
> section=1], offset=3278, 
> length=117],offset=0,name=9bf4f0fb-0942-48ba-8a16-0dbd98db3f97,size=117]; 
> will route to failure
> org.apache.nifi.processor.exception.ProcessException: Could not determine the 
> Avro Schema to use for writing the content
>         at 
> org.apache.nifi.avro.AvroRecordSetWriter.createWriter(AvroRecordSetWriter.java:154)
>         at 
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at 
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.base/java.lang.reflect.Method.invoke(Method.java:566)
>         at 
> org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:254)
>         at 
> org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:105)
>         at com.sun.proxy.$Proxy177.createWriter(Unknown Source)
>         at 
> org.apache.nifi.processors.standard.AbstractRecordProcessor$1.process(AbstractRecordProcessor.java:150)
>         at 
> org.apache.nifi.controller.repository.StandardProcessSession.write(StandardProcessSession.java:3441)
>         at 
> org.apache.nifi.processors.standard.AbstractRecordProcessor.onTrigger(AbstractRecordProcessor.java:122)
>         at 
> org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
>         at 
> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1360)
>         at 
> org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:243)
>         at 
> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:102)
>         at org.apache.nifi.engine.FlowEngine$2.run(FlowEngine.java:110)
>         at 
> java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
>         at 
> java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305)
>         at 
> java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305)
>         at 
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
>         at 
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
>         at java.base/java.lang.Thread.run(Thread.java:829)
> Caused by: org.apache.nifi.schema.access.SchemaNotFoundException: Failed to 
> compile Avro Schema
>         at 
> org.apache.nifi.avro.AvroRecordSetWriter.createWriter(AvroRecordSetWriter.java:145)
>         ... 21 common frames omitted
> Caused by: java.lang.NullPointerException: null
>         at 
> org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:208)
>         at 
> org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
>         at 
> org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
>         at 
> org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:241)
>         at 
> org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
>         at 
> org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
>         at 
> org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
>         at 
> org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:122)
>         at 
> org.apache.nifi.avro.AvroTypeUtil.extractAvroSchema(AvroTypeUtil.java:102)
>         at 
> org.apache.nifi.avro.AvroRecordSetWriter.createWriter(AvroRecordSetWriter.java:142)
>         ... 21 common frames omitted
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to