[jira] [Updated] (NIFI-10993) PublishKafkaRecord should write key record (when configured) using correct schema

2023-01-23 Thread Peter Turcsanyi (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-10993?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Peter Turcsanyi updated NIFI-10993:
---
Resolution: Fixed
Status: Resolved  (was: Patch Available)

> PublishKafkaRecord should write key record (when configured) using correct 
> schema
> -
>
> Key: NIFI-10993
> URL: https://issues.apache.org/jira/browse/NIFI-10993
> Project: Apache NiFi
>  Issue Type: Bug
>Reporter: Paul Grey
>Assignee: Paul Grey
>Priority: Minor
> Fix For: 1.20.0
>
>  Time Spent: 0.5h
>  Remaining Estimate: 0h
>
> community report via 
> (https://www.mail-archive.com/users@nifi.apache.org/msg15668.html)
> to us...@nifi.apache.org
> To whom it may concern,
> Hello, I would like to report an issue for Nifi. But, following the new Jira 
> Guidelines, I would therefore like to request that an account for ASF Jira in 
> order to create a ticket.
> In regards to the bug, using Nifi 1.19.1 I would like to send a tombstone 
> message (null payload) to Kafka and using the Confluent JDBC sink connector 
> (with delete.enabled=true) to delete a record in our Postgres database. I 
> believe as of Nifi 1.19, PublishKafkaRecord_2_6 now supports the 'Publish 
> Strategy: Use Wrapper' feature which allows setting the Kafka message key and 
> value (Primary Key as the Kafka key, null for the Kafka value). For the 
> Record Key Writer, I'm using an AvroRecordSetWriter to validate and serialize 
> the key against the confluent schema registry (Schema Write Strategy: 
> Confluent Schema Registry Reference, Schema Access Strategy: Use 'Schema 
> Name' Property) but sending the message I come across the error:
> PublishKafkaRecord_2_6[id=XXX] Failed to send FlowFile[filename=XXX] to 
> Kafka: org.apache.nifi.processor.exception.ProcessException: Could not 
> determine the Avro Schema to use for writing the content
> - Caused by: org.apache.nifi.schema.access.SchemaNotFoundException: Cannot 
> write Confluent Schema Registry Reference because the Schema Identifier is 
> not known
> I can confirm the configuration for the for the AvroRecordSetWriter, 
> ConfluentSchemaRegistry controllers, and PublishKafkaRecord processor are all 
> configured correctly, as I can send the Kafka message just fine using the 
> default Publish Strategy (Use Content as Record Value). It only occurs using 
> Use Wrapper, and the ConfluentSchemaRegistry.
> A workaround that has worked was for using JsonRecordSetWriter w/ embedded 
> JSON schemas, but it would be nice to continue using our Avro Schema Registry 
> for this.
> I'd appreciate if someone had any advice or experience with this issue, 
> otherwise I'd like to log an issue in JIRA.
> Thank you,
> Austin Tao



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (NIFI-10993) PublishKafkaRecord should write key record (when configured) using correct schema

2023-01-23 Thread Joe Witt (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-10993?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Joe Witt updated NIFI-10993:

Fix Version/s: 1.20.0

> PublishKafkaRecord should write key record (when configured) using correct 
> schema
> -
>
> Key: NIFI-10993
> URL: https://issues.apache.org/jira/browse/NIFI-10993
> Project: Apache NiFi
>  Issue Type: Bug
>Reporter: Paul Grey
>Assignee: Paul Grey
>Priority: Minor
> Fix For: 1.20.0
>
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> community report via 
> (https://www.mail-archive.com/users@nifi.apache.org/msg15668.html)
> to us...@nifi.apache.org
> To whom it may concern,
> Hello, I would like to report an issue for Nifi. But, following the new Jira 
> Guidelines, I would therefore like to request that an account for ASF Jira in 
> order to create a ticket.
> In regards to the bug, using Nifi 1.19.1 I would like to send a tombstone 
> message (null payload) to Kafka and using the Confluent JDBC sink connector 
> (with delete.enabled=true) to delete a record in our Postgres database. I 
> believe as of Nifi 1.19, PublishKafkaRecord_2_6 now supports the 'Publish 
> Strategy: Use Wrapper' feature which allows setting the Kafka message key and 
> value (Primary Key as the Kafka key, null for the Kafka value). For the 
> Record Key Writer, I'm using an AvroRecordSetWriter to validate and serialize 
> the key against the confluent schema registry (Schema Write Strategy: 
> Confluent Schema Registry Reference, Schema Access Strategy: Use 'Schema 
> Name' Property) but sending the message I come across the error:
> PublishKafkaRecord_2_6[id=XXX] Failed to send FlowFile[filename=XXX] to 
> Kafka: org.apache.nifi.processor.exception.ProcessException: Could not 
> determine the Avro Schema to use for writing the content
> - Caused by: org.apache.nifi.schema.access.SchemaNotFoundException: Cannot 
> write Confluent Schema Registry Reference because the Schema Identifier is 
> not known
> I can confirm the configuration for the for the AvroRecordSetWriter, 
> ConfluentSchemaRegistry controllers, and PublishKafkaRecord processor are all 
> configured correctly, as I can send the Kafka message just fine using the 
> default Publish Strategy (Use Content as Record Value). It only occurs using 
> Use Wrapper, and the ConfluentSchemaRegistry.
> A workaround that has worked was for using JsonRecordSetWriter w/ embedded 
> JSON schemas, but it would be nice to continue using our Avro Schema Registry 
> for this.
> I'd appreciate if someone had any advice or experience with this issue, 
> otherwise I'd like to log an issue in JIRA.
> Thank you,
> Austin Tao



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (NIFI-10993) PublishKafkaRecord should write key record (when configured) using correct schema

2023-01-10 Thread Paul Grey (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-10993?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Paul Grey updated NIFI-10993:
-
Status: Patch Available  (was: In Progress)

> PublishKafkaRecord should write key record (when configured) using correct 
> schema
> -
>
> Key: NIFI-10993
> URL: https://issues.apache.org/jira/browse/NIFI-10993
> Project: Apache NiFi
>  Issue Type: Bug
>Reporter: Paul Grey
>Assignee: Paul Grey
>Priority: Minor
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> community report via 
> (https://www.mail-archive.com/users@nifi.apache.org/msg15668.html)
> to us...@nifi.apache.org
> To whom it may concern,
> Hello, I would like to report an issue for Nifi. But, following the new Jira 
> Guidelines, I would therefore like to request that an account for ASF Jira in 
> order to create a ticket.
> In regards to the bug, using Nifi 1.19.1 I would like to send a tombstone 
> message (null payload) to Kafka and using the Confluent JDBC sink connector 
> (with delete.enabled=true) to delete a record in our Postgres database. I 
> believe as of Nifi 1.19, PublishKafkaRecord_2_6 now supports the 'Publish 
> Strategy: Use Wrapper' feature which allows setting the Kafka message key and 
> value (Primary Key as the Kafka key, null for the Kafka value). For the 
> Record Key Writer, I'm using an AvroRecordSetWriter to validate and serialize 
> the key against the confluent schema registry (Schema Write Strategy: 
> Confluent Schema Registry Reference, Schema Access Strategy: Use 'Schema 
> Name' Property) but sending the message I come across the error:
> PublishKafkaRecord_2_6[id=XXX] Failed to send FlowFile[filename=XXX] to 
> Kafka: org.apache.nifi.processor.exception.ProcessException: Could not 
> determine the Avro Schema to use for writing the content
> - Caused by: org.apache.nifi.schema.access.SchemaNotFoundException: Cannot 
> write Confluent Schema Registry Reference because the Schema Identifier is 
> not known
> I can confirm the configuration for the for the AvroRecordSetWriter, 
> ConfluentSchemaRegistry controllers, and PublishKafkaRecord processor are all 
> configured correctly, as I can send the Kafka message just fine using the 
> default Publish Strategy (Use Content as Record Value). It only occurs using 
> Use Wrapper, and the ConfluentSchemaRegistry.
> A workaround that has worked was for using JsonRecordSetWriter w/ embedded 
> JSON schemas, but it would be nice to continue using our Avro Schema Registry 
> for this.
> I'd appreciate if someone had any advice or experience with this issue, 
> otherwise I'd like to log an issue in JIRA.
> Thank you,
> Austin Tao



--
This message was sent by Atlassian Jira
(v8.20.10#820010)