exceptionfactory commented on code in PR #10829:
URL: https://github.com/apache/nifi/pull/10829#discussion_r2743556475


##########
nifi-extension-bundles/nifi-standard-services/nifi-record-serialization-services-bundle/nifi-record-serialization-services/src/main/java/org/apache/nifi/avro/AvroReader.java:
##########
@@ -60,12 +60,27 @@ public class AvroReader extends SchemaRegistryService 
implements RecordReaderFac
             .required(true)
             .build();
 
+    static final PropertyDescriptor FAST_READER_ENABLED = new 
PropertyDescriptor.Builder()
+            .name("Fast Reader Enabled")
+            .description("""
+                    When enabled, the Avro library uses an optimized reader 
implementation that improves read performance
+                    by creating a detailed execution plan at initialization. 
However, this optimization can lead to
+                    significantly higher memory consumption, especially when 
using schema inference. If OutOfMemory errors
+                    occur during Avro processing, consider disabling this 
option.""")
+            .addValidator(StandardValidators.BOOLEAN_VALIDATOR)
+            .allowableValues("true", "false")
+            .defaultValue("true")

Review Comment:
   Keeping the default and current behavior as `true` seems better, but I'm 
open to changing it depending on further investigation. The setting for Maximum 
Heap size is a question, and it would be helpful to understand whether there is 
any correlation between Max Heap or other memory constraints.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to