exceptionfactory commented on code in PR #9196:
URL: https://github.com/apache/nifi/pull/9196#discussion_r1737278405


##########
nifi-extension-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestValidateRecord.java:
##########
@@ -593,6 +599,47 @@ public void testValidateMaps() throws IOException, 
InitializationException, Malf
         }
     }
 
+    @ParameterizedTest
+    @MethodSource("getLocales")
+    void testTimeZoneWithTimeStamp(Locale locale) throws Exception {
+        Locale originalLocale = Locale.getDefault();
+        Locale.setDefault(locale);
+        final String timestampWithTimeZonePattern = "dd/MM/yyyy HH:mm:ssZ";
+        final String schema = new 
String(Files.readAllBytes(Paths.get("src/test/resources/TestValidateRecord/timestampWithTimeZone.avsc")),
 "UTF-8");
+        final CSVReader csvReader = new CSVReader();
+        runner.addControllerService("reader", csvReader);
+        runner.setProperty(csvReader, 
SchemaAccessUtils.SCHEMA_ACCESS_STRATEGY, 
SchemaAccessUtils.SCHEMA_TEXT_PROPERTY);
+        runner.setProperty(csvReader, SchemaAccessUtils.SCHEMA_TEXT, schema);
+        runner.setProperty(csvReader, CSVUtils.FIRST_LINE_IS_HEADER, "true");
+        runner.setProperty(csvReader, CSVUtils.VALUE_SEPARATOR, "◆");
+        runner.setProperty(csvReader, DateTimeUtils.TIMESTAMP_FORMAT, 
timestampWithTimeZonePattern);
+        runner.enableControllerService(csvReader);
+
+        final CSVRecordSetWriter csvWriter = new CSVRecordSetWriter();
+        runner.addControllerService("writer", csvWriter);
+        runner.setProperty(csvWriter, "Schema Write Strategy", 
"full-schema-attribute");
+        runner.setProperty(csvWriter, DateTimeUtils.TIMESTAMP_FORMAT, 
timestampWithTimeZonePattern);
+        runner.enableControllerService(csvWriter);
+
+        runner.setProperty(ValidateRecord.RECORD_READER, "reader");
+        runner.setProperty(ValidateRecord.RECORD_WRITER, "writer");
+        runner.setProperty(ValidateRecord.ALLOW_EXTRA_FIELDS, "false");
+        runner.setProperty(ValidateRecord.MAX_VALIDATION_DETAILS_LENGTH, 
"4000");
+        runner.setProperty(ValidateRecord.VALIDATION_DETAILS_ATTRIBUTE_NAME, 
"valDetails");
+        final String timestampWithTimezone = "24/09/2024 15:04:23+0630";
+        final String content = 
"apache_date◆apache_ip_source◆apache_method◆apache_path◆apache_query_string◆apache_response_code◆apache_referer◆apache_user_agent\n"
 +
+                timestampWithTimezone + 
"◆10.4.3.20◆GET◆/path◆?test=toto◆200◆-◆";
+
+        runner.enqueue(content);
+        runner.run();
+
+        runner.assertTransferCount(ValidateRecord.REL_VALID, 1);
+        final MockFlowFile validFlowFile = 
runner.getFlowFilesForRelationship(ValidateRecord.REL_VALID).getFirst();
+        //Validate timezone is taken into account
+        
assertFalse(validFlowFile.getContent().contains(timestampWithTimezone));

Review Comment:
   Thanks for confirming. That highlights the core concern. The current 
behavior of the `ObjectTimestampFieldConverter` expects the input and output 
values to be the same, which aligns with the understanding the 
`java.sql.Timestamp` does not have a zone offset. When converting from a 
"zoned" timestamp as in this test, it raises the question whether the output 
should reflect that offset when computing output timestamp.
   
   Perhaps one way to make the test both more specific, and workable across 
zones, is to build input string based on the system default zone offset. That 
way, the input and output hours should remain unchanged.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to