[ 
https://issues.apache.org/jira/browse/BEAM-9977?focusedWorklogId=465914&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-465914
 ]

ASF GitHub Bot logged work on BEAM-9977:
----------------------------------------

                Author: ASF GitHub Bot
            Created on: 03/Aug/20 21:17
            Start Date: 03/Aug/20 21:17
    Worklog Time Spent: 10m 
      Work Description: boyuanzz commented on a change in pull request #11749:
URL: https://github.com/apache/beam/pull/11749#discussion_r464667015



##########
File path: 
sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaIO.java
##########
@@ -906,19 +926,89 @@ public void setValueDeserializer(String 
valueDeserializer) {
       Coder<K> keyCoder = getKeyCoder(coderRegistry);
       Coder<V> valueCoder = getValueCoder(coderRegistry);
 
-      // Handles unbounded source to bounded conversion if maxNumRecords or 
maxReadTime is set.
-      Unbounded<KafkaRecord<K, V>> unbounded =
-          org.apache.beam.sdk.io.Read.from(
-              
toBuilder().setKeyCoder(keyCoder).setValueCoder(valueCoder).build().makeSource());
+      // The Read will be expanded into SDF transform when "beam_fn_api" is 
enabled and
+      // "beam_fn_api_use_deprecated_read" is not enabled.
+      if (!ExperimentalOptions.hasExperiment(input.getPipeline().getOptions(), 
"beam_fn_api")
+          || ExperimentalOptions.hasExperiment(
+              input.getPipeline().getOptions(), 
"beam_fn_api_use_deprecated_read")) {
+        // Handles unbounded source to bounded conversion if maxNumRecords or 
maxReadTime is set.
+        Unbounded<KafkaRecord<K, V>> unbounded =
+            org.apache.beam.sdk.io.Read.from(
+                
toBuilder().setKeyCoder(keyCoder).setValueCoder(valueCoder).build().makeSource());
+
+        PTransform<PBegin, PCollection<KafkaRecord<K, V>>> transform = 
unbounded;
+
+        if (getMaxNumRecords() < Long.MAX_VALUE || getMaxReadTime() != null) {
+          transform =
+              
unbounded.withMaxReadTime(getMaxReadTime()).withMaxNumRecords(getMaxNumRecords());
+        }
 
-      PTransform<PBegin, PCollection<KafkaRecord<K, V>>> transform = unbounded;
+        return input.getPipeline().apply(transform);
+      } else {
+        ReadViaSDF<K, V, Manual> readTransform =
+            ReadViaSDF.<K, V, Manual>read()
+                .withConsumerConfigOverrides(getConsumerConfig())
+                .withOffsetConsumerConfigOverrides(getOffsetConsumerConfig())
+                .withConsumerFactoryFn(getConsumerFactoryFn())
+                .withKeyDeserializerProvider(getKeyDeserializerProvider())
+                .withValueDeserializerProvider(getValueDeserializerProvider())
+                .withManualWatermarkEstimator()
+                .withTimestampPolicyFactory(getTimestampPolicyFactory());
+        if (isCommitOffsetsInFinalizeEnabled()) {
+          readTransform = readTransform.commitOffsets();
+        }
 
-      if (getMaxNumRecords() < Long.MAX_VALUE || getMaxReadTime() != null) {
-        transform =
-            
unbounded.withMaxReadTime(getMaxReadTime()).withMaxNumRecords(getMaxNumRecords());
+        return input
+            .getPipeline()
+            .apply(Impulse.create())
+            .apply(ParDo.of(new GenerateKafkaSourceDescription(this)))
+            .setCoder(SerializableCoder.of(KafkaSourceDescription.class))
+            .apply(readTransform)
+            .setCoder(KafkaRecordCoder.of(keyCoder, valueCoder));

Review comment:
       The coder needs to be set when in x-lang case. It seems like there is 
something not correct when x-lang expand the transform.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Issue Time Tracking
-------------------

    Worklog Id:     (was: 465914)
    Time Spent: 19h 40m  (was: 19.5h)

> Build Kafka Read on top of Java SplittableDoFn
> ----------------------------------------------
>
>                 Key: BEAM-9977
>                 URL: https://issues.apache.org/jira/browse/BEAM-9977
>             Project: Beam
>          Issue Type: New Feature
>          Components: io-java-kafka
>            Reporter: Boyuan Zhang
>            Assignee: Boyuan Zhang
>            Priority: P2
>          Time Spent: 19h 40m
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to