[ 
https://issues.apache.org/jira/browse/BEAM-7310?focusedWorklogId=375666&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-375666
 ]

ASF GitHub Bot logged work on BEAM-7310:
----------------------------------------

                Author: ASF GitHub Bot
            Created on: 22/Jan/20 15:50
            Start Date: 22/Jan/20 15:50
    Worklog Time Spent: 10m 
      Work Description: iemejia commented on pull request #10563: [BEAM-7310] 
Add support of Confluent Schema Registry for KafkaIO
URL: https://github.com/apache/beam/pull/10563#discussion_r369641188
 
 

 ##########
 File path: 
sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaIO.java
 ##########
 @@ -994,34 +1044,17 @@ private Schema fetchAvroSchema(String 
schemaRegistryUrl, String subject) {
       Coder<V> valueCoder;
       if (avroValueSchema != null) {
         valueCoder = (Coder<V>) AvroCoder.of(avroValueSchema);
-        checkArgument(
-            valueCoder != null,
-            "Value coder could not be inferred from value Avro schema. Please 
provide"
-                + "value coder explicitly using 
withValueDeserializerAndCoder()");
       } else {
         valueCoder =
             getValueCoder() != null
                 ? getValueCoder()
-                : inferCoder(registry, getValueDeserializer());
-        checkArgument(
-            valueCoder != null,
-            "Value coder could not be inferred from value deserializer. Please 
provide"
-                + "value coder explicitly using 
withValueDeserializerAndCoder()");
+                : inferCoder(coderRegistry, getValueDeserializer());
       }
-
-      // Handles unbounded source to bounded conversion if maxNumRecords or 
maxReadTime is set.
-      Unbounded<KafkaRecord<K, V>> unbounded =
-          org.apache.beam.sdk.io.Read.from(
-              
toBuilder().setKeyCoder(keyCoder).setValueCoder(valueCoder).build().makeSource());
-
-      PTransform<PBegin, PCollection<KafkaRecord<K, V>>> transform = unbounded;
-
-      if (getMaxNumRecords() < Long.MAX_VALUE || getMaxReadTime() != null) {
-        transform =
-            
unbounded.withMaxReadTime(getMaxReadTime()).withMaxNumRecords(getMaxNumRecords());
-      }
-
-      return input.getPipeline().apply(transform);
+      checkArgument(
 
 Review comment:
   `checkState` better, we should use `checkArgument` only for arguments.
 
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


Issue Time Tracking
-------------------

    Worklog Id:     (was: 375666)
    Time Spent: 7h  (was: 6h 50m)

> Confluent Schema Registry support in KafkaIO
> --------------------------------------------
>
>                 Key: BEAM-7310
>                 URL: https://issues.apache.org/jira/browse/BEAM-7310
>             Project: Beam
>          Issue Type: Improvement
>          Components: io-java-kafka
>    Affects Versions: 2.12.0
>            Reporter: Yohei Shimomae
>            Assignee: Alexey Romanenko
>            Priority: Minor
>          Time Spent: 7h
>  Remaining Estimate: 0h
>
> Confluent Schema Registry is useful when we manage Avro Schema but  KafkaIO 
> does not support Confluent Schema Registry as discussed here.
> https://stackoverflow.com/questions/56035121/unable-to-connect-from-dataflow-job-to-schema-registry-when-schema-registry-requ
> https://lists.apache.org/thread.html/7695fccddebd08733b80ae1e43b79b636b63cd5fe583a2bdeecda6c4@%3Cuser.beam.apache.org%3E



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to