[ 
https://issues.apache.org/jira/browse/BEAM-11482?focusedWorklogId=526535&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-526535
 ]

ASF GitHub Bot logged work on BEAM-11482:
-----------------------------------------

                Author: ASF GitHub Bot
            Created on: 20/Dec/20 14:53
            Start Date: 20/Dec/20 14:53
    Worklog Time Spent: 10m 
      Work Description: ccciudatu commented on a change in pull request #13572:
URL: https://github.com/apache/beam/pull/13572#discussion_r546387329



##########
File path: 
sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/kafka/BeamKafkaThriftTable.java
##########
@@ -74,55 +92,81 @@ private static Schema thriftSchema(
 
   @Override
   protected PTransform<PCollection<KV<byte[], byte[]>>, PCollection<Row>> 
getPTransformForInput() {
-    final @NonNull SchemaProvider schemaProvider = ThriftSchema.provider();
-    return new PTransform<PCollection<KV<byte[], byte[]>>, PCollection<Row>>() 
{
-      @Override
-      @SuppressWarnings("nullness")
-      public PCollection<Row> expand(PCollection<KV<byte[], byte[]>> input) {
-        return input
-            .apply(Values.create())
-            
.apply(MapElements.into(typeDescriptor).via(BeamKafkaThriftTable.this::decode))
-            .setSchema(
-                schema,
-                typeDescriptor,
-                schemaProvider.toRowFunction(typeDescriptor),
-                schemaProvider.fromRowFunction(typeDescriptor))
-            .apply(Convert.toRows());
-      }
-    };
+    return new InputTransformer(typeDescriptor, coder, schema);
   }
 
-  private T decode(byte[] bytes) {
-    try {
-      return thriftCoder.decode(new ByteArrayInputStream(bytes));
-    } catch (IOException e) {
-      throw new IllegalStateException(e);
+  private static class InputTransformer<T extends TBase<?, ?>>

Review comment:
       Got it, makes sense. I tried to solve it locally for Kafka, as I think 
making this truly reusable/composable requires a larger refactoring that should 
follow up. But those functions do seem to rather belong in the `ThriftSchema` 
anyway, so I'll fix this.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Issue Time Tracking
-------------------

    Worklog Id:     (was: 526535)
    Time Spent: 7h  (was: 6h 50m)

> Thrift support for KafkaTableProvider
> -------------------------------------
>
>                 Key: BEAM-11482
>                 URL: https://issues.apache.org/jira/browse/BEAM-11482
>             Project: Beam
>          Issue Type: New Feature
>          Components: dsl-sql, io-java-kafka
>            Reporter: Costi Ciudatu
>            Assignee: Costi Ciudatu
>            Priority: P2
>          Time Spent: 7h
>  Remaining Estimate: 0h
>
> Kafka table provider can leverage the Thrift coder and schema provider 
> defined in the IO package to handle thrift input/output.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to