[
https://issues.apache.org/jira/browse/FLINK-3871?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15673942#comment-15673942
]
ASF GitHub Bot commented on FLINK-3871:
---------------------------------------
Github user fhueske commented on a diff in the pull request:
https://github.com/apache/flink/pull/2762#discussion_r88461076
--- Diff:
flink-streaming-connectors/flink-connector-kafka-base/src/main/java/org/apache/flink/streaming/util/serialization/AvroRowSerializationSchema.java
---
@@ -0,0 +1,83 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.flink.streaming.util.serialization;
+
+import org.apache.avro.Schema;
+import org.apache.avro.generic.GenericData;
+import org.apache.avro.generic.GenericDatumWriter;
+import org.apache.avro.generic.GenericRecord;
+import org.apache.avro.io.DatumWriter;
+import org.apache.avro.io.Encoder;
+import org.apache.avro.io.EncoderFactory;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.table.Row;
+
+import java.io.ByteArrayOutputStream;
+import java.io.IOException;
+
+import static
org.apache.flink.streaming.connectors.kafka.internals.TypeUtil.createRowAvroSchema;
+
+/**
+ * Serialization schema that serializes an object into a Avro bytes.
+ * <p>
+ */
+public class AvroRowSerializationSchema implements
SerializationSchema<Row> {
+
+ /** Field names in a Row */
+ private final String[] fieldNames;
+ /** Avro serialization schema */
+ private final Schema schema;
+ /** Writer to serialize Avro GeneralRecord into a byte array */
+ private final DatumWriter<GenericRecord> datumWriter;
+ /** Output stream to serialize records into byte array */
+ private final ByteArrayOutputStream arrayOutputStream = new
ByteArrayOutputStream();
+ /** Low level class for serialization of Avro values */
+ private final Encoder encoder =
EncoderFactory.get().directBinaryEncoder(arrayOutputStream, null);
--- End diff --
use `binaryEncoder` instead of `directBinaryEncoder` to get a buffering
encoder
> Add Kafka TableSource with Avro serialization
> ---------------------------------------------
>
> Key: FLINK-3871
> URL: https://issues.apache.org/jira/browse/FLINK-3871
> Project: Flink
> Issue Type: New Feature
> Components: Table API & SQL
> Reporter: Fabian Hueske
>
> Add a Kafka TableSource which supports Avro serialized data.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)