[ 
https://issues.apache.org/jira/browse/FLINK-9444?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16531189#comment-16531189
 ] 

ASF GitHub Bot commented on FLINK-9444:
---------------------------------------

Github user twalthr commented on a diff in the pull request:

    https://github.com/apache/flink/pull/6218#discussion_r199761646
  
    --- Diff: 
flink-formats/flink-avro/src/test/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverterTest.java
 ---
    @@ -0,0 +1,71 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one
    + * or more contributor license agreements.  See the NOTICE file
    + * distributed with this work for additional information
    + * regarding copyright ownership.  The ASF licenses this file
    + * to you under the Apache License, Version 2.0 (the
    + * "License"); you may not use this file except in compliance
    + * with the License.  You may obtain a copy of the License at
    + *
    + *     http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.flink.formats.avro.typeutils;
    +
    +import org.apache.flink.api.common.typeinfo.TypeInformation;
    +import org.apache.flink.api.common.typeinfo.Types;
    +import org.apache.flink.api.java.typeutils.RowTypeInfo;
    +import org.apache.flink.formats.avro.generated.User;
    +import org.apache.flink.types.Row;
    +
    +import org.junit.Test;
    +
    +import static org.junit.Assert.assertEquals;
    +import static org.junit.Assert.assertTrue;
    +
    +/**
    + * Tests for {@link AvroSchemaConverter}.
    + */
    +public class AvroSchemaConverterTest {
    +
    +   @Test
    +   public void testAvroClassConversion() {
    +           validateUserSchema(AvroSchemaConverter.convert(User.class));
    +   }
    +
    +   @Test
    +   public void testAvroSchemaConversion() {
    +           final String schema = User.getClassSchema().toString(true);
    +           validateUserSchema(AvroSchemaConverter.convert(schema));
    +   }
    +
    +   private void validateUserSchema(TypeInformation<?> actual) {
    +           final TypeInformation<Row> address = Types.ROW_NAMED(
    +                   new String[]{"num", "street", "city", "state", "zip"},
    +                   Types.INT, Types.STRING, Types.STRING, Types.STRING, 
Types.STRING);
    +
    +           final TypeInformation<Row> user = Types.ROW_NAMED(
    +                   new String[] {"name", "favorite_number", 
"favorite_color", "type_long_test",
    --- End diff --
    
    Actually, I'm a big fan of per line fields but it also blows up the code.


> KafkaAvroTableSource failed to work for map and array fields
> ------------------------------------------------------------
>
>                 Key: FLINK-9444
>                 URL: https://issues.apache.org/jira/browse/FLINK-9444
>             Project: Flink
>          Issue Type: Bug
>          Components: Kafka Connector, Table API &amp; SQL
>    Affects Versions: 1.6.0
>            Reporter: Jun Zhang
>            Assignee: Jun Zhang
>            Priority: Blocker
>              Labels: patch, pull-request-available
>             Fix For: 1.6.0
>
>         Attachments: flink-9444.patch
>
>
> When some Avro schema has map/array fields and the corresponding TableSchema 
> declares *MapTypeInfo/ListTypeInfo* for these fields, an exception will be 
> thrown when registering the *KafkaAvroTableSource*, complaining like:
> Exception in thread "main" org.apache.flink.table.api.ValidationException: 
> Type Map<String, Integer> of table field 'event' does not match with type 
> GenericType<java.util.Map> of the field 'event' of the TableSource return 
> type.
>  at org.apache.flink.table.api.ValidationException$.apply(exceptions.scala:74)
>  at 
> org.apache.flink.table.sources.TableSourceUtil$$anonfun$validateTableSource$1.apply(TableSourceUtil.scala:92)
>  at 
> org.apache.flink.table.sources.TableSourceUtil$$anonfun$validateTableSource$1.apply(TableSourceUtil.scala:71)
>  at 
> scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
>  at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
>  at 
> org.apache.flink.table.sources.TableSourceUtil$.validateTableSource(TableSourceUtil.scala:71)
>  at 
> org.apache.flink.table.plan.schema.StreamTableSourceTable.<init>(StreamTableSourceTable.scala:33)
>  at 
> org.apache.flink.table.api.StreamTableEnvironment.registerTableSourceInternal(StreamTableEnvironment.scala:124)
>  at 
> org.apache.flink.table.api.TableEnvironment.registerTableSource(TableEnvironment.scala:438)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to