Hi, can this test case[1] help for you? You can use SQL like 'insert into ... 
select ..., Array[...], MAP[...], Array[MAP[...]] from ...'. If this can not 
meet your requirement, what about using UDF[2]?




[1]https://github.com/apache/flink/blob/65907bc5470bc43f0227ab287d2a6f150ba0bc29/flink-connectors/flink-connector-kafka/src/test/java/org/apache/flink/streaming/connectors/kafka/table/KafkaTableITCase.java#L310

[2]https://nightlies.apache.org/flink/flink-docs-master/docs/dev/table/functions/udfs/

--

    Best!
    Xuyang




At 2022-09-11 16:54:26, "Lior Liviev" <lior.liv...@earnix.com> wrote:

Hello,
I want to make the input record be the content of a field in the output record. 
Can this be achieved with the Flink-SQL DSL?
Let's say I'm getting an input that looks like this:

{
      ...
      "fields": [{
            "name": "intField",
            "type": "int"
      }, {
            "name": "stringField",
            "type": "string"
      }]
}



Is it possible to transform it, using SQL, to something like this:
{
      ....
      "fields": [{
            "name": "payload",
            "type": {
                  "type": "record",
                  "name": "Schema",
                  "namespace": "line4.read.iw.iw",
                  "fields": [{
                        "name": "intField",
                        "type": "int"
                  }, {
                        "name": "stringField",
                        "type": "string"
                  }]
            }
      }]
}



If not are there any alternatives?

Reply via email to