[
https://issues.apache.org/jira/browse/FLINK-20989?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
godfrey he updated FLINK-20989:
-------------------------------
Description:
The following test case will encounter NPE:
{code:scala}
val t = tEnv.fromValues(
DataTypes.ROW(
DataTypes.FIELD("a", DataTypes.INT()),
DataTypes.FIELD("b", DataTypes.ARRAY(DataTypes.STRING()))
),
row(1, Array("aa", "bb", "cc")),
row(2, null),
row(3, Array("dd"))
)
tEnv.registerTable("T", t)
tEnv.executeSql("SELECT a, s FROM T, UNNEST(T.b) as A (s)").print()
{code}
Exception is
{code:java}
Caused by: java.lang.NullPointerException
at
scala.collection.mutable.ArrayOps$ofRef$.length$extension(ArrayOps.scala:192)
at scala.collection.mutable.ArrayOps$ofRef.length(ArrayOps.scala:192)
at
scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:32)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
at
org.apache.flink.table.planner.plan.utils.ObjectExplodeTableFunc.eval(ExplodeFunctionUtil.scala:34)
{code}
The reason is functions in ExplodeFunctionUtil do not handle null data. Since
1.12, the bug is fixed.
was:
The following test case will encounter NPE:
{code:scala}
val t = tEnv.fromValues(
DataTypes.ROW(
DataTypes.FIELD("a", DataTypes.INT()),
DataTypes.FIELD("b", DataTypes.ARRAY(DataTypes.STRING()))
),
row(1, Array("aa", "bb", "cc")),
row(2, null),
row(3, Array("dd"))
)
tEnv.registerTable("T", t)
tEnv.executeSql("SELECT a, s FROM T, UNNEST(T.b) as A (s)").print()
{code}
Exception is
{code:java}
Caused by: java.lang.NullPointerException
at
scala.collection.mutable.ArrayOps$ofRef$.length$extension(ArrayOps.scala:192)
at scala.collection.mutable.ArrayOps$ofRef.length(ArrayOps.scala:192)
at
scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:32)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
at
org.apache.flink.table.planner.plan.utils.ObjectExplodeTableFunc.eval(ExplodeFunctionUtil.scala:34)
{code}
The reason is functions in ExplodeFunctionUtil do not handle null data
> Functions in ExplodeFunctionUtil should handle null data to avoid NPE
> ---------------------------------------------------------------------
>
> Key: FLINK-20989
> URL: https://issues.apache.org/jira/browse/FLINK-20989
> Project: Flink
> Issue Type: Bug
> Components: Table SQL / Planner
> Affects Versions: 1.11.3
> Reporter: godfrey he
> Assignee: godfrey he
> Priority: Major
> Labels: pull-request-available
> Fix For: 1.11.4
>
>
> The following test case will encounter NPE:
> {code:scala}
> val t = tEnv.fromValues(
> DataTypes.ROW(
> DataTypes.FIELD("a", DataTypes.INT()),
> DataTypes.FIELD("b", DataTypes.ARRAY(DataTypes.STRING()))
> ),
> row(1, Array("aa", "bb", "cc")),
> row(2, null),
> row(3, Array("dd"))
> )
> tEnv.registerTable("T", t)
> tEnv.executeSql("SELECT a, s FROM T, UNNEST(T.b) as A (s)").print()
> {code}
> Exception is
> {code:java}
> Caused by: java.lang.NullPointerException
> at
> scala.collection.mutable.ArrayOps$ofRef$.length$extension(ArrayOps.scala:192)
> at scala.collection.mutable.ArrayOps$ofRef.length(ArrayOps.scala:192)
> at
> scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:32)
> at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
> at
> org.apache.flink.table.planner.plan.utils.ObjectExplodeTableFunc.eval(ExplodeFunctionUtil.scala:34)
> {code}
> The reason is functions in ExplodeFunctionUtil do not handle null data. Since
> 1.12, the bug is fixed.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)