[ 
https://issues.apache.org/jira/browse/HUDI-7119?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

hehuiyuan updated HUDI-7119:
----------------------------
    Description: 
Don't write {{hoodie.table.precombine.field=ts}} to hoodie.properties when use 
Flink to create a hudi table and the `ts` field does not exist

 

There is an error when read hudi table using spark:

```

java.util.NoSuchElementException: key not found: ts
    at scala.collection.MapLike.default(MapLike.scala:235)
    at scala.collection.MapLike.default$(MapLike.scala:234)
    at scala.collection.AbstractMap.default(Map.scala:63)
    at scala.collection.MapLike.apply(MapLike.scala:144)
    at scala.collection.MapLike.apply$(MapLike.scala:143)
    at scala.collection.AbstractMap.apply(Map.scala:63)
    at 
org.apache.hudi.HoodieBaseRelation$.$anonfun$projectSchema$2(HoodieBaseRelation.scala:694)
    at 
scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:238)
    at 
scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
    at 
scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
    at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
    at scala.collection.TraversableLike.map(TraversableLike.scala:238)
    at scala.collection.TraversableLike.map$(TraversableLike.scala:231)
    at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)
    at 
org.apache.hudi.HoodieBaseRelation$.projectSchema(HoodieBaseRelation.scala:693)
    at 
org.apache.hudi.HoodieBaseRelation.buildScan(HoodieBaseRelation.scala:348)
    at 
org.apache.spark.sql.execution.datasources.DataSourceStrategy.$anonfun$apply$4(DataSourceStrategy.scala:298)
    at 
org.apache.spark.sql.execution.datasources.DataSourceStrategy.$anonfun$pruneFilterProject$1(DataSourceStrategy.scala:331)
    at 
org.apache.spark.sql.execution.datasources.DataSourceStrategy.pruneFilterProjectRaw(DataSourceStrategy.scala:386)
    at 
org.apache.spark.sql.execution.datasources.DataSourceStrategy.pruneFilterProject(DataSourceStrategy.scala:330)
    at 
org.apache.spark.sql.execution.datasources.DataSourceStrategy.apply(DataSourceStrategy.scala:298)
    at 
org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$1(QueryPlanner.scala:63)

```

  was:
Don't write {{hoodie.table.precombine.field=ts}} to hoodie.properties when 
create a insert table using flink.

 

There is error when read hudi table using spark:

```

java.util.NoSuchElementException: key not found: ts
    at scala.collection.MapLike.default(MapLike.scala:235)
    at scala.collection.MapLike.default$(MapLike.scala:234)
    at scala.collection.AbstractMap.default(Map.scala:63)
    at scala.collection.MapLike.apply(MapLike.scala:144)
    at scala.collection.MapLike.apply$(MapLike.scala:143)
    at scala.collection.AbstractMap.apply(Map.scala:63)
    at 
org.apache.hudi.HoodieBaseRelation$.$anonfun$projectSchema$2(HoodieBaseRelation.scala:694)
    at 
scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:238)
    at 
scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
    at 
scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
    at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
    at scala.collection.TraversableLike.map(TraversableLike.scala:238)
    at scala.collection.TraversableLike.map$(TraversableLike.scala:231)
    at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)
    at 
org.apache.hudi.HoodieBaseRelation$.projectSchema(HoodieBaseRelation.scala:693)
    at 
org.apache.hudi.HoodieBaseRelation.buildScan(HoodieBaseRelation.scala:348)
    at 
org.apache.spark.sql.execution.datasources.DataSourceStrategy.$anonfun$apply$4(DataSourceStrategy.scala:298)
    at 
org.apache.spark.sql.execution.datasources.DataSourceStrategy.$anonfun$pruneFilterProject$1(DataSourceStrategy.scala:331)
    at 
org.apache.spark.sql.execution.datasources.DataSourceStrategy.pruneFilterProjectRaw(DataSourceStrategy.scala:386)
    at 
org.apache.spark.sql.execution.datasources.DataSourceStrategy.pruneFilterProject(DataSourceStrategy.scala:330)
    at 
org.apache.spark.sql.execution.datasources.DataSourceStrategy.apply(DataSourceStrategy.scala:298)
    at 
org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$1(QueryPlanner.scala:63)

```


> Don't write hoodie.table.precombine.field=ts to hoodie.properties when create 
> a insert table using flink.
> ---------------------------------------------------------------------------------------------------------
>
>                 Key: HUDI-7119
>                 URL: https://issues.apache.org/jira/browse/HUDI-7119
>             Project: Apache Hudi
>          Issue Type: Bug
>            Reporter: hehuiyuan
>            Priority: Major
>
> Don't write {{hoodie.table.precombine.field=ts}} to hoodie.properties when 
> use Flink to create a hudi table and the `ts` field does not exist
>  
> There is an error when read hudi table using spark:
> ```
> java.util.NoSuchElementException: key not found: ts
>     at scala.collection.MapLike.default(MapLike.scala:235)
>     at scala.collection.MapLike.default$(MapLike.scala:234)
>     at scala.collection.AbstractMap.default(Map.scala:63)
>     at scala.collection.MapLike.apply(MapLike.scala:144)
>     at scala.collection.MapLike.apply$(MapLike.scala:143)
>     at scala.collection.AbstractMap.apply(Map.scala:63)
>     at 
> org.apache.hudi.HoodieBaseRelation$.$anonfun$projectSchema$2(HoodieBaseRelation.scala:694)
>     at 
> scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:238)
>     at 
> scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
>     at 
> scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
>     at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
>     at scala.collection.TraversableLike.map(TraversableLike.scala:238)
>     at scala.collection.TraversableLike.map$(TraversableLike.scala:231)
>     at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)
>     at 
> org.apache.hudi.HoodieBaseRelation$.projectSchema(HoodieBaseRelation.scala:693)
>     at 
> org.apache.hudi.HoodieBaseRelation.buildScan(HoodieBaseRelation.scala:348)
>     at 
> org.apache.spark.sql.execution.datasources.DataSourceStrategy.$anonfun$apply$4(DataSourceStrategy.scala:298)
>     at 
> org.apache.spark.sql.execution.datasources.DataSourceStrategy.$anonfun$pruneFilterProject$1(DataSourceStrategy.scala:331)
>     at 
> org.apache.spark.sql.execution.datasources.DataSourceStrategy.pruneFilterProjectRaw(DataSourceStrategy.scala:386)
>     at 
> org.apache.spark.sql.execution.datasources.DataSourceStrategy.pruneFilterProject(DataSourceStrategy.scala:330)
>     at 
> org.apache.spark.sql.execution.datasources.DataSourceStrategy.apply(DataSourceStrategy.scala:298)
>     at 
> org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$1(QueryPlanner.scala:63)
> ```



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to