sivabalan narayanan created HUDI-3164:
-----------------------------------------

             Summary: CTAS fails w/ UnsupportedOperationException when trying 
to modify immutable map in 
DataSourceUtils.mayBeOverwriteParquetWriteLegacyFormatProp
                 Key: HUDI-3164
                 URL: https://issues.apache.org/jira/browse/HUDI-3164
             Project: Apache Hudi
          Issue Type: Task
            Reporter: sivabalan narayanan


CTAS fails w/ UnsupportedOperationException when trying to modify immutable map 
in DataSourceUtils.mayBeOverwriteParquetWriteLegacyFormatProp. 

with spark3.2 master. 

 
{code:java}
val s = """
create table catalog_sales 
USING HUDI
options (
  type = 'cow',
  primaryKey = 'cs_item_sk,cs_order_number'
)
LOCATION 'file:///tmp/catalog_sales_hudi'
PARTITIONED BY (cs_sold_date_sk)
AS SELECT * FROM catalog_sales_ext2  {code}
stacktrace:
{code:java}
java.lang.UnsupportedOperationException
  at java.util.Collections$UnmodifiableMap.put(Collections.java:1459)
  at 
org.apache.hudi.DataSourceUtils.mayBeOverwriteParquetWriteLegacyFormatProp(DataSourceUtils.java:323)
  at 
org.apache.hudi.spark3.internal.DefaultSource.getTable(DefaultSource.java:59)
  at 
org.apache.spark.sql.execution.datasources.v2.DataSourceV2Utils$.getTableFromProvider(DataSourceV2Utils.scala:83)
  at org.apache.spark.sql.DataFrameWriter.getTable$1(DataFrameWriter.scala:280)
  at 
org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:296)
  at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:247)
  at 
org.apache.hudi.HoodieSparkSqlWriter$.bulkInsertAsRow(HoodieSparkSqlWriter.scala:478)
  at org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:159)
  at 
org.apache.spark.sql.hudi.command.InsertIntoHoodieTableCommand$.run(InsertIntoHoodieTableCommand.scala:109)
  at 
org.apache.spark.sql.hudi.command.CreateHoodieTableAsSelectCommand.run(CreateHoodieTableAsSelectCommand.scala:91)
 {code}
 



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

Reply via email to