melin opened a new issue #4202:
URL: https://github.com/apache/hudi/issues/4202


   ```
   [INFO] Compiling 67 source files to 
/Users/huaixin/Documents/codes/bigdata/hudi/hudi-spark-datasource/hudi-spark/target/classes
 at 1638515566715
   [ERROR] 
/Users/huaixin/Documents/codes/bigdata/hudi/hudi-spark-datasource/hudi-spark/src/main/scala/org/apache/hudi/HoodieFileIndex.scala:562:
 error: value literals is not a member of 
org.apache.spark.sql.execution.datasources.PartitioningUtils.PartitionValues
   [ERROR]           partitionValues.map(_.literals.map(_.value))
   [ERROR]                                 ^
   [ERROR] 
/Users/huaixin/Documents/codes/bigdata/hudi/hudi-spark-datasource/hudi-spark/src/main/scala/org/apache/hudi/HoodieFileIndex.scala:563:
 error: missing argument list for method fromSeq in object InternalRow
   [ERROR] Unapplied methods are only converted to functions when a function 
type is expected.
   [ERROR] You can make this conversion explicit by writing `fromSeq _` or 
`fromSeq(_)` instead of `fromSeq`.
   [ERROR]             .map(InternalRow.fromSeq)
   [ERROR]                              ^
   [ERROR] 
/Users/huaixin/Documents/codes/bigdata/hudi/hudi-spark-datasource/hudi-spark/src/main/scala/org/apache/spark/sql/avro/HoodieAvroDeserializer.scala:28:
 error: overloaded method constructor AvroDeserializer with alternatives:
   [ERROR]   (rootAvroType: org.apache.avro.Schema,rootCatalystType: 
org.apache.spark.sql.types.DataType,datetimeRebaseMode: 
String)org.apache.spark.sql.avro.AvroDeserializer <and>
   [ERROR]   (rootAvroType: org.apache.avro.Schema,rootCatalystType: 
org.apache.spark.sql.types.DataType,positionalFieldMatch: 
Boolean,datetimeRebaseMode: 
org.apache.spark.sql.internal.SQLConf.LegacyBehaviorPolicy.Value,filters: 
org.apache.spark.sql.catalyst.StructFilters)org.apache.spark.sql.avro.AvroDeserializer
   [ERROR]  cannot be applied to (org.apache.avro.Schema, 
org.apache.spark.sql.types.DataType)
   [ERROR]   extends AvroDeserializer(rootAvroType, rootCatalystType) {
   [ERROR]           ^
   [WARNING] 
/Users/huaixin/Documents/codes/bigdata/hudi/hudi-spark-datasource/hudi-spark/src/main/scala/org/apache/spark/sql/hudi/DataSkippingUtils.scala:169:
 warning: non-variable type argument 
org.apache.spark.sql.catalyst.expressions.Literal in type pattern 
Seq[org.apache.spark.sql.catalyst.expressions.Literal] (the underlying of 
Seq[org.apache.spark.sql.catalyst.expressions.Literal]) is unchecked since it 
is eliminated by erasure
   [WARNING]       case In(attribute: AttributeReference, list: Seq[Literal]) =>
   [WARNING]                                                    ^
   [WARNING] 
/Users/huaixin/Documents/codes/bigdata/hudi/hudi-spark-datasource/hudi-spark/src/main/scala/org/apache/spark/sql/hudi/DataSkippingUtils.scala:178:
 warning: non-variable type argument 
org.apache.spark.sql.catalyst.expressions.Literal in type pattern 
Seq[org.apache.spark.sql.catalyst.expressions.Literal] (the underlying of 
Seq[org.apache.spark.sql.catalyst.expressions.Literal]) is unchecked since it 
is eliminated by erasure
   [WARNING]       case Not(In(attribute: AttributeReference, list: 
Seq[Literal])) =>
   [WARNING]                                                        ^
   [ERROR] 
/Users/huaixin/Documents/codes/bigdata/hudi/hudi-spark-datasource/hudi-spark/src/main/scala/org/apache/spark/sql/hudi/analysis/HoodieAnalysis.scala:427:
 error: wrong number of arguments for pattern 
org.apache.spark.sql.execution.command.ShowPartitionsCommand(tableName: 
org.apache.spark.sql.catalyst.TableIdentifier,output: 
Seq[org.apache.spark.sql.catalyst.expressions.Attribute],spec: 
Option[org.apache.spark.sql.catalyst.catalog.CatalogTypes.TablePartitionSpec])
   [ERROR]       case ShowPartitionsCommand(tableName, specOpt)
   [ERROR]                                 ^
   [ERROR] 
/Users/huaixin/Documents/codes/bigdata/hudi/hudi-spark-datasource/hudi-spark/src/main/scala/org/apache/spark/sql/hudi/command/AlterHoodieTableAddColumnsCommand.scala:90:
 error: overloaded method value checkDataColNames with alternatives:
   [ERROR]   (provider: String,schema: 
org.apache.spark.sql.types.StructType)Unit <and>
   [ERROR]   (table: org.apache.spark.sql.catalyst.catalog.CatalogTable,schema: 
org.apache.spark.sql.types.StructType)Unit
   [ERROR]  cannot be applied to 
(org.apache.spark.sql.catalyst.catalog.CatalogTable, Seq[String])
   [ERROR]     DDLUtils.checkDataColNames(table, colsToAdd.map(_.name))
   [ERROR]              ^
   [ERROR] 
/Users/huaixin/Documents/codes/bigdata/hudi/hudi-spark-datasource/hudi-spark/src/main/scala/org/apache/spark/sql/hudi/command/CreateHoodieTableCommand.scala:201:
 error: not found: value DATASOURCE_SCHEMA_NUMPARTS
   [ERROR]     properties.put(DATASOURCE_SCHEMA_NUMPARTS, parts.size.toString)
   [ERROR]                    ^
   [ERROR] 
/Users/huaixin/Documents/codes/bigdata/hudi/hudi-spark-datasource/hudi-spark/src/main/scala/org/apache/spark/sql/hudi/command/MergeIntoHoodieTableCommand.scala:206:
 error: wrong number of arguments for pattern 
org.apache.spark.sql.catalyst.expressions.Cast(child: 
org.apache.spark.sql.catalyst.expressions.Expression,dataType: 
org.apache.spark.sql.types.DataType,timeZoneId: Option[String],ansiEnabled: 
Boolean)
   [ERROR]       case Cast(attr: AttributeReference, _, _) if 
sourceColumnName.find(resolver(_, attr.name)).get.equals(targetColumnName) => 
true
   [ERROR]                ^
   [WARNING] two warnings found
   [ERROR] 7 errors found
   
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to