n3nash commented on a change in pull request #2374:
URL: https://github.com/apache/hudi/pull/2374#discussion_r550972255
##########
File path:
hudi-client/hudi-spark-client/src/main/java/org/apache/hudi/table/action/cluster/SparkExecuteClusteringCommitActionExecutor.java
##########
@@ -118,7 +118,7 @@ public
SparkExecuteClusteringCommitActionExecutor(HoodieEngineContext context,
Schema readerSchema = HoodieAvroUtils.addMetadataFields(new
Schema.Parser().parse(config.getSchema()));
return ((ClusteringExecutionStrategy<T, JavaRDD<HoodieRecord<? extends
HoodieRecordPayload>>, JavaRDD<HoodieKey>, JavaRDD<WriteStatus>>)
ReflectionUtils.loadClass(config.getClusteringExecutionStrategyClass(), table,
context, config))
- .performClustering(inputRecords,
clusteringGroup.getNumOutputFileGroups(), instantTime, strategyParams,
readerSchema);
+ .performClustering(inputRecords, 0, instantTime, strategyParams,
readerSchema);
Review comment:
The build was failing and not able to find this because I probably
hadn't built the avro classes, so these changes were just a workaround, they
have been reverted now.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]