Abacn commented on code in PR #31915:
URL: https://github.com/apache/beam/pull/31915#discussion_r1683562697
##########
sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/Redistribute.java:
##########
@@ -160,6 +161,7 @@ public PCollection<T> expand(PCollection<T> input) {
return input
.apply("Pair with random key", ParDo.of(new
AssignShardFn<>(numBuckets)))
.apply(Redistribute.<Integer,
T>byKey().withAllowDuplicates(this.allowDuplicates))
+ .apply("Redistribute", ParDo.of(new RedistributeFn<>()))
Review Comment:
Thanks for proposing the fix. As this is a Spark runner issue, a proper fix
should be in runners/spark code path. Java core is a common code path affects
all runners.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]