[
https://issues.apache.org/jira/browse/BEAM-3737?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17131740#comment-17131740
]
Beam JIRA Bot commented on BEAM-3737:
-------------------------------------
This issue is P2 but has been unassigned without any comment for 60 days so it
has been labeled "stale-P2". If this issue is still affecting you, we care!
Please comment and remove the label. Otherwise, in 14 days the issue will be
moved to P3.
Please see https://beam.apache.org/contribute/jira-priorities/ for a detailed
explanation of what these priorities mean.
> Key-aware batching function
> ---------------------------
>
> Key: BEAM-3737
> URL: https://issues.apache.org/jira/browse/BEAM-3737
> Project: Beam
> Issue Type: New Feature
> Components: sdk-py-core
> Reporter: Chuan Yu Foo
> Priority: P2
> Labels: stale-P2
>
> I have a CombineFn for which add_input has very large overhead. I would like
> to batch the incoming elements into a large batch before each call to
> add_input to reduce this overhead. In other words, I would like to do
> something like:
> {{elements | GroupByKey() | BatchElements() | CombineValues(MyCombineFn())}}
> Unfortunately, BatchElements is not key-aware, and can't be used after a
> GroupByKey to batch elements per key. I'm working around this by doing the
> batching within CombineValues, which makes the CombineFn rather messy. It
> would be nice if there were a key-aware BatchElements transform which could
> be used in this context.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)