[
https://issues.apache.org/jira/browse/BEAM-3709?focusedWorklogId=132493&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-132493
]
ASF GitHub Bot logged work on BEAM-3709:
----------------------------------------
Author: ASF GitHub Bot
Created on: 08/Aug/18 17:36
Start Date: 08/Aug/18 17:36
Worklog Time Spent: 10m
Work Description: aaltay closed pull request #6161: [BEAM-3709] Portable
combiner lifting support in py Dataflow Runner.
URL: https://github.com/apache/beam/pull/6161
This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:
As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):
diff --git a/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py
b/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py
index 87982a1c9f1..c36ae8ccd1e 100644
--- a/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py
+++ b/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py
@@ -714,15 +714,6 @@ def _pardo_fn_data(transform_node, get_label):
transform_node.inputs[0].windowing)
def apply_CombineValues(self, transform, pcoll):
- # TODO(BEAM-2937): Disable combiner lifting for fnapi. Remove this
- # restrictions once this feature is supported in the dataflow runner
- # harness.
- # Import here to avoid adding the dependency for local running scenarios.
- # pylint: disable=wrong-import-order, wrong-import-position
- from apache_beam.runners.dataflow.internal import apiclient
- if apiclient._use_fnapi(pcoll.pipeline._options):
- return self.apply_PTransform(transform, pcoll)
-
return pvalue.PCollection(pcoll.pipeline)
def run_CombineValues(self, transform_node):
@@ -731,13 +722,24 @@ def run_CombineValues(self, transform_node):
input_step = self._cache.get_pvalue(transform_node.inputs[0])
step = self._add_step(
TransformNames.COMBINE, transform_node.full_label, transform_node)
- # Combiner functions do not take deferred side-inputs (i.e. PValues) and
- # therefore the code to handle extra args/kwargs is simpler than for the
- # DoFn's of the ParDo transform. In the last, empty argument is where
- # side inputs information would go.
- fn_data = (transform.fn, transform.args, transform.kwargs, ())
- step.add_property(PropertyNames.SERIALIZED_FN,
- pickler.dumps(fn_data))
+
+ # The data transmitted in SERIALIZED_FN is different depending on whether
+ # this is a fnapi pipeline or not.
+ from apache_beam.runners.dataflow.internal import apiclient
+ if apiclient._use_fnapi(transform_node.inputs[0].pipeline._options):
+ # Fnapi pipelines send the transform ID of the CombineValues transform's
+ # parent composite because Dataflow expects the ID of a CombinePerKey
+ # transform.
+ serialized_data = self.proto_context.transforms.get_id(
+ transform_node.parent)
+ else:
+ # Combiner functions do not take deferred side-inputs (i.e. PValues) and
+ # therefore the code to handle extra args/kwargs is simpler than for the
+ # DoFn's of the ParDo transform. In the last, empty argument is where
+ # side inputs information would go.
+ serialized_data = pickler.dumps((transform.fn, transform.args,
+ transform.kwargs, ()))
+ step.add_property(PropertyNames.SERIALIZED_FN, serialized_data)
step.add_property(
PropertyNames.PARALLEL_INPUT,
{'@type': 'OutputReference',
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
Issue Time Tracking
-------------------
Worklog Id: (was: 132493)
Time Spent: 2h (was: 1h 50m)
> Implement the portable lifted Combiner transforms in Python SDK
> ---------------------------------------------------------------
>
> Key: BEAM-3709
> URL: https://issues.apache.org/jira/browse/BEAM-3709
> Project: Beam
> Issue Type: Sub-task
> Components: sdk-py-core, sdk-py-harness
> Reporter: Daniel Oliveira
> Assignee: Daniel Oliveira
> Priority: Major
> Labels: portability
> Time Spent: 2h
> Remaining Estimate: 0h
>
> Lifted combines are split into separate parts with different URNs. These
> parts need to be implemented in the Python SDK harness so that the SDK can
> actually execute them when receiving Combine transforms with the
> corresponding URNs.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)