[
https://issues.apache.org/jira/browse/BEAM-6894?focusedWorklogId=226616&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-226616
]
ASF GitHub Bot logged work on BEAM-6894:
----------------------------------------
Author: ASF GitHub Bot
Created on: 12/Apr/19 12:07
Start Date: 12/Apr/19 12:07
Worklog Time Spent: 10m
Work Description: robertwb commented on pull request #8270: [BEAM-6894]
Updates Dataflow runner to support external ParDos.
URL: https://github.com/apache/beam/pull/8270#discussion_r274873682
##########
File path: sdks/python/apache_beam/runners/dataflow/dataflow_runner.py
##########
@@ -783,15 +819,19 @@ def run_ParDo(self, transform_node, options):
step.add_property(PropertyNames.OUTPUT_INFO, outputs)
- # Add the restriction encoding if we are a splittable DoFn
- # and are using the Fn API on the unified worker.
- from apache_beam.runners.common import DoFnSignature
- signature = DoFnSignature(transform_node.transform.fn)
- if (use_fnapi and use_unified_worker and signature.is_splittable_dofn()):
- restriction_coder = (
- signature.get_restriction_provider().restriction_coder())
- step.add_property(PropertyNames.RESTRICTION_ENCODING,
- self._get_cloud_encoding(restriction_coder, use_fnapi))
+ # Proto holder ParDos contain serialized DoFns from remote SDKs that cannot
+ # be examined by Python DoFnSignature.
+ if not proto_holder:
+ # Add the restriction encoding if we are a splittable DoFn
+ # and are using the Fn API on the unified worker.
+ from apache_beam.runners.common import DoFnSignature
+ signature = DoFnSignature(transform_node.transform.fn)
+ if (use_fnapi and use_unified_worker and signature.is_splittable_dofn()):
+ restriction_coder = (
+ signature.get_restriction_provider().restriction_coder())
+ step.add_property(PropertyNames.RESTRICTION_ENCODING,
Review comment:
We'll need to set this if the external transform was itself an SDF too
(which will be quite an important usecase).
Seems this would be better implemented as a get_restriction_coder method on
the transform (where the wrapping one has this information in the proto itself,
or perhaps this would need to be stored as a Coder object reconstructed from
the coder id in the proto that gets passed into the constructor using the
context available in ParDo.from_runner_api_parameter. Then we could get rid of
the proto_holder flag altogether.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
Issue Time Tracking
-------------------
Worklog Id: (was: 226616)
Time Spent: 3h (was: 2h 50m)
> ExternalTransform.expand() does not create the proper AppliedPTransform
> sub-graph
> ---------------------------------------------------------------------------------
>
> Key: BEAM-6894
> URL: https://issues.apache.org/jira/browse/BEAM-6894
> Project: Beam
> Issue Type: Bug
> Components: sdk-py-core
> Reporter: Chamikara Jayalath
> Assignee: Chamikara Jayalath
> Priority: Major
> Time Spent: 3h
> Remaining Estimate: 0h
>
> 'ExternalTransform.expand()' can be used to expand a remote transform and
> build the correct runner-api subgraph for that transform. However currently
> we do not modify the AppliedPTransform sub-graph correctly during this
> process. Relevant code location here.
> [https://github.com/apache/beam/blob/master/sdks/python/apache_beam/transforms/external.py#L135]
>
> Without this, DataflowRunner that relies in this object graph (not just the
> runner API proto) to build the job submission request to Dataflow service
> cannot construct this request properly.
>
> cc: [~robertwb]
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)