[
https://issues.apache.org/jira/browse/BEAM-4826?focusedWorklogId=134286&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-134286
]
ASF GitHub Bot logged work on BEAM-4826:
----------------------------------------
Author: ASF GitHub Bot
Created on: 13/Aug/18 20:38
Start Date: 13/Aug/18 20:38
Worklog Time Spent: 10m
Work Description: lukecwik commented on a change in pull request #6132:
[BEAM-4826] Sanitize pCollections before sending to SDKHarness
URL: https://github.com/apache/beam/pull/6132#discussion_r209748219
##########
File path:
runners/core-construction-java/src/test/java/org/apache/beam/runners/core/construction/graph/GreedyPipelineFuserTest.java
##########
@@ -1144,4 +1144,135 @@ public void compositesIgnored() {
.withNoOutputs()
.withTransforms("goTransform")));
}
+
+ @Test
+ public void sanitizedTransforms() throws Exception {
+
+ PCollection flattenOutput = pc("flatten.out");
Review comment:
I was going to suggest to build the pipeline using the pattern that we are
hitting right now where Flatten has dangling inputs instead of creating the
PTransforms manually.
```
Pipeline p = Pipeline.create();
PCollection<String> pc1 = ...
PCollection<String> pc2 = ...
PCollection<String> flatten = Flatten...
...
```
This will exercise a real pipeline instead of having the fake.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
Issue Time Tracking
-------------------
Worklog Id: (was: 134286)
Time Spent: 1h 20m (was: 1h 10m)
> Flink runner sends bad flatten to SDK
> -------------------------------------
>
> Key: BEAM-4826
> URL: https://issues.apache.org/jira/browse/BEAM-4826
> Project: Beam
> Issue Type: Bug
> Components: runner-flink
> Reporter: Henning Rohde
> Assignee: Ankur Goenka
> Priority: Major
> Labels: portability
> Time Spent: 1h 20m
> Remaining Estimate: 0h
>
> For a Go flatten test w/ 3 input, the Flink runner splits this into 3 bundle
> descriptors. But it sends the original 3-input flatten but w/ 1 actual input
> present in each bundle descriptor. This is inconsistent and the SDK shouldn't
> expect dangling PCollections. In contrast, Dataflow removes the flatten when
> it does the same split.
> Snippet:
> register: <
> process_bundle_descriptor: <
> id: "3"
> transforms: <
> key: "e4"
> value: <
> unique_name: "github.com/apache/beam/sdks/go/pkg/beam.createFn'1"
> spec: <
> urn: "urn:beam:transform:pardo:v1"
> payload: [...]
> >
> inputs: <
> key: "i0"
> value: "n3"
> >
> outputs: <
> key: "i0"
> value: "n4"
> >
> >
> >
> transforms: <
> key: "e7"
> value: <
> unique_name: "Flatten"
> spec: <
> urn: "beam:transform:flatten:v1"
> >
> inputs: <
> key: "i0"
> value: "n2"
> >
> inputs: <
> key: "i1"
> value: "n4" . // <----------- only one present.
> >
> inputs: <
> key: "i2"
> value: "n6"
> >
> outputs: <
> key: "i0"
> value: "n7"
> >
> >
> >
> [...]
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)