kennknowles commented on a change in pull request #14408:
URL: https://github.com/apache/beam/pull/14408#discussion_r607365007
##########
File path:
runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/DataflowRunner.java
##########
@@ -1276,37 +1275,60 @@ public DataflowPipelineJob run(Pipeline pipeline) {
return dataflowPipelineJob;
}
+ private static String getContainerImageFromEnvironmentId(
+ String environmentId, RunnerApi.Pipeline pipelineProto) {
+ RunnerApi.Environment environment =
+ pipelineProto.getComponents().getEnvironmentsMap().get(environmentId);
+ if (!BeamUrns.getUrn(RunnerApi.StandardEnvironments.Environments.DOCKER)
+ .equals(environment.getUrn())) {
+ throw new RuntimeException(
+ "Dataflow can only execute pipeline steps in Docker environments: "
+ + environment.getUrn());
+ }
+ RunnerApi.DockerPayload dockerPayload;
+ try {
+ dockerPayload =
RunnerApi.DockerPayload.parseFrom(environment.getPayload());
+ } catch (InvalidProtocolBufferException e) {
+ throw new RuntimeException("Error parsing docker payload.", e);
+ }
+ return dockerPayload.getContainerImage();
+ }
+
+ @AutoValue
+ abstract static class EnvironmentInfo {
+ static EnvironmentInfo create(String environmentId, String containerUrl) {
+ return new AutoValue_DataflowRunner_EnvironmentInfo(environmentId,
containerUrl);
+ }
+
+ abstract String environmentId();
+
+ abstract String containerUrl();
+ }
+
+ private static List<EnvironmentInfo>
getAllEnvironmentInfo(RunnerApi.Pipeline pipelineProto) {
Review comment:
Noting that this is generally useful and could be moved outside of
`DataflowRunner` into runners-core-construction or something. (non-blocking
comment)
##########
File path:
runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/DataflowRunner.java
##########
@@ -1276,37 +1275,60 @@ public DataflowPipelineJob run(Pipeline pipeline) {
return dataflowPipelineJob;
}
+ private static String getContainerImageFromEnvironmentId(
+ String environmentId, RunnerApi.Pipeline pipelineProto) {
+ RunnerApi.Environment environment =
+ pipelineProto.getComponents().getEnvironmentsMap().get(environmentId);
+ if (!BeamUrns.getUrn(RunnerApi.StandardEnvironments.Environments.DOCKER)
+ .equals(environment.getUrn())) {
+ throw new RuntimeException(
+ "Dataflow can only execute pipeline steps in Docker environments: "
+ + environment.getUrn());
+ }
+ RunnerApi.DockerPayload dockerPayload;
+ try {
+ dockerPayload =
RunnerApi.DockerPayload.parseFrom(environment.getPayload());
+ } catch (InvalidProtocolBufferException e) {
+ throw new RuntimeException("Error parsing docker payload.", e);
+ }
+ return dockerPayload.getContainerImage();
+ }
+
+ @AutoValue
+ abstract static class EnvironmentInfo {
+ static EnvironmentInfo create(String environmentId, String containerUrl) {
+ return new AutoValue_DataflowRunner_EnvironmentInfo(environmentId,
containerUrl);
+ }
+
+ abstract String environmentId();
+
+ abstract String containerUrl();
+ }
+
+ private static List<EnvironmentInfo>
getAllEnvironmentInfo(RunnerApi.Pipeline pipelineProto) {
+ return pipelineProto.getComponents().getTransformsMap().values().stream()
+ .map(transform -> transform.getEnvironmentId())
+ .filter(environmentId -> !environmentId.isEmpty())
+ .distinct()
+ .map(
+ environmentId ->
+ EnvironmentInfo.create(
+ environmentId,
+ getContainerImageFromEnvironmentId(environmentId,
pipelineProto)))
+ .collect(Collectors.toList());
+ }
+
static void configureSdkHarnessContainerImages(
- DataflowPipelineOptions options,
- RunnerApi.Pipeline pipelineProto,
- Job newJob,
- String workerHarnessContainerImage) {
+ DataflowPipelineOptions options, RunnerApi.Pipeline pipelineProto, Job
newJob) {
if (useUnifiedWorker(options)) {
- ImmutableSet.Builder<String> sdkContainerUrlSetBuilder =
ImmutableSet.builder();
- sdkContainerUrlSetBuilder.add(workerHarnessContainerImage);
- for (Map.Entry<String, RunnerApi.Environment> entry :
- pipelineProto.getComponents().getEnvironmentsMap().entrySet()) {
- if
(!BeamUrns.getUrn(RunnerApi.StandardEnvironments.Environments.DOCKER)
- .equals(entry.getValue().getUrn())) {
- throw new RuntimeException(
- "Dataflow can only execute pipeline steps in Docker
environments: "
- + entry.getValue().getUrn());
- }
- RunnerApi.DockerPayload dockerPayload;
- try {
- dockerPayload =
RunnerApi.DockerPayload.parseFrom(entry.getValue().getPayload());
- } catch (InvalidProtocolBufferException e) {
- throw new RuntimeException("Error parsing docker payload.", e);
- }
- sdkContainerUrlSetBuilder.add(dockerPayload.getContainerImage());
- }
List<SdkHarnessContainerImage> sdkContainerList =
- sdkContainerUrlSetBuilder.build().stream()
+ getAllEnvironmentInfo(pipelineProto).stream()
.map(
- (String url) -> {
+ environmentInfo -> {
SdkHarnessContainerImage image = new
SdkHarnessContainerImage();
- image.setContainerImage(url);
- if (url.toLowerCase().contains("python")) {
+ image.setEnvironmentId(environmentInfo.environmentId());
+ image.setContainerImage(environmentInfo.containerUrl());
Review comment:
Is this a temporary adapter? I would guess if there is some code that
downloads the portable pipeline it could extract the container image on the
service side based on the environment id? (non-blocking comment)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]