See 
<https://builds.apache.org/job/beam_PostCommit_PortableJar_Flink/1577/display/redirect?page=changes>

Changes:

[lcwik] [BEAM-2939, BEAM-9458] Add deduplication transform for SplittableDoFns


------------------------------------------
[...truncated 78.64 KB...]
from apache_beam.testing.util import equal_to
from apache_beam.transforms import Create
from apache_beam.transforms import Map

logging.basicConfig(level=logging.INFO)

# To test that our main session is getting plumbed through artifact staging
# correctly, create a global variable. If the main session is not plumbed
# through properly, global_var will be undefined and the pipeline will fail.
global_var = 1

pipeline_options = PipelineOptions()
pipeline_options.view_as(SetupOptions).save_main_session = True
pipeline = beam.Pipeline(options=pipeline_options)
pcoll = (pipeline
         | Create([0, 1, 2])
         | Map(lambda x: x + global_var))
assert_that(pcoll, equal_to([1, 2, 3]))

result = pipeline.run()
result.wait_until_finish()
"

(python -c "$PIPELINE_PY" \
  --runner FlinkRunner \
  --flink_job_server_jar "$FLINK_JOB_SERVER_JAR" \
  --parallelism 1 \
  --environment_type DOCKER \
  --environment_config "$PYTHON_CONTAINER_IMAGE" \
  --flink_master "localhost:$FLINK_PORT" \
  --flink_submit_uber_jar \
) || TEST_EXIT_CODE=$? # don't fail fast here; clean up before exiting
Starting Flink mini cluster listening on port 42583
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function lift_combiners at 0x7f4189c29950> ====================
INFO:apache_beam.runners.portability.flink_runner:Adding HTTP protocol scheme 
to flink_master parameter: http://localhost:42583
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: 
['--parallelism', '1']
[main] WARN org.apache.flink.runtime.webmonitor.WebMonitorUtils - Log file 
environment variable 'log.file' is not set.
[main] WARN org.apache.flink.runtime.webmonitor.WebMonitorUtils - JobManager 
log files are unavailable in the web dashboard. Log file location not found in 
environment variable 'log.file' or configuration key 'Key: 'web.log.path' , 
default: null (fallback keys: [{key=jobmanager.web.log.path, 
isDeprecated=true}])'.
Started Flink mini cluster (1 TaskManagers with 1 task slots) with Rest API at 
java.util.concurrent.CompletableFuture@47a64f7d[Not completed]
INFO:apache_beam.runners.portability.abstract_job_service:Artifact server 
started on port 42193
INFO:apache_beam.runners.portability.abstract_job_service:Running job 
'job-e15f67c4-180f-4e6b-9390-e9099f10cd4f'
[flink-rest-server-netty-worker-thread-2] WARN 
org.apache.flink.runtime.webmonitor.handlers.JarRunHandler - Configuring the 
job submission via query parameters is deprecated. Please migrate to submitting 
a JSON request instead.
INFO:apache_beam.runners.portability.flink_uber_jar_job_server:Started Flink 
job as 0469eef285c92b4d35f0e98693d5f656
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
RUNNING
[CHAIN MapPartition (MapPartition at [4]assert_that/{Create, Group}) -> FlatMap 
(FlatMap at ExtractOutput[0]) (1/1)] WARN 
org.apache.beam.runners.fnexecution.environment.DockerCommand - Unable to pull 
docker image apache/beam_python3.6_sdk:2.21.0.dev, cause: Received exit code 1 
for command 'docker pull apache/beam_python3.6_sdk:2.21.0.dev'. stderr: Error 
response from daemon: manifest for apache/beam_python3.6_sdk:2.21.0.dev not 
found
[grpc-default-executor-1] WARN 
/usr/local/lib/python3.6/site-packages/apache_beam/options/pipeline_options.py:290
 - Discarding unparseable args: ['--app_name=None', 
'--direct_runner_use_stacked_bundle', '--job_server_timeout=60', 
'--options_id=1', '--pipeline_type_check', 
'--retrieval_service_type=CLASSLOADER'] 
[CHAIN MapPartition (MapPartition at [6]{Create, Map(<lambda at <string>:23>), 
assert_that}) -> FlatMap (FlatMap at ExtractOutput[0]) (1/1)] WARN 
org.apache.flink.metrics.MetricGroup - The operator name MapPartition 
(MapPartition at [6]{Create, Map(<lambda at <string>:23>), assert_that}) 
exceeded the 80 characters length limit and was truncated.
[grpc-default-executor-1] WARN 
org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for unknown 
endpoint.
[CHAIN MapPartition (MapPartition at [4]assert_that/{Create, Group}) -> FlatMap 
(FlatMap at ExtractOutput[0]) (1/1)] WARN 
org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for unknown 
endpoint.
[CHAIN MapPartition (MapPartition at [6]{Create, Map(<lambda at <string>:23>), 
assert_that}) -> FlatMap (FlatMap at ExtractOutput[0]) (1/1)] WARN 
org.apache.beam.runners.fnexecution.environment.DockerCommand - Unable to pull 
docker image apache/beam_python3.6_sdk:2.21.0.dev, cause: Received exit code 1 
for command 'docker pull apache/beam_python3.6_sdk:2.21.0.dev'. stderr: Error 
response from daemon: manifest for apache/beam_python3.6_sdk:2.21.0.dev not 
found
[CHAIN MapPartition (MapPartition at [4]assert_that/{Create, Group}) -> FlatMap 
(FlatMap at ExtractOutput[0]) (1/1)] WARN 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory - Error 
cleaning up servers urn: "beam:env:docker:v1"
payload: "\n$apache/beam_python3.6_sdk:2.21.0.dev"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:timer:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:param_windowed_value:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:worker_status:v1"

java.io.IOException: Received exit code 1 for command 'docker kill 
312d71f77f18dfc02ffc827fa182ce853aa3f178e1ada25a5c424c49d0c367f4'. stderr: 
Error response from daemon: Cannot kill container: 
312d71f77f18dfc02ffc827fa182ce853aa3f178e1ada25a5c424c49d0c367f4: Container 
312d71f77f18dfc02ffc827fa182ce853aa3f178e1ada25a5c424c49d0c367f4 is not running
        at 
org.apache.beam.runners.fnexecution.environment.DockerCommand.runShortCommand(DockerCommand.java:234)
        at 
org.apache.beam.runners.fnexecution.environment.DockerCommand.runShortCommand(DockerCommand.java:168)
        at 
org.apache.beam.runners.fnexecution.environment.DockerCommand.killContainer(DockerCommand.java:148)
        at 
org.apache.beam.runners.fnexecution.environment.DockerContainerEnvironment.close(DockerContainerEnvironment.java:93)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.$closeResource(DefaultJobBundleFactory.java:479)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.close(DefaultJobBundleFactory.java:479)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.unref(DefaultJobBundleFactory.java:494)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.access$1600(DefaultJobBundleFactory.java:432)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.lambda$createEnvironmentCaches$3(DefaultJobBundleFactory.java:169)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.processPendingNotifications(LocalCache.java:1809)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.runUnlockedCleanup(LocalCache.java:3462)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.postWriteCleanup(LocalCache.java:3438)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.clear(LocalCache.java:3215)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.clear(LocalCache.java:4270)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalManualCache.invalidateAll(LocalCache.java:4909)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.close(DefaultJobBundleFactory.java:259)
        at 
org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext.close(DefaultExecutableStageContext.java:43)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.closeActual(ReferenceCountingExecutableStageContextFactory.java:208)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.access$200(ReferenceCountingExecutableStageContextFactory.java:184)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.release(ReferenceCountingExecutableStageContextFactory.java:173)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.scheduleRelease(ReferenceCountingExecutableStageContextFactory.java:132)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.access$300(ReferenceCountingExecutableStageContextFactory.java:44)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.close(ReferenceCountingExecutableStageContextFactory.java:204)
        at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.$closeResource(FlinkExecutableStageFunction.java:204)
        at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.close(FlinkExecutableStageFunction.java:291)
        at 
org.apache.flink.api.common.functions.util.FunctionUtils.closeFunction(FunctionUtils.java:43)
        at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:508)
        at 
org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
        at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
        at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
        at java.lang.Thread.run(Thread.java:748)
[grpc-default-executor-1] WARN 
/usr/local/lib/python3.6/site-packages/apache_beam/options/pipeline_options.py:290
 - Discarding unparseable args: ['--app_name=None', 
'--direct_runner_use_stacked_bundle', '--job_server_timeout=60', 
'--options_id=1', '--pipeline_type_check', 
'--retrieval_service_type=CLASSLOADER'] 
[grpc-default-executor-0] WARN 
org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for unknown 
endpoint.
[CHAIN MapPartition (MapPartition at [6]{Create, Map(<lambda at <string>:23>), 
assert_that}) -> FlatMap (FlatMap at ExtractOutput[0]) (1/1)] WARN 
org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for unknown 
endpoint.
[CHAIN MapPartition (MapPartition at [6]{Create, Map(<lambda at <string>:23>), 
assert_that}) -> FlatMap (FlatMap at ExtractOutput[0]) (1/1)] WARN 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory - Error 
cleaning up servers urn: "beam:env:docker:v1"
payload: "\n$apache/beam_python3.6_sdk:2.21.0.dev"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:timer:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:param_windowed_value:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:worker_status:v1"

java.io.IOException: Received exit code 1 for command 'docker logs 
372d27fac90fe50c51b7b4e8ed1fe51d8b5c2f80217b5a2fe583efac1edb0435'. stdout and 
stderr: Error: No such container: 
372d27fac90fe50c51b7b4e8ed1fe51d8b5c2f80217b5a2fe583efac1edb0435
        at 
org.apache.beam.runners.fnexecution.environment.DockerCommand.runShortCommand(DockerCommand.java:234)
        at 
org.apache.beam.runners.fnexecution.environment.DockerCommand.getContainerLogs(DockerCommand.java:134)
        at 
org.apache.beam.runners.fnexecution.environment.DockerContainerEnvironment.close(DockerContainerEnvironment.java:91)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.$closeResource(DefaultJobBundleFactory.java:479)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.close(DefaultJobBundleFactory.java:479)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.unref(DefaultJobBundleFactory.java:494)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.access$1600(DefaultJobBundleFactory.java:432)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.lambda$createEnvironmentCaches$3(DefaultJobBundleFactory.java:169)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.processPendingNotifications(LocalCache.java:1809)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.runUnlockedCleanup(LocalCache.java:3462)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.postWriteCleanup(LocalCache.java:3438)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.clear(LocalCache.java:3215)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.clear(LocalCache.java:4270)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalManualCache.invalidateAll(LocalCache.java:4909)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.close(DefaultJobBundleFactory.java:259)
        at 
org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext.close(DefaultExecutableStageContext.java:43)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.closeActual(ReferenceCountingExecutableStageContextFactory.java:208)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.access$200(ReferenceCountingExecutableStageContextFactory.java:184)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.release(ReferenceCountingExecutableStageContextFactory.java:173)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.scheduleRelease(ReferenceCountingExecutableStageContextFactory.java:132)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.access$300(ReferenceCountingExecutableStageContextFactory.java:44)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.close(ReferenceCountingExecutableStageContextFactory.java:204)
        at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.$closeResource(FlinkExecutableStageFunction.java:204)
        at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.close(FlinkExecutableStageFunction.java:291)
        at 
org.apache.flink.api.common.functions.util.FunctionUtils.closeFunction(FunctionUtils.java:43)
        at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:508)
        at 
org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
        at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
        at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
        at java.lang.Thread.run(Thread.java:748)
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1)] 
WARN org.apache.beam.runners.fnexecution.environment.DockerCommand - Unable to 
pull docker image apache/beam_python3.6_sdk:2.21.0.dev, cause: Received exit 
code 1 for command 'docker pull apache/beam_python3.6_sdk:2.21.0.dev'. stderr: 
Error response from daemon: manifest for apache/beam_python3.6_sdk:2.21.0.dev 
not found
[grpc-default-executor-0] WARN 
/usr/local/lib/python3.6/site-packages/apache_beam/options/pipeline_options.py:290
 - Discarding unparseable args: ['--app_name=None', 
'--direct_runner_use_stacked_bundle', '--job_server_timeout=60', 
'--options_id=1', '--pipeline_type_check', 
'--retrieval_service_type=CLASSLOADER'] 
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1)] 
WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for 
unknown endpoint.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1)] 
WARN org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory - 
Error cleaning up servers urn: "beam:env:docker:v1"
payload: "\n$apache/beam_python3.6_sdk:2.21.0.dev"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:timer:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:param_windowed_value:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:worker_status:v1"

java.io.IOException: Received exit code 1 for command 'docker kill 
1b6e30b4d5196b165847aa2b2355ed7e9a826729b3678e42cbb434399f8fd0b9'. stderr: 
Error response from daemon: Cannot kill container: 
1b6e30b4d5196b165847aa2b2355ed7e9a826729b3678e42cbb434399f8fd0b9: No such 
container: 1b6e30b4d5196b165847aa2b2355ed7e9a826729b3678e42cbb434399f8fd0b9
        at 
org.apache.beam.runners.fnexecution.environment.DockerCommand.runShortCommand(DockerCommand.java:234)
        at 
org.apache.beam.runners.fnexecution.environment.DockerCommand.runShortCommand(DockerCommand.java:168)
        at 
org.apache.beam.runners.fnexecution.environment.DockerCommand.killContainer(DockerCommand.java:148)
        at 
org.apache.beam.runners.fnexecution.environment.DockerContainerEnvironment.close(DockerContainerEnvironment.java:93)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.$closeResource(DefaultJobBundleFactory.java:479)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.close(DefaultJobBundleFactory.java:479)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.unref(DefaultJobBundleFactory.java:494)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.access$1600(DefaultJobBundleFactory.java:432)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.lambda$createEnvironmentCaches$3(DefaultJobBundleFactory.java:169)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.processPendingNotifications(LocalCache.java:1809)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.runUnlockedCleanup(LocalCache.java:3462)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.postWriteCleanup(LocalCache.java:3438)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.clear(LocalCache.java:3215)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.clear(LocalCache.java:4270)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalManualCache.invalidateAll(LocalCache.java:4909)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.close(DefaultJobBundleFactory.java:259)
        at 
org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext.close(DefaultExecutableStageContext.java:43)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.closeActual(ReferenceCountingExecutableStageContextFactory.java:208)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.access$200(ReferenceCountingExecutableStageContextFactory.java:184)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.release(ReferenceCountingExecutableStageContextFactory.java:173)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.scheduleRelease(ReferenceCountingExecutableStageContextFactory.java:132)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.access$300(ReferenceCountingExecutableStageContextFactory.java:44)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.close(ReferenceCountingExecutableStageContextFactory.java:204)
        at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.$closeResource(FlinkExecutableStageFunction.java:204)
        at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.close(FlinkExecutableStageFunction.java:291)
        at 
org.apache.flink.api.common.functions.util.FunctionUtils.closeFunction(FunctionUtils.java:43)
        at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:508)
        at 
org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
        at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
        at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
        at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

kill %1 || echo "Failed to shut down Flink mini cluster"

rm -rf "$ENV_DIR"
>>> SUCCESS

if [[ "$TEST_EXIT_CODE" -eq 0 ]]; then
  echo ">>> SUCCESS"
else
  echo ">>> FAILURE"
fi
exit $TEST_EXIT_CODE

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:python:container:py37:docker'.
> Process 'command 'docker'' finished with non-zero exit value 2

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 14m 33s
83 actionable tasks: 63 executed, 18 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/m4pmqcemdecdg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to