See 
<https://ci-beam.apache.org/job/beam_PostCommit_PortableJar_Flink/2564/display/redirect?page=changes>

Changes:

[Udi Meiri] Color Jenkins logs support

[ningk] Added a whitespace lint precommit job.

[ningk] Removed trailing whitespaces from all markdown and gradle build files.

[ningk] Updated jenkins precommit job README, added comments of what needs to be

[ningk] Removed trailing whitespaces from HEAD changes

[Alan Myrvold] [BEAM-10049] Add licenses for go dependencies in python, java, 
and go

[noreply] Propagate BigQuery streaming insert throttled time to Dataflow worker 
in


------------------------------------------
[...truncated 94.80 KB...]
basename "$VIRTUAL_ENV"

# Make sure to unalias pydoc if it's already there
alias pydoc 2>/dev/null >/dev/null && unalias pydoc || true

pydoc () {
    python -m pydoc "$@"
}

# This should detect bash and zsh, which have a hash command that must
# be called to get it to forget past commands.  Without forgetting
# past commands the $PATH changes we made may not be respected
if [ -n "${BASH-}" ] || [ -n "${ZSH_VERSION-}" ] ; then
    hash -r 2>/dev/null
fi
pip install --retries 10 -e "$PYTHON_ROOT_DIR"

# Hacky python script to find a free port. Note there is a small chance the 
chosen port could
# get taken before being claimed.
SOCKET_SCRIPT="
from __future__ import print_function
import socket
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.bind(('localhost', 0))
print(s.getsockname()[1])
s.close()
"
FLINK_PORT=$(python -c "$SOCKET_SCRIPT")
python -c "$SOCKET_SCRIPT"

echo "Starting Flink mini cluster listening on port $FLINK_PORT"
java -Dorg.slf4j.simpleLogger.defaultLogLevel=warn -jar 
"$FLINK_MINI_CLUSTER_JAR" --rest-port "$FLINK_PORT" --rest-bind-address 
localhost &

PIPELINE_PY="
import apache_beam as beam
import logging
from apache_beam.options.pipeline_options import PipelineOptions
from apache_beam.options.pipeline_options import SetupOptions
from apache_beam.testing.util import assert_that
from apache_beam.testing.util import equal_to
from apache_beam.transforms import Create
from apache_beam.transforms import Map

logging.basicConfig(level=logging.INFO)

# To test that our main session is getting plumbed through artifact staging
# correctly, create a global variable. If the main session is not plumbed
# through properly, global_var will be undefined and the pipeline will fail.
global_var = 1

pipeline_options = PipelineOptions()
pipeline_options.view_as(SetupOptions).save_main_session = True
pipeline = beam.Pipeline(options=pipeline_options)
pcoll = (pipeline
         | Create([0, 1, 2])
         | Map(lambda x: x + global_var))
assert_that(pcoll, equal_to([1, 2, 3]))

result = pipeline.run()
result.wait_until_finish()
"

(python -c "$PIPELINE_PY" \
  --runner FlinkRunner \
  --flink_job_server_jar "$FLINK_JOB_SERVER_JAR" \
  --parallelism 1 \
  --environment_type DOCKER \
  --environment_config "$PYTHON_CONTAINER_IMAGE" \
  --flink_master "localhost:$FLINK_PORT" \
  --flink_submit_uber_jar \
) || TEST_EXIT_CODE=$? # don't fail fast here; clean up before exiting
WARNING:root:Make sure that locally built Python SDK docker image has Python 
3.6 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python3.6_sdk:2.25.0.dev. 
If the image is not available at local, we will try to pull from hub.docker.com
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function lift_combiners at 0x7f87e02be2f0> ====================
INFO:apache_beam.runners.portability.flink_runner:Adding HTTP protocol scheme 
to flink_master parameter: http://localhost:53625
[main] WARN org.apache.flink.runtime.webmonitor.WebMonitorUtils - Log file 
environment variable 'log.file' is not set.
[main] WARN org.apache.flink.runtime.webmonitor.WebMonitorUtils - JobManager 
log files are unavailable in the web dashboard. Log file location not found in 
environment variable 'log.file' or configuration key 'Key: 'web.log.path' , 
default: null (fallback keys: [{key=jobmanager.web.log.path, 
isDeprecated=true}])'.
INFO:apache_beam.runners.portability.abstract_job_service:Artifact server 
started on port 41091
INFO:apache_beam.runners.portability.abstract_job_service:Running job 
'job-0dc595ba-5b3d-40ee-a44d-f1672f2d7f46'
[flink-rest-server-netty-worker-thread-2] WARN 
org.apache.flink.runtime.webmonitor.handlers.JarRunHandler - Configuring the 
job submission via query parameters is deprecated. Please migrate to submitting 
a JSON request instead.
INFO:apache_beam.runners.portability.flink_uber_jar_job_server:Started Flink 
job as 6140a3d0d5c99cd01df951efe31c1f05
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
RUNNING
[CHAIN MapPartition (MapPartition at [4]assert_that/{Create, Group}) -> FlatMap 
(FlatMap at ExtractOutput[0]) (1/1)] WARN 
org.apache.beam.runners.fnexecution.environment.DockerCommand - Unable to pull 
docker image apache/beam_python3.6_sdk:2.25.0.dev, cause: Received exit code 1 
for command 'docker pull apache/beam_python3.6_sdk:2.25.0.dev'. stderr: Error 
response from daemon: manifest for apache/beam_python3.6_sdk:2.25.0.dev not 
found
[grpc-default-executor-0] WARN 
/usr/local/lib/python3.6/site-packages/apache_beam/options/pipeline_options.py:311
 - Discarding unparseable args: ['--app_name=None', 
'--direct_runner_use_stacked_bundle', '--job_server_timeout=60', 
'--options_id=1', '--parallelism=1', '--pipeline_type_check', 
'--retrieval_service_type=CLASSLOADER'] 
[grpc-default-executor-1] WARN 
org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for unknown 
endpoint.
[CHAIN MapPartition (MapPartition at [6]{Create, Map(<lambda at <string>:23>), 
assert_that}) -> FlatMap (FlatMap at ExtractOutput[0]) (1/1)] WARN 
org.apache.flink.metrics.MetricGroup - The operator name MapPartition 
(MapPartition at [6]{Create, Map(<lambda at <string>:23>), assert_that}) 
exceeded the 80 characters length limit and was truncated.
[CHAIN MapPartition (MapPartition at [4]assert_that/{Create, Group}) -> FlatMap 
(FlatMap at ExtractOutput[0]) (1/1)] WARN 
org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for unknown 
endpoint.
[CHAIN MapPartition (MapPartition at [6]{Create, Map(<lambda at <string>:23>), 
assert_that}) -> FlatMap (FlatMap at ExtractOutput[0]) (1/1)] WARN 
org.apache.beam.runners.fnexecution.environment.DockerCommand - Unable to pull 
docker image apache/beam_python3.6_sdk:2.25.0.dev, cause: Received exit code 1 
for command 'docker pull apache/beam_python3.6_sdk:2.25.0.dev'. stderr: Error 
response from daemon: manifest for apache/beam_python3.6_sdk:2.25.0.dev not 
found
[CHAIN MapPartition (MapPartition at [4]assert_that/{Create, Group}) -> FlatMap 
(FlatMap at ExtractOutput[0]) (1/1)] WARN 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory - Error 
cleaning up servers urn: "beam:env:docker:v1"
payload: "\n$apache/beam_python3.6_sdk:2.25.0.dev"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:timer:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:param_windowed_value:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:worker_status:v1"
capabilities: "beam:version:sdk_base:apache/beam_python3.6_sdk:2.25.0.dev"
dependencies {
  type_urn: "beam:artifact:type:file:v1"
  type_payload: 
"\n\253\001classpath://BEAM-PIPELINE/pipeline/artifacts/job-0dc595ba-5b3d-40ee-a44d-f1672f2d7f46/95d66c1b23881c873f78edfdcfcc136a922035db1cb87c88d496865a9de90187-pickled_main_session"
  role_urn: "beam:artifact:role:staging_to:v1"
  role_payload: "\n\024pickled_main_session"
}

java.io.IOException: Received exit code 1 for command 'docker kill 
d328544192ea54b81159f9def164de2b929356e0169b3aab82534d3cf52f42d3'. stderr: 
Error response from daemon: Cannot kill container: 
d328544192ea54b81159f9def164de2b929356e0169b3aab82534d3cf52f42d3: Container 
d328544192ea54b81159f9def164de2b929356e0169b3aab82534d3cf52f42d3 is not running
        at 
org.apache.beam.runners.fnexecution.environment.DockerCommand.runShortCommand(DockerCommand.java:234)
        at 
org.apache.beam.runners.fnexecution.environment.DockerCommand.runShortCommand(DockerCommand.java:168)
        at 
org.apache.beam.runners.fnexecution.environment.DockerCommand.killContainer(DockerCommand.java:148)
        at 
org.apache.beam.runners.fnexecution.environment.DockerContainerEnvironment.close(DockerContainerEnvironment.java:93)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.$closeResource(DefaultJobBundleFactory.java:629)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.close(DefaultJobBundleFactory.java:629)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.unref(DefaultJobBundleFactory.java:645)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.access$400(DefaultJobBundleFactory.java:576)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.lambda$createEnvironmentCaches$3(DefaultJobBundleFactory.java:208)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.processPendingNotifications(LocalCache.java:1809)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.runUnlockedCleanup(LocalCache.java:3462)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.postWriteCleanup(LocalCache.java:3438)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.clear(LocalCache.java:3215)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.clear(LocalCache.java:4270)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalManualCache.invalidateAll(LocalCache.java:4909)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.close(DefaultJobBundleFactory.java:315)
        at 
org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext.close(DefaultExecutableStageContext.java:43)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.closeActual(ReferenceCountingExecutableStageContextFactory.java:209)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.access$200(ReferenceCountingExecutableStageContextFactory.java:185)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.release(ReferenceCountingExecutableStageContextFactory.java:174)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.scheduleRelease(ReferenceCountingExecutableStageContextFactory.java:133)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.access$300(ReferenceCountingExecutableStageContextFactory.java:45)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.close(ReferenceCountingExecutableStageContextFactory.java:205)
        at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.$closeResource(FlinkExecutableStageFunction.java:204)
        at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.close(FlinkExecutableStageFunction.java:282)
        at 
org.apache.flink.api.common.functions.util.FunctionUtils.closeFunction(FunctionUtils.java:43)
        at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:508)
        at 
org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
        at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:708)
        at org.apache.flink.runtime.taskmanager.Task.run(Task.java:533)
        at java.lang.Thread.run(Thread.java:748)
[grpc-default-executor-0] WARN 
/usr/local/lib/python3.6/site-packages/apache_beam/options/pipeline_options.py:311
 - Discarding unparseable args: ['--app_name=None', 
'--direct_runner_use_stacked_bundle', '--job_server_timeout=60', 
'--options_id=1', '--parallelism=1', '--pipeline_type_check', 
'--retrieval_service_type=CLASSLOADER'] 
[CHAIN MapPartition (MapPartition at [6]{Create, Map(<lambda at <string>:23>), 
assert_that}) -> FlatMap (FlatMap at ExtractOutput[0]) (1/1)] WARN 
org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for unknown 
endpoint.
[grpc-default-executor-1] WARN 
org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for unknown 
endpoint.
[CHAIN MapPartition (MapPartition at [6]{Create, Map(<lambda at <string>:23>), 
assert_that}) -> FlatMap (FlatMap at ExtractOutput[0]) (1/1)] WARN 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory - Error 
cleaning up servers urn: "beam:env:docker:v1"
payload: "\n$apache/beam_python3.6_sdk:2.25.0.dev"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:timer:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:param_windowed_value:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:worker_status:v1"
capabilities: "beam:version:sdk_base:apache/beam_python3.6_sdk:2.25.0.dev"
dependencies {
  type_urn: "beam:artifact:type:file:v1"
  type_payload: 
"\n\253\001classpath://BEAM-PIPELINE/pipeline/artifacts/job-0dc595ba-5b3d-40ee-a44d-f1672f2d7f46/95d66c1b23881c873f78edfdcfcc136a922035db1cb87c88d496865a9de90187-pickled_main_session"
  role_urn: "beam:artifact:role:staging_to:v1"
  role_payload: "\n\024pickled_main_session"
}

java.io.IOException: Received exit code 1 for command 'docker kill 
14bd5ff8eea6a4eae33916ae758ac1d897d4d7102dc1a52d46c8f8f7d002061f'. stderr: 
Error response from daemon: Cannot kill container: 
14bd5ff8eea6a4eae33916ae758ac1d897d4d7102dc1a52d46c8f8f7d002061f: Container 
14bd5ff8eea6a4eae33916ae758ac1d897d4d7102dc1a52d46c8f8f7d002061f is not running
        at 
org.apache.beam.runners.fnexecution.environment.DockerCommand.runShortCommand(DockerCommand.java:234)
        at 
org.apache.beam.runners.fnexecution.environment.DockerCommand.runShortCommand(DockerCommand.java:168)
        at 
org.apache.beam.runners.fnexecution.environment.DockerCommand.killContainer(DockerCommand.java:148)
        at 
org.apache.beam.runners.fnexecution.environment.DockerContainerEnvironment.close(DockerContainerEnvironment.java:93)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.$closeResource(DefaultJobBundleFactory.java:629)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.close(DefaultJobBundleFactory.java:629)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.unref(DefaultJobBundleFactory.java:645)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.access$400(DefaultJobBundleFactory.java:576)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.lambda$createEnvironmentCaches$3(DefaultJobBundleFactory.java:208)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.processPendingNotifications(LocalCache.java:1809)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.runUnlockedCleanup(LocalCache.java:3462)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.postWriteCleanup(LocalCache.java:3438)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.clear(LocalCache.java:3215)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.clear(LocalCache.java:4270)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalManualCache.invalidateAll(LocalCache.java:4909)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.close(DefaultJobBundleFactory.java:315)
        at 
org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext.close(DefaultExecutableStageContext.java:43)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.closeActual(ReferenceCountingExecutableStageContextFactory.java:209)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.access$200(ReferenceCountingExecutableStageContextFactory.java:185)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.release(ReferenceCountingExecutableStageContextFactory.java:174)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.scheduleRelease(ReferenceCountingExecutableStageContextFactory.java:133)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.access$300(ReferenceCountingExecutableStageContextFactory.java:45)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.close(ReferenceCountingExecutableStageContextFactory.java:205)
        at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.$closeResource(FlinkExecutableStageFunction.java:204)
        at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.close(FlinkExecutableStageFunction.java:282)
        at 
org.apache.flink.api.common.functions.util.FunctionUtils.closeFunction(FunctionUtils.java:43)
        at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:508)
        at 
org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
        at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:708)
        at org.apache.flink.runtime.taskmanager.Task.run(Task.java:533)
        at java.lang.Thread.run(Thread.java:748)
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1)] 
WARN org.apache.beam.runners.fnexecution.environment.DockerCommand - Unable to 
pull docker image apache/beam_python3.6_sdk:2.25.0.dev, cause: Received exit 
code 1 for command 'docker pull apache/beam_python3.6_sdk:2.25.0.dev'. stderr: 
Error response from daemon: manifest for apache/beam_python3.6_sdk:2.25.0.dev 
not found
[grpc-default-executor-0] WARN 
/usr/local/lib/python3.6/site-packages/apache_beam/options/pipeline_options.py:311
 - Discarding unparseable args: ['--app_name=None', 
'--direct_runner_use_stacked_bundle', '--job_server_timeout=60', 
'--options_id=1', '--parallelism=1', '--pipeline_type_check', 
'--retrieval_service_type=CLASSLOADER'] 
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/1)] 
WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for 
unknown endpoint.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

kill %1 || echo "Failed to shut down Flink mini cluster"

rm -rf "$ENV_DIR"

if [[ "$TEST_EXIT_CODE" -eq 0 ]]; then
  echo ">>> SUCCESS"
else
  echo ">>> FAILURE"
fi
exit $TEST_EXIT_CODE

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:python:container:py37:docker'.
> Process 'command 'docker'' finished with non-zero exit value 2

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 18m 37s
124 actionable tasks: 104 executed, 17 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/tzojnwd3hb2fo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to