See
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/210/display/redirect?page=changes>
Changes:
[egalpin] [BEAM-14003] Adds compat for Elasticsearch 8.0.0
[egalpin] [BEAM-13136] Removes support for Elasticsearch 2.x
[noreply] Merge pull request #17149 from [BEAM-13883] [Playground] Increase test
------------------------------------------
[...truncated 313.99 KB...]
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:726 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0330193233-795860-cp1k6p94.1648668753.796036/PyHamcrest-1.10.1.tar.gz
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:710 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0330193233-795860-cp1k6p94.1648668753.796036/parameterized-0.7.5.tar.gz...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:726 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0330193233-795860-cp1k6p94.1648668753.796036/parameterized-0.7.5.tar.gz
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:710 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0330193233-795860-cp1k6p94.1648668753.796036/beautifulsoup4-4.10.0.tar.gz...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:726 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0330193233-795860-cp1k6p94.1648668753.796036/beautifulsoup4-4.10.0.tar.gz
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:710 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0330193233-795860-cp1k6p94.1648668753.796036/mock-2.0.0-py2.py3-none-any.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:726 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0330193233-795860-cp1k6p94.1648668753.796036/mock-2.0.0-py2.py3-none-any.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:710 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0330193233-795860-cp1k6p94.1648668753.796036/seaborn-0.11.2-py3-none-any.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:726 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0330193233-795860-cp1k6p94.1648668753.796036/seaborn-0.11.2-py3-none-any.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:710 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0330193233-795860-cp1k6p94.1648668753.796036/PyHamcrest-1.10.1-py3-none-any.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:726 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0330193233-795860-cp1k6p94.1648668753.796036/PyHamcrest-1.10.1-py3-none-any.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:710 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0330193233-795860-cp1k6p94.1648668753.796036/beautifulsoup4-4.10.0-py3-none-any.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:726 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0330193233-795860-cp1k6p94.1648668753.796036/beautifulsoup4-4.10.0-py3-none-any.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:710 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0330193233-795860-cp1k6p94.1648668753.796036/parameterized-0.7.5-py2.py3-none-any.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:726 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0330193233-795860-cp1k6p94.1648668753.796036/parameterized-0.7.5-py2.py3-none-any.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:710 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0330193233-795860-cp1k6p94.1648668753.796036/matplotlib-3.3.4-cp36-cp36m-manylinux1_x86_64.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:726 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0330193233-795860-cp1k6p94.1648668753.796036/matplotlib-3.3.4-cp36-cp36m-manylinux1_x86_64.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:710 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0330193233-795860-cp1k6p94.1648668753.796036/matplotlib-3.5.1-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:726 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0330193233-795860-cp1k6p94.1648668753.796036/matplotlib-3.5.1-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:710 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0330193233-795860-cp1k6p94.1648668753.796036/matplotlib-3.5.1-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:726 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0330193233-795860-cp1k6p94.1648668753.796036/matplotlib-3.5.1-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:710 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0330193233-795860-cp1k6p94.1648668753.796036/matplotlib-3.5.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:726 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0330193233-795860-cp1k6p94.1648668753.796036/matplotlib-3.5.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:710 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0330193233-795860-cp1k6p94.1648668753.796036/dataflow_python_sdk.tar...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:726 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0330193233-795860-cp1k6p94.1648668753.796036/dataflow_python_sdk.tar
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:710 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0330193233-795860-cp1k6p94.1648668753.796036/pipeline.pb...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:726 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0330193233-795860-cp1k6p94.1648668753.796036/pipeline.pb
in 0 seconds.
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:335
Discarding unparseable args: ['--sleep_secs=20',
'--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:335
Discarding unparseable args: ['--sleep_secs=20',
'--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:890 Create job:
<Job
clientRequestId: '20220330193233796984-7366'
createTime: '2022-03-30T19:32:40.709840Z'
currentStateTime: '1970-01-01T00:00:00Z'
id:
'2022-03-30_12_32_40-11720943193291923378'
location: 'us-central1'
name: 'beamapp-jenkins-0330193233-795860-cp1k6p94'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2022-03-30T19:32:40.709840Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:892 Created job
with id: [2022-03-30_12_32_40-11720943193291923378]
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:893 Submitted job:
2022-03-30_12_32_40-11720943193291923378
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:894 To access the
Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-30_12_32_40-11720943193291923378?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-30_12_32_40-11720943193291923378?project=apache-beam-testing
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 Job
2022-03-30_12_32_40-11720943193291923378 is in state JOB_STATE_RUNNING
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:41.394Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job
2022-03-30_12_32_40-11720943193291923378. The number of workers will be between
1 and 1000.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:44.156Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically
enabled for job 2022-03-30_12_32_40-11720943193291923378.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:47.003Z: JOB_MESSAGE_BASIC: Worker configuration:
e2-standard-2 in us-central1-b.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:48.643Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo
operations into optimizable parts.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:48.670Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton
operations into optimizable parts.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:48.734Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey
operations into optimizable parts.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:48.769Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
write/Write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:48.838Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations
into optimizable parts.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:48.865Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner
information.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:48.900Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read,
Write, and Flatten operations
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:48.934Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/InitializeWrite into
write/Write/WriteImpl/DoOnce/Map(decode)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:48.955Z: JOB_MESSAGE_DETAILED: Fusing consumer
read/Read/Map(<lambda at iobase.py:898>) into read/Read/Impulse
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:48.998Z: JOB_MESSAGE_DETAILED: Fusing consumer
ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction
into read/Read/Map(<lambda at iobase.py:898>)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:49.019Z: JOB_MESSAGE_DETAILED: Fusing consumer
ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/SplitWithSizing
into
ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:49.049Z: JOB_MESSAGE_DETAILED: Fusing consumer points into
ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/ProcessElementAndRestrictionWithSizing
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:49.071Z: JOB_MESSAGE_DETAILED: Fusing consumer
CombinePerKey(sum)/GroupByKey+CombinePerKey(sum)/Combine/Partial into points
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:49.092Z: JOB_MESSAGE_DETAILED: Fusing consumer
CombinePerKey(sum)/GroupByKey/Write into
CombinePerKey(sum)/GroupByKey+CombinePerKey(sum)/Combine/Partial
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:49.140Z: JOB_MESSAGE_DETAILED: Fusing consumer
CombinePerKey(sum)/Combine into CombinePerKey(sum)/GroupByKey/Read
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:49.175Z: JOB_MESSAGE_DETAILED: Fusing consumer
CombinePerKey(sum)/Combine/Extract into CombinePerKey(sum)/Combine
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:49.210Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/WindowInto(WindowIntoFn) into
CombinePerKey(sum)/Combine/Extract
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:49.276Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3229>) into
write/Write/WriteImpl/DoOnce/Impulse
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:49.310Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/DoOnce/Map(decode) into
write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3229>)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:49.344Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/WriteBundles into
write/Write/WriteImpl/WindowInto(WindowIntoFn)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:49.379Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/Pair into write/Write/WriteImpl/WriteBundles
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:49.411Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/GroupByKey/Write into write/Write/WriteImpl/Pair
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:49.468Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/Extract into write/Write/WriteImpl/GroupByKey/Read
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:49.505Z: JOB_MESSAGE_DEBUG: Workflow config is missing a
default resource spec.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:49.537Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and
teardown to workflow graph.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:49.569Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop
steps.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:49.603Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:49.744Z: JOB_MESSAGE_DEBUG: Executing wait step start34
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:49.812Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/DoOnce/Impulse+write/Write/WriteImpl/DoOnce/FlatMap(<lambda
at
core.py:3229>)+write/Write/WriteImpl/DoOnce/Map(decode)+write/Write/WriteImpl/InitializeWrite
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:49.834Z: JOB_MESSAGE_BASIC: Executing operation
read/Read/Impulse+read/Read/Map(<lambda at
iobase.py:898>)+ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction+ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/SplitWithSizing
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:49.848Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:49.856Z: JOB_MESSAGE_BASIC: Executing operation
CombinePerKey(sum)/GroupByKey/Create
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:49.877Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/GroupByKey/Create
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:49.877Z: JOB_MESSAGE_BASIC: Starting 1 workers in
us-central1-b...
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:49.943Z: JOB_MESSAGE_BASIC: Finished operation
CombinePerKey(sum)/GroupByKey/Create
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:49.944Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/GroupByKey/Create
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:50.012Z: JOB_MESSAGE_DEBUG: Value
"CombinePerKey(sum)/GroupByKey/Session" materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:32:50.050Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/GroupByKey/Session" materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:33:07.925Z: JOB_MESSAGE_BASIC: Your project already contains 100
Dataflow-created metric descriptors, so new user metrics of the form
custom.googleapis.com/* will not be created. However, all user metrics are also
available in the metric dataflow.googleapis.com/job/user_counter. If you rely
on the custom metrics, you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:33:34.311Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number
of workers to 1 based on the rate of progress in the currently running stage(s).
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:33:52.877Z: JOB_MESSAGE_DETAILED: Workers have started
successfully.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:10.855Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/DoOnce/Impulse+write/Write/WriteImpl/DoOnce/FlatMap(<lambda
at
core.py:3229>)+write/Write/WriteImpl/DoOnce/Map(decode)+write/Write/WriteImpl/InitializeWrite
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:10.938Z: JOB_MESSAGE_BASIC: Finished operation
read/Read/Impulse+read/Read/Map(<lambda at
iobase.py:898>)+ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction+ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/SplitWithSizing
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:10.953Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/DoOnce/Map(decode).None" materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:10.988Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/InitializeWrite.None" materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:11.021Z: JOB_MESSAGE_DEBUG: Value
"ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7-split-with-sizing-out3"
materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:11.065Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/FinalizeWrite/View-python_side_input0-write/Write/WriteImpl/FinalizeWrite
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:11.100Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/WriteBundles/View-python_side_input0-write/Write/WriteImpl/WriteBundles
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:11.135Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/PreFinalize/View-python_side_input0-write/Write/WriteImpl/PreFinalize
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:11.161Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/WriteBundles/View-python_side_input0-write/Write/WriteImpl/WriteBundles
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:11.167Z: JOB_MESSAGE_BASIC: Executing operation
ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/ProcessElementAndRestrictionWithSizing+points+CombinePerKey(sum)/GroupByKey+CombinePerKey(sum)/Combine/Partial+CombinePerKey(sum)/GroupByKey/Write
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:11.188Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/PreFinalize/View-python_side_input0-write/Write/WriteImpl/PreFinalize
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:11.195Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/FinalizeWrite/View-python_side_input0-write/Write/WriteImpl/FinalizeWrite
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:11.215Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/WriteBundles/View-python_side_input0-write/Write/WriteImpl/WriteBundles.out"
materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:11.275Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/FinalizeWrite/View-python_side_input0-write/Write/WriteImpl/FinalizeWrite.out"
materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:11.320Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/PreFinalize/View-python_side_input0-write/Write/WriteImpl/PreFinalize.out"
materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:17.996Z: JOB_MESSAGE_BASIC: Finished operation
ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/ProcessElementAndRestrictionWithSizing+points+CombinePerKey(sum)/GroupByKey+CombinePerKey(sum)/Combine/Partial+CombinePerKey(sum)/GroupByKey/Write
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:18.098Z: JOB_MESSAGE_BASIC: Executing operation
CombinePerKey(sum)/GroupByKey/Close
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:18.156Z: JOB_MESSAGE_BASIC: Finished operation
CombinePerKey(sum)/GroupByKey/Close
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:18.258Z: JOB_MESSAGE_BASIC: Executing operation
CombinePerKey(sum)/GroupByKey/Read+CombinePerKey(sum)/Combine+CombinePerKey(sum)/Combine/Extract+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/GroupByKey/Write
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:20.122Z: JOB_MESSAGE_BASIC: Finished operation
CombinePerKey(sum)/GroupByKey/Read+CombinePerKey(sum)/Combine+CombinePerKey(sum)/Combine/Extract+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/GroupByKey/Write
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:20.182Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/GroupByKey/Close
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:20.237Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/GroupByKey/Close
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:20.329Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/GroupByKey/Read+write/Write/WriteImpl/Extract
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:22.638Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/GroupByKey/Read+write/Write/WriteImpl/Extract
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:22.709Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/Extract.None" materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:22.776Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/FinalizeWrite/View-python_side_input1-write/Write/WriteImpl/FinalizeWrite
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:22.807Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/PreFinalize/View-python_side_input1-write/Write/WriteImpl/PreFinalize
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:22.827Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/FinalizeWrite/View-python_side_input1-write/Write/WriteImpl/FinalizeWrite
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:22.865Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/PreFinalize/View-python_side_input1-write/Write/WriteImpl/PreFinalize
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:22.898Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/FinalizeWrite/View-python_side_input1-write/Write/WriteImpl/FinalizeWrite.out"
materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:22.934Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/PreFinalize/View-python_side_input1-write/Write/WriteImpl/PreFinalize.out"
materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:22.992Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/PreFinalize
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:25.822Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/PreFinalize
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:25.884Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/PreFinalize.None" materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:25.956Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/FinalizeWrite/View-python_side_input2-write/Write/WriteImpl/FinalizeWrite
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:26.011Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/FinalizeWrite/View-python_side_input2-write/Write/WriteImpl/FinalizeWrite
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:26.081Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/FinalizeWrite/View-python_side_input2-write/Write/WriteImpl/FinalizeWrite.out"
materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:26.170Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/FinalizeWrite
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:28.547Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/FinalizeWrite
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:28.609Z: JOB_MESSAGE_DEBUG: Executing success step success32
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:28.688Z: JOB_MESSAGE_DETAILED: Cleaning up.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:28.809Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:41:28.845Z: JOB_MESSAGE_BASIC: Stopping worker pool...
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:43:48.338Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker
pool from 1 to 0.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:43:48.386Z: JOB_MESSAGE_BASIC: Worker pool stopped.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-03-30T19:43:48.418Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 Job
2022-03-30_12_32_40-11720943193291923378 is in state JOB_STATE_DONE
[32mINFO [0m apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size
estimation of the input
[32mINFO [0m apache_beam.io.gcp.gcsio:gcsio.py:572 Finished listing 1
files in 0.03780341148376465 seconds.
[32mPASSED[0m
[33m=============================== warnings summary
===============================[0m
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42:
DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use
"async def" instead
def call(self, fn, *args, **kwargs):
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63:
PendingDeprecationWarning: Client.dataset is deprecated and will be removed in
a future version. Use a string like 'my_project.my_dataset' or a
cloud.google.bigquery.DatasetReference object, instead.
dataset_ref = client.dataset(unique_dataset_name, project=project)
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2143:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
is_streaming_pipeline = p.options.view_as(StandardOptions).streaming
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2149:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
experiments = p.options.view_as(DebugOptions).experiments or []
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1128:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
temp_location = p.options.view_as(GoogleCloudOptions).temp_location
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1130:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2443:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
temp_location = pcoll.pipeline.options.view_as(
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2445:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2469:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
pipeline_options=pcoll.pipeline.options,
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file:
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/pytest_postCommitIT-df-py38-no-xdist.xml>
-
[33m[1m========== 6 passed, 5251 deselected, 12 warnings in 3988.56 seconds
===========[0m
> Task :sdks:python:test-suites:dataflow:examplesPostCommit
BUILD SUCCESSFUL in 1h 38m 34s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/zt6icx7vdqjby
FATAL: command execution failed
hudson.remoting.ChannelClosedException: Channel
"hudson.remoting.Channel@791c3297:apache-beam-jenkins-13": Remote call on
apache-beam-jenkins-13 failed. The channel is closing down or has closed down
at hudson.remoting.Channel.call(Channel.java:994)
at
hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:286)
at com.sun.proxy.$Proxy132.isAlive(Unknown Source)
at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1213)
at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1205)
at hudson.Launcher$ProcStarter.join(Launcher.java:522)
at hudson.plugins.gradle.Gradle.perform(Gradle.java:317)
at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
at
hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:806)
at hudson.model.Build$BuildExecution.build(Build.java:198)
at hudson.model.Build$BuildExecution.doRun(Build.java:163)
at
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:514)
at hudson.model.Run.execute(Run.java:1888)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:99)
at hudson.model.Executor.run(Executor.java:432)
Caused by: java.io.IOException
at hudson.remoting.Channel.close(Channel.java:1477)
at hudson.remoting.Channel.close(Channel.java:1454)
at hudson.slaves.SlaveComputer.closeChannel(SlaveComputer.java:894)
at hudson.slaves.SlaveComputer.access$100(SlaveComputer.java:108)
at hudson.slaves.SlaveComputer$2.run(SlaveComputer.java:774)
at
jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28)
at
jenkins.security.ImpersonatingExecutorService$1.run(ImpersonatingExecutorService.java:68)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
FATAL: Channel "hudson.remoting.Channel@791c3297:apache-beam-jenkins-13":
Remote call on apache-beam-jenkins-13 failed. The channel is closing down or
has closed down
java.io.IOException
at hudson.remoting.Channel.close(Channel.java:1477)
at hudson.remoting.Channel.close(Channel.java:1454)
at hudson.slaves.SlaveComputer.closeChannel(SlaveComputer.java:894)
at hudson.slaves.SlaveComputer.access$100(SlaveComputer.java:108)
at hudson.slaves.SlaveComputer$2.run(SlaveComputer.java:774)
at
jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28)
at
jenkins.security.ImpersonatingExecutorService$1.run(ImpersonatingExecutorService.java:68)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused: hudson.remoting.ChannelClosedException: Channel
"hudson.remoting.Channel@791c3297:apache-beam-jenkins-13": Remote call on
apache-beam-jenkins-13 failed. The channel is closing down or has closed down
at hudson.remoting.Channel.call(Channel.java:994)
at hudson.Launcher$RemoteLauncher.kill(Launcher.java:1148)
at
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:526)
at hudson.model.Run.execute(Run.java:1888)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:99)
at hudson.model.Executor.run(Executor.java:432)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]