See 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/1223/display/redirect>

Changes:


------------------------------------------
[...truncated 303.57 KB...]
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> at 0x7f817947a860>
duration = None

    def wait_until_finish(self, duration=None):
      if not self.is_in_terminal_state():
        if not self.has_job:
          raise IOError('Failed to get the Dataflow job id.')
    
        thread = threading.Thread(
            target=DataflowRunner.poll_for_job_completion,
            args=(self._runner, self, duration))
    
        # Mark the thread as a daemon thread so a keyboard interrupt on the main
        # thread will terminate everything. This is also the reason we will not
        # use thread.join() to wait for the polling thread.
        thread.daemon = True
        thread.start()
        while thread.is_alive():
          time.sleep(5.0)
    
        # TODO: Merge the termination code in poll_for_job_completion and
        # is_in_terminal_state.
        terminated = self.is_in_terminal_state()
        assert duration or terminated, (
            'Job did not reach to a terminal state after waiting indefinitely.')
    
        if terminated and self.state != PipelineState.DONE:
          # TODO(BEAM-1290): Consider converting this to an error log based on
          # theresolution of the issue.
          raise DataflowRuntimeException(
              'Dataflow pipeline failed. State: %s, Error:\n%s' %
              (self.state, getattr(self._runner, 'last_error_msg', None)),
>             self)
E         
apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow 
pipeline failed. State: FAILED, Error:
E         Workflow failed. Causes: Job appears to be stuck. Several workers 
have failed to start up in a row, and no worker has successfully started up for 
this job. Last error reported: Unable to pull container image due to error: 
image pull request failed with error: Error response from daemon: manifest for 
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210921120103 not found: 
manifest unknown: Failed to fetch "20210921120103" from request 
"/v2/apache-beam-testing/java-postcommit-it/java/manifests/20210921120103".. 
This is likely due to an invalid SDK container image URL. Please verify any 
provided SDK container image is valid and that Dataflow workers have 
permissions to pull image..

apache_beam/runners/dataflow/dataflow_runner.py:1642: DataflowRuntimeException
------------------------------ Captured log call -------------------------------
INFO     apache_beam.runners.portability.stager:stager.py:300 Copying Beam SDK 
"<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz";>
 to staging location.
WARNING  root:environments.py:374 Make sure that locally built Python SDK 
docker image has Python 3.6 interpreter.
INFO     root:environments.py:380 Default Python SDK image for environment is 
apache/beam_python3.6_sdk:2.34.0.dev
INFO     root:environments.py:296 Using provided Python SDK container image: 
gcr.io/cloud-dataflow/v1beta3/python36-fnapi:beam-master-20210809
INFO     root:environments.py:304 Python SDK container image set to 
"gcr.io/cloud-dataflow/v1beta3/python36-fnapi:beam-master-20210809" for Docker 
environment
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:650 
==================== <function pack_combiners at 0x7f817b609d90> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:650 
==================== <function sort_stages at 0x7f817b610598> 
====================
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:454 
Defaulting to the temp_location as staging_location: 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 
Starting GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0921121845-129634.1632226725.130416/icedtea-sound-Bdoi2wYa757-fzq5vconCy4SSQ22ZaOq7yuC98fKPs8.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 
Completed GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0921121845-129634.1632226725.130416/icedtea-sound-Bdoi2wYa757-fzq5vconCy4SSQ22ZaOq7yuC98fKPs8.jar
 in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 
Starting GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0921121845-129634.1632226725.130416/jaccess-CMbK-IOdQPLKHEqCuDnE-yBk-VpbtVT-hgjbHRUGO78.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 
Completed GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0921121845-129634.1632226725.130416/jaccess-CMbK-IOdQPLKHEqCuDnE-yBk-VpbtVT-hgjbHRUGO78.jar
 in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 
Starting GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0921121845-129634.1632226725.130416/localedata-ae5Z0L6ak4922fztWeWy7ajiWXdG3ubNrwerJRFoFj0.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 
Completed GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0921121845-129634.1632226725.130416/localedata-ae5Z0L6ak4922fztWeWy7ajiWXdG3ubNrwerJRFoFj0.jar
 in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 
Starting GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0921121845-129634.1632226725.130416/nashorn-XHtz_UehGpYcLTOrATrTnMNVUgEVa_ttoWkPxnVfqTo.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 
Completed GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0921121845-129634.1632226725.130416/nashorn-XHtz_UehGpYcLTOrATrTnMNVUgEVa_ttoWkPxnVfqTo.jar
 in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 
Starting GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0921121845-129634.1632226725.130416/cldrdata-k07I6K9W3X5KTQbcDIEsqM0LXyM18f0eR6IaJw-P_xk.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 
Completed GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0921121845-129634.1632226725.130416/cldrdata-k07I6K9W3X5KTQbcDIEsqM0LXyM18f0eR6IaJw-P_xk.jar
 in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 
Starting GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0921121845-129634.1632226725.130416/dnsns-RGhCDg3GVOQVC2r6ka2N0hmI4eqQH6VobuoAnQ74MnE.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 
Completed GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0921121845-129634.1632226725.130416/dnsns-RGhCDg3GVOQVC2r6ka2N0hmI4eqQH6VobuoAnQ74MnE.jar
 in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 
Starting GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0921121845-129634.1632226725.130416/beam-sdks-java-testing-expansion-service-testExpansionService-2.34.0-SNAPSHOT-HkaXHdXPJLj4gnqrUaqXne37d0ZSwtxa5dHOoQep0KY.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 
Completed GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0921121845-129634.1632226725.130416/beam-sdks-java-testing-expansion-service-testExpansionService-2.34.0-SNAPSHOT-HkaXHdXPJLj4gnqrUaqXne37d0ZSwtxa5dHOoQep0KY.jar
 in 3 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 
Starting GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0921121845-129634.1632226725.130416/dataflow_python_sdk.tar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 
Completed GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0921121845-129634.1632226725.130416/dataflow_python_sdk.tar
 in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 
Starting GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0921121845-129634.1632226725.130416/pipeline.pb...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 
Completed GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0921121845-129634.1632226725.130416/pipeline.pb
 in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:639 
Starting GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0921121845-129634.1632226725.130416/dataflow_graph.json...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:658 
Completed GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0921121845-129634.1632226725.130416/dataflow_graph.json
 in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:819 
Create job: <Job
                                                                           
clientRequestId: '20210921121845131793-9101'
                                                                           
createTime: '2021-09-21T12:18:51.231389Z'
                                                                           
currentStateTime: '1970-01-01T00:00:00Z'
                                                                           id: 
'2021-09-21_05_18_50-14061490285826146752'
                                                                           
location: 'us-central1'
                                                                           
name: 'beamapp-jenkins-0921121845-129634'
                                                                           
projectId: 'apache-beam-testing'
                                                                           
stageStates: []
                                                                           
startTime: '2021-09-21T12:18:51.231389Z'
                                                                           
steps: []
                                                                           
tempFiles: []
                                                                           
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:821 
Created job with id: [2021-09-21_05_18_50-14061490285826146752]
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:822 
Submitted job: 2021-09-21_05_18_50-14061490285826146752
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:828 To 
access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-21_05_18_50-14061490285826146752?project=apache-beam-testing
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 
Job 2021-09-21_05_18_50-14061490285826146752 is in state JOB_STATE_RUNNING
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:53.893Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 
2021-09-21_05_18_50-14061490285826146752. The number of workers will be between 
1 and 1000.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:54.019Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically 
enabled for job 2021-09-21_05_18_50-14061490285826146752.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:56.532Z: JOB_MESSAGE_BASIC: Worker configuration: 
e2-standard-2 in us-central1-a.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:57.567Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo 
operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:57.587Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton 
operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:57.643Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey 
operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:57.681Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step 
assert_that/Group/CoGroupByKeyImpl/GroupByKey: GroupByKey not followed by a 
combiner.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:57.717Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations 
into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:57.742Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner 
information.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:57.799Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, 
Write, and Flatten operations
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:57.841Z: JOB_MESSAGE_DEBUG: Inserted coder converter before 
flatten ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Flatten_15
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:57.877Z: JOB_MESSAGE_DEBUG: Inserted coder converter before 
flatten ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Flatten_15
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:57.916Z: JOB_MESSAGE_DETAILED: Unzipping flatten 
ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Flatten_15 for input 
ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Tag-0-_13.None-post13
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:57.960Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of 
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write, through flatten 
assert_that/Group/CoGroupByKeyImpl/Flatten, into producer 
assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:57.990Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values) into 
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Read
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:58.026Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/RestoreTags into 
assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:58.051Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Unkey into assert_that/Group/RestoreTags
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:58.073Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Match into assert_that/Unkey
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:58.105Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write into 
assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:58.141Z: JOB_MESSAGE_DETAILED: Fusing consumer 
GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
 into 
GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Impulse
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:58.163Z: JOB_MESSAGE_DETAILED: Fusing consumer 
external_1GenerateSequence-beam-external-java-generate_sequence-v1--Read-BoundedCountingSource--ParDo-BoundedS/PairWithRestriction
 into 
GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:58.189Z: JOB_MESSAGE_DETAILED: Fusing consumer 
external_1GenerateSequence-beam-external-java-generate_sequence-v1--Read-BoundedCountingSource--ParDo-BoundedS/SplitWithSizing
 into 
external_1GenerateSequence-beam-external-java-generate_sequence-v1--Read-BoundedCountingSource--ParDo-BoundedS/PairWithRestriction
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:58.220Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/WindowInto(WindowIntoFn) into 
external_1GenerateSequence-beam-external-java-generate_sequence-v1--Read-BoundedCountingSource--ParDo-BoundedS/ProcessElementAndRestrictionWithSizing
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:58.268Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Create/FlatMap(<lambda at core.py:2965>) into 
assert_that/Create/Impulse
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:58.294Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at 
core.py:2965>)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:58.327Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/CoGroupByKeyImpl/Tag[0] into assert_that/Create/Map(decode)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:58.360Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity into 
assert_that/Group/CoGroupByKeyImpl/Tag[0]
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:58.385Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:58.419Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/CoGroupByKeyImpl/Tag[1] into assert_that/ToVoidKey
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:58.451Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity into 
assert_that/Group/CoGroupByKeyImpl/Tag[1]
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:58.494Z: JOB_MESSAGE_DEBUG: Workflow config is missing a 
default resource spec.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:58.523Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and 
teardown to workflow graph.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:58.555Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop 
steps.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:58.588Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:58.747Z: JOB_MESSAGE_DEBUG: Executing wait step start23
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:58.826Z: JOB_MESSAGE_BASIC: Executing operation 
GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Impulse+GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)+external_1GenerateSequence-beam-external-java-generate_sequence-v1--Read-BoundedCountingSource--ParDo-BoundedS/PairWithRestriction+external_1GenerateSequence-beam-external-java-generate_sequence-v1--Read-BoundedCountingSource--ParDo-BoundedS/SplitWithSizing
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:58.865Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:58.888Z: JOB_MESSAGE_BASIC: Executing operation 
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:58.904Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-a...
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:58.970Z: JOB_MESSAGE_BASIC: Finished operation 
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:59.033Z: JOB_MESSAGE_DEBUG: Value 
"assert_that/Group/CoGroupByKeyImpl/GroupByKey/Session" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:18:59.114Z: JOB_MESSAGE_BASIC: Executing operation 
assert_that/Create/Impulse+assert_that/Create/FlatMap(<lambda at 
core.py:2965>)+assert_that/Create/Map(decode)+assert_that/Group/CoGroupByKeyImpl/Tag[0]+assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:19:22.269Z: JOB_MESSAGE_BASIC: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:19:48.828Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number 
of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:20:31.384Z: JOB_MESSAGE_WARNING: A worker was unable to start up. 
 Error: Unable to pull container image due to error: image pull request failed 
with error: Error response from daemon: manifest for 
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210921120103 not found: 
manifest unknown: Failed to fetch "20210921120103" from request 
"/v2/apache-beam-testing/java-postcommit-it/java/manifests/20210921120103".. 
This is likely due to an invalid SDK container image URL. Please verify any 
provided SDK container image is valid and that Dataflow workers have 
permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:21:08.136Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number 
of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:21:53.854Z: JOB_MESSAGE_WARNING: A worker was unable to start up. 
 Error: Unable to pull container image due to error: image pull request failed 
with error: Error response from daemon: manifest for 
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210921120103 not found: 
manifest unknown: Failed to fetch "20210921120103" from request 
"/v2/apache-beam-testing/java-postcommit-it/java/manifests/20210921120103".. 
This is likely due to an invalid SDK container image URL. Please verify any 
provided SDK container image is valid and that Dataflow workers have 
permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:22:21.093Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number 
of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:23:06.316Z: JOB_MESSAGE_WARNING: A worker was unable to start up. 
 Error: Unable to pull container image due to error: image pull request failed 
with error: Error response from daemon: manifest for 
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210921120103 not found: 
manifest unknown: Failed to fetch "20210921120103" from request 
"/v2/apache-beam-testing/java-postcommit-it/java/manifests/20210921120103".. 
This is likely due to an invalid SDK container image URL. Please verify any 
provided SDK container image is valid and that Dataflow workers have 
permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:23:34.036Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number 
of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:24:21.056Z: JOB_MESSAGE_WARNING: A worker was unable to start up. 
 Error: Unable to pull container image due to error: image pull request failed 
with error: Error response from daemon: manifest for 
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210921120103 not found: 
manifest unknown: Failed to fetch "20210921120103" from request 
"/v2/apache-beam-testing/java-postcommit-it/java/manifests/20210921120103".. 
This is likely due to an invalid SDK container image URL. Please verify any 
provided SDK container image is valid and that Dataflow workers have 
permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:24:46.851Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number 
of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:25:31.746Z: JOB_MESSAGE_WARNING: A worker was unable to start up. 
 Error: Unable to pull container image due to error: image pull request failed 
with error: Error response from daemon: manifest for 
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210921120103 not found: 
manifest unknown: Failed to fetch "20210921120103" from request 
"/v2/apache-beam-testing/java-postcommit-it/java/manifests/20210921120103".. 
This is likely due to an invalid SDK container image URL. Please verify any 
provided SDK container image is valid and that Dataflow workers have 
permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:25:31.796Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: Job 
appears to be stuck. Several workers have failed to start up in a row, and no 
worker has successfully started up for this job. Last error reported: Unable to 
pull container image due to error: image pull request failed with error: Error 
response from daemon: manifest for 
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210921120103 not found: 
manifest unknown: Failed to fetch "20210921120103" from request 
"/v2/apache-beam-testing/java-postcommit-it/java/manifests/20210921120103".. 
This is likely due to an invalid SDK container image URL. Please verify any 
provided SDK container image is valid and that Dataflow workers have 
permissions to pull image..
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:25:31.851Z: JOB_MESSAGE_BASIC: Finished operation 
assert_that/Create/Impulse+assert_that/Create/FlatMap(<lambda at 
core.py:2965>)+assert_that/Create/Map(decode)+assert_that/Group/CoGroupByKeyImpl/Tag[0]+assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:25:31.931Z: JOB_MESSAGE_WARNING: Unable to delete temp files: 
"gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0921121845-129634.1632226725.130416/dax-tmp-2021-09-21_05_18_50-14061490285826146752-S03-0-fa10c9cfe8577dcb/[email protected]."
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:25:31.960Z: JOB_MESSAGE_WARNING: 
S03:GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Impulse+GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)+external_1GenerateSequence-beam-external-java-generate_sequence-v1--Read-BoundedCountingSource--ParDo-BoundedS/PairWithRestriction+external_1GenerateSequence-beam-external-java-generate_sequence-v1--Read-BoundedCountingSource--ParDo-BoundedS/SplitWithSizing
 failed.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:25:31.991Z: JOB_MESSAGE_BASIC: Finished operation 
GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/Impulse+GenerateSequence(beam:external:java:generate_sequence:v1)/Read(BoundedCountingSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)+external_1GenerateSequence-beam-external-java-generate_sequence-v1--Read-BoundedCountingSource--ParDo-BoundedS/PairWithRestriction+external_1GenerateSequence-beam-external-java-generate_sequence-v1--Read-BoundedCountingSource--ParDo-BoundedS/SplitWithSizing
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:25:32.064Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:25:32.125Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:25:32.171Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:25:32.447Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker 
pool from 1 to 0.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:26:14.733Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-09-21T12:26:14.761Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 
Job 2021-09-21_05_18_50-14061490285826146752 is in state JOB_STATE_FAILED
=============================== warnings summary ===============================
apache_beam/io/filesystems_test.py:54
apache_beam/io/filesystems_test.py:54
apache_beam/io/filesystems_test.py:54
apache_beam/io/filesystems_test.py:54
apache_beam/io/filesystems_test.py:54
  
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:54:
 DeprecationWarning: invalid escape sequence \c
    self.assertIsNone(FileSystems.get_scheme('c:\\abc\cdf'))  # pylint: 
disable=anomalous-backslash-in-string

apache_beam/io/filesystems_test.py:62
apache_beam/io/filesystems_test.py:62
apache_beam/io/filesystems_test.py:62
apache_beam/io/filesystems_test.py:62
apache_beam/io/filesystems_test.py:62
  
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:62:
 DeprecationWarning: invalid escape sequence \d
    self.assertTrue(isinstance(FileSystems.get_filesystem('c:\\abc\def'),  # 
pylint: disable=anomalous-backslash-in-string

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/sdks/python/pytest_xlangValidateRunner.xml>
 -
============== 7 failed, 1 passed, 10 warnings in 608.04 seconds ===============

> Task 
> :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerPythonUsingJava
>  FAILED

> Task :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerCleanup
Stopping expansion service pid: 5521.
Stopping expansion service pid: 5524.

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210921120103
Untagged: 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:83a7e45e14a769c28b78b7ab48b2162b015978f3814f2622b6bf5fc4ba9b5f63
Deleted: sha256:1c99a336152b48538924438a632e0ff76e3eeb97f2658f21aa55a6a8ce1b5d29
Deleted: sha256:b8d22a5c55c5a42e63c68ee6d3ff67c90ff584daa3ba5bc2d2acdcc39882c5ac
Deleted: sha256:b9d7c940da34dfa50e730ff0f4e3dfb437c878b1a966e63998769e3881339065
Deleted: sha256:08138dec72f6e8dc6193116b86da2cc4c26ae2e32895f087331751aeafa01a43
Deleted: sha256:670fd99fe3bf1b2a05d671cad9883da7ffd079e9391d1df96f2dd3e8b7d0edec
Deleted: sha256:700e3a952720087df178648b2194eea9fe49e7b13e90d91f35d120edc4af31d2
Deleted: sha256:4e5f028925d0b63c70a6162c267f03c93d7a4c454e307c0e01b56e7bfa6f1291
Deleted: sha256:47d04d92ef4d40d4d9a2ae88e5a0cffe05d191c13d87d245931b35661e36cdc5
Deleted: sha256:526702b9edb8cfabfd3f5012e0da371543e3e06826687769a978b9ffbf186f1b
Deleted: sha256:a225eab18e427a82160581272ea284e4bebd8b73c2b266d5c01488e365bcd67b
Deleted: sha256:2b6f61f31ec08cee53dbe9b482918fcc3572f48a868fb6cfe12c15c743f7f992
Deleted: sha256:830299da70c22a772d5d4d645ca47fd10eece28e4b9930291f73d1a514fed455
Deleted: sha256:d8121b3267547d87e7fbdac028885deebdb561a06b06d3220b7c427b3484d8c3
Deleted: sha256:3a76485f5dc422fd754d6538e64b382eaa82be4a5078730039d09431010aafc7
Deleted: sha256:170000f3ea1c4262852d2c016b2c0530b5fb49db05f1d15eba9f476d9f38c432
Deleted: sha256:f5d9e26192d843662e5bac6d8c003f49bc430d540c1169c0f99c14274b8b0f29
Deleted: sha256:bf549fcd5aad16a5b4f4b4d72d52906c3e62f6fbe58023422c5adecef2c4db58
Deleted: sha256:ad4d766fbbb2b924ca6455b9fffc0c000abb84ef82437e6ac16ae3f0a4795edf
Deleted: sha256:ec0dc46058de8a3d329cb1e41e6ad21ef6f2cd613157abb5bf7210fd84d53fcb
Deleted: sha256:e467ce955505f2a3afec3dc7afefac519bbf80a670450aedbf6f04d7f77fad16
Deleted: sha256:262bad530627d825bd0d49601d6bbcc782d45690e191f8c9ab553b116f5be296
Deleted: sha256:44fb8d7e9376bf71a228f70b07e359515d673e835fbd397bac0e63bdef30dd1e
Deleted: sha256:a93bc7fc0a00bbccd6d811194e4e40212d989c1eb3238edee5d32710d4aa86d2
Deleted: sha256:ae7cbcd48a8a7c07fb98267caf7c33aaeeccca520160d6595def893d7b3c2263
Deleted: sha256:9aa6deef6928ff9cdbb566b028644a326d502ca85f117a25393a18274005b3c9
Deleted: sha256:c74d16212c034fdf50c4c221da6cc2d12a239c4945c47c4ee4b59c51683a6b99
Deleted: sha256:c9303cda25e89feaad90630c458f08b716ba2207700f9d38eff41d7af18938ad
Deleted: sha256:6d5f2901d340d5225802c321e9a21f12d8e1891a4336f91561857f58c12413b8
ERROR: (gcloud.container.images.untag) Image could not be found: 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210921120103]

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file 
'<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/runners/google-cloud-dataflow-java/build.gradle'>
 line: 279

* What went wrong:
Execution failed for task 
':runners:google-cloud-dataflow-java:cleanUpDockerImages'.
> Process 'command 'gcloud'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 27m 49s
105 actionable tasks: 76 executed, 25 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/gb45z6yfkun5k

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to