See 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/1595/display/redirect?page=changes>

Changes:

[arietis27] [BEAM-13517] Unable to write nulls to columns with logical types


------------------------------------------
[...truncated 727.42 KB...]
...021-12-23T12:20:32.925571Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> at 0x7fa9d5574898>
duration = None

    def wait_until_finish(self, duration=None):
      if not self.is_in_terminal_state():
        if not self.has_job:
          raise IOError('Failed to get the Dataflow job id.')
    
        thread = threading.Thread(
            target=DataflowRunner.poll_for_job_completion,
            args=(self._runner, self, duration))
    
        # Mark the thread as a daemon thread so a keyboard interrupt on the main
        # thread will terminate everything. This is also the reason we will not
        # use thread.join() to wait for the polling thread.
        thread.daemon = True
        thread.start()
        while thread.is_alive():
          time.sleep(5.0)
    
        # TODO: Merge the termination code in poll_for_job_completion and
        # is_in_terminal_state.
        terminated = self.is_in_terminal_state()
        assert duration or terminated, (
            'Job did not reach to a terminal state after waiting indefinitely.')
    
        if terminated and self.state != PipelineState.DONE:
          # TODO(BEAM-1290): Consider converting this to an error log based on
          # theresolution of the issue.
          raise DataflowRuntimeException(
              'Dataflow pipeline failed. State: %s, Error:\n%s' %
              (self.state, getattr(self._runner, 'last_error_msg', None)),
>             self)
E         
apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow 
pipeline failed. State: FAILED, Error:
E         Workflow failed. Causes: Job appears to be stuck. Several workers 
have failed to start up in a row, and no worker has successfully started up for 
this job. Last error reported: Unable to pull container image due to error: 
image pull request failed with error: Error response from daemon: manifest for 
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211223120119 not found: 
manifest unknown: Failed to fetch "20211223120119" from request 
"/v2/apache-beam-testing/java-postcommit-it/java/manifests/20211223120119".. 
This is likely due to an invalid SDK container image URL. Please verify any 
provided SDK container image is valid and that Dataflow workers have 
permissions to pull image..

apache_beam/runners/dataflow/dataflow_runner.py:1638: DataflowRuntimeException
------------------------------ Captured log call -------------------------------
INFO     apache_beam.runners.portability.stager:stager.py:302 Copying Beam SDK 
"<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz";>
 to staging location.
WARNING  root:environments.py:374 Make sure that locally built Python SDK 
docker image has Python 3.6 interpreter.
INFO     root:environments.py:380 Default Python SDK image for environment is 
apache/beam_python3.6_sdk:2.36.0.dev
INFO     root:environments.py:296 Using provided Python SDK container image: 
gcr.io/cloud-dataflow/v1beta3/python36-fnapi:beam-master-20211222
INFO     root:environments.py:304 Python SDK container image set to 
"gcr.io/cloud-dataflow/v1beta3/python36-fnapi:beam-master-20211222" for Docker 
environment
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function pack_combiners at 0x7fa9d782ca60> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function sort_stages at 0x7fa9d782f268> 
====================
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:454 
Defaulting to the temp_location as staging_location: 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:648 
Starting GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1223122025-166320.1640262025.166760/icedtea-sound-Bdoi2wYa757-fzq5vconCy4SSQ22ZaOq7yuC98fKPs8.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:667 
Completed GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1223122025-166320.1640262025.166760/icedtea-sound-Bdoi2wYa757-fzq5vconCy4SSQ22ZaOq7yuC98fKPs8.jar
 in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:648 
Starting GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1223122025-166320.1640262025.166760/jaccess-CMbK-IOdQPLKHEqCuDnE-yBk-VpbtVT-hgjbHRUGO78.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:667 
Completed GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1223122025-166320.1640262025.166760/jaccess-CMbK-IOdQPLKHEqCuDnE-yBk-VpbtVT-hgjbHRUGO78.jar
 in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:648 
Starting GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1223122025-166320.1640262025.166760/localedata-ae5Z0L6ak4922fztWeWy7ajiWXdG3ubNrwerJRFoFj0.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:667 
Completed GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1223122025-166320.1640262025.166760/localedata-ae5Z0L6ak4922fztWeWy7ajiWXdG3ubNrwerJRFoFj0.jar
 in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:648 
Starting GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1223122025-166320.1640262025.166760/nashorn-XHtz_UehGpYcLTOrATrTnMNVUgEVa_ttoWkPxnVfqTo.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:667 
Completed GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1223122025-166320.1640262025.166760/nashorn-XHtz_UehGpYcLTOrATrTnMNVUgEVa_ttoWkPxnVfqTo.jar
 in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:648 
Starting GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1223122025-166320.1640262025.166760/cldrdata-k07I6K9W3X5KTQbcDIEsqM0LXyM18f0eR6IaJw-P_xk.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:667 
Completed GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1223122025-166320.1640262025.166760/cldrdata-k07I6K9W3X5KTQbcDIEsqM0LXyM18f0eR6IaJw-P_xk.jar
 in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:648 
Starting GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1223122025-166320.1640262025.166760/dnsns-RGhCDg3GVOQVC2r6ka2N0hmI4eqQH6VobuoAnQ74MnE.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:667 
Completed GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1223122025-166320.1640262025.166760/dnsns-RGhCDg3GVOQVC2r6ka2N0hmI4eqQH6VobuoAnQ74MnE.jar
 in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:648 
Starting GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1223122025-166320.1640262025.166760/beam-sdks-java-testing-expansion-service-testExpansionService-2.36.0-SNAPSHOT-daMt7XZNnUsk5OxC22Ms7yfV97Eb9Uf2ZvVAdd4tFLI.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:667 
Completed GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1223122025-166320.1640262025.166760/beam-sdks-java-testing-expansion-service-testExpansionService-2.36.0-SNAPSHOT-daMt7XZNnUsk5OxC22Ms7yfV97Eb9Uf2ZvVAdd4tFLI.jar
 in 5 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:648 
Starting GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1223122025-166320.1640262025.166760/dataflow_python_sdk.tar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:667 
Completed GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1223122025-166320.1640262025.166760/dataflow_python_sdk.tar
 in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:648 
Starting GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1223122025-166320.1640262025.166760/pipeline.pb...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:667 
Completed GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1223122025-166320.1640262025.166760/pipeline.pb
 in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:828 
Create job: <Job
                                                                           
clientRequestId: '20211223122025167657-3474'
                                                                           
createTime: '2021-12-23T12:20:32.925571Z'
                                                                           
currentStateTime: '1970-01-01T00:00:00Z'
                                                                           id: 
'2021-12-23_04_20_32-11118803557263164090'
                                                                           
location: 'us-central1'
                                                                           
name: 'beamapp-jenkins-1223122025-166320'
                                                                           
projectId: 'apache-beam-testing'
                                                                           
stageStates: []
                                                                           
startTime: '2021-12-23T12:20:32.925571Z'
                                                                           
steps: []
                                                                           
tempFiles: []
                                                                           
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:830 
Created job with id: [2021-12-23_04_20_32-11118803557263164090]
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:831 
Submitted job: 2021-12-23_04_20_32-11118803557263164090
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:837 To 
access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-23_04_20_32-11118803557263164090?project=apache-beam-testing
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 
Job 2021-12-23_04_20_32-11118803557263164090 is in state JOB_STATE_RUNNING
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:35.593Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 
2021-12-23_04_20_32-11118803557263164090. The number of workers will be between 
1 and 1000.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:35.946Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically 
enabled for job 2021-12-23_04_20_32-11118803557263164090.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:38.490Z: JOB_MESSAGE_BASIC: Worker configuration: 
e2-standard-2 in us-central1-b.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:39.165Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo 
operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:39.199Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton 
operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:39.260Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey 
operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:39.298Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step 
assert_that/Group/CoGroupByKeyImpl/GroupByKey: GroupByKey not followed by a 
combiner.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:39.323Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not 
followed by a combiner.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:39.367Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations 
into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:39.397Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner 
information.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:39.439Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, 
Write, and Flatten operations
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:39.476Z: JOB_MESSAGE_DEBUG: Inserted coder converter before 
flatten ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Flatten_27
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:39.511Z: JOB_MESSAGE_DEBUG: Inserted coder converter before 
flatten ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Flatten_27
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:39.548Z: JOB_MESSAGE_DETAILED: Unzipping flatten 
ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Flatten_27 for input 
ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Tag-0-_25.None-post14
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:39.581Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of 
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write, through flatten 
assert_that/Group/CoGroupByKeyImpl/Flatten, into producer 
assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:39.616Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values) into 
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Read
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:39.641Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/RestoreTags into 
assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:39.674Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Unkey into assert_that/Group/RestoreTags
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:39.708Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Match into assert_that/Unkey
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:39.737Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write into 
assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:39.775Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Create/FlatMap(<lambda at core.py:3224>) into Create/Impulse
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:39.807Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at 
core.py:3224>)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:39.842Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into 
Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:39.904Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify into 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:39.936Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write into 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:39.968Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow into 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:40.003Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) 
into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:40.040Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:40.073Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:40.107Z: JOB_MESSAGE_DETAILED: Fusing consumer 
ExternalTransform(beam:transforms:xlang:test:prefix)/Map/ParMultiDo(Anonymous) 
into Create/Map(decode)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:40.153Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/WindowInto(WindowIntoFn) into 
ExternalTransform(beam:transforms:xlang:test:prefix)/Map/ParMultiDo(Anonymous)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:40.180Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Create/FlatMap(<lambda at core.py:3224>) into 
assert_that/Create/Impulse
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:40.228Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at 
core.py:3224>)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:40.261Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/CoGroupByKeyImpl/Tag[0] into assert_that/Create/Map(decode)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:40.296Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity into 
assert_that/Group/CoGroupByKeyImpl/Tag[0]
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:40.325Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:40.354Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/CoGroupByKeyImpl/Tag[1] into assert_that/ToVoidKey
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:40.381Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity into 
assert_that/Group/CoGroupByKeyImpl/Tag[1]
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:40.405Z: JOB_MESSAGE_DEBUG: Workflow config is missing a 
default resource spec.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:40.426Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and 
teardown to workflow graph.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:40.452Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop 
steps.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:40.475Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:40.619Z: JOB_MESSAGE_DEBUG: Executing wait step start26
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:40.698Z: JOB_MESSAGE_BASIC: Executing operation 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:40.725Z: JOB_MESSAGE_BASIC: Executing operation 
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:40.739Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:40.772Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-b...
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:40.815Z: JOB_MESSAGE_BASIC: Finished operation 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:40.832Z: JOB_MESSAGE_BASIC: Finished operation 
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:40.881Z: JOB_MESSAGE_DEBUG: Value 
"Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Session" 
materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:40.909Z: JOB_MESSAGE_DEBUG: Value 
"assert_that/Group/CoGroupByKeyImpl/GroupByKey/Session" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:40.952Z: JOB_MESSAGE_BASIC: Executing operation 
Create/Impulse+Create/FlatMap(<lambda at 
core.py:3224>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:20:40.973Z: JOB_MESSAGE_BASIC: Executing operation 
assert_that/Create/Impulse+assert_that/Create/FlatMap(<lambda at 
core.py:3224>)+assert_that/Create/Map(decode)+assert_that/Group/CoGroupByKeyImpl/Tag[0]+assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:21:01.057Z: JOB_MESSAGE_BASIC: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:21:25.923Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number 
of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:22:09.432Z: JOB_MESSAGE_WARNING: A worker was unable to start up. 
 Error: Unable to pull container image due to error: image pull request failed 
with error: Error response from daemon: manifest for 
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211223120119 not found: 
manifest unknown: Failed to fetch "20211223120119" from request 
"/v2/apache-beam-testing/java-postcommit-it/java/manifests/20211223120119".. 
This is likely due to an invalid SDK container image URL. Please verify any 
provided SDK container image is valid and that Dataflow workers have 
permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:22:33.655Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number 
of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:23:19.827Z: JOB_MESSAGE_WARNING: A worker was unable to start up. 
 Error: Unable to pull container image due to error: image pull request failed 
with error: Error response from daemon: manifest for 
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211223120119 not found: 
manifest unknown: Failed to fetch "20211223120119" from request 
"/v2/apache-beam-testing/java-postcommit-it/java/manifests/20211223120119".. 
This is likely due to an invalid SDK container image URL. Please verify any 
provided SDK container image is valid and that Dataflow workers have 
permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:23:48.532Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number 
of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:24:33.105Z: JOB_MESSAGE_WARNING: A worker was unable to start up. 
 Error: Unable to pull container image due to error: image pull request failed 
with error: Error response from daemon: manifest for 
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211223120119 not found: 
manifest unknown: Failed to fetch "20211223120119" from request 
"/v2/apache-beam-testing/java-postcommit-it/java/manifests/20211223120119".. 
This is likely due to an invalid SDK container image URL. Please verify any 
provided SDK container image is valid and that Dataflow workers have 
permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:24:56.259Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number 
of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:25:43.641Z: JOB_MESSAGE_WARNING: A worker was unable to start up. 
 Error: Unable to pull container image due to error: image pull request failed 
with error: Error response from daemon: manifest for 
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211223120119 not found: 
manifest unknown: Failed to fetch "20211223120119" from request 
"/v2/apache-beam-testing/java-postcommit-it/java/manifests/20211223120119".. 
This is likely due to an invalid SDK container image URL. Please verify any 
provided SDK container image is valid and that Dataflow workers have 
permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:26:09.208Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number 
of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:26:56.234Z: JOB_MESSAGE_WARNING: A worker was unable to start up. 
 Error: Unable to pull container image due to error: image pull request failed 
with error: Error response from daemon: manifest for 
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211223120119 not found: 
manifest unknown: Failed to fetch "20211223120119" from request 
"/v2/apache-beam-testing/java-postcommit-it/java/manifests/20211223120119".. 
This is likely due to an invalid SDK container image URL. Please verify any 
provided SDK container image is valid and that Dataflow workers have 
permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:26:56.266Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: Job 
appears to be stuck. Several workers have failed to start up in a row, and no 
worker has successfully started up for this job. Last error reported: Unable to 
pull container image due to error: image pull request failed with error: Error 
response from daemon: manifest for 
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211223120119 not found: 
manifest unknown: Failed to fetch "20211223120119" from request 
"/v2/apache-beam-testing/java-postcommit-it/java/manifests/20211223120119".. 
This is likely due to an invalid SDK container image URL. Please verify any 
provided SDK container image is valid and that Dataflow workers have 
permissions to pull image..
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:26:56.326Z: JOB_MESSAGE_BASIC: Finished operation 
Create/Impulse+Create/FlatMap(<lambda at 
core.py:3224>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:26:56.326Z: JOB_MESSAGE_BASIC: Finished operation 
assert_that/Create/Impulse+assert_that/Create/FlatMap(<lambda at 
core.py:3224>)+assert_that/Create/Map(decode)+assert_that/Group/CoGroupByKeyImpl/Tag[0]+assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:26:56.413Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:26:56.483Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:26:56.516Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:26:56.725Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker 
pool from 1 to 0.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:27:43.261Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2021-12-23T12:27:43.313Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 
Job 2021-12-23_04_20_32-11118803557263164090 is in state JOB_STATE_FAILED
=============================== warnings summary ===============================
apache_beam/io/filesystems_test.py:54
  
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:54:
 DeprecationWarning: invalid escape sequence \c
    self.assertIsNone(FileSystems.get_scheme('c:\\abc\cdf'))  # pylint: 
disable=anomalous-backslash-in-string

apache_beam/io/filesystems_test.py:62
  
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:62:
 DeprecationWarning: invalid escape sequence \d
    self.assertTrue(isinstance(FileSystems.get_filesystem('c:\\abc\def'),  # 
pylint: disable=anomalous-backslash-in-string

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/sdks/python/pytest_xlangValidateRunner.xml>
 -
=============== 9 failed, 1 passed, 2 warnings in 993.49 seconds ===============

> Task 
> :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerPythonUsingJava
>  FAILED

> Task :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerCleanup
Stopping expansion service pid: 21526.
Stopping expansion service pid: 21527.

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages FAILED
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211223120119
Untagged: 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f47afad92e24ac0bf660e95ab2248097cd68c4fb494b5397a458ab4277fca5f8
Deleted: sha256:90795e6fc4b5338829943ae1cfc3bc09b09a7d51f25f84c852fa228de43c3f79
Deleted: sha256:4eeaa568f77fa960880f24c4a947895be9908a5656fb105caab27b821afae576
Deleted: sha256:1cb9746a0de52eaa9c7e9d5d693b201da1f29c6364ff2c878739fda240c21aa8
Deleted: sha256:834bde0358231b9a260772bf1512beade7a6ebbf624663d24bea75578f7a97f2
Deleted: sha256:be40365d50e54d07563570c3286a417199aef7c8458520dcba6ff21397531c8a
Deleted: sha256:0cd83fd32dbdf78e22de547097b22e7851ce602810f0d302999dd26464c2e955
Deleted: sha256:e72ec9d48744e9f1baefe7470d570469672ca985d8c796e50809e12d5da871db
Deleted: sha256:bb72b73da57dda06167e64390125959ddef0ae1fe9caf7f42390b8b8593fff00
Deleted: sha256:094348c32755d50cf1694a82c9e622cf2dacae930aa24d279c0ee75e43350360
Deleted: sha256:9fb26f6efe5a9c76ccf8d07942938e41ea5290356f83837ec6faa7ec3406c97f
Deleted: sha256:9bb36a56e26f8a83acf3c324e3f7fbae17064d49e7953f36bc72bad5cf6b64eb
Deleted: sha256:279804858fdccce22f2fe0fc0faa34c0824463f65373688d69a156faa65dec02
Deleted: sha256:7a03a616ccfedcf66368b1e70e978b73f395e1ee0749c699f8b4a68f94da4dda
Deleted: sha256:bdb43c2bc0776accc4d2d1fb77ea6cabd7f2d69ab13459cc60eac22e0481a6b7
Deleted: sha256:e7dafda54d3e5ef4d0cc5c5874691e1056dd5d0d9a61e58aa9c8100ea556cff3
Deleted: sha256:c9102ff9ad6040697b9ef71de57d47ee089f2cd65d33473402c9ebb42d2060eb
Deleted: sha256:487b3713d168e11c464d12388fe9edcf809b2b1177dfdbb79f9177998fe73af7
Deleted: sha256:473ca06cbe652d4c35217ecd1169bd4c4563fe50d0a4a6aabeba00e6893b12c8
Deleted: sha256:202032e06ec88dc73798350f192562a8f6a6dead3d5d787bcf74cb6d9d39e4c3
Deleted: sha256:24e1fb0e814f36a40042d0e0feda86003c2fced1369f43d6fd6ac8bb65a1e6db
Deleted: sha256:632c085658db2ee40a4a7be20ed2d228cd8c525fb0e36ea8854b9e3be99174cb
Deleted: sha256:9aba4c9997f1ab1208bf7374f77a383c49a062289e5a7c15cd4ca053f72f8987
Deleted: sha256:fbbd19da90ec7d68a53e63b6aed280db6d5ace41c9b40e1e57237d32fea75831
Deleted: sha256:d8090c3315b643935c45a8f46d287a79c62972420b279a3f940b8a906db6833a
Deleted: sha256:32dcf024a888efc37e62db93a16ccd864d603b71de1f14c8f81e1db66727412b
Deleted: sha256:588235237cdf12f405b4db045b5a8cdb22083b2ee0a87da11dc92f8752c3f6ff
Deleted: sha256:6d7bb056a316715a4ab8c9bb7bc007924ec81db00c7aed2b7eb705b8805265f5
Deleted: sha256:670ad0e290d6df49c9fb15b9412163434a298a629e6117092def9afcd2487625
ERROR: (gcloud.container.images.untag) Image could not be found: 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211223120119]

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file 
'<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/runners/google-cloud-dataflow-java/build.gradle'>
 line: 278

* What went wrong:
Execution failed for task 
':runners:google-cloud-dataflow-java:cleanUpDockerImages'.
> Process 'command 'gcloud'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 27m 27s
106 actionable tasks: 68 executed, 34 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/egahvqm7o7mkw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to