See 
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/106/display/redirect?page=changes>

Changes:

[github] Fixing apache_beam.io.gcp.bigquery_test:PubSubBigQueryIT. at head


------------------------------------------
[...truncated 5.44 MB...]
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
                  "component_encodings": [
                    {
                      "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": 
"ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": 
"ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-12T17:56:56.170498Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-12_10_56_55-11149883925904266893'
 location: u'us-central1'
 name: u'beamapp-jenkins-0312175638-294033'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-12T17:56:56.170498Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: 
[2020-03-12_10_56_55-11149883925904266893]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow 
monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_56_55-11149883925904266893?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely 
for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2020-03-12_10_56_55-11149883925904266893 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:56:55.074Z: 
JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 
2020-03-12_10_56_55-11149883925904266893.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:56:55.074Z: 
JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 
2020-03-12_10_56_55-11149883925904266893. The number of workers will be between 
1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:56:55.074Z: 
JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. 
Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:01.949Z: 
JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service 
Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:02.555Z: 
JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.081Z: 
JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable 
parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.155Z: 
JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into 
optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.229Z: 
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.258Z: 
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step 
assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.283Z: 
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not 
followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.328Z: 
JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into 
optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.368Z: 
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write 
steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.479Z: 
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.599Z: 
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.651Z: 
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.682Z: 
JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.716Z: 
JOB_MESSAGE_DETAILED: Fusing unzipped copy of 
assert_that/Group/GroupByKey/WriteStream, through flatten 
assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.748Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream 
into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.778Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) 
into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.800Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at 
core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.839Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into 
assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.875Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into 
assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.914Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at 
core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.942Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into 
Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.975Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.009Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.044Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) 
into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.074Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.110Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into 
Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.149Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.181Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into 
Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.225Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into 
assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.258Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into 
assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.292Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets 
into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.317Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/Map(_merge_tagged_vals_under_key) into 
assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.361Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into 
assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.394Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.433Z: 
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.463Z: 
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.497Z: 
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.531Z: 
JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:21.934Z: 
JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric 
descriptors and Stackdriver will not create new Dataflow custom metrics for 
this job. Each unique user-defined metric name (independent of the DoFn in 
which it is defined) produces a new metric descriptor. To delete old / unused 
metric descriptors see 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:35.399Z: 
JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:35.446Z: 
JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:35.493Z: 
JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:58:04.719Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that 
the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:58:30.500Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:58:30.534Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T18:03:05.741Z: 
JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service 
Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T18:04:37.419Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T18:04:37.475Z: 
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T18:04:37.500Z: 
JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T18:04:37.541Z: 
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T18:04:37.566Z: 
JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T18:05:22.319Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on 
low average worker CPU utilization, and the pipeline having sufficiently low 
backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T18:05:22.528Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T18:05:22.599Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2020-03-12_10_56_55-11149883925904266893 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

======================================================================
ERROR: Test a GBK sideinput, with multiple triggering.
----------------------------------------------------------------------
Traceback (most recent call last):
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/transforms/sideinputs_test.py";,>
 line 406, in test_multi_triggered_gbk_side_input
    p.run()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/testing/test_pipeline.py";,>
 line 112, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 495, in run
    self._options).run(False)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 508, in run
    return self.runner.run_pipeline(self, self._options)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py";,>
 line 57, in run_pipeline
    self).run_pipeline(pipeline, options)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py";,>
 line 536, in run_pipeline
    self.visit_transforms(pipeline, options)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py";,>
 line 224, in visit_transforms
    pipeline.visit(RunVisitor(self))
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 545, in visit
    self._root_transform().visit(visitor, self, visited)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 1033, in visit
    part.visit(visitor, pipeline, visited)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 1036, in visit
    visitor.visit_transform(self)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py";,>
 line 219, in visit_transform
    self.runner.run_transform(transform_node, options)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py";,>
 line 246, in run_transform
    return m(transform_node, options)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py";,>
 line 957, in run_ParDo
    PropertyNames.STEP_NAME: input_step.proto.name,
AttributeError: 'list' object has no attribute 'proto'
-------------------- >> begin captured logging << --------------------
apache_beam.options.pipeline_options: WARNING: --region not set; will default 
to us-central1. Future releases of Beam will require the user to set --region 
explicitly, or else have a default set via the gcloud tool. 
https://cloud.google.com/compute/docs/regions-zones
apache_beam.runners.runner: ERROR: Error while visiting Main windowInto
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2182.606s

FAILED (errors=1)
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_29_38-12902448343493459045?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_39_04-13755921482621391046?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_48_14-14199041188818676489?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_56_55-11149883925904266893?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_29_37-7496866972695785782?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_38_17-9223809253366558464?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_47_14-12556406680748801646?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_29_37-11322849103390080251?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_38_09-5106433121096340210?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_46_44-974645006389504310?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_29_38-10300893707782139206?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_37_40-18132827848236419716?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_46_31-7061421517999968259?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_29_39-14167326813525540096?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_39_14-11936083830124716780?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_29_36-10261436357990232592?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_38_04-3068100152836446232?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_46_40-2442995916299268955?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_29_39-16080148491383674731?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_38_16-14295870522617893294?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_47_50-18366059412314013485?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_29_37-17267459930739234312?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_38_18-7890722251611419122?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_46_59-12366168715647852489?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests 
> FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'>
 line: 113

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'>
 line: 142

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 17m 40s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/fpjwrrqvthc7y

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to