See 
<https://builds.apache.org/job/beam_PostCommit_Python37/2443/display/redirect>

Changes:


------------------------------------------
[...truncated 10.93 MB...]
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/testing/util.py";,>
 line 218, in _matches
    hamcrest_assert(actual, contains_inanyorder(*expected_list))
  File "/usr/local/lib/python3.7/site-packages/hamcrest/core/assert_that.py", 
line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "/usr/local/lib/python3.7/site-packages/hamcrest/core/assert_that.py", 
line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: a sequence over [a sequence containing a string containing 'bicycle'] 
in any order
     but: not matched: <[]>


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", 
line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", 
line 179, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 63, in 
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 64, in 
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 79, in 
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 80, in 
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 84, in 
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 332, in 
apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 195, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "dataflow_worker/shuffle_operations.py", line 261, in 
dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "dataflow_worker/shuffle_operations.py", line 268, in 
dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "apache_beam/runners/worker/operations.py", line 332, in 
apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 195, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 670, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 671, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 963, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1030, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 961, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 726, in 
apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 812, in 
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "apache_beam/runners/common.py", line 1122, in 
apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 195, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 670, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 671, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 963, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1030, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 961, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 553, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1122, in 
apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 195, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 670, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 671, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 963, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1045, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.7/site-packages/future/utils/__init__.py", line 
446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 961, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 554, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/transforms/core.py";,>
 line 1511, in <lambda>
    wrapper = lambda x: [fn(x)]
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/testing/util.py";,>
 line 218, in _matches
    hamcrest_assert(actual, contains_inanyorder(*expected_list))
  File "/usr/local/lib/python3.7/site-packages/hamcrest/core/assert_that.py", 
line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "/usr/local/lib/python3.7/site-packages/hamcrest/core/assert_that.py", 
line 60, in _assert_match
    raise AssertionError(description)
RuntimeError: AssertionError: 
Expected: a sequence over [a sequence containing a string containing 'bicycle'] 
in any order
     but: not matched: <[]> [while running 'assert_that/Match']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-07T19:04:00.231Z: 
JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "apache_beam/runners/common.py", line 961, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 554, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/transforms/core.py";,>
 line 1511, in <lambda>
    wrapper = lambda x: [fn(x)]
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/testing/util.py";,>
 line 218, in _matches
    hamcrest_assert(actual, contains_inanyorder(*expected_list))
  File "/usr/local/lib/python3.7/site-packages/hamcrest/core/assert_that.py", 
line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "/usr/local/lib/python3.7/site-packages/hamcrest/core/assert_that.py", 
line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: a sequence over [a sequence containing a string containing 'bicycle'] 
in any order
     but: not matched: <[]>


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", 
line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", 
line 179, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 63, in 
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 64, in 
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 79, in 
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 80, in 
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 84, in 
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 332, in 
apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 195, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "dataflow_worker/shuffle_operations.py", line 261, in 
dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "dataflow_worker/shuffle_operations.py", line 268, in 
dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "apache_beam/runners/worker/operations.py", line 332, in 
apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 195, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 670, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 671, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 963, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1030, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 961, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 726, in 
apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 812, in 
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "apache_beam/runners/common.py", line 1122, in 
apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 195, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 670, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 671, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 963, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1030, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 961, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 553, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1122, in 
apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 195, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 670, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 671, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 963, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1045, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.7/site-packages/future/utils/__init__.py", line 
446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 961, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 554, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/transforms/core.py";,>
 line 1511, in <lambda>
    wrapper = lambda x: [fn(x)]
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/testing/util.py";,>
 line 218, in _matches
    hamcrest_assert(actual, contains_inanyorder(*expected_list))
  File "/usr/local/lib/python3.7/site-packages/hamcrest/core/assert_that.py", 
line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "/usr/local/lib/python3.7/site-packages/hamcrest/core/assert_that.py", 
line 60, in _assert_match
    raise AssertionError(description)
RuntimeError: AssertionError: 
Expected: a sequence over [a sequence containing a string containing 'bicycle'] 
in any order
     but: not matched: <[]> [while running 'assert_that/Match']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-07T19:04:00.259Z: 
JOB_MESSAGE_BASIC: Finished operation 
assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-07T19:04:00.339Z: 
JOB_MESSAGE_DEBUG: Executing failure step failure40
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-07T19:04:00.372Z: 
JOB_MESSAGE_ERROR: Workflow failed. Causes: 
S10:assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
 failed., The job failed because a work item has failed 4 times. Look in 
previous log entries for the cause of each one of the 4 failures. For more 
information, see https://cloud.google.com/dataflow/docs/guides/common-errors. 
The work item was attempted on these workers: 
  beamapp-jenkins-060718572-06071157-no27-harness-sdns
      Root cause: Work item failed.,
  beamapp-jenkins-060718572-06071157-no27-harness-sdns
      Root cause: Work item failed.,
  beamapp-jenkins-060718572-06071157-no27-harness-sdns
      Root cause: Work item failed.,
  beamapp-jenkins-060718572-06071157-no27-harness-sdns
      Root cause: Work item failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-07T19:04:00.487Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-07T19:04:00.547Z: 
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-07T19:04:00.580Z: 
JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-07T19:05:27.519Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-07T19:05:27.566Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-07T19:05:27.607Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 
2020-06-07_11_57_36-1662331383088912114 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_05_34-1874073569396527731?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_20_16-16184606906964844052?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_28_05-10053554150396782424?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_35_44-11737880187777303703?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_44_28-9822790286768024889?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_51_32-8354929218965868733?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_58_17-14761848947768980714?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_05_32-15740854562121038449?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_27_57-14562539819082222478?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_35_38-10380296488833623120?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_43_06-14305550701004185645?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_50_42-14126715957010073583?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_58_23-5836322844554005779?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_05_33-5952410854064269477?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_17_25-13041561123358225601?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_25_04-5238965003966389534?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_33_07-9812508416512973522?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_41_40-5275071233404638064?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_49_32-7316808310672464803?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_57_36-1662331383088912114?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_05_32-3024169620525072802?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_25_16-60598183634201909?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_34_04-10855331005408117461?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_42_21-10758589432786818674?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_59_38-17581304990659638071?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_05_32-16180264101668194229?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_14_02-7070310161070440056?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_22_56-12623855582522263237?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_32_49-5496549865572064279?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_40_54-5326298391934482664?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_49_25-5654362267241508276?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_58_18-17165354108217206594?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_12_05_56-11618878765627293587?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_05_32-12963066144376508622?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_14_02-12046167397703741417?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_22_51-13115462413900676025?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_31_40-9719612649346673853?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_39_19-16549923052754635630?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_46_46-17100283342773616696?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_54_59-4719588554478421346?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_05_31-2187079598521006989?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_13_39-1708762432260555666?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_22_20-8872032300487582042?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_30_18-9991340794488720616?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_37_46-4064812798431479456?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_45_26-16642368025737433746?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_53_15-12118914333736957819?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_12_00_59-9777246202072667791?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_05_31-8925294316602518026?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_15_55-16045219722916889234?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_26_10-10058723366734189960?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_34_28-9413641559119806724?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_42_14-4929721913793963151?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_50_15-17283236028197559333?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-07_11_58_08-15882991552198927504?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py37.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 63 tests in 4128.403s

FAILED (SKIP=7, errors=1)

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script 
'<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
 line: 116

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 10m 27s
87 actionable tasks: 64 executed, 23 from cache

Publishing build scan...
https://gradle.com/s/kuvmwvvnkmrde

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to