See 
<https://builds.apache.org/job/beam_PostCommit_Python36/2526/display/redirect>

Changes:


------------------------------------------
[...truncated 10.83 MB...]
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/testing/util.py";,>
 line 218, in _matches
    hamcrest_assert(actual, contains_inanyorder(*expected_list))
  File "/usr/local/lib/python3.6/site-packages/hamcrest/core/assert_that.py", 
line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "/usr/local/lib/python3.6/site-packages/hamcrest/core/assert_that.py", 
line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: a sequence over [a sequence containing a string containing 'bicycle'] 
in any order
     but: not matched: <[]>


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.6/site-packages/dataflow_worker/batchworker.py", 
line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.6/site-packages/dataflow_worker/executor.py", 
line 179, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 63, in 
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 64, in 
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 79, in 
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 80, in 
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 84, in 
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 332, in 
apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 195, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "dataflow_worker/shuffle_operations.py", line 261, in 
dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "dataflow_worker/shuffle_operations.py", line 268, in 
dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "apache_beam/runners/worker/operations.py", line 332, in 
apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 195, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 670, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 671, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 963, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1030, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 961, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 726, in 
apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 812, in 
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "apache_beam/runners/common.py", line 1122, in 
apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 195, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 670, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 671, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 963, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1030, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 961, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 553, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1122, in 
apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 195, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 670, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 671, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 963, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1045, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.6/site-packages/future/utils/__init__.py", line 
446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 961, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 554, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/transforms/core.py";,>
 line 1511, in <lambda>
    wrapper = lambda x: [fn(x)]
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/testing/util.py";,>
 line 218, in _matches
    hamcrest_assert(actual, contains_inanyorder(*expected_list))
  File "/usr/local/lib/python3.6/site-packages/hamcrest/core/assert_that.py", 
line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "/usr/local/lib/python3.6/site-packages/hamcrest/core/assert_that.py", 
line 60, in _assert_match
    raise AssertionError(description)
RuntimeError: AssertionError: 
Expected: a sequence over [a sequence containing a string containing 'bicycle'] 
in any order
     but: not matched: <[]> [while running 'assert_that/Match']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-06T13:01:08.653Z: 
JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "apache_beam/runners/common.py", line 961, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 554, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/transforms/core.py";,>
 line 1511, in <lambda>
    wrapper = lambda x: [fn(x)]
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/testing/util.py";,>
 line 218, in _matches
    hamcrest_assert(actual, contains_inanyorder(*expected_list))
  File "/usr/local/lib/python3.6/site-packages/hamcrest/core/assert_that.py", 
line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "/usr/local/lib/python3.6/site-packages/hamcrest/core/assert_that.py", 
line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: a sequence over [a sequence containing a string containing 'bicycle'] 
in any order
     but: not matched: <[]>


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.6/site-packages/dataflow_worker/batchworker.py", 
line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.6/site-packages/dataflow_worker/executor.py", 
line 179, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 63, in 
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 64, in 
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 79, in 
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 80, in 
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 84, in 
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 332, in 
apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 195, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "dataflow_worker/shuffle_operations.py", line 261, in 
dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "dataflow_worker/shuffle_operations.py", line 268, in 
dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "apache_beam/runners/worker/operations.py", line 332, in 
apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 195, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 670, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 671, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 963, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1030, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 961, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 726, in 
apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 812, in 
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "apache_beam/runners/common.py", line 1122, in 
apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 195, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 670, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 671, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 963, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1030, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 961, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 553, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1122, in 
apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 195, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 670, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 671, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 963, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1045, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.6/site-packages/future/utils/__init__.py", line 
446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 961, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 554, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/transforms/core.py";,>
 line 1511, in <lambda>
    wrapper = lambda x: [fn(x)]
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/testing/util.py";,>
 line 218, in _matches
    hamcrest_assert(actual, contains_inanyorder(*expected_list))
  File "/usr/local/lib/python3.6/site-packages/hamcrest/core/assert_that.py", 
line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "/usr/local/lib/python3.6/site-packages/hamcrest/core/assert_that.py", 
line 60, in _assert_match
    raise AssertionError(description)
RuntimeError: AssertionError: 
Expected: a sequence over [a sequence containing a string containing 'bicycle'] 
in any order
     but: not matched: <[]> [while running 'assert_that/Match']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-06T13:01:08.677Z: 
JOB_MESSAGE_BASIC: Finished operation 
assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-06T13:01:08.758Z: 
JOB_MESSAGE_DEBUG: Executing failure step failure40
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-06T13:01:08.798Z: 
JOB_MESSAGE_ERROR: Workflow failed. Causes: 
S10:assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
 failed., The job failed because a work item has failed 4 times. Look in 
previous log entries for the cause of each one of the 4 failures. For more 
information, see https://cloud.google.com/dataflow/docs/guides/common-errors. 
The work item was attempted on these workers: 
  beamapp-jenkins-060612545-06060554-ejx5-harness-vh50
      Root cause: Work item failed.,
  beamapp-jenkins-060612545-06060554-ejx5-harness-vh50
      Root cause: Work item failed.,
  beamapp-jenkins-060612545-06060554-ejx5-harness-vh50
      Root cause: Work item failed.,
  beamapp-jenkins-060612545-06060554-ejx5-harness-vh50
      Root cause: Work item failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-06T13:01:08.922Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-06T13:01:08.976Z: 
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-06T13:01:09.027Z: 
JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-06T13:02:57.374Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-06T13:02:57.452Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-06T13:02:57.487Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 
2020-06-06_05_54_59-5277234232627803653 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_03_28-8743101677824616578?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_17_41-3928553551191137210?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_24_54-14775531475813090456?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_33_05-237179161854587931?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_41_58-16682568752052684293?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_49_34-7008124565125638081?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_57_28-3045663713089209935?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_06_05_17-3941697432767991595?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_03_24-16337931217102228733?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_22_25-6623475602639554188?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_31_44-17203554699797660823?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_39_21-14190286123819120830?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_48_19-2592744699022765161?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_56_11-2240718138004318906?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_03_27-7686813540681807607?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_15_45-16140964812098352066?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_23_11-17960590713872940232?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_31_36-243467094939682029?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_39_32-339654125893881207?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_47_19-899532780387138540?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_54_59-5277234232627803653?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_03_24-18340286408535900774?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_21_20-9587379527271069481?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_30_34-15558123543731263002?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_39_33-7586911412318385154?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_58_14-639167162508969545?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_03_25-5248569815272530115?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_11_32-12956399208588398843?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_20_21-2319624917281022999?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_28_25-7151158257278100041?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_35_52-9473625098076744578?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_43_50-3474287416498316895?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_52_13-3466139318296272173?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_03_25-943139651968555768?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_12_16-3609043167296153523?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_21_32-1587053388443014466?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_29_57-15879854641205879775?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_37_56-8972092303736065479?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_46_40-4733336803547352060?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_55_18-4865813585084556482?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_03_24-15656804239520286901?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_11_35-4037200458801047269?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_20_25-1358860958148034220?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_28_01-17348755079524206407?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_35_56-7038137401709845874?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_43_43-14216604810983143785?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_51_59-334936286000020751?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_06_00_01-15409665263850500672?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_03_25-18404344932709762345?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_13_52-8829814924256300624?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_24_19-12024436320735889699?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_32_28-12969657386825978749?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_40_30-15390945286460198979?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_47_52-2712545837979903618?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-06_05_55_45-8763609575965840259?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py36.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 63 tests in 4211.858s

FAILED (SKIP=7, errors=1)

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script 
'<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
 line: 116

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 12m 9s
86 actionable tasks: 63 executed, 23 from cache

Publishing build scan...
https://gradle.com/s/sasqpbb6yk65w

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to