See <https://ci-beam.apache.org/job/beam_PostCommit_Python35/1/display/redirect>

Changes:


------------------------------------------
[...truncated 10.84 MB...]
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/testing/util.py";,>
 line 218, in _matches
    hamcrest_assert(actual, contains_inanyorder(*expected_list))
  File "/usr/local/lib/python3.5/site-packages/hamcrest/core/assert_that.py", 
line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "/usr/local/lib/python3.5/site-packages/hamcrest/core/assert_that.py", 
line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: a sequence over [a sequence containing a string containing 'bicycle'] 
in any order
     but: not matched: <[]>


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", 
line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", 
line 179, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 63, in 
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 64, in 
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 79, in 
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 80, in 
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 84, in 
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 332, in 
apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 195, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "dataflow_worker/shuffle_operations.py", line 261, in 
dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "dataflow_worker/shuffle_operations.py", line 268, in 
dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "apache_beam/runners/worker/operations.py", line 332, in 
apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 195, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 670, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 671, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 963, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1030, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 961, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 726, in 
apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 812, in 
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "apache_beam/runners/common.py", line 1122, in 
apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 195, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 670, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 671, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 963, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1030, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 961, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 553, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1122, in 
apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 195, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 670, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 671, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 963, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1045, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 
446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 961, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 554, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/transforms/core.py";,>
 line 1511, in <lambda>
    wrapper = lambda x: [fn(x)]
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/testing/util.py";,>
 line 218, in _matches
    hamcrest_assert(actual, contains_inanyorder(*expected_list))
  File "/usr/local/lib/python3.5/site-packages/hamcrest/core/assert_that.py", 
line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "/usr/local/lib/python3.5/site-packages/hamcrest/core/assert_that.py", 
line 60, in _assert_match
    raise AssertionError(description)
RuntimeError: AssertionError: 
Expected: a sequence over [a sequence containing a string containing 'bicycle'] 
in any order
     but: not matched: <[]> [while running 'assert_that/Match']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-08T23:13:58.660Z: 
JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "apache_beam/runners/common.py", line 961, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 554, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/transforms/core.py";,>
 line 1511, in <lambda>
    wrapper = lambda x: [fn(x)]
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/testing/util.py";,>
 line 218, in _matches
    hamcrest_assert(actual, contains_inanyorder(*expected_list))
  File "/usr/local/lib/python3.5/site-packages/hamcrest/core/assert_that.py", 
line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "/usr/local/lib/python3.5/site-packages/hamcrest/core/assert_that.py", 
line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: a sequence over [a sequence containing a string containing 'bicycle'] 
in any order
     but: not matched: <[]>


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", 
line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", 
line 179, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 63, in 
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 64, in 
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 79, in 
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 80, in 
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 84, in 
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 332, in 
apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 195, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "dataflow_worker/shuffle_operations.py", line 261, in 
dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "dataflow_worker/shuffle_operations.py", line 268, in 
dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "apache_beam/runners/worker/operations.py", line 332, in 
apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 195, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 670, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 671, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 963, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1030, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 961, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 726, in 
apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 812, in 
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "apache_beam/runners/common.py", line 1122, in 
apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 195, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 670, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 671, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 963, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1030, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 961, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 553, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1122, in 
apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 195, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 670, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 671, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 963, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1045, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 
446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 961, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 554, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/transforms/core.py";,>
 line 1511, in <lambda>
    wrapper = lambda x: [fn(x)]
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/testing/util.py";,>
 line 218, in _matches
    hamcrest_assert(actual, contains_inanyorder(*expected_list))
  File "/usr/local/lib/python3.5/site-packages/hamcrest/core/assert_that.py", 
line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "/usr/local/lib/python3.5/site-packages/hamcrest/core/assert_that.py", 
line 60, in _assert_match
    raise AssertionError(description)
RuntimeError: AssertionError: 
Expected: a sequence over [a sequence containing a string containing 'bicycle'] 
in any order
     but: not matched: <[]> [while running 'assert_that/Match']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-08T23:13:58.685Z: 
JOB_MESSAGE_BASIC: Finished operation 
assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-08T23:13:58.765Z: 
JOB_MESSAGE_DEBUG: Executing failure step failure40
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-08T23:13:58.803Z: 
JOB_MESSAGE_ERROR: Workflow failed. Causes: 
S10:assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
 failed., The job failed because a work item has failed 4 times. Look in 
previous log entries for the cause of each one of the 4 failures. For more 
information, see https://cloud.google.com/dataflow/docs/guides/common-errors. 
The work item was attempted on these workers: 
  beamapp-jenkins-060823073-06081607-lprc-harness-4mrc
      Root cause: Work item failed.,
  beamapp-jenkins-060823073-06081607-lprc-harness-4mrc
      Root cause: Work item failed.,
  beamapp-jenkins-060823073-06081607-lprc-harness-4mrc
      Root cause: Work item failed.,
  beamapp-jenkins-060823073-06081607-lprc-harness-4mrc
      Root cause: Work item failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-08T23:13:58.925Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-08T23:13:58.982Z: 
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-08T23:13:59.015Z: 
JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-08T23:15:20.867Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-08T23:15:20.910Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-08T23:15:20.943Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 
2020-06-08_16_07_37-6289300478907430074 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_13_58-2698642520008781250?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_28_38-3089280541969730396?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_36_15-12412516357735198637?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_44_19-6931119327400067595?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_53_19-11932533405989730492?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_16_01_21-11945358476539912088?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_16_08_56-9479698330428141329?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_13_55-14357357818172037496?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_34_45-2096269219901334165?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_43_37-2214443020279901525?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_50_47-1339164163656959717?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_59_20-13342731278000600624?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_16_09_22-2695501961593154780?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_13_56-8741841674228849615?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_26_31-278564030836903182?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_34_04-2666936056846281597?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_43_01-9383275766099629797?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_51_22-4166628236942110644?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_59_24-8484136675242625441?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_16_07_37-6289300478907430074?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_13_56-15671525666010014827?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_34_15-13961097309129923424?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_42_22-14677562876721238041?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_50_53-4653567905053773767?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_16_08_23-5910591855406461805?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_16_16_33-5173078455721752039?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_13_55-454959637506739322?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_22_23-11003081898156292563?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_30_43-9117052841202948989?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_38_55-11909918688844586296?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_46_56-1375308875848849000?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_55_10-23677562870375054?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_16_02_47-8143118917041750445?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_16_11_10-13007840245834963324?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_13_54-1591042389247644165?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_21_44-3647669011244066329?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_31_01-12109766236025225592?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_39_01-16473466569755978770?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_47_00-6843197281491483771?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_55_29-4693372964893036176?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_16_03_57-12774866453461081687?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_13_56-1163234828674458634?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_22_39-3817049475219045726?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_31_36-2532929481119429685?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_41_24-6557962027806756840?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_50_25-16594202267118346141?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_58_44-12554104637669799844?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_16_07_36-8734076158281963453?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_13_56-6630562833816218193?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_24_15-2437131731288662595?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_34_43-13734920718378473702?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_43_00-2121033034619739654?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_51_08-14377511185390324040?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_15_59_22-6036608006963351518?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-08_16_07_37-12982047291098646023?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py35.xml
----------------------------------------------------------------------
XML: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 63 tests in 4285.898s

FAILED (SKIP=7, errors=1)

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
 line: 116

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 12m 55s
86 actionable tasks: 63 executed, 23 from cache

Publishing build scan...
https://gradle.com/s/3qgd2ktpvgqjq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to