See 
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/8018/display/redirect?page=changes>

Changes:

[amaliujia] [BEAM-7070] JOIN condition should accept field access

[amaliujia] [sql] ignore Nexmark SQL queries that has non equal join.

[amaliujia] [sql] generalize RexInputRef and RexFieldAccess in JOIN.

------------------------------------------
[...truncated 1.06 MB...]
    self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 560, in 
apache_beam.runners.worker.operations.DoOperation.process
    with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 561, in 
apache_beam.runners.worker.operations.DoOperation.process
    delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 747, in 
apache_beam.runners.common.DoFnRunner.receive
    self.process(windowed_value)
  File "apache_beam/runners/common.py", line 753, in 
apache_beam.runners.common.DoFnRunner.process
    self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 807, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
    raise_with_traceback(new_exn)
  File "apache_beam/runners/common.py", line 751, in 
apache_beam.runners.common.DoFnRunner.process
    return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 563, in 
apache_beam.runners.common.PerWindowInvoker.invoke_process
    self._invoke_process_per_window(
  File "apache_beam/runners/common.py", line 635, in 
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
    windowed_value, self.process_method(*args_for_process))
  File 
"/usr/local/lib/python2.7/dist-packages/apache_beam/io/gcp/bigquery_file_loads.py",
 line 432, in process
    'BigQuery jobs failed. BQ error: %s', self._latest_error)
Exception: (u"BigQuery jobs failed. BQ error: %s [while running 
'WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs']",
 <JobStatus
 errorResult: <ErrorProto
 message: u'Not found: Table 
apache-beam-testing:python_bq_file_loads_15562123295473.beam_load_2019_04_25_171613_93_886b2f8e1c6f82f1b8474506df599b7a_0
 was not found in location US'
 reason: u'notFound'>
 errors: [<ErrorProto
 message: u'Not found: Table 
apache-beam-testing:python_bq_file_loads_15562123295473.beam_load_2019_04_25_171613_93_886b2f8e1c6f82f1b8474506df599b7a_0
 was not found in location US'
 reason: u'notFound'>]
 state: u'DONE'>)

root: INFO: 2019-04-25T17:17:32.724Z: JOB_MESSAGE_ERROR: Traceback (most recent 
call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", 
line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", 
line 176, in execute
    op.start()
  File "dataflow_worker/native_operations.py", line 38, in 
dataflow_worker.native_operations.NativeReadOperation.start
    def start(self):
  File "dataflow_worker/native_operations.py", line 39, in 
dataflow_worker.native_operations.NativeReadOperation.start
    with self.scoped_start_state:
  File "dataflow_worker/native_operations.py", line 44, in 
dataflow_worker.native_operations.NativeReadOperation.start
    with self.spec.source.reader() as reader:
  File "dataflow_worker/native_operations.py", line 54, in 
dataflow_worker.native_operations.NativeReadOperation.start
    self.output(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 246, in 
apache_beam.runners.worker.operations.Operation.output
    cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 142, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
    self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 560, in 
apache_beam.runners.worker.operations.DoOperation.process
    with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 561, in 
apache_beam.runners.worker.operations.DoOperation.process
    delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 747, in 
apache_beam.runners.common.DoFnRunner.receive
    self.process(windowed_value)
  File "apache_beam/runners/common.py", line 753, in 
apache_beam.runners.common.DoFnRunner.process
    self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 807, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
    raise_with_traceback(new_exn)
  File "apache_beam/runners/common.py", line 751, in 
apache_beam.runners.common.DoFnRunner.process
    return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 563, in 
apache_beam.runners.common.PerWindowInvoker.invoke_process
    self._invoke_process_per_window(
  File "apache_beam/runners/common.py", line 635, in 
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
    windowed_value, self.process_method(*args_for_process))
  File 
"/usr/local/lib/python2.7/dist-packages/apache_beam/io/gcp/bigquery_file_loads.py",
 line 432, in process
    'BigQuery jobs failed. BQ error: %s', self._latest_error)
Exception: (u"BigQuery jobs failed. BQ error: %s [while running 
'WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs']",
 <JobStatus
 errorResult: <ErrorProto
 message: u'Not found: Table 
apache-beam-testing:python_bq_file_loads_15562123295473.beam_load_2019_04_25_171613_93_886b2f8e1c6f82f1b8474506df599b7a_0
 was not found in location US'
 reason: u'notFound'>
 errors: [<ErrorProto
 message: u'Not found: Table 
apache-beam-testing:python_bq_file_loads_15562123295473.beam_load_2019_04_25_171613_93_886b2f8e1c6f82f1b8474506df599b7a_0
 was not found in location US'
 reason: u'notFound'>]
 state: u'DONE'>)

root: INFO: 2019-04-25T17:17:33.085Z: JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs).out" 
materialized.
root: INFO: 2019-04-25T17:17:33.208Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
root: INFO: 2019-04-25T17:17:33.348Z: JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0).output"
 materialized.
root: INFO: 2019-04-25T17:17:33.544Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
root: INFO: 2019-04-25T17:17:36.914Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
root: INFO: 2019-04-25T17:17:37.031Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/Delete
root: INFO: 2019-04-25T17:17:39.338Z: JOB_MESSAGE_ERROR: Traceback (most recent 
call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", 
line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", 
line 176, in execute
    op.start()
  File "dataflow_worker/native_operations.py", line 38, in 
dataflow_worker.native_operations.NativeReadOperation.start
    def start(self):
  File "dataflow_worker/native_operations.py", line 39, in 
dataflow_worker.native_operations.NativeReadOperation.start
    with self.scoped_start_state:
  File "dataflow_worker/native_operations.py", line 44, in 
dataflow_worker.native_operations.NativeReadOperation.start
    with self.spec.source.reader() as reader:
  File "dataflow_worker/native_operations.py", line 54, in 
dataflow_worker.native_operations.NativeReadOperation.start
    self.output(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 246, in 
apache_beam.runners.worker.operations.Operation.output
    cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 142, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
    self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 560, in 
apache_beam.runners.worker.operations.DoOperation.process
    with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 561, in 
apache_beam.runners.worker.operations.DoOperation.process
    delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 747, in 
apache_beam.runners.common.DoFnRunner.receive
    self.process(windowed_value)
  File "apache_beam/runners/common.py", line 753, in 
apache_beam.runners.common.DoFnRunner.process
    self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 807, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
    raise_with_traceback(new_exn)
  File "apache_beam/runners/common.py", line 751, in 
apache_beam.runners.common.DoFnRunner.process
    return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 563, in 
apache_beam.runners.common.PerWindowInvoker.invoke_process
    self._invoke_process_per_window(
  File "apache_beam/runners/common.py", line 635, in 
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
    windowed_value, self.process_method(*args_for_process))
  File 
"/usr/local/lib/python2.7/dist-packages/apache_beam/io/gcp/bigquery_file_loads.py",
 line 432, in process
    'BigQuery jobs failed. BQ error: %s', self._latest_error)
Exception: (u"BigQuery jobs failed. BQ error: %s [while running 
'WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs']",
 <JobStatus
 errorResult: <ErrorProto
 message: u'Not found: Table 
apache-beam-testing:python_bq_file_loads_15562123295473.beam_load_2019_04_25_171613_93_886b2f8e1c6f82f1b8474506df599b7a_0
 was not found in location US'
 reason: u'notFound'>
 errors: [<ErrorProto
 message: u'Not found: Table 
apache-beam-testing:python_bq_file_loads_15562123295473.beam_load_2019_04_25_171613_93_886b2f8e1c6f82f1b8474506df599b7a_0
 was not found in location US'
 reason: u'notFound'>]
 state: u'DONE'>)

root: INFO: 2019-04-25T17:17:42.994Z: JOB_MESSAGE_ERROR: Traceback (most recent 
call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", 
line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", 
line 176, in execute
    op.start()
  File "dataflow_worker/native_operations.py", line 38, in 
dataflow_worker.native_operations.NativeReadOperation.start
    def start(self):
  File "dataflow_worker/native_operations.py", line 39, in 
dataflow_worker.native_operations.NativeReadOperation.start
    with self.scoped_start_state:
  File "dataflow_worker/native_operations.py", line 44, in 
dataflow_worker.native_operations.NativeReadOperation.start
    with self.spec.source.reader() as reader:
  File "dataflow_worker/native_operations.py", line 54, in 
dataflow_worker.native_operations.NativeReadOperation.start
    self.output(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 246, in 
apache_beam.runners.worker.operations.Operation.output
    cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 142, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
    self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 560, in 
apache_beam.runners.worker.operations.DoOperation.process
    with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 561, in 
apache_beam.runners.worker.operations.DoOperation.process
    delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 747, in 
apache_beam.runners.common.DoFnRunner.receive
    self.process(windowed_value)
  File "apache_beam/runners/common.py", line 753, in 
apache_beam.runners.common.DoFnRunner.process
    self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 807, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
    raise_with_traceback(new_exn)
  File "apache_beam/runners/common.py", line 751, in 
apache_beam.runners.common.DoFnRunner.process
    return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 563, in 
apache_beam.runners.common.PerWindowInvoker.invoke_process
    self._invoke_process_per_window(
  File "apache_beam/runners/common.py", line 635, in 
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
    windowed_value, self.process_method(*args_for_process))
  File 
"/usr/local/lib/python2.7/dist-packages/apache_beam/io/gcp/bigquery_file_loads.py",
 line 432, in process
    'BigQuery jobs failed. BQ error: %s', self._latest_error)
Exception: (u"BigQuery jobs failed. BQ error: %s [while running 
'WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs']",
 <JobStatus
 errorResult: <ErrorProto
 message: u'Not found: Table 
apache-beam-testing:python_bq_file_loads_15562123295473.beam_load_2019_04_25_171613_93_886b2f8e1c6f82f1b8474506df599b7a_0
 was not found in location US'
 reason: u'notFound'>
 errors: [<ErrorProto
 message: u'Not found: Table 
apache-beam-testing:python_bq_file_loads_15562123295473.beam_load_2019_04_25_171613_93_886b2f8e1c6f82f1b8474506df599b7a_0
 was not found in location US'
 reason: u'notFound'>]
 state: u'DONE'>)

root: INFO: 2019-04-25T17:17:43.073Z: JOB_MESSAGE_DEBUG: Executing failure step 
failure94
root: INFO: 2019-04-25T17:17:43.122Z: JOB_MESSAGE_ERROR: Workflow failed. 
Causes: 
S42:WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
 failed., A work item was attempted 4 times without success. Each time the 
worker eventually lost contact with the service. The work item was attempted 
on: 
  beamapp-jenkins-042517121-04251012-jqpw-harness-jzb5,
  beamapp-jenkins-042517121-04251012-jqpw-harness-jzb5,
  beamapp-jenkins-042517121-04251012-jqpw-harness-jzb5,
  beamapp-jenkins-042517121-04251012-jqpw-harness-jzb5
root: INFO: 2019-04-25T17:17:43.359Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-25T17:17:43.476Z: JOB_MESSAGE_DEBUG: Starting worker pool 
teardown.
root: INFO: 2019-04-25T17:17:43.539Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-25T17:19:49.768Z: JOB_MESSAGE_DETAILED: Autoscaling: 
Reduced the number of workers to 0 based on the rate of progress in the 
currently running step(s).
root: INFO: 2019-04-25T17:19:49.836Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-25T17:19:49.883Z: JOB_MESSAGE_DEBUG: Tearing down pending 
resources...
root: INFO: Job 2019-04-25_10_12_23-14741857671608466321 is in state 
JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 3408.598s

FAILED (errors=1)
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_09_57_03-13328933435624970047?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_10_05_15-8112512816825224513?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_10_14_46-10569347671549043479?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_10_22_07-9913640404679897356?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_10_30_25-17683572630237434903?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_10_38_45-489718807083641785?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_10_45_50-340219442152907856?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_09_57_06-2740132071486808429?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_10_12_23-14741857671608466321?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_10_20_14-4293151911701251494?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_09_57_03-11465989641438890846?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_10_19_35-6292884523584715640?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_09_57_06-5602205856620064356?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_10_10_12-10613251398723753055?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_10_17_27-4175882456746785646?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_10_24_27-14049841991549493758?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_10_33_18-10693184125035187163?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_09_57_03-200460623589664304?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_10_15_58-5371455237988680639?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_10_24_27-8710064244220506876?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_09_57_04-16789389289234084284?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_10_05_34-5486490284265203740?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_10_13_45-12571649541029847320?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_10_21_45-3530920725297092656?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_09_57_03-7461021660598557668?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_10_05_53-2947096528146868416?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_10_13_38-8053831112598630090?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_10_21_13-4389891439387440881?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_10_27_16-14318489903278975065?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_09_57_03-14714655176602881132?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_10_07_57-10826445061678406792?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_10_18_31-8384409448886081935?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-25_10_26_04-5602365773881647022?project=apache-beam-testing.

> Task :beam-sdks-python:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'>
 line: 211

* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 3m 47s
104 actionable tasks: 80 executed, 21 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/zwc7sazp6zcyq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to