See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python38/491/display/redirect?page=changes>

Changes:

[Pablo Estrada] Updating BigQuery client for Python

[Andrew Pilloud] [BEAM-11165] ZetaSQL Calc only convert referenced columns

[Robin Qiu] Support read/write ZetaSQL DATETIME/NUMERIC types from/to BigQuery

[Robin Qiu] Address comments

[noreply] Merge pull request #13164 from Refactoring BigQuery Read utilities 
into

[Robert Burke] Moving to 2.27.0-SNAPSHOT on master branch.


------------------------------------------
[...truncated 24.54 MB...]
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:07:15.947Z: 
JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/Flatten
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:07:15.968Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:07:15.971Z: 
JOB_MESSAGE_DEBUG: Value 
"write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:07:16.007Z: 
JOB_MESSAGE_DEBUG: Value 
"write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:07:16.023Z: 
JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/Flatten
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:07:16.044Z: 
JOB_MESSAGE_DEBUG: Value 
"write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:07:16.098Z: 
JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/Flatten.out" 
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:07:16.149Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:14:51.451Z: 
JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "apache_beam/runners/common.py", line 1213, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 742, in 
apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 867, in 
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_file_loads.py",
 line 605, in process
    self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_tools.py", 
line 516, in wait_for_bq_job
    raise RuntimeError(
RuntimeError: BigQuery job 
beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_798_df6e2f109dea5595da17626a90a21b91_757d9a4504d5448d95779055a0e67c8f
 failed. Error Result: <ErrorProto
 message: 'An internal error occurred and the request could not be completed. 
Error: 1842444'
 reason: 'internalError'>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py", 
line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py", 
line 179, in execute
    op.start()
  File "dataflow_worker/native_operations.py", line 38, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 39, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 44, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 54, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 358, in 
apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 157, in 
apache_beam.runners.worker.operations.ConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1215, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1294, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.8/site-packages/future/utils/__init__.py", line 
446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 1213, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 742, in 
apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 867, in 
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_file_loads.py",
 line 605, in process
    self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_tools.py", 
line 516, in wait_for_bq_job
    raise RuntimeError(
RuntimeError: BigQuery job 
beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_798_df6e2f109dea5595da17626a90a21b91_757d9a4504d5448d95779055a0e67c8f
 failed. Error Result: <ErrorProto
 message: 'An internal error occurred and the request could not be completed. 
Error: 1842444'
 reason: 'internalError'> [while running 
'write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:14:56.708Z: 
JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "apache_beam/runners/common.py", line 1213, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 742, in 
apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 867, in 
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_file_loads.py",
 line 605, in process
    self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_tools.py", 
line 516, in wait_for_bq_job
    raise RuntimeError(
RuntimeError: BigQuery job 
beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_798_df6e2f109dea5595da17626a90a21b91_757d9a4504d5448d95779055a0e67c8f
 failed. Error Result: <ErrorProto
 message: 'An internal error occurred and the request could not be completed. 
Error: 1842444'
 reason: 'internalError'>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py", 
line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py", 
line 179, in execute
    op.start()
  File "dataflow_worker/native_operations.py", line 38, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 39, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 44, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 54, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 358, in 
apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 157, in 
apache_beam.runners.worker.operations.ConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1215, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1294, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.8/site-packages/future/utils/__init__.py", line 
446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 1213, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 742, in 
apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 867, in 
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_file_loads.py",
 line 605, in process
    self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_tools.py", 
line 516, in wait_for_bq_job
    raise RuntimeError(
RuntimeError: BigQuery job 
beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_798_df6e2f109dea5595da17626a90a21b91_757d9a4504d5448d95779055a0e67c8f
 failed. Error Result: <ErrorProto
 message: 'An internal error occurred and the request could not be completed. 
Error: 1842444'
 reason: 'internalError'> [while running 
'write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:14:57.588Z: 
JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "apache_beam/runners/common.py", line 1213, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 742, in 
apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 867, in 
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_file_loads.py",
 line 605, in process
    self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_tools.py", 
line 516, in wait_for_bq_job
    raise RuntimeError(
RuntimeError: BigQuery job 
beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_798_df6e2f109dea5595da17626a90a21b91_757d9a4504d5448d95779055a0e67c8f
 failed. Error Result: <ErrorProto
 message: 'An internal error occurred and the request could not be completed. 
Error: 1842444'
 reason: 'internalError'>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py", 
line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py", 
line 179, in execute
    op.start()
  File "dataflow_worker/native_operations.py", line 38, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 39, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 44, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 54, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 358, in 
apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 157, in 
apache_beam.runners.worker.operations.ConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1215, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1294, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.8/site-packages/future/utils/__init__.py", line 
446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 1213, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 742, in 
apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 867, in 
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_file_loads.py",
 line 605, in process
    self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_tools.py", 
line 516, in wait_for_bq_job
    raise RuntimeError(
RuntimeError: BigQuery job 
beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_798_df6e2f109dea5595da17626a90a21b91_757d9a4504d5448d95779055a0e67c8f
 failed. Error Result: <ErrorProto
 message: 'An internal error occurred and the request could not be completed. 
Error: 1842444'
 reason: 'internalError'> [while running 
'write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:15:01.517Z: 
JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "apache_beam/runners/common.py", line 1213, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 742, in 
apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 867, in 
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_file_loads.py",
 line 605, in process
    self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_tools.py", 
line 516, in wait_for_bq_job
    raise RuntimeError(
RuntimeError: BigQuery job 
beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_798_df6e2f109dea5595da17626a90a21b91_757d9a4504d5448d95779055a0e67c8f
 failed. Error Result: <ErrorProto
 message: 'An internal error occurred and the request could not be completed. 
Error: 1842444'
 reason: 'internalError'>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py", 
line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py", 
line 179, in execute
    op.start()
  File "dataflow_worker/native_operations.py", line 38, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 39, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 44, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 54, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 358, in 
apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 157, in 
apache_beam.runners.worker.operations.ConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1215, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1294, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.8/site-packages/future/utils/__init__.py", line 
446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 1213, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 742, in 
apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 867, in 
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_file_loads.py",
 line 605, in process
    self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_tools.py", 
line 516, in wait_for_bq_job
    raise RuntimeError(
RuntimeError: BigQuery job 
beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_798_df6e2f109dea5595da17626a90a21b91_757d9a4504d5448d95779055a0e67c8f
 failed. Error Result: <ErrorProto
 message: 'An internal error occurred and the request could not be completed. 
Error: 1842444'
 reason: 'internalError'> [while running 
'write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:15:02.180Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:15:02.259Z: 
JOB_MESSAGE_DEBUG: Executing failure step failure47
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:15:02.298Z: 
JOB_MESSAGE_ERROR: Workflow failed. Causes: 
S20:write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
 failed., The job failed because a work item has failed 4 times. Look in 
previous log entries for the cause of each one of the 4 failures. For more 
information, see https://cloud.google.com/dataflow/docs/guides/common-errors. 
The work item was attempted on these workers: 
  beamapp-jenkins-110601002-11051700-zkpn-harness-rxxx
      Root cause: Work item failed.,
  beamapp-jenkins-110601002-11051700-zkpn-harness-rxxx
      Root cause: Work item failed.,
  beamapp-jenkins-110601002-11051700-zkpn-harness-rxxx
      Root cause: Work item failed.,
  beamapp-jenkins-110601002-11051700-zkpn-harness-rxxx
      Root cause: Work item failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:15:02.378Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:15:02.638Z: 
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:15:02.669Z: 
JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:15:44.139Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:15:44.206Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:15:44.243Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 
2020-11-05_17_00_36-3082242840281892995 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py38.xml
----------------------------------------------------------------------
XML: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 67 tests in 4469.211s

FAILED (SKIP=7, errors=1)

> Task :sdks:python:test-suites:dataflow:py38:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/direct/common.gradle'>
 line: 144

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:direct:py38:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
 line: 118

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 23m 10s
205 actionable tasks: 168 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/wlixxrxrbzto6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to