See
<https://ci-beam.apache.org/job/beam_PostCommit_Python311/866/display/redirect>
Changes:
------------------------------------------
[...truncated 12.06 MB...]
self.run()
File "/usr/lib/python3.11/threading.py", line 975, in run
self._target(*self._args, **self._kwargs)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",>
line 171, in poll_for_job_completion
time.sleep(sleep_secs)
~~~~~~~~~~~~~~~~~~~~~ Stack of <unknown> (140014445790976) ~~~~~~~~~~~~~~~~~~~~~
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/build/gradleenv/2050596099/lib/python3.11/site-packages/execnet/gateway_base.py",>
line 361, in _perform_spawn
reply.run()
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/build/gradleenv/2050596099/lib/python3.11/site-packages/execnet/gateway_base.py",>
line 296, in run
self._result = func(*args, **kwargs)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/build/gradleenv/2050596099/lib/python3.11/site-packages/execnet/gateway_base.py",>
line 1049, in _thread_receiver
msg = Message.from_io(io)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/build/gradleenv/2050596099/lib/python3.11/site-packages/execnet/gateway_base.py",>
line 507, in from_io
header = io.read(9) # type 1, channel 4, payload 4
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/build/gradleenv/2050596099/lib/python3.11/site-packages/execnet/gateway_base.py",>
line 474, in read
data = self._read(numbytes - len(buf))
+++++++++++++++++++++++++++++++++++ Timeout ++++++++++++++++++++++++++++++++++++
[gw0] [32mPASSED[0m
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_streaming_inserts
apache_beam/io/gcp/bigquery_test.py::BigQueryFileLoadsIntegrationTests::test_avro_file_load
[gw7] [31mFAILED[0m
apache_beam/examples/cookbook/bigtableio_it_test.py::BigtableIOWriteTest::test_bigtable_write
apache_beam/examples/inference/tfx_bsl/tfx_bsl_inference_it_test.py::TFXRunInferenceTests::test_tfx_run_inference_mobilenetv2
[gw7] [33mSKIPPED[0m
apache_beam/examples/inference/tfx_bsl/tfx_bsl_inference_it_test.py::TFXRunInferenceTests::test_tfx_run_inference_mobilenetv2
[gw4] [32mPASSED[0m
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_native_datetime
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_query
[gw3] [32mPASSED[0m
apache_beam/io/gcp/datastore/v1new/datastore_write_it_test.py::DatastoreWriteIT::test_datastore_write_limit
apache_beam/ml/gcp/naturallanguageml_test_it.py::NaturalLanguageMlTestIT::test_analyzing_syntax
[gw3] [33mSKIPPED[0m
apache_beam/ml/gcp/naturallanguageml_test_it.py::NaturalLanguageMlTestIT::test_analyzing_syntax
apache_beam/ml/inference/base_test.py::RunInferenceBaseTest::test_run_inference_with_side_inputin_streaming
[gw3] [33mSKIPPED[0m
apache_beam/ml/inference/base_test.py::RunInferenceBaseTest::test_run_inference_with_side_inputin_streaming
[gw0] [32mPASSED[0m
apache_beam/io/gcp/bigquery_test.py::BigQueryFileLoadsIntegrationTests::test_avro_file_load
apache_beam/ml/inference/tensorflow_inference_it_test.py::TensorflowInference::test_tf_imagenet_image_segmentation
[gw0] [33mSKIPPED[0m
apache_beam/ml/inference/tensorflow_inference_it_test.py::TensorflowInference::test_tf_imagenet_image_segmentation
apache_beam/ml/inference/tensorflow_inference_it_test.py::TensorflowInference::test_tf_mnist_classification
[gw0] [33mSKIPPED[0m
apache_beam/ml/inference/tensorflow_inference_it_test.py::TensorflowInference::test_tf_mnist_classification
apache_beam/ml/inference/tensorflow_inference_it_test.py::TensorflowInference::test_tf_mnist_classification_large_model
[gw0] [33mSKIPPED[0m
apache_beam/ml/inference/tensorflow_inference_it_test.py::TensorflowInference::test_tf_mnist_classification_large_model
apache_beam/ml/inference/tensorflow_inference_it_test.py::TensorflowInference::test_tf_mnist_with_weights_classification
[gw0] [33mSKIPPED[0m
apache_beam/ml/inference/tensorflow_inference_it_test.py::TensorflowInference::test_tf_mnist_with_weights_classification
[gw4] [32mPASSED[0m
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_query
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_query_and_filters
[gw4] [32mPASSED[0m
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_query_and_filters
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_row_restriction
[gw4] [32mPASSED[0m
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_row_restriction
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_very_selective_filters
[gw4] [32mPASSED[0m
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_very_selective_filters
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_iobase_source
[gw4] [32mPASSED[0m
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_iobase_source
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
[gw4] [32mPASSED[0m
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries
[gw4] [32mPASSED[0m
apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries
apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner
[gw4] [32mPASSED[0m
apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner
=================================== FAILURES ===================================
[31m[1m___________________ BigtableIOWriteTest.test_bigtable_write
____________________[0m
[gw7] linux -- Python 3.11.2
<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/build/gradleenv/2050596099/bin/python3.11>
self = <apache_beam.examples.cookbook.bigtableio_it_test.BigtableIOWriteTest
testMethod=test_bigtable_write>
@pytest.mark.it_postcommit
def test_bigtable_write(self):
number = self.number
pipeline_args = self.test_pipeline.options_list
pipeline_options = PipelineOptions(pipeline_args)
> with beam.Pipeline(options=pipeline_options) as pipeline:
[1m[31mapache_beam/examples/cookbook/bigtableio_it_test.py[0m:183:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31mapache_beam/pipeline.py[0m:608: in __exit__
self.result = self.run()
[1m[31mapache_beam/pipeline.py[0m:561: in run
self._options).run(False)
[1m[31mapache_beam/pipeline.py[0m:585: in run
return self.runner.run_pipeline(self, self._options)
[1m[31mapache_beam/runners/dataflow/test_dataflow_runner.py[0m:66: in
run_pipeline
self.result.wait_until_finish(duration=wait_duration)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <DataflowPipelineResult <Job
clientRequestId: '20231104230131547707-3550'
createTime: '2023-11-04T23:02:11.824402Z'
...023-11-04T23:02:11.824402Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> at 0x7f57344f6510>
duration = None
def wait_until_finish(self, duration=None):
if not self.is_in_terminal_state():
if not self.has_job:
raise IOError('Failed to get the Dataflow job id.')
consoleUrl = (
"Console URL: https://console.cloud.google.com/"
f"dataflow/jobs/<RegionId>/{self.job_id()}"
"?project=<ProjectId>")
thread = threading.Thread(
target=DataflowRunner.poll_for_job_completion,
args=(self._runner, self, duration))
# Mark the thread as a daemon thread so a keyboard interrupt on the main
# thread will terminate everything. This is also the reason we will not
# use thread.join() to wait for the polling thread.
thread.daemon = True
thread.start()
while thread.is_alive():
> time.sleep(5.0)
[1m[31mE Failed: Timeout >4500.0s[0m
[1m[31mapache_beam/runners/dataflow/dataflow_runner.py[0m:758: Failed
------------------------------ Captured log call -------------------------------
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:761
Executing command:
['<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/build/gradleenv/2050596099/bin/python3.11',>
'-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r',
'/tmp/tmpzamxxcbj/tmp_requirements.txt', '--exists-action', 'i', '--no-deps',
'--implementation', 'cp', '--abi', 'cp311', '--platform',
'manylinux2014_x86_64']
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:322 Copying
Beam SDK
"<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/sdks/python/build/apache_beam-2.53.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl">
to staging location.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:395 Pipeline
has additional dependencies to be installed in SDK worker container, consider
using the SDK container image pre-building workflow to avoid repetitive
installations. Learn more on
https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
[32mINFO [0m root:environments.py:313 Using provided Python SDK container
image: gcr.io/cloud-dataflow/v1beta3/beam_python3.11_sdk:beam-master-20231023
[32mINFO [0m root:environments.py:320 Python SDK container image set to
"gcr.io/cloud-dataflow/v1beta3/beam_python3.11_sdk:beam-master-20231023" for
Docker environment
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712
==================== <function pack_combiners at 0x7f5769519bc0>
====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712
==================== <function sort_stages at 0x7f576951a480>
====================
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/requirements.txt...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/requirements.txt
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/mock-2.0.0-py2.py3-none-any.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/mock-2.0.0-py2.py3-none-any.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/seaborn-0.13.0-py3-none-any.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/seaborn-0.13.0-py3-none-any.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/PyHamcrest-1.10.1-py3-none-any.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/PyHamcrest-1.10.1-py3-none-any.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/transformers-4.34.0-py3-none-any.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/transformers-4.34.0-py3-none-any.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/inflection-0.5.1-py2.py3-none-any.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/inflection-0.5.1-py2.py3-none-any.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/beautifulsoup4-4.12.2-py3-none-any.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/beautifulsoup4-4.12.2-py3-none-any.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/parameterized-0.7.5-py2.py3-none-any.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/parameterized-0.7.5-py2.py3-none-any.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/torch-2.1.0-cp38-cp38-manylinux1_x86_64.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/torch-2.1.0-cp38-cp38-manylinux1_x86_64.whl
in 33 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/torchvision-0.16.0-cp38-cp38-manylinux1_x86_64.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/torchvision-0.16.0-cp38-cp38-manylinux1_x86_64.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/Pillow-10.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/Pillow-10.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/matplotlib-3.8.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/matplotlib-3.8.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/matplotlib-3.8.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/matplotlib-3.8.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/scikit_learn-1.2.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/scikit_learn-1.2.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/apache_beam-2.53.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/apache_beam-2.53.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/pipeline.pb...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104230131-546585-23eerbmy.1699138891.546823/pipeline.pb
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:843 Create job:
<Job
clientRequestId: '20231104230131547707-3550'
createTime: '2023-11-04T23:02:11.824402Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2023-11-04_16_02_11-9286710928131373702'
location: 'us-central1'
name: 'beamapp-jenkins-1104230131-546585-23eerbmy'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2023-11-04T23:02:11.824402Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:845 Created job
with id: [2023-11-04_16_02_11-9286710928131373702]
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:846 Submitted job:
2023-11-04_16_02_11-9286710928131373702
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:847 To access the
Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-11-04_16_02_11-9286710928131373702?project=apache-beam-testing
[32mINFO [0m
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58
Console log:
[32mINFO [0m
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-11-04_16_02_11-9286710928131373702?project=apache-beam-testing
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job
2023-11-04_16_02_11-9286710928131373702 is in state JOB_STATE_RUNNING
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-04T23:02:15.767Z: JOB_MESSAGE_BASIC: Worker configuration:
e2-standard-2 in us-central1-b.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-04T23:02:18.328Z: JOB_MESSAGE_BASIC: Executing operation Generate
Direct Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-04T23:02:18.388Z: JOB_MESSAGE_BASIC: Starting 1 workers in
us-central1-b...
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-04T23:02:18.850Z: JOB_MESSAGE_BASIC: Finished operation Generate Direct
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-04T23:02:27.540Z: JOB_MESSAGE_BASIC: Executing operation Generate
Direct Rows/Create/Impulse+Generate Direct Rows/Create/FlatMap(<lambda at
core.py:3774>)+Generate Direct
Rows/Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Generate Direct
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Generate
Direct
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Generate
Direct Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-04T23:02:32.521Z: JOB_MESSAGE_BASIC: Your project already contains 100
Dataflow-created metric descriptors, so new user metrics of the form
custom.googleapis.com/* will not be created. However, all user metrics are also
available in the metric dataflow.googleapis.com/job/user_counter. If you rely
on the custom metrics, you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-04T23:05:23.246Z: JOB_MESSAGE_BASIC: All workers have finished the
startup processes and began to receive work requests.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-04T23:05:23.798Z: JOB_MESSAGE_BASIC: Finished operation Generate Direct
Rows/Create/Impulse+Generate Direct Rows/Create/FlatMap(<lambda at
core.py:3774>)+Generate Direct
Rows/Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Generate Direct
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Generate
Direct
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Generate
Direct Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-04T23:05:23.890Z: JOB_MESSAGE_BASIC: Executing operation Generate
Direct Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-04T23:05:25.566Z: JOB_MESSAGE_BASIC: Finished operation Generate Direct
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-04T23:05:25.624Z: JOB_MESSAGE_BASIC: Executing operation Generate
Direct
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Generate
Direct
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Generate
Direct
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Generate
Direct Rows/Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Generate Direct
Rows/Create/Map(decode)+Generate Direct
Rows/WriteToBigTable/ParDo(_BigTableWriteFn)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-05T00:16:17.097Z: JOB_MESSAGE_BASIC: Cancel request is committed for
workflow job: 2023-11-04_16_02_11-9286710928131373702.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-05T00:16:17.150Z: JOB_MESSAGE_BASIC: Finished operation Generate Direct
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Generate
Direct
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Generate
Direct
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Generate
Direct Rows/Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Generate Direct
Rows/Create/Map(decode)+Generate Direct
Rows/WriteToBigTable/ParDo(_BigTableWriteFn)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-05T00:16:17.407Z: JOB_MESSAGE_BASIC: Stopping worker pool...
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job
2023-11-04_16_02_11-9286710928131373702 is in state JOB_STATE_CANCELLING
[33m=============================== warnings summary
===============================[0m
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_it
apache_beam/examples/complete/game/game_stats_it_test.py::GameStatsIT::test_game_stats_it
apache_beam/examples/complete/game/leader_board_it_test.py::LeaderBoardIT::test_leader_board_it
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_file_loads
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_streaming_inserts
<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63:
PendingDeprecationWarning: Client.dataset is deprecated and will be removed in
a future version. Use a string like 'my_project.my_dataset' or a
cloud.google.bigquery.DatasetReference object, instead.
dataset_ref = client.dataset(unique_dataset_name, project=project)
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100:
PendingDeprecationWarning: Client.dataset is deprecated and will be removed in
a future version. Use a string like 'my_project.my_dataset' or a
cloud.google.bigquery.DatasetReference object, instead.
table_ref = client.dataset(dataset_id).table(table_id)
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47:
FutureWarning: The default value of numeric_only in DataFrame.mean is
deprecated. In a future version, it will default to False. In addition,
specifying 'numeric_only=None' is deprecated. Select only valid columns or
specify the value of numeric_only to silence this warning.
return airline_df[at_top_airports].mean()
apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_native_source
<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:170:
BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use
ReadFromBigQuery instead.
beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:706:
BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use
ReadFromBigQuery instead.
beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))
-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file:
<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/sdks/python/pytest_postCommitIT-df-py311.xml>
-
[36m[1m=========================== short test summary info
============================[0m
[31mFAILED[0m
apache_beam/examples/cookbook/bigtableio_it_test.py::[1mBigtableIOWriteTest::test_bigtable_write[0m
- Failed: Timeout >4500.0s
[31m====== [31m[1m1 failed[0m, [32m86 passed[0m, [33m51 skipped[0m,
[33m9 warnings[0m[31m in 8022.22s (2:13:42)[0m[31m =======[0m
> Task :sdks:python:test-suites:dataflow:py311:postCommitIT FAILED
FAILURE: Build failed with an exception.
* Where:
Script
'<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
line: 139
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py311:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 9.0.
You can use '--warning-mode all' to show the individual deprecation warnings
and determine if they come from your own scripts or plugins.
For more on this, please refer to
https://docs.gradle.org/8.4/userguide/command_line_interface.html#sec:command_line_warnings
in the Gradle documentation.
BUILD FAILED in 2h 20m 3s
214 actionable tasks: 150 executed, 60 from cache, 4 up-to-date
Publishing build scan...
https://ge.apache.org/s/cg7vl3mor4fzw
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]