See
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/2481/display/redirect>
Changes:
------------------------------------------
[...truncated 12.15 MB...]
File "/usr/lib/python3.9/threading.py", line 973, in _bootstrap_inner
self.run()
File "/usr/lib/python3.9/threading.py", line 910, in run
self._target(*self._args, **self._kwargs)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",>
line 171, in poll_for_job_completion
time.sleep(sleep_secs)
~~~~~~~~~~~~~~~~~~~~~ Stack of <unknown> (139758740125440) ~~~~~~~~~~~~~~~~~~~~~
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/execnet/gateway_base.py",>
line 361, in _perform_spawn
reply.run()
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/execnet/gateway_base.py",>
line 296, in run
self._result = func(*args, **kwargs)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/execnet/gateway_base.py",>
line 1049, in _thread_receiver
msg = Message.from_io(io)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/execnet/gateway_base.py",>
line 507, in from_io
header = io.read(9) # type 1, channel 4, payload 4
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/execnet/gateway_base.py",>
line 474, in read
data = self._read(numbytes - len(buf))
+++++++++++++++++++++++++++++++++++ Timeout ++++++++++++++++++++++++++++++++++++
[gw7] [32mPASSED[0m
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_native_datetime
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_query
[gw4] [31mFAILED[0m
apache_beam/examples/cookbook/bigtableio_it_test.py::BigtableIOWriteTest::test_bigtable_write
apache_beam/io/fileio_test.py::MatchIntegrationTest::test_transform_on_gcs
[gw5] [32mPASSED[0m
apache_beam/io/gcp/datastore/v1new/datastore_write_it_test.py::DatastoreWriteIT::test_datastore_write_limit
apache_beam/ml/gcp/naturallanguageml_test_it.py::NaturalLanguageMlTestIT::test_analyzing_syntax
[gw5] [33mSKIPPED[0m
apache_beam/ml/gcp/naturallanguageml_test_it.py::NaturalLanguageMlTestIT::test_analyzing_syntax
apache_beam/ml/inference/tensorflow_inference_it_test.py::TensorflowInference::test_tf_imagenet_image_segmentation
[gw5] [33mSKIPPED[0m
apache_beam/ml/inference/tensorflow_inference_it_test.py::TensorflowInference::test_tf_imagenet_image_segmentation
apache_beam/ml/inference/tensorflow_inference_it_test.py::TensorflowInference::test_tf_mnist_classification
[gw5] [33mSKIPPED[0m
apache_beam/ml/inference/tensorflow_inference_it_test.py::TensorflowInference::test_tf_mnist_classification
apache_beam/ml/inference/tensorflow_inference_it_test.py::TensorflowInference::test_tf_mnist_classification_large_model
[gw5] [33mSKIPPED[0m
apache_beam/ml/inference/tensorflow_inference_it_test.py::TensorflowInference::test_tf_mnist_classification_large_model
apache_beam/ml/inference/tensorflow_inference_it_test.py::TensorflowInference::test_tf_mnist_with_weights_classification
[gw5] [33mSKIPPED[0m
apache_beam/ml/inference/tensorflow_inference_it_test.py::TensorflowInference::test_tf_mnist_with_weights_classification
[gw7] [32mPASSED[0m
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_query
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_query_and_filters
[gw4] [32mPASSED[0m
apache_beam/io/fileio_test.py::MatchIntegrationTest::test_transform_on_gcs
[gw7] [32mPASSED[0m
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_query_and_filters
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_row_restriction
[gw7] [32mPASSED[0m
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_row_restriction
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_very_selective_filters
[gw7] [32mPASSED[0m
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_very_selective_filters
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_iobase_source
[gw7] [32mPASSED[0m
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_iobase_source
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
[gw7] [32mPASSED[0m
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries
[gw7] [32mPASSED[0m
apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries
apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner
[gw7] [32mPASSED[0m
apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner
=================================== FAILURES ===================================
[31m[1m___________________ BigtableIOWriteTest.test_bigtable_write
____________________[0m
[gw4] linux -- Python 3.9.10
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9>
self = <apache_beam.examples.cookbook.bigtableio_it_test.BigtableIOWriteTest
testMethod=test_bigtable_write>
@pytest.mark.it_postcommit
def test_bigtable_write(self):
number = self.number
pipeline_args = self.test_pipeline.options_list
pipeline_options = PipelineOptions(pipeline_args)
with beam.Pipeline(options=pipeline_options) as pipeline:
config_data = {
'project_id': self.project,
'instance_id': self.instance_id,
'table_id': self.table_id
}
> _ = (
pipeline
| 'Generate Direct Rows' >> GenerateTestRows(number, **config_data))
[1m[31mapache_beam/examples/cookbook/bigtableio_it_test.py[0m:189:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31mapache_beam/pipeline.py[0m:608: in __exit__
self.result = self.run()
[1m[31mapache_beam/pipeline.py[0m:558: in run
return Pipeline.from_runner_api(
[1m[31mapache_beam/pipeline.py[0m:585: in run
return self.runner.run_pipeline(self, self._options)
[1m[31mapache_beam/runners/dataflow/test_dataflow_runner.py[0m:66: in
run_pipeline
self.result.wait_until_finish(duration=wait_duration)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <DataflowPipelineResult <Job
clientRequestId: '20231027122351480737-9154'
createTime: '2023-10-27T12:23:59.652551Z'
...023-10-27T12:23:59.652551Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> at 0x7f1bb412fb50>
duration = None
def wait_until_finish(self, duration=None):
if not self.is_in_terminal_state():
if not self.has_job:
raise IOError('Failed to get the Dataflow job id.')
consoleUrl = (
"Console URL: https://console.cloud.google.com/"
f"dataflow/jobs/<RegionId>/{self.job_id()}"
"?project=<ProjectId>")
thread = threading.Thread(
target=DataflowRunner.poll_for_job_completion,
args=(self._runner, self, duration))
# Mark the thread as a daemon thread so a keyboard interrupt on the main
# thread will terminate everything. This is also the reason we will not
# use thread.join() to wait for the polling thread.
thread.daemon = True
thread.start()
while thread.is_alive():
> time.sleep(5.0)
[1m[31mE Failed: Timeout >4500.0s[0m
[1m[31mapache_beam/runners/dataflow/dataflow_runner.py[0m:758: Failed
------------------------------ Captured log call -------------------------------
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:761
Executing command:
['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',>
'-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r',
'/tmp/tmpd9doecrm/tmp_requirements.txt', '--exists-action', 'i', '--no-deps',
'--implementation', 'cp', '--abi', 'cp39', '--platform', 'manylinux2014_x86_64']
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:322 Copying
Beam SDK
"<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/build/apache_beam-2.52.0.dev0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl">
to staging location.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:395 Pipeline
has additional dependencies to be installed in SDK worker container, consider
using the SDK container image pre-building workflow to avoid repetitive
installations. Learn more on
https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
[32mINFO [0m root:environments.py:313 Using provided Python SDK container
image: gcr.io/cloud-dataflow/v1beta3/beam_python3.9_sdk:beam-master-20231023
[32mINFO [0m root:environments.py:320 Python SDK container image set to
"gcr.io/cloud-dataflow/v1beta3/beam_python3.9_sdk:beam-master-20231023" for
Docker environment
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712
==================== <function pack_combiners at 0x7f1c0d583f70>
====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712
==================== <function sort_stages at 0x7f1c0d584790>
====================
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1027122351-478403-durm23ee.1698409431.478737/requirements.txt...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1027122351-478403-durm23ee.1698409431.478737/requirements.txt
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1027122351-478403-durm23ee.1698409431.478737/mock-2.0.0-py2.py3-none-any.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1027122351-478403-durm23ee.1698409431.478737/mock-2.0.0-py2.py3-none-any.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1027122351-478403-durm23ee.1698409431.478737/seaborn-0.12.2-py3-none-any.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1027122351-478403-durm23ee.1698409431.478737/seaborn-0.12.2-py3-none-any.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1027122351-478403-durm23ee.1698409431.478737/seaborn-0.13.0-py3-none-any.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1027122351-478403-durm23ee.1698409431.478737/seaborn-0.13.0-py3-none-any.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1027122351-478403-durm23ee.1698409431.478737/PyHamcrest-1.10.1-py3-none-any.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1027122351-478403-durm23ee.1698409431.478737/PyHamcrest-1.10.1-py3-none-any.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1027122351-478403-durm23ee.1698409431.478737/inflection-0.5.1-py2.py3-none-any.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1027122351-478403-durm23ee.1698409431.478737/inflection-0.5.1-py2.py3-none-any.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1027122351-478403-durm23ee.1698409431.478737/beautifulsoup4-4.12.2-py3-none-any.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1027122351-478403-durm23ee.1698409431.478737/beautifulsoup4-4.12.2-py3-none-any.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1027122351-478403-durm23ee.1698409431.478737/parameterized-0.7.5-py2.py3-none-any.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1027122351-478403-durm23ee.1698409431.478737/parameterized-0.7.5-py2.py3-none-any.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1027122351-478403-durm23ee.1698409431.478737/matplotlib-3.8.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1027122351-478403-durm23ee.1698409431.478737/matplotlib-3.8.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1027122351-478403-durm23ee.1698409431.478737/matplotlib-3.8.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1027122351-478403-durm23ee.1698409431.478737/matplotlib-3.8.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1027122351-478403-durm23ee.1698409431.478737/scikit_learn-1.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1027122351-478403-durm23ee.1698409431.478737/scikit_learn-1.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
in 1 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1027122351-478403-durm23ee.1698409431.478737/scikit_learn-1.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1027122351-478403-durm23ee.1698409431.478737/scikit_learn-1.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
in 1 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1027122351-478403-durm23ee.1698409431.478737/apache_beam-2.52.0.dev0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1027122351-478403-durm23ee.1698409431.478737/apache_beam-2.52.0.dev0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1027122351-478403-durm23ee.1698409431.478737/pipeline.pb...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1027122351-478403-durm23ee.1698409431.478737/pipeline.pb
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:843 Create job:
<Job
clientRequestId: '20231027122351480737-9154'
createTime: '2023-10-27T12:23:59.652551Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2023-10-27_05_23_59-4301342114237121416'
location: 'us-central1'
name: 'beamapp-jenkins-1027122351-478403-durm23ee'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2023-10-27T12:23:59.652551Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:845 Created job
with id: [2023-10-27_05_23_59-4301342114237121416]
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:846 Submitted job:
2023-10-27_05_23_59-4301342114237121416
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:847 To access the
Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-27_05_23_59-4301342114237121416?project=apache-beam-testing
[32mINFO [0m
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58
Console log:
[32mINFO [0m
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-27_05_23_59-4301342114237121416?project=apache-beam-testing
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job
2023-10-27_05_23_59-4301342114237121416 is in state JOB_STATE_RUNNING
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-27T12:24:03.431Z: JOB_MESSAGE_BASIC: Worker configuration:
e2-standard-2 in us-central1-b.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-27T12:24:06.057Z: JOB_MESSAGE_BASIC: Executing operation Generate
Direct Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-27T12:24:06.106Z: JOB_MESSAGE_BASIC: Starting 1 workers in
us-central1-b...
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-27T12:24:15.378Z: JOB_MESSAGE_BASIC: Finished operation Generate Direct
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-27T12:24:15.507Z: JOB_MESSAGE_BASIC: Executing operation Generate
Direct Rows/Create/Impulse+Generate Direct Rows/Create/FlatMap(<lambda at
core.py:3774>)+Generate Direct
Rows/Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Generate Direct
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Generate
Direct
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Generate
Direct Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-27T12:24:27.266Z: JOB_MESSAGE_BASIC: Your project already contains 100
Dataflow-created metric descriptors, so new user metrics of the form
custom.googleapis.com/* will not be created. However, all user metrics are also
available in the metric dataflow.googleapis.com/job/user_counter. If you rely
on the custom metrics, you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-27T12:26:58.285Z: JOB_MESSAGE_BASIC: All workers have finished the
startup processes and began to receive work requests.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-27T12:26:58.745Z: JOB_MESSAGE_BASIC: Finished operation Generate Direct
Rows/Create/Impulse+Generate Direct Rows/Create/FlatMap(<lambda at
core.py:3774>)+Generate Direct
Rows/Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Generate Direct
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Generate
Direct
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Generate
Direct Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-27T12:26:58.798Z: JOB_MESSAGE_BASIC: Executing operation Generate
Direct Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-27T12:27:00.410Z: JOB_MESSAGE_BASIC: Finished operation Generate Direct
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-27T12:27:00.457Z: JOB_MESSAGE_BASIC: Executing operation Generate
Direct
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Generate
Direct
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Generate
Direct
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Generate
Direct Rows/Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Generate Direct
Rows/Create/Map(decode)+Generate Direct
Rows/WriteToBigTable/ParDo(_BigTableWriteFn)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-27T13:38:41.704Z: JOB_MESSAGE_BASIC: Cancel request is committed for
workflow job: 2023-10-27_05_23_59-4301342114237121416.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-27T13:38:41.726Z: JOB_MESSAGE_BASIC: Finished operation Generate Direct
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Generate
Direct
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Generate
Direct
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Generate
Direct Rows/Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Generate Direct
Rows/Create/Map(decode)+Generate Direct
Rows/WriteToBigTable/ParDo(_BigTableWriteFn)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-27T13:38:41.897Z: JOB_MESSAGE_BASIC: Stopping worker pool...
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job
2023-10-27_05_23_59-4301342114237121416 is in state JOB_STATE_CANCELLING
[33m=============================== warnings summary
===============================[0m
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_it
apache_beam/examples/complete/game/game_stats_it_test.py::GameStatsIT::test_game_stats_it
apache_beam/examples/complete/game/leader_board_it_test.py::LeaderBoardIT::test_leader_board_it
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_file_loads
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_streaming_inserts
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63:
PendingDeprecationWarning: Client.dataset is deprecated and will be removed in
a future version. Use a string like 'my_project.my_dataset' or a
cloud.google.bigquery.DatasetReference object, instead.
dataset_ref = client.dataset(unique_dataset_name, project=project)
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100:
PendingDeprecationWarning: Client.dataset is deprecated and will be removed in
a future version. Use a string like 'my_project.my_dataset' or a
cloud.google.bigquery.DatasetReference object, instead.
table_ref = client.dataset(dataset_id).table(table_id)
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47:
FutureWarning: The default value of numeric_only in DataFrame.mean is
deprecated. In a future version, it will default to False. In addition,
specifying 'numeric_only=None' is deprecated. Select only valid columns or
specify the value of numeric_only to silence this warning.
return airline_df[at_top_airports].mean()
apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_native_source
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:170:
BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use
ReadFromBigQuery instead.
beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:706:
BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use
ReadFromBigQuery instead.
beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))
-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file:
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/pytest_postCommitIT-df-py39.xml>
-
[36m[1m=========================== short test summary info
============================[0m
[31mFAILED[0m
apache_beam/examples/cookbook/bigtableio_it_test.py::[1mBigtableIOWriteTest::test_bigtable_write[0m
- Failed: Timeout >4500.0s
[31m====== [31m[1m1 failed[0m, [32m87 passed[0m, [33m50 skipped[0m,
[33m9 warnings[0m[31m in 7381.84s (2:03:01)[0m[31m =======[0m
> Task :sdks:python:test-suites:dataflow:py39:postCommitIT FAILED
FAILURE: Build failed with an exception.
* Where:
Script
'<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
line: 139
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 9.0.
You can use '--warning-mode all' to show the individual deprecation warnings
and determine if they come from your own scripts or plugins.
For more on this, please refer to
https://docs.gradle.org/8.4/userguide/command_line_interface.html#sec:command_line_warnings
in the Gradle documentation.
BUILD FAILED in 2h 9m 22s
219 actionable tasks: 155 executed, 60 from cache, 4 up-to-date
Publishing build scan...
https://ge.apache.org/s/mxnigmfy62hbw
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]