See
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/874/display/redirect?page=changes>
Changes:
[noreply] Remove cred rotation jobs from jenkins (#29243)
------------------------------------------
[...truncated 649.91 KB...]
> timestr = "".join(filter(str.isdigit, str(datetime.datetime.utcnow())))
[1m[31mE AttributeError: type object 'datetime.datetime' has no attribute
'datetime'[0m
[1m[31mapache_beam/io/gcp/bigtableio_it_test.py[0m:163: AttributeError
[31m[1m____ ERROR at setup of
TestWriteToBigtableXlangIT.test_delete_row_mutation _____[0m
cls = <class 'apache_beam.io.gcp.bigtableio_it_test.TestWriteToBigtableXlangIT'>
@classmethod
def setUpClass(cls):
cls.test_pipeline = TestPipeline(is_integration_test=True)
cls.project = cls.test_pipeline.get_option('project')
cls.args = cls.test_pipeline.get_full_options_as_args()
cls.expansion_service = ('localhost:%s' %
os.environ.get('EXPANSION_PORT'))
> timestr = "".join(filter(str.isdigit, str(datetime.datetime.utcnow())))
[1m[31mE AttributeError: type object 'datetime.datetime' has no attribute
'datetime'[0m
[1m[31mapache_beam/io/gcp/bigtableio_it_test.py[0m:163: AttributeError
[31m[1m________ ERROR at setup of
TestWriteToBigtableXlangIT.test_set_mutation ________[0m
cls = <class 'apache_beam.io.gcp.bigtableio_it_test.TestWriteToBigtableXlangIT'>
@classmethod
def setUpClass(cls):
cls.test_pipeline = TestPipeline(is_integration_test=True)
cls.project = cls.test_pipeline.get_option('project')
cls.args = cls.test_pipeline.get_full_options_as_args()
cls.expansion_service = ('localhost:%s' %
os.environ.get('EXPANSION_PORT'))
> timestr = "".join(filter(str.isdigit, str(datetime.datetime.utcnow())))
[1m[31mE AttributeError: type object 'datetime.datetime' has no attribute
'datetime'[0m
[1m[31mapache_beam/io/gcp/bigtableio_it_test.py[0m:163: AttributeError
[33m=============================== warnings summary
===============================[0m
../../build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py>:17:
DeprecationWarning: The distutils package is deprecated and slated for removal
in Python 3.12. Use setuptools or check PEP 632 for potential alternatives
from distutils import util
-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file:
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/pytest_gcpCrossLanguage.xml>
-
[36m[1m=========================== short test summary info
============================[0m
[31mERROR[0m
apache_beam/io/gcp/bigtableio_it_test.py::[1mTestWriteToBigtableXlangIT::test_delete_cells_mutation[0m
- AttributeError: type object 'datetime.datetime' has no attribute 'datetime'
[31mERROR[0m
apache_beam/io/gcp/bigtableio_it_test.py::[1mTestWriteToBigtableXlangIT::test_delete_cells_with_timerange_mutation[0m
- AttributeError: type object 'datetime.datetime' has no attribute 'datetime'
[31mERROR[0m
apache_beam/io/gcp/bigtableio_it_test.py::[1mTestWriteToBigtableXlangIT::test_delete_column_family_mutation[0m
- AttributeError: type object 'datetime.datetime' has no attribute 'datetime'
[31mERROR[0m
apache_beam/io/gcp/bigtableio_it_test.py::[1mTestWriteToBigtableXlangIT::test_delete_row_mutation[0m
- AttributeError: type object 'datetime.datetime' has no attribute 'datetime'
[31mERROR[0m
apache_beam/io/gcp/bigtableio_it_test.py::[1mTestWriteToBigtableXlangIT::test_set_mutation[0m
- AttributeError: type object 'datetime.datetime' has no attribute 'datetime'
[31m= [32m8 passed[0m, [33m19 skipped[0m, [33m7215 deselected[0m, [33m1
warning[0m, [31m[1m5 errors[0m[31m in 2690.88s (0:44:50)[0m[31m =[0m
> Task :sdks:python:test-suites:dataflow:py311:gcpCrossLanguageCleanup
Stopping expansion service pid: 2357901.
Skipping invalid pid: 2357916.
> Task :sdks:python:test-suites:dataflow:py38:gcpCrossLanguagePythonUsingJava
[32mINFO [0m
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:88 Created table
[test-table]
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:322 Copying
Beam SDK
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/build/apache_beam-2.52.0.dev0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl">
to staging location.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:395 Pipeline
has additional dependencies to be installed in SDK worker container, consider
using the SDK container image pre-building workflow to avoid repetitive
installations. Learn more on
https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
[32mINFO [0m root:environments.py:313 Using provided Python SDK container
image: gcr.io/apache-beam-testing/beam-sdk/beam_python3.8_sdk:latest
[32mINFO [0m root:environments.py:320 Python SDK container image set to
"gcr.io/apache-beam-testing/beam-sdk/beam_python3.8_sdk:latest" for Docker
environment
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712
==================== <function pack_combiners at 0x7fbefdc91280>
====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712
==================== <function sort_stages at 0x7fbefdc91a60>
====================
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1101145354-876548-525qnntb.1698850434.876774/beam-sdks-java-io-google-cloud-platform-expansion-service-2.52.0-SNAPSHOT-Av4U5stpxP4XJU67ahh3VHPN2cnRVHvhx1kYWBo4x5o.jar...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1101145354-876548-525qnntb.1698850434.876774/beam-sdks-java-io-google-cloud-platform-expansion-service-2.52.0-SNAPSHOT-Av4U5stpxP4XJU67ahh3VHPN2cnRVHvhx1kYWBo4x5o.jar
in 6 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1101145354-876548-525qnntb.1698850434.876774/apache_beam-2.52.0.dev0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1101145354-876548-525qnntb.1698850434.876774/apache_beam-2.52.0.dev0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1101145354-876548-525qnntb.1698850434.876774/pipeline.pb...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1101145354-876548-525qnntb.1698850434.876774/pipeline.pb
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:843 Create job:
<Job
clientRequestId: '20231101145354878172-2485'
createTime: '2023-11-01T14:54:03.195488Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2023-11-01_07_54_02-13815666169983333389'
location: 'us-central1'
name: 'beamapp-jenkins-1101145354-876548-525qnntb'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2023-11-01T14:54:03.195488Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:845 Created job
with id: [2023-11-01_07_54_02-13815666169983333389]
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:846 Submitted job:
2023-11-01_07_54_02-13815666169983333389
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:847 To access the
Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-11-01_07_54_02-13815666169983333389?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-11-01_07_54_02-13815666169983333389?project=apache-beam-testing
[32mINFO [0m
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58
Console log:
[32mINFO [0m
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-11-01_07_54_02-13815666169983333389?project=apache-beam-testing
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job
2023-11-01_07_54_02-13815666169983333389 is in state JOB_STATE_RUNNING
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-01T14:54:06.997Z: JOB_MESSAGE_BASIC: Worker configuration:
e2-standard-2 in us-central1-b.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-01T14:54:09.942Z: JOB_MESSAGE_BASIC: Executing operation
ReadFromBigtable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_read:v1/BigtableIO.Read/Read(BigtableSource)/Impulse+ReadFromBigtable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_read:v1/BigtableIO.Read/Read(BigtableSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)+external_9ReadFromBigtable-SchemaAwareExternalTransform-beam-schematransform-org-apache-beam-bigtable_read-v1-3/PairWithRestriction+external_9ReadFromBigtable-SchemaAwareExternalTransform-beam-schematransform-org-apache-beam-bigtable_read-v1-3/SplitWithSizing
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-01T14:54:10.067Z: JOB_MESSAGE_BASIC: Starting 1 workers in
us-central1-b...
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-01T14:54:39.807Z: JOB_MESSAGE_BASIC: Your project already contains 100
Dataflow-created metric descriptors, so new user metrics of the form
custom.googleapis.com/* will not be created. However, all user metrics are also
available in the metric dataflow.googleapis.com/job/user_counter. If you rely
on the custom metrics, you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-01T14:57:11.975Z: JOB_MESSAGE_BASIC: All workers have finished the
startup processes and began to receive work requests.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-01T14:57:22.351Z: JOB_MESSAGE_BASIC: Finished operation
ReadFromBigtable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_read:v1/BigtableIO.Read/Read(BigtableSource)/Impulse+ReadFromBigtable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_read:v1/BigtableIO.Read/Read(BigtableSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)+external_9ReadFromBigtable-SchemaAwareExternalTransform-beam-schematransform-org-apache-beam-bigtable_read-v1-3/PairWithRestriction+external_9ReadFromBigtable-SchemaAwareExternalTransform-beam-schematransform-org-apache-beam-bigtable_read-v1-3/SplitWithSizing
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-01T14:57:22.492Z: JOB_MESSAGE_BASIC: Executing operation
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Create
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-01T14:57:23.206Z: JOB_MESSAGE_BASIC: Finished operation
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Create
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-01T14:57:23.361Z: JOB_MESSAGE_BASIC: Executing operation
external_9ReadFromBigtable-SchemaAwareExternalTransform-beam-schematransform-org-apache-beam-bigtable_read-v1-3/ProcessElementAndRestrictionWithSizing+ReadFromBigtable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_read:v1/MapElements/Map/ParMultiDo(Anonymous)+ReadFromBigtable/ParDo(_BeamRowToPartialRowData)+Extract
cells+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/CoGroupByKeyImpl/Tag[1]+assert_that/Group/CoGroupByKeyImpl/Flatten/OutputIdentity+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-01T14:57:23.387Z: JOB_MESSAGE_BASIC: Executing operation
assert_that/Create/Impulse+assert_that/Create/FlatMap(<lambda at
core.py:3774>)+assert_that/Create/Map(decode)+assert_that/Group/CoGroupByKeyImpl/Tag[0]+assert_that/Group/CoGroupByKeyImpl/Flatten/OutputIdentity+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-01T14:57:23.814Z: JOB_MESSAGE_BASIC: Finished operation
assert_that/Create/Impulse+assert_that/Create/FlatMap(<lambda at
core.py:3774>)+assert_that/Create/Map(decode)+assert_that/Group/CoGroupByKeyImpl/Tag[0]+assert_that/Group/CoGroupByKeyImpl/Flatten/OutputIdentity+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-01T14:57:42.716Z: JOB_MESSAGE_BASIC: Finished operation
external_9ReadFromBigtable-SchemaAwareExternalTransform-beam-schematransform-org-apache-beam-bigtable_read-v1-3/ProcessElementAndRestrictionWithSizing+ReadFromBigtable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_read:v1/MapElements/Map/ParMultiDo(Anonymous)+ReadFromBigtable/ParDo(_BeamRowToPartialRowData)+Extract
cells+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/CoGroupByKeyImpl/Tag[1]+assert_that/Group/CoGroupByKeyImpl/Flatten/OutputIdentity+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-01T14:57:42.805Z: JOB_MESSAGE_BASIC: Executing operation
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Close
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-01T14:57:42.901Z: JOB_MESSAGE_BASIC: Finished operation
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Close
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-01T14:57:42.972Z: JOB_MESSAGE_BASIC: Executing operation
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Read+assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values)+assert_that/Group/RestoreTags+assert_that/Unkey+assert_that/Match
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-01T14:57:43.555Z: JOB_MESSAGE_BASIC: Finished operation
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Read+assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values)+assert_that/Group/RestoreTags+assert_that/Unkey+assert_that/Match
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-01T14:57:43.894Z: JOB_MESSAGE_BASIC: Stopping worker pool...
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-11-01T14:59:47.623Z: JOB_MESSAGE_BASIC: Worker pool stopped.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job
2023-11-01_07_54_02-13815666169983333389 is in state JOB_STATE_DONE
[32mINFO [0m
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:92 Deleting table
[test-table] and instance [bt-read-tests-1698850417-920523]
[32mPASSED[0m
apache_beam/io/gcp/bigtableio_it_test.py::TestWriteToBigtableXlangIT::test_delete_cells_mutation
[31mERROR[0m
apache_beam/io/gcp/bigtableio_it_test.py::TestWriteToBigtableXlangIT::test_delete_cells_with_timerange_mutation
[31mERROR[0m
apache_beam/io/gcp/bigtableio_it_test.py::TestWriteToBigtableXlangIT::test_delete_column_family_mutation
[31mERROR[0m
apache_beam/io/gcp/bigtableio_it_test.py::TestWriteToBigtableXlangIT::test_delete_row_mutation
[31mERROR[0m
apache_beam/io/gcp/bigtableio_it_test.py::TestWriteToBigtableXlangIT::test_set_mutation
[31mERROR[0m
==================================== ERRORS ====================================
[31m[1m___ ERROR at setup of
TestWriteToBigtableXlangIT.test_delete_cells_mutation ____[0m
cls = <class 'apache_beam.io.gcp.bigtableio_it_test.TestWriteToBigtableXlangIT'>
@classmethod
def setUpClass(cls):
cls.test_pipeline = TestPipeline(is_integration_test=True)
cls.project = cls.test_pipeline.get_option('project')
cls.args = cls.test_pipeline.get_full_options_as_args()
cls.expansion_service = ('localhost:%s' %
os.environ.get('EXPANSION_PORT'))
> timestr = "".join(filter(str.isdigit, str(datetime.datetime.utcnow())))
[1m[31mE AttributeError: type object 'datetime.datetime' has no attribute
'datetime'[0m
[1m[31mapache_beam/io/gcp/bigtableio_it_test.py[0m:163: AttributeError
[31m[1m_ ERROR at setup of
TestWriteToBigtableXlangIT.test_delete_cells_with_timerange_mutation _[0m
cls = <class 'apache_beam.io.gcp.bigtableio_it_test.TestWriteToBigtableXlangIT'>
@classmethod
def setUpClass(cls):
cls.test_pipeline = TestPipeline(is_integration_test=True)
cls.project = cls.test_pipeline.get_option('project')
cls.args = cls.test_pipeline.get_full_options_as_args()
cls.expansion_service = ('localhost:%s' %
os.environ.get('EXPANSION_PORT'))
> timestr = "".join(filter(str.isdigit, str(datetime.datetime.utcnow())))
[1m[31mE AttributeError: type object 'datetime.datetime' has no attribute
'datetime'[0m
[1m[31mapache_beam/io/gcp/bigtableio_it_test.py[0m:163: AttributeError
[31m[1m_ ERROR at setup of
TestWriteToBigtableXlangIT.test_delete_column_family_mutation _[0m
cls = <class 'apache_beam.io.gcp.bigtableio_it_test.TestWriteToBigtableXlangIT'>
@classmethod
def setUpClass(cls):
cls.test_pipeline = TestPipeline(is_integration_test=True)
cls.project = cls.test_pipeline.get_option('project')
cls.args = cls.test_pipeline.get_full_options_as_args()
cls.expansion_service = ('localhost:%s' %
os.environ.get('EXPANSION_PORT'))
> timestr = "".join(filter(str.isdigit, str(datetime.datetime.utcnow())))
[1m[31mE AttributeError: type object 'datetime.datetime' has no attribute
'datetime'[0m
[1m[31mapache_beam/io/gcp/bigtableio_it_test.py[0m:163: AttributeError
[31m[1m____ ERROR at setup of
TestWriteToBigtableXlangIT.test_delete_row_mutation _____[0m
cls = <class 'apache_beam.io.gcp.bigtableio_it_test.TestWriteToBigtableXlangIT'>
@classmethod
def setUpClass(cls):
cls.test_pipeline = TestPipeline(is_integration_test=True)
cls.project = cls.test_pipeline.get_option('project')
cls.args = cls.test_pipeline.get_full_options_as_args()
cls.expansion_service = ('localhost:%s' %
os.environ.get('EXPANSION_PORT'))
> timestr = "".join(filter(str.isdigit, str(datetime.datetime.utcnow())))
[1m[31mE AttributeError: type object 'datetime.datetime' has no attribute
'datetime'[0m
[1m[31mapache_beam/io/gcp/bigtableio_it_test.py[0m:163: AttributeError
[31m[1m________ ERROR at setup of
TestWriteToBigtableXlangIT.test_set_mutation ________[0m
cls = <class 'apache_beam.io.gcp.bigtableio_it_test.TestWriteToBigtableXlangIT'>
@classmethod
def setUpClass(cls):
cls.test_pipeline = TestPipeline(is_integration_test=True)
cls.project = cls.test_pipeline.get_option('project')
cls.args = cls.test_pipeline.get_full_options_as_args()
cls.expansion_service = ('localhost:%s' %
os.environ.get('EXPANSION_PORT'))
> timestr = "".join(filter(str.isdigit, str(datetime.datetime.utcnow())))
[1m[31mE AttributeError: type object 'datetime.datetime' has no attribute
'datetime'[0m
[1m[31mapache_beam/io/gcp/bigtableio_it_test.py[0m:163: AttributeError
- generated xml file:
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/pytest_gcpCrossLanguage.xml>
-
[36m[1m=========================== short test summary info
============================[0m
[31mERROR[0m
apache_beam/io/gcp/bigtableio_it_test.py::[1mTestWriteToBigtableXlangIT::test_delete_cells_mutation[0m
- AttributeError: type object 'datetime.datetime' has no attribute 'datetime'
[31mERROR[0m
apache_beam/io/gcp/bigtableio_it_test.py::[1mTestWriteToBigtableXlangIT::test_delete_cells_with_timerange_mutation[0m
- AttributeError: type object 'datetime.datetime' has no attribute 'datetime'
[31mERROR[0m
apache_beam/io/gcp/bigtableio_it_test.py::[1mTestWriteToBigtableXlangIT::test_delete_column_family_mutation[0m
- AttributeError: type object 'datetime.datetime' has no attribute 'datetime'
[31mERROR[0m
apache_beam/io/gcp/bigtableio_it_test.py::[1mTestWriteToBigtableXlangIT::test_delete_row_mutation[0m
- AttributeError: type object 'datetime.datetime' has no attribute 'datetime'
[31mERROR[0m
apache_beam/io/gcp/bigtableio_it_test.py::[1mTestWriteToBigtableXlangIT::test_set_mutation[0m
- AttributeError: type object 'datetime.datetime' has no attribute 'datetime'
[31m==== [32m8 passed[0m, [33m19 skipped[0m, [33m7215 deselected[0m,
[31m[1m5 errors[0m[31m in 2799.53s (0:46:39)[0m[31m =====[0m
> Task :sdks:python:test-suites:dataflow:py38:gcpCrossLanguagePythonUsingJava
> FAILED
> Task :sdks:python:test-suites:dataflow:py38:gcpCrossLanguageCleanup
Stopping expansion service pid: 2359368.
Skipping invalid pid: 2359369.
> Task :sdks:python:test-suites:xlang:fnApiJobServerCleanup
Killing process at 2357282
FAILURE: Build completed with 2 failures.
1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py311:gcpCrossLanguagePythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.
==============================================================================
2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py38:gcpCrossLanguagePythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.
==============================================================================
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 9.0.
You can use '--warning-mode all' to show the individual deprecation warnings
and determine if they come from your own scripts or plugins.
For more on this, please refer to
https://docs.gradle.org/8.4/userguide/command_line_interface.html#sec:command_line_warnings
in the Gradle documentation.
BUILD FAILED in 1h 1m 32s
124 actionable tasks: 89 executed, 33 from cache, 2 up-to-date
Publishing build scan...
https://ge.apache.org/s/qcmnekoh6qp5q
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]