See
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/871/display/redirect>
Changes:
------------------------------------------
[...truncated 509.93 KB...]
cls.test_pipeline = TestPipeline(is_integration_test=True)
cls.project = cls.test_pipeline.get_option('project')
cls.args = cls.test_pipeline.get_full_options_as_args()
cls.expansion_service = ('localhost:%s' %
os.environ.get('EXPANSION_PORT'))
> timestr = "".join(filter(str.isdigit, str(datetime.datetime.utcnow())))
[1m[31mE AttributeError: type object 'datetime.datetime' has no attribute
'datetime'[0m
[1m[31mapache_beam/io/gcp/bigtableio_it_test.py[0m:163: AttributeError
[31m[1m________ ERROR at setup of
TestWriteToBigtableXlangIT.test_set_mutation ________[0m
cls = <class 'apache_beam.io.gcp.bigtableio_it_test.TestWriteToBigtableXlangIT'>
@classmethod
def setUpClass(cls):
cls.test_pipeline = TestPipeline(is_integration_test=True)
cls.project = cls.test_pipeline.get_option('project')
cls.args = cls.test_pipeline.get_full_options_as_args()
cls.expansion_service = ('localhost:%s' %
os.environ.get('EXPANSION_PORT'))
> timestr = "".join(filter(str.isdigit, str(datetime.datetime.utcnow())))
[1m[31mE AttributeError: type object 'datetime.datetime' has no attribute
'datetime'[0m
[1m[31mapache_beam/io/gcp/bigtableio_it_test.py[0m:163: AttributeError
[33m=============================== warnings summary
===============================[0m
../../build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py>:17:
DeprecationWarning: The distutils package is deprecated and slated for removal
in Python 3.12. Use setuptools or check PEP 632 for potential alternatives
from distutils import util
-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file:
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/pytest_gcpCrossLanguage.xml>
-
[36m[1m=========================== short test summary info
============================[0m
[31mERROR[0m
apache_beam/io/gcp/bigtableio_it_test.py::[1mTestWriteToBigtableXlangIT::test_delete_cells_mutation[0m
- AttributeError: type object 'datetime.datetime' has no attribute 'datetime'
[31mERROR[0m
apache_beam/io/gcp/bigtableio_it_test.py::[1mTestWriteToBigtableXlangIT::test_delete_cells_with_timerange_mutation[0m
- AttributeError: type object 'datetime.datetime' has no attribute 'datetime'
[31mERROR[0m
apache_beam/io/gcp/bigtableio_it_test.py::[1mTestWriteToBigtableXlangIT::test_delete_column_family_mutation[0m
- AttributeError: type object 'datetime.datetime' has no attribute 'datetime'
[31mERROR[0m
apache_beam/io/gcp/bigtableio_it_test.py::[1mTestWriteToBigtableXlangIT::test_delete_row_mutation[0m
- AttributeError: type object 'datetime.datetime' has no attribute 'datetime'
[31mERROR[0m
apache_beam/io/gcp/bigtableio_it_test.py::[1mTestWriteToBigtableXlangIT::test_set_mutation[0m
- AttributeError: type object 'datetime.datetime' has no attribute 'datetime'
[31m= [32m8 passed[0m, [33m19 skipped[0m, [33m7213 deselected[0m, [33m1
warning[0m, [31m[1m5 errors[0m[31m in 2737.02s (0:45:37)[0m[31m =[0m
> Task :sdks:python:test-suites:dataflow:py311:gcpCrossLanguagePythonUsingJava
> FAILED
> Task :sdks:python:test-suites:dataflow:py311:gcpCrossLanguageCleanup
Stopping expansion service pid: 636510.
Skipping invalid pid: 636511.
> Task :sdks:python:test-suites:dataflow:py38:gcpCrossLanguagePythonUsingJava
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-31T21:05:59.038Z: JOB_MESSAGE_BASIC: Worker pool stopped.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job
2023-10-31_14_00_32-1128791376174754339 is in state JOB_STATE_DONE
[32mINFO [0m
apache_beam.io.gcp.tests.bigquery_matcher:bigquery_matcher.py:121 Attempting to
perform query SELECT * FROM
python_xlang_storage_write_1698786018_ed4be5.write_with_beam_rows to BQ
[32mINFO [0m
apache_beam.io.gcp.tests.bigquery_matcher:bigquery_matcher.py:158 Result of
query is: [(2, 0.2, Decimal('2.22'), 'b', False, b'b', datetime.datetime(1970,
1, 1, 0, 33, 20, 200, tzinfo=datetime.timezone.utc)), (4, 0.4, Decimal('4.44'),
'd', False, b'd', datetime.datetime(1970, 1, 1, 1, 6, 40, 400,
tzinfo=datetime.timezone.utc)), (1, 0.1, Decimal('1.11'), 'a', True, b'a',
datetime.datetime(1970, 1, 1, 0, 16, 40, 100, tzinfo=datetime.timezone.utc)),
(3, 0.3, Decimal('3.33'), 'c', True, b'd', datetime.datetime(1970, 1, 1, 0, 50,
0, 300, tzinfo=datetime.timezone.utc))]
[32mINFO [0m
apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:122
Deleting dataset python_xlang_storage_write_1698786018_ed4be5 in project
apache-beam-testing
[32mPASSED[0m
apache_beam/io/gcp/bigtableio_it_test.py::TestReadFromBigTableIT::test_read_xlang
[1m-------------------------------- live log call
---------------------------------[0m
[32mINFO [0m
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:80 Created instance
[bt-read-tests-1698786367-e703c3] in project [apache-beam-testing]
[32mINFO [0m
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:88 Created table
[test-table]
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:322 Copying
Beam SDK
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/build/apache_beam-2.52.0.dev0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl">
to staging location.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:395 Pipeline
has additional dependencies to be installed in SDK worker container, consider
using the SDK container image pre-building workflow to avoid repetitive
installations. Learn more on
https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
[32mINFO [0m root:environments.py:313 Using provided Python SDK container
image: gcr.io/apache-beam-testing/beam-sdk/beam_python3.8_sdk:latest
[32mINFO [0m root:environments.py:320 Python SDK container image set to
"gcr.io/apache-beam-testing/beam-sdk/beam_python3.8_sdk:latest" for Docker
environment
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712
==================== <function pack_combiners at 0x7ff906732ee0>
====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712
==================== <function sort_stages at 0x7ff90672e700>
====================
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1031210623-873398-1q525qnn.1698786383.873634/beam-sdks-java-io-google-cloud-platform-expansion-service-2.52.0-SNAPSHOT-pkdZ9XO8zDFu9Z8Um45-BXb4Xry6j7el9yLTqCIGffM.jar...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1031210623-873398-1q525qnn.1698786383.873634/beam-sdks-java-io-google-cloud-platform-expansion-service-2.52.0-SNAPSHOT-pkdZ9XO8zDFu9Z8Um45-BXb4Xry6j7el9yLTqCIGffM.jar
in 6 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1031210623-873398-1q525qnn.1698786383.873634/apache_beam-2.52.0.dev0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1031210623-873398-1q525qnn.1698786383.873634/apache_beam-2.52.0.dev0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS
upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1031210623-873398-1q525qnn.1698786383.873634/pipeline.pb...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS
upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1031210623-873398-1q525qnn.1698786383.873634/pipeline.pb
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:843 Create job:
<Job
clientRequestId: '20231031210623874914-9680'
createTime: '2023-10-31T21:06:31.880540Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2023-10-31_14_06_31-4708059875289394473'
location: 'us-central1'
name: 'beamapp-jenkins-1031210623-873398-1q525qnn'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2023-10-31T21:06:31.880540Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:845 Created job
with id: [2023-10-31_14_06_31-4708059875289394473]
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:846 Submitted job:
2023-10-31_14_06_31-4708059875289394473
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:847 To access the
Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-31_14_06_31-4708059875289394473?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-31_14_06_31-4708059875289394473?project=apache-beam-testing
[32mINFO [0m
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58
Console log:
[32mINFO [0m
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-31_14_06_31-4708059875289394473?project=apache-beam-testing
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job
2023-10-31_14_06_31-4708059875289394473 is in state JOB_STATE_RUNNING
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-31T21:06:34.905Z: JOB_MESSAGE_BASIC: Worker configuration:
e2-standard-2 in us-central1-b.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-31T21:06:37.265Z: JOB_MESSAGE_BASIC: Executing operation
ReadFromBigtable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_read:v1/BigtableIO.Read/Read(BigtableSource)/Impulse+ReadFromBigtable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_read:v1/BigtableIO.Read/Read(BigtableSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)+external_9ReadFromBigtable-SchemaAwareExternalTransform-beam-schematransform-org-apache-beam-bigtable_read-v1-3/PairWithRestriction+external_9ReadFromBigtable-SchemaAwareExternalTransform-beam-schematransform-org-apache-beam-bigtable_read-v1-3/SplitWithSizing
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-31T21:06:37.319Z: JOB_MESSAGE_BASIC: Starting 1 workers in
us-central1-b...
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-31T21:06:44.211Z: JOB_MESSAGE_BASIC: Your project already contains 100
Dataflow-created metric descriptors, so new user metrics of the form
custom.googleapis.com/* will not be created. However, all user metrics are also
available in the metric dataflow.googleapis.com/job/user_counter. If you rely
on the custom metrics, you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-31T21:10:16.518Z: JOB_MESSAGE_BASIC: All workers have finished the
startup processes and began to receive work requests.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-31T21:10:23.336Z: JOB_MESSAGE_BASIC: Finished operation
ReadFromBigtable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_read:v1/BigtableIO.Read/Read(BigtableSource)/Impulse+ReadFromBigtable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_read:v1/BigtableIO.Read/Read(BigtableSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)+external_9ReadFromBigtable-SchemaAwareExternalTransform-beam-schematransform-org-apache-beam-bigtable_read-v1-3/PairWithRestriction+external_9ReadFromBigtable-SchemaAwareExternalTransform-beam-schematransform-org-apache-beam-bigtable_read-v1-3/SplitWithSizing
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-31T21:10:23.431Z: JOB_MESSAGE_BASIC: Executing operation
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Create
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-31T21:10:23.763Z: JOB_MESSAGE_BASIC: Finished operation
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Create
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-31T21:10:23.847Z: JOB_MESSAGE_BASIC: Executing operation
external_9ReadFromBigtable-SchemaAwareExternalTransform-beam-schematransform-org-apache-beam-bigtable_read-v1-3/ProcessElementAndRestrictionWithSizing+ReadFromBigtable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_read:v1/MapElements/Map/ParMultiDo(Anonymous)+ReadFromBigtable/ParDo(_BeamRowToPartialRowData)+Extract
cells+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/CoGroupByKeyImpl/Tag[1]+assert_that/Group/CoGroupByKeyImpl/Flatten/OutputIdentity+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-31T21:10:23.874Z: JOB_MESSAGE_BASIC: Executing operation
assert_that/Create/Impulse+assert_that/Create/FlatMap(<lambda at
core.py:3774>)+assert_that/Create/Map(decode)+assert_that/Group/CoGroupByKeyImpl/Tag[0]+assert_that/Group/CoGroupByKeyImpl/Flatten/OutputIdentity+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-31T21:10:24.221Z: JOB_MESSAGE_BASIC: Finished operation
assert_that/Create/Impulse+assert_that/Create/FlatMap(<lambda at
core.py:3774>)+assert_that/Create/Map(decode)+assert_that/Group/CoGroupByKeyImpl/Tag[0]+assert_that/Group/CoGroupByKeyImpl/Flatten/OutputIdentity+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-31T21:10:43.948Z: JOB_MESSAGE_BASIC: Finished operation
external_9ReadFromBigtable-SchemaAwareExternalTransform-beam-schematransform-org-apache-beam-bigtable_read-v1-3/ProcessElementAndRestrictionWithSizing+ReadFromBigtable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_read:v1/MapElements/Map/ParMultiDo(Anonymous)+ReadFromBigtable/ParDo(_BeamRowToPartialRowData)+Extract
cells+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/CoGroupByKeyImpl/Tag[1]+assert_that/Group/CoGroupByKeyImpl/Flatten/OutputIdentity+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-31T21:10:43.990Z: JOB_MESSAGE_BASIC: Executing operation
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Close
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-31T21:10:45.115Z: JOB_MESSAGE_BASIC: Finished operation
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Close
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-31T21:10:45.178Z: JOB_MESSAGE_BASIC: Executing operation
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Read+assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values)+assert_that/Group/RestoreTags+assert_that/Unkey+assert_that/Match
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-31T21:10:45.496Z: JOB_MESSAGE_BASIC: Finished operation
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Read+assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values)+assert_that/Group/RestoreTags+assert_that/Unkey+assert_that/Match
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-31T21:10:45.642Z: JOB_MESSAGE_BASIC: Stopping worker pool...
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200
2023-10-31T21:12:49.998Z: JOB_MESSAGE_BASIC: Worker pool stopped.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job
2023-10-31_14_06_31-4708059875289394473 is in state JOB_STATE_DONE
[32mINFO [0m
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:92 Deleting table
[test-table] and instance [bt-read-tests-1698786367-e703c3]
[32mPASSED[0m
apache_beam/io/gcp/bigtableio_it_test.py::TestWriteToBigtableXlangIT::test_delete_cells_mutation
[31mERROR[0m
apache_beam/io/gcp/bigtableio_it_test.py::TestWriteToBigtableXlangIT::test_delete_cells_with_timerange_mutation
[31mERROR[0m
apache_beam/io/gcp/bigtableio_it_test.py::TestWriteToBigtableXlangIT::test_delete_column_family_mutation
[31mERROR[0m
apache_beam/io/gcp/bigtableio_it_test.py::TestWriteToBigtableXlangIT::test_delete_row_mutation
[31mERROR[0m
apache_beam/io/gcp/bigtableio_it_test.py::TestWriteToBigtableXlangIT::test_set_mutation
[31mERROR[0m
==================================== ERRORS ====================================
[31m[1m___ ERROR at setup of
TestWriteToBigtableXlangIT.test_delete_cells_mutation ____[0m
cls = <class 'apache_beam.io.gcp.bigtableio_it_test.TestWriteToBigtableXlangIT'>
@classmethod
def setUpClass(cls):
cls.test_pipeline = TestPipeline(is_integration_test=True)
cls.project = cls.test_pipeline.get_option('project')
cls.args = cls.test_pipeline.get_full_options_as_args()
cls.expansion_service = ('localhost:%s' %
os.environ.get('EXPANSION_PORT'))
> timestr = "".join(filter(str.isdigit, str(datetime.datetime.utcnow())))
[1m[31mE AttributeError: type object 'datetime.datetime' has no attribute
'datetime'[0m
[1m[31mapache_beam/io/gcp/bigtableio_it_test.py[0m:163: AttributeError
[31m[1m_ ERROR at setup of
TestWriteToBigtableXlangIT.test_delete_cells_with_timerange_mutation _[0m
cls = <class 'apache_beam.io.gcp.bigtableio_it_test.TestWriteToBigtableXlangIT'>
@classmethod
def setUpClass(cls):
cls.test_pipeline = TestPipeline(is_integration_test=True)
cls.project = cls.test_pipeline.get_option('project')
cls.args = cls.test_pipeline.get_full_options_as_args()
cls.expansion_service = ('localhost:%s' %
os.environ.get('EXPANSION_PORT'))
> timestr = "".join(filter(str.isdigit, str(datetime.datetime.utcnow())))
[1m[31mE AttributeError: type object 'datetime.datetime' has no attribute
'datetime'[0m
[1m[31mapache_beam/io/gcp/bigtableio_it_test.py[0m:163: AttributeError
[31m[1m_ ERROR at setup of
TestWriteToBigtableXlangIT.test_delete_column_family_mutation _[0m
cls = <class 'apache_beam.io.gcp.bigtableio_it_test.TestWriteToBigtableXlangIT'>
@classmethod
def setUpClass(cls):
cls.test_pipeline = TestPipeline(is_integration_test=True)
cls.project = cls.test_pipeline.get_option('project')
cls.args = cls.test_pipeline.get_full_options_as_args()
cls.expansion_service = ('localhost:%s' %
os.environ.get('EXPANSION_PORT'))
> timestr = "".join(filter(str.isdigit, str(datetime.datetime.utcnow())))
[1m[31mE AttributeError: type object 'datetime.datetime' has no attribute
'datetime'[0m
[1m[31mapache_beam/io/gcp/bigtableio_it_test.py[0m:163: AttributeError
[31m[1m____ ERROR at setup of
TestWriteToBigtableXlangIT.test_delete_row_mutation _____[0m
cls = <class 'apache_beam.io.gcp.bigtableio_it_test.TestWriteToBigtableXlangIT'>
@classmethod
def setUpClass(cls):
cls.test_pipeline = TestPipeline(is_integration_test=True)
cls.project = cls.test_pipeline.get_option('project')
cls.args = cls.test_pipeline.get_full_options_as_args()
cls.expansion_service = ('localhost:%s' %
os.environ.get('EXPANSION_PORT'))
> timestr = "".join(filter(str.isdigit, str(datetime.datetime.utcnow())))
[1m[31mE AttributeError: type object 'datetime.datetime' has no attribute
'datetime'[0m
[1m[31mapache_beam/io/gcp/bigtableio_it_test.py[0m:163: AttributeError
[31m[1m________ ERROR at setup of
TestWriteToBigtableXlangIT.test_set_mutation ________[0m
cls = <class 'apache_beam.io.gcp.bigtableio_it_test.TestWriteToBigtableXlangIT'>
@classmethod
def setUpClass(cls):
cls.test_pipeline = TestPipeline(is_integration_test=True)
cls.project = cls.test_pipeline.get_option('project')
cls.args = cls.test_pipeline.get_full_options_as_args()
cls.expansion_service = ('localhost:%s' %
os.environ.get('EXPANSION_PORT'))
> timestr = "".join(filter(str.isdigit, str(datetime.datetime.utcnow())))
[1m[31mE AttributeError: type object 'datetime.datetime' has no attribute
'datetime'[0m
[1m[31mapache_beam/io/gcp/bigtableio_it_test.py[0m:163: AttributeError
- generated xml file:
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/pytest_gcpCrossLanguage.xml>
-
[36m[1m=========================== short test summary info
============================[0m
[31mERROR[0m
apache_beam/io/gcp/bigtableio_it_test.py::[1mTestWriteToBigtableXlangIT::test_delete_cells_mutation[0m
- AttributeError: type object 'datetime.datetime' has no attribute 'datetime'
[31mERROR[0m
apache_beam/io/gcp/bigtableio_it_test.py::[1mTestWriteToBigtableXlangIT::test_delete_cells_with_timerange_mutation[0m
- AttributeError: type object 'datetime.datetime' has no attribute 'datetime'
[31mERROR[0m
apache_beam/io/gcp/bigtableio_it_test.py::[1mTestWriteToBigtableXlangIT::test_delete_column_family_mutation[0m
- AttributeError: type object 'datetime.datetime' has no attribute 'datetime'
[31mERROR[0m
apache_beam/io/gcp/bigtableio_it_test.py::[1mTestWriteToBigtableXlangIT::test_delete_row_mutation[0m
- AttributeError: type object 'datetime.datetime' has no attribute 'datetime'
[31mERROR[0m
apache_beam/io/gcp/bigtableio_it_test.py::[1mTestWriteToBigtableXlangIT::test_set_mutation[0m
- AttributeError: type object 'datetime.datetime' has no attribute 'datetime'
[31m==== [32m8 passed[0m, [33m19 skipped[0m, [33m7213 deselected[0m,
[31m[1m5 errors[0m[31m in 2855.44s (0:47:35)[0m[31m =====[0m
> Task :sdks:python:test-suites:dataflow:py38:gcpCrossLanguagePythonUsingJava
> FAILED
> Task :sdks:python:test-suites:dataflow:py38:gcpCrossLanguageCleanup
Stopping expansion service pid: 659419.
Skipping invalid pid: 659420.
> Task :sdks:python:test-suites:xlang:fnApiJobServerCleanup
Killing process at 626585
FAILURE: Build completed with 2 failures.
1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py311:gcpCrossLanguagePythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.
==============================================================================
2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py38:gcpCrossLanguagePythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.
==============================================================================
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 9.0.
You can use '--warning-mode all' to show the individual deprecation warnings
and determine if they come from your own scripts or plugins.
For more on this, please refer to
https://docs.gradle.org/8.4/userguide/command_line_interface.html#sec:command_line_warnings
in the Gradle documentation.
BUILD FAILED in 1h 14m 43s
124 actionable tasks: 89 executed, 33 from cache, 2 up-to-date
Publishing build scan...
https://ge.apache.org/s/aot53xlpx6pok
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]