See
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/32/display/redirect>
Changes:
------------------------------------------
[...truncated 147.88 KB...]
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-04T08:47:47.697Z: JOB_MESSAGE_BASIC: Finished operation
Create/Impulse+Create/FlatMap(<lambda at
core.py:3574>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-04T08:47:47.757Z: JOB_MESSAGE_BASIC: Executing operation
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-04T08:47:47.808Z: JOB_MESSAGE_BASIC: Finished operation
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-04T08:47:47.880Z: JOB_MESSAGE_BASIC: Executing operation
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Create
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-04T08:47:48.030Z: JOB_MESSAGE_BASIC: Finished operation
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Create
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-04T08:47:48.087Z: JOB_MESSAGE_DEBUG: Value
"StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Session"
materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-04T08:47:48.145Z: JOB_MESSAGE_BASIC: Executing operation
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Create/Map(decode)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/element-count/ParMultiDo(ElementCounter)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/CreateTables/CreateTables/ParMultiDo(CreateTables)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert
to
message+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
failed rows and
errors/Map/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
failed
rows/Map/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write
Records+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
failed
rows/Map/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
failed rows and
errors/Map/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Write
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-04T08:47:55.279Z: JOB_MESSAGE_BASIC: Finished operation
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Create/Map(decode)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/element-count/ParMultiDo(ElementCounter)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/CreateTables/CreateTables/ParMultiDo(CreateTables)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert
to
message+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
failed rows and
errors/Map/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
failed
rows/Map/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write
Records+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
failed
rows/Map/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
failed rows and
errors/Map/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Write
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-04T08:47:55.348Z: JOB_MESSAGE_BASIC: Executing operation
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Close
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-04T08:47:55.396Z: JOB_MESSAGE_BASIC: Finished operation
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Close
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-04T08:47:55.462Z: JOB_MESSAGE_BASIC: Executing operation
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Read+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/GroupByWindow+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Finalize
writes/ParMultiDo(StorageApiFinalizeWrites)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-04T08:47:57.430Z: JOB_MESSAGE_BASIC: Finished operation
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Read+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/GroupByWindow+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Finalize
writes/ParMultiDo(StorageApiFinalizeWrites)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-04T08:47:57.490Z: JOB_MESSAGE_DEBUG: Executing success step success32
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-04T08:47:57.555Z: JOB_MESSAGE_DETAILED: Cleaning up.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-04T08:47:57.623Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-04T08:47:57.654Z: JOB_MESSAGE_BASIC: Stopping worker pool...
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-04T08:50:15.576Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker
pool from 1 to 0.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-04T08:50:15.621Z: JOB_MESSAGE_BASIC: Worker pool stopped.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-04T08:50:15.659Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:193 Job
2023-04-04_01_40_37-136985785876813818 is in state JOB_STATE_DONE
[32mINFO [0m
apache_beam.io.gcp.tests.bigquery_matcher:bigquery_matcher.py:121 Attempting to
perform query SELECT * FROM
python_xlang_storage_write168059762751c82a.python_xlang_storage_write_beam_rows
to BQ
[32mINFO [0m
apache_beam.io.gcp.tests.bigquery_matcher:bigquery_matcher.py:158 Result of
query is: [(1, 0.1, Decimal('1.11'), 'a', True, b'a', datetime.datetime(1970,
1, 1, 0, 16, 40, 100, tzinfo=datetime.timezone.utc)), (4, 0.4, Decimal('4.44'),
'd', False, b'd', datetime.datetime(1970, 1, 1, 1, 6, 40, 400,
tzinfo=datetime.timezone.utc)), (3, 0.3, Decimal('3.33'), 'c', True, b'd',
datetime.datetime(1970, 1, 1, 0, 50, 0, 300, tzinfo=datetime.timezone.utc)),
(2, 0.2, Decimal('2.22'), 'b', False, b'b', datetime.datetime(1970, 1, 1, 0,
33, 20, 200, tzinfo=datetime.timezone.utc))]
[32mINFO [0m
apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:117
Deleting dataset python_xlang_storage_write168059762751c82a in project
apache-beam-testing
[32mPASSED[0m
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_storage_write_nested_records_and_lists
[1m-------------------------------- live log call
---------------------------------[0m
[32mINFO [0m apache_beam.io.gcp.bigquery_tools:bigquery_tools.py:816
Dataset apache-beam-testing:python_xlang_storage_write1680598229ab2112 does not
exist so we will create it as temporary with location=None
[32mINFO [0m
apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:107
Created dataset python_xlang_storage_write1680598229ab2112 in project
apache-beam-testing
[32mINFO [0m
apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:110
expansion port: 33201
[32mINFO [0m
apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:117
Deleting dataset python_xlang_storage_write1680598229ab2112 in project
apache-beam-testing
[31mFAILED[0m
=================================== FAILURES ===================================
[31m[1m___
BigQueryXlangStorageWriteIT.test_storage_write_nested_records_and_lists ____[0m
self =
<apache_beam.io.external.xlang_bigqueryio_it_test.BigQueryXlangStorageWriteIT
testMethod=test_storage_write_nested_records_and_lists>
def test_storage_write_nested_records_and_lists(self):
table_name = "python_storage_write_nested_records_and_lists"
schema = {
"fields": [{
"name": "repeated_int", "type": "INTEGER", "mode": "REPEATED"
},
{
"name": "struct",
"type": "STRUCT",
"fields": [{
"name": "nested_int", "type": "INTEGER"
}, {
"name": "nested_str", "type": "STRING"
}]
},
{
"name": "repeated_struct",
"type": "STRUCT",
"mode": "REPEATED",
"fields": [{
"name": "nested_numeric", "type": "NUMERIC"
}, {
"name": "nested_bytes", "type": "BYTES"
}]
}]
}
items = [{
"repeated_int": [1, 2, 3],
"struct": {
"nested_int": 1, "nested_str": "a"
},
"repeated_struct": [{
"nested_numeric": Decimal("1.23"), "nested_bytes": b'a'
},
{
"nested_numeric": Decimal("3.21"),
"nested_bytes": b'aa'
}]
}]
> self.storage_write_test(table_name, items, schema)
[1m[31mapache_beam/io/external/xlang_bigqueryio_it_test.py[0m:205:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31mapache_beam/io/external/xlang_bigqueryio_it_test.py[0m:149: in
storage_write_test
p
[1m[31mapache_beam/pvalue.py[0m:137: in __or__
return self.pipeline.apply(ptransform, self)
[1m[31mapache_beam/pipeline.py[0m:712: in apply
pvalueish_result = self.runner.apply(transform, pvalueish, self._options)
[1m[31mapache_beam/runners/dataflow/dataflow_runner.py[0m:137: in apply
return super().apply(transform, input, options)
[1m[31mapache_beam/runners/runner.py[0m:185: in apply
return m(transform, input, options)
[1m[31mapache_beam/runners/runner.py[0m:215: in apply_PTransform
return transform.expand(input)
[1m[31mapache_beam/io/gcp/bigquery.py[0m:2163: in expand
| StorageWriteToBigQuery(
[1m[31mapache_beam/io/gcp/bigquery.py[0m:2463: in __init__
self.schematransform_config = SchemaAwareExternalTransform.discover_config(
[1m[31mapache_beam/transforms/external.py[0m:385: in discover_config
for st in schematransforms:
[1m[31mapache_beam/transforms/external.py[0m:357: in discover
discover_response = service.DiscoverSchemaTransform(
[1m[31m../../build/gradleenv/1922375555/lib/python3.8/site-packages/grpc/_channel.py[0m:1030:
in __call__
return _end_unary_response_blocking(state, call, False, None)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
state = <grpc._channel._RPCState object at 0x7f3f7582d190>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f3f757c1740>
with_call = False, deadline = None
def _end_unary_response_blocking(
state: _RPCState, call: cygrpc.SegregatedCall, with_call: bool,
deadline: Optional[float]
) -> Union[ResponseType, Tuple[ResponseType, grpc.Call]]:
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None,
deadline)
return state.response, rendezvous
else:
return state.response
else:
> raise _InactiveRpcError(state) # pytype: disable=not-instantiable
[1m[31mE grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC
that terminated with:[0m
[1m[31mE status = StatusCode.UNAVAILABLE[0m
[1m[31mE details = "failed to connect to all addresses; last
error: UNKNOWN: ipv4:127.0.0.1:33201: Failed to connect to remote host:
Connection refused"[0m
[1m[31mE debug_error_string = "UNKNOWN:failed to connect to all
addresses; last error: UNKNOWN: ipv4:127.0.0.1:33201: Failed to connect to
remote host: Connection refused {grpc_status:14,
created_time:"2023-04-04T08:50:30.348762596+00:00"}"[0m
[1m[31mE >[0m
[1m[31m../../build/gradleenv/1922375555/lib/python3.8/site-packages/grpc/_channel.py[0m:910:
_InactiveRpcError
------------------------------ Captured log call -------------------------------
[32mINFO [0m apache_beam.io.gcp.bigquery_tools:bigquery_tools.py:816
Dataset apache-beam-testing:python_xlang_storage_write1680598229ab2112 does not
exist so we will create it as temporary with location=None
[32mINFO [0m
apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:107
Created dataset python_xlang_storage_write1680598229ab2112 in project
apache-beam-testing
[32mINFO [0m
apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:110
expansion port: 33201
[32mINFO [0m
apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:117
Deleting dataset python_xlang_storage_write1680598229ab2112 in project
apache-beam-testing
[33m=============================== warnings summary
===============================[0m
../../build/gradleenv/1922375555/lib/python3.8/site-packages/hdfs/config.py:15
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/hdfs/config.py>:15:
DeprecationWarning: the imp module is deprecated in favour of importlib; see
the module's documentation for alternative uses
from imp import load_source
../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:121
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:121:
DeprecationWarning: pkg_resources is deprecated as an API
warnings.warn("pkg_resources is deprecated as an API", DeprecationWarning)
../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2870:
18 warnings
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:2870:
DeprecationWarning: Deprecated call to
`pkg_resources.declare_namespace('google')`.
Implementing implicit namespace packages (as specified in PEP 420) is
preferred to `pkg_resources.declare_namespace`. See
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(pkg)
../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2870:
13 warnings
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:2870:
DeprecationWarning: Deprecated call to
`pkg_resources.declare_namespace('google.cloud')`.
Implementing implicit namespace packages (as specified in PEP 420) is
preferred to `pkg_resources.declare_namespace`. See
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(pkg)
../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2349
../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2349
../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2349
../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2349
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:2349:
DeprecationWarning: Deprecated call to
`pkg_resources.declare_namespace('google')`.
Implementing implicit namespace packages (as specified in PEP 420) is
preferred to `pkg_resources.declare_namespace`. See
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(parent)
../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2870
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:2870:
DeprecationWarning: Deprecated call to
`pkg_resources.declare_namespace('google.logging')`.
Implementing implicit namespace packages (as specified in PEP 420) is
preferred to `pkg_resources.declare_namespace`. See
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(pkg)
../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2870
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:2870:
DeprecationWarning: Deprecated call to
`pkg_resources.declare_namespace('google.iam')`.
Implementing implicit namespace packages (as specified in PEP 420) is
preferred to `pkg_resources.declare_namespace`. See
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(pkg)
../../build/gradleenv/1922375555/lib/python3.8/site-packages/google/rpc/__init__.py:20
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/google/rpc/__init__.py>:20:
DeprecationWarning: Deprecated call to
`pkg_resources.declare_namespace('google.rpc')`.
Implementing implicit namespace packages (as specified in PEP 420) is
preferred to `pkg_resources.declare_namespace`. See
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
pkg_resources.declare_namespace(__name__)
../../build/gradleenv/1922375555/lib/python3.8/site-packages/tenacity/_asyncio.py:42
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/tenacity/_asyncio.py>:42:
DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use
"async def" instead
def call(self, fn, *args, **kwargs):
apache_beam/typehints/pandas_type_compatibility_test.py:67
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:67:
FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas
in a future version. Use pandas.Index with the appropriate dtype instead.
}).set_index(pd.Int64Index(range(123, 223), name='an_index')),
apache_beam/typehints/pandas_type_compatibility_test.py:90
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:90:
FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas
in a future version. Use pandas.Index with the appropriate dtype instead.
pd.Int64Index(range(123, 223), name='an_index'),
apache_beam/typehints/pandas_type_compatibility_test.py:91
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:91:
FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas
in a future version. Use pandas.Index with the appropriate dtype instead.
pd.Int64Index(range(475, 575), name='another_index'),
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_storage_write_all_types
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_storage_write_nested_records_and_lists
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2030:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
is_streaming_pipeline = p.options.view_as(StandardOptions).streaming
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_storage_write_all_types
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_storage_write_nested_records_and_lists
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2036:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
experiments = p.options.view_as(DebugOptions).experiments or []
-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file:
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/pytest_gcpCrossLanguage.xml>
-
[36m[1m=========================== short test summary info
============================[0m
[31mFAILED[0m
apache_beam/io/external/xlang_bigqueryio_it_test.py::[1mBigQueryXlangStorageWriteIT::test_storage_write_nested_records_and_lists[0m
- grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated
with:
status = StatusCode.UNAVAILABLE
details = "failed to connect to all addresses; last error: UNKNOWN:
ipv4:127.0.0.1:33201: Failed to connect to remote host: Connection refused"
debug_error_string = "UNKNOWN:failed to connect to all addresses; last
error: UNKNOWN: ipv4:127.0.0.1:33201: Failed to connect to remote host:
Connection refused {grpc_status:14,
created_time:"2023-04-04T08:50:30.348762596+00:00"}"
>
[31m= [31m[1m1 failed[0m, [32m2 passed[0m, [33m9 skipped[0m, [33m6873
deselected[0m, [33m48 warnings[0m[31m in 1220.73s (0:20:20)[0m[31m =[0m
> Task :sdks:python:test-suites:dataflow:py37:gcpCrossLanguagePythonUsingJava
> FAILED
> Task :sdks:python:test-suites:dataflow:py37:gcpCrossLanguageCleanup
Stopping expansion service pid: 196195.
Skipping invalid pid: 196196.
> Task :sdks:python:test-suites:xlang:fnApiJobServerCleanup
Killing process at 156205
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py37:gcpCrossLanguagePythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings
and determine if they come from your own scripts or plugins.
See
https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings
Execution optimizations have been disabled for 1 invalid unit(s) of work during
this build to ensure correctness.
Please consult deprecation warnings for more details.
BUILD FAILED in 22m 8s
108 actionable tasks: 72 executed, 32 from cache, 4 up-to-date
Publishing build scan...
https://gradle.com/s/j7e3bvwz6q5zy
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]