See
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/19/display/redirect?page=changes>
Changes:
[mmack] [BEAM-13246] Add support for S3 Bucket Key at the object level (AWS Sdk
[Pablo Estrada] Output successful rows from BQ Streaming Inserts
[schapman] BEAM-13439 Type annotation for ptransform_fn
[noreply] [BEAM-13606] Fail bundles with failed BigTable mutations (#16751)
------------------------------------------
[...truncated 272.89 KB...]
self.ops = self.create_execution_tree(self.process_bundle_descriptor)
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 916, in create_execution_tree
return collections.OrderedDict([(
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 919, in <listcomp>
get_operation(transform_id))) for transform_id in sorted(
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 806, in wrapper
result = cache[args] = func(*args)
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 898, in get_operation
transform_consumers = {
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 899, in <dictcomp>
tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 899, in <listcomp>
tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 806, in wrapper
result = cache[args] = func(*args)
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 898, in get_operation
transform_consumers = {
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 899, in <dictcomp>
tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 899, in <listcomp>
tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 806, in wrapper
result = cache[args] = func(*args)
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 898, in get_operation
transform_consumers = {
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 899, in <dictcomp>
tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 899, in <listcomp>
tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 806, in wrapper
result = cache[args] = func(*args)
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 898, in get_operation
transform_consumers = {
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 899, in <dictcomp>
tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 899, in <listcomp>
tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 806, in wrapper
result = cache[args] = func(*args)
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 898, in get_operation
transform_consumers = {
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 899, in <dictcomp>
tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 899, in <listcomp>
tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 806, in wrapper
result = cache[args] = func(*args)
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 898, in get_operation
transform_consumers = {
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 899, in <dictcomp>
tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 899, in <listcomp>
tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 806, in wrapper
result = cache[args] = func(*args)
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 898, in get_operation
transform_consumers = {
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 899, in <dictcomp>
tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 899, in <listcomp>
tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 806, in wrapper
result = cache[args] = func(*args)
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 898, in get_operation
transform_consumers = {
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 899, in <dictcomp>
tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 899, in <listcomp>
tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 806, in wrapper
result = cache[args] = func(*args)
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 898, in get_operation
transform_consumers = {
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 899, in <dictcomp>
tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 899, in <listcomp>
tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 806, in wrapper
result = cache[args] = func(*args)
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 903, in get_operation
return transform_factory.create_operation(
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 1198, in create_operation
return creator(self, transform_id, transform_proto, payload, consumers)
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 1545, in create_par_do
return _create_pardo_operation(
File
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 1588, in _create_pardo_operation
dofn_data = pickler.loads(serialized_fn)
File "/usr/local/lib/python3.8/site-packages/apache_beam/internal/pickler.py",
line 51, in loads
return desired_pickle_lib.loads(
File
"/usr/local/lib/python3.8/site-packages/apache_beam/internal/dill_pickler.py",
line 289, in loads
return dill.loads(s)
File "/usr/local/lib/python3.8/site-packages/dill/_dill.py", line 275, in loads
return load(file, ignore, **kwds)
File "/usr/local/lib/python3.8/site-packages/dill/_dill.py", line 270, in load
return Unpickler(file, ignore=ignore, **kwds).load()
File "/usr/local/lib/python3.8/site-packages/dill/_dill.py", line 472, in load
obj = StockUnpickler.load(self)
File "/usr/local/lib/python3.8/site-packages/dill/_dill.py", line 462, in
find_class
return StockUnpickler.find_class(self, module, name)
AttributeError: Can't get attribute '_unpickle_block' on <module
'pandas._libs.internals' from
'/usr/local/lib/python3.8/site-packages/pandas/_libs/internals.cpython-38-x86_64-linux-gnu.so'>
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-08T18:32:05.952Z: JOB_MESSAGE_DETAILED: Cleaning up.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-08T18:32:06.210Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-08T18:32:06.277Z: JOB_MESSAGE_BASIC: Stopping worker pool...
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-08T18:34:25.590Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker
pool from 1 to 0.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-08T18:34:25.714Z: JOB_MESSAGE_BASIC: Worker pool stopped.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-08T18:34:25.748Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 Job
2022-02-08_10_23_25-6390891360735482824 is in state JOB_STATE_FAILED
[32mINFO [0m apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size
estimation of the input
[32mINFO [0m apache_beam.io.gcp.gcsio:gcsio.py:572 Finished listing 0
files in 0.053968191146850586 seconds.
[33m=============================== warnings summary
===============================[0m
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42:
DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use
"async def" instead
def call(self, fn, *args, **kwargs):
apache_beam/io/filesystems_test.py:54
apache_beam/io/filesystems_test.py:54
apache_beam/io/filesystems_test.py:54
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:54:
DeprecationWarning: invalid escape sequence \c
self.assertIsNone(FileSystems.get_scheme('c:\\abc\cdf')) # pylint:
disable=anomalous-backslash-in-string
apache_beam/io/filesystems_test.py:62
apache_beam/io/filesystems_test.py:62
apache_beam/io/filesystems_test.py:62
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:62:
DeprecationWarning: invalid escape sequence \d
self.assertTrue(isinstance(FileSystems.get_filesystem('c:\\abc\def'), #
pylint: disable=anomalous-backslash-in-string
apache_beam/io/gcp/bigquery.py:2437
apache_beam/io/gcp/bigquery.py:2437
apache_beam/io/gcp/bigquery.py:2437
apache_beam/io/gcp/bigquery.py:2437
apache_beam/io/gcp/bigquery.py:2437
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2437:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
temp_location = pcoll.pipeline.options.view_as(
apache_beam/io/gcp/bigquery.py:2439
apache_beam/io/gcp/bigquery.py:2439
apache_beam/io/gcp/bigquery.py:2439
apache_beam/io/gcp/bigquery.py:2439
apache_beam/io/gcp/bigquery.py:2439
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2439:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name
apache_beam/io/gcp/bigquery.py:2463
apache_beam/io/gcp/bigquery.py:2463
apache_beam/io/gcp/bigquery.py:2463
apache_beam/io/gcp/bigquery.py:2463
apache_beam/io/gcp/bigquery.py:2463
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2463:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
pipeline_options=pcoll.pipeline.options,
apache_beam/dataframe/io.py:629
apache_beam/dataframe/io.py:629
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/dataframe/io.py>:629:
FutureWarning: WriteToFiles is experimental.
return pcoll | fileio.WriteToFiles(
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/fileio.py>:550:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
p.options.view_as(GoogleCloudOptions).temp_location or
apache_beam/io/gcp/tests/utils.py:63
apache_beam/io/gcp/tests/utils.py:63
apache_beam/io/gcp/tests/utils.py:63
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63:
PendingDeprecationWarning: Client.dataset is deprecated and will be removed in
a future version. Use a string like 'my_project.my_dataset' or a
cloud.google.bigquery.DatasetReference object, instead.
dataset_ref = client.dataset(unique_dataset_name, project=project)
apache_beam/io/gcp/bigquery.py:2138
apache_beam/io/gcp/bigquery.py:2138
apache_beam/io/gcp/bigquery.py:2138
apache_beam/io/gcp/bigquery.py:2138
apache_beam/io/gcp/bigquery.py:2138
apache_beam/io/gcp/bigquery.py:2138
apache_beam/io/gcp/bigquery.py:2138
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2138:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
is_streaming_pipeline = p.options.view_as(StandardOptions).streaming
apache_beam/io/gcp/bigquery.py:2144
apache_beam/io/gcp/bigquery.py:2144
apache_beam/io/gcp/bigquery.py:2144
apache_beam/io/gcp/bigquery.py:2144
apache_beam/io/gcp/bigquery.py:2144
apache_beam/io/gcp/bigquery.py:2144
apache_beam/io/gcp/bigquery.py:2144
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2144:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
experiments = p.options.view_as(DebugOptions).experiments or []
apache_beam/io/gcp/bigquery_file_loads.py:1128
apache_beam/io/gcp/bigquery_file_loads.py:1128
apache_beam/io/gcp/bigquery_file_loads.py:1128
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1128:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
temp_location = p.options.view_as(GoogleCloudOptions).temp_location
apache_beam/io/gcp/bigquery_file_loads.py:1130
apache_beam/io/gcp/bigquery_file_loads.py:1130
apache_beam/io/gcp/bigquery_file_loads.py:1130
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1130:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')
apache_beam/io/gcp/bigquery.py:2134
apache_beam/io/gcp/bigquery.py:2134
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2134:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
self.table_reference.projectId = pcoll.pipeline.options.view_as(
apache_beam/io/gcp/tests/utils.py:100
apache_beam/io/gcp/tests/utils.py:100
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100:
PendingDeprecationWarning: Client.dataset is deprecated and will be removed in
a future version. Use a string like 'my_project.my_dataset' or a
cloud.google.bigquery.DatasetReference object, instead.
table_ref = client.dataset(dataset_id).table(table_id)
apache_beam/examples/dataframe/flight_delays.py:45
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:45:
FutureWarning: Dropping of nuisance columns in DataFrame reductions (with
'numeric_only=None') is deprecated; in a future version this will raise
TypeError. Select only valid columns before calling the reduction.
return airline_df[at_top_airports].mean()
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file:
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/pytest_postCommitIT-df-py38.xml>
-
[31m[1m======== 8 failed, 16 passed, 1 skipped, 61 warnings in 1539.45
seconds ========[0m
> Task :sdks:python:test-suites:dataflow:py38:examples FAILED
FAILURE: Build failed with an exception.
* Where:
Script
'<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
line: 165
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:examples'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 28m 32s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/w44ov6lm573yk
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]