See
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/15/display/redirect?page=changes>
Changes:
[Kyle Weaver] Update Dataflow Python dev container images.
[Kiley Sok] Add java 17 to changes
[Daniel Oliveira] [BEAM-13732] Switch x-lang BigQueryIO expansion service to
GCP one.
[noreply] [BEAM-13858] Fix broken github action on :sdks:go:examples:wordCount
[Kiley Sok] add jira for runner v2
[noreply] [BEAM-13732] Go SDK BigQuery IO wrapper. Initial implementation.
[noreply] [BEAM-13732] Add example for Go BigQuery IO wrapper. (#16786)
[noreply] Update CHANGES.md with Go SDK milestones. (#16787)
[noreply] [BEAM-13193] Allow BeamFnDataOutboundObserver to flush elements.
------------------------------------------
[...truncated 98.33 MB...]
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Closing components.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 09,
2022 6:25:00 AM
org.apache.flink.runtime.dispatcher.runner.AbstractDispatcherLeaderProcess
closeInternal'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Stopping SessionDispatcherLeaderProcess.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 09,
2022 6:25:00 AM org.apache.flink.runtime.dispatcher.Dispatcher onStop'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Stopping dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 09,
2022 6:25:00 AM org.apache.flink.runtime.dispatcher.Dispatcher
terminateRunningJobs'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Stopping all currently running jobs of dispatcher
akka://flink/user/rpc/dispatcher_2.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 09,
2022 6:25:00 AM
org.apache.flink.runtime.resourcemanager.slotmanager.DeclarativeSlotManager
close'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Closing the slot manager.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 09,
2022 6:25:00 AM org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService
stop'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Stop job leader service.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 09,
2022 6:25:00 AM
org.apache.flink.runtime.resourcemanager.slotmanager.DeclarativeSlotManager
suspend'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Suspending the slot manager.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 09,
2022 6:25:00 AM
org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager shutdown'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Shutting down TaskExecutorLocalStateStoresManager.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 09,
2022 6:25:00 AM org.apache.flink.runtime.dispatcher.Dispatcher lambda$onStop$0'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Stopped dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 09,
2022 6:25:00 AM org.apache.flink.runtime.io.disk.FileChannelManagerImpl
lambda$getFileCloser$0'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
FileChannelManager removed spill file directory
/tmp/flink-io-5e8b1453-dfdb-44ee-b2eb-fade797fe040'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 09,
2022 6:25:00 AM org.apache.flink.runtime.io.network.NettyShuffleEnvironment
close'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Shutting down the network environment and its components.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 09,
2022 6:25:00 AM org.apache.flink.runtime.io.disk.FileChannelManagerImpl
lambda$getFileCloser$0'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
FileChannelManager removed spill file directory
/tmp/flink-netty-shuffle-0c33b776-81a9-4985-951c-5dd84cde8b21'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 09,
2022 6:25:00 AM org.apache.flink.runtime.taskexecutor.KvStateService shutdown'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Shutting down the kvState service and its components.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 09,
2022 6:25:00 AM org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService
stop'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Stop job leader service.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 09,
2022 6:25:00 AM org.apache.flink.runtime.filecache.FileCache shutdown'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
removed file cache directory
/tmp/flink-dist-cache-6f5b1f0b-1cd0-452f-8b68-72d023f3305c'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 09,
2022 6:25:00 AM org.apache.flink.runtime.taskexecutor.TaskExecutor
handleOnStopException'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Stopped TaskExecutor akka://flink/user/rpc/taskmanager_0.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 09,
2022 6:25:00 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService stopService'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Stopping Akka RPC service.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 09,
2022 6:25:00 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService stopService'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Stopping Akka RPC service.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 09,
2022 6:25:00 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService
lambda$stopService$7'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Stopped Akka RPC service.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 09,
2022 6:25:00 AM org.apache.flink.runtime.blob.AbstractBlobCache close'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Shutting down BLOB cache'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 09,
2022 6:25:00 AM org.apache.flink.runtime.blob.AbstractBlobCache close'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Shutting down BLOB cache'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 09,
2022 6:25:00 AM org.apache.flink.runtime.blob.BlobServer close'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Stopped BLOB server at 0.0.0.0:44773'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 09,
2022 6:25:00 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService
lambda$stopService$7'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Stopped Akka RPC service.'
INFO apache_beam.runners.portability.portable_runner:portable_runner.py:576
Job state changed to DONE
INFO apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of
the input
INFO apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 11 files in
0.04628324508666992 seconds.
INFO apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of
the input
INFO apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 11 files in
0.08922600746154785 seconds.
INFO apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of
the input
INFO apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 11 files in
0.04170370101928711 seconds.
INFO apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of
the input
INFO apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 33 files in
0.05970168113708496 seconds.
PASSED [ 95%]
apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics
-------------------------------- live log call ---------------------------------
INFO root:pipeline.py:188 Missing pipeline option (runner). Executing
pipeline using the default runner: DirectRunner.
INFO root:transforms.py:182 Computing dataframe stage
<ComputeStage(PTransform)
label=[[ComputedExpression[set_column_DataFrame_140484368752400],
ComputedExpression[set_index_DataFrame_140484368750736],
ComputedExpression[pre_combine_sum_DataFrame_140485402533648]]:140484355528848]>
for
Stage[inputs={PlaceholderExpression[placeholder_DataFrame_140483956168976]},
partitioning=Arbitrary,
ops=[ComputedExpression[set_column_DataFrame_140484368752400],
ComputedExpression[set_index_DataFrame_140484368750736],
ComputedExpression[pre_combine_sum_DataFrame_140485402533648]],
outputs={ComputedExpression[pre_combine_sum_DataFrame_140485402533648],
PlaceholderExpression[placeholder_DataFrame_140483956168976]}]
INFO root:transforms.py:182 Computing dataframe stage
<ComputeStage(PTransform)
label=[[ComputedExpression[post_combine_sum_DataFrame_140485401602320]]:140484376920208]>
for
Stage[inputs={ComputedExpression[pre_combine_sum_DataFrame_140485402533648]},
partitioning=Index,
ops=[ComputedExpression[post_combine_sum_DataFrame_140485401602320]],
outputs={ComputedExpression[post_combine_sum_DataFrame_140485401602320]}]
INFO apache_beam.io.fileio:fileio.py:555 Added temporary directory
/tmp/.temp5623238c-076a-4745-a57f-062dd9427495
WARNING root:environments.py:374 Make sure that locally built Python SDK
docker image has Python 3.7 interpreter.
INFO root:environments.py:380 Default Python SDK image for environment is
apache/beam_python3.7_sdk:2.37.0.dev
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function annotate_downstream_side_inputs at
0x7fc5248fd710> ====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function fix_side_input_pcoll_coders at 0x7fc5248fd830>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function pack_combiners at 0x7fc5248fdd40>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function lift_combiners at 0x7fc5248fddd0>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function expand_sdf at 0x7fc5248fdf80>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function expand_gbk at 0x7fc5248fb050>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function sink_flattens at 0x7fc5248fb170>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function greedily_fuse at 0x7fc5248fb200>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function read_to_impulse at 0x7fc5248fb290>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function impulse_to_input at 0x7fc5248fb320>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function sort_stages at 0x7fc5248fb560>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function setup_timer_mapping at 0x7fc5248fb4d0>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function populate_data_channel_coders at 0x7fc5248fb5f0>
====================
INFO apache_beam.runners.worker.statecache:statecache.py:172 Creating state
cache with size 100
INFO
apache_beam.runners.portability.fn_api_runner.worker_handlers:worker_handlers.py:894
Created Worker handler
<apache_beam.runners.portability.fn_api_runner.worker_handlers.EmbeddedWorkerHandler
object at 0x7fc4f86e97d0> for environment
ref_Environment_default_environment_1 (beam:env:embedded_python:v1, b'')
INFO
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621
Running
((((ref_AppliedPTransform_Read-Read-Impulse_4)+(ref_AppliedPTransform_Read-Read-Map-lambda-at-iobase-py-898-_5))+(Read/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+(Read/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction))+(ref_PCollection_PCollection_2_split/Write)
INFO
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621
Running
((((((((((((((ref_PCollection_PCollection_2_split/Read)+(Read/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/Process))+(ref_AppliedPTransform_Split_8))+(ref_AppliedPTransform_ToRows_9))+(ref_AppliedPTransform_BatchElements-words-BatchElements-ParDo-_GlobalWindowsBatchingDoFn-_12))+(ref_AppliedPTransform_BatchElements-words-Map-lambda-at-schemas-py-140-_13))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmps_y5o9_z-result-ComputedExpression-set_column_DataFr_16))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmps_y5o9_z-result-ComputedExpression-set_column_DataFr_18))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmps_y5o9_z-result-ComputedExpression-post_combine_sum__21))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmps_y5o9_z-result-ComputedExpression-post_combine_sum__22))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmps_y5o9_z-result-ComputedExpression-post_combine_sum__23))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmps_y5o9_z-result-ComputedExpression-post_combine_sum__26))+(ToPCollection(df)
-
/tmp/tmps_y5o9_z.result/[ComputedExpression[post_combine_sum_DataFrame_140485401602320]]:140484376920208/CoGroupByKey/CoGroupByKeyImpl/Flatten/Transcode/0))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmps_y5o9_z-result-ComputedExpression-post_combine_sum__27))+(ToPCollection(df)
-
/tmp/tmps_y5o9_z.result/[ComputedExpression[post_combine_sum_DataFrame_140485401602320]]:140484376920208/CoGroupByKey/CoGroupByKeyImpl/GroupByKey/Write)
INFO
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621
Running ((((((((((((ToPCollection(df) -
/tmp/tmps_y5o9_z.result/[ComputedExpression[post_combine_sum_DataFrame_140485401602320]]:140484376920208/CoGroupByKey/CoGroupByKeyImpl/GroupByKey/Read)+(ref_AppliedPTransform_ToPCollection-df---tmp-tmps_y5o9_z-result-ComputedExpression-post_combine_sum__29))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmps_y5o9_z-result-ComputedExpression-post_combine_sum__30))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmps_y5o9_z-result-ComputedExpression-post_combine_sum__31))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmps_y5o9_z-result-ComputedExpression-post_combine_sum__33))+(ref_AppliedPTransform_WriteToPandas-df---tmp-tmps_y5o9_z-result-WriteToFiles-ParDo-_WriteUnshardedRe_37))+(ref_AppliedPTransform_Unbatch-post_combine_sum_DataFrame_140485401602320-with-indexes-ParDo-_Unbatch_46))+(WriteToPandas(df)
-
/tmp/tmps_y5o9_z.result/WriteToFiles/Flatten/Write/0))+(ref_AppliedPTransform_WriteToPandas-df---tmp-tmps_y5o9_z-result-WriteToFiles-ParDo-_AppendShardedDes_38))+(WriteToPandas(df)
-
/tmp/tmps_y5o9_z.result/WriteToFiles/GroupRecordsByDestinationAndShard/Write))+(ref_AppliedPTransform_Filter-lambda-at-wordcount-py-80-_47))+(ref_AppliedPTransform_Map-lambda-at-wordcount-py-81-_48))+(ref_AppliedPTransform_Map-print-_49)
INFO
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621
Running ((WriteToPandas(df) -
/tmp/tmps_y5o9_z.result/WriteToFiles/GroupRecordsByDestinationAndShard/Read)+(ref_AppliedPTransform_WriteToPandas-df---tmp-tmps_y5o9_z-result-WriteToFiles-ParDo-_WriteShardedReco_40))+(WriteToPandas(df)
- /tmp/tmps_y5o9_z.result/WriteToFiles/Flatten/Write/1)
INFO
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621
Running ((WriteToPandas(df) -
/tmp/tmps_y5o9_z.result/WriteToFiles/Flatten/Read)+(ref_AppliedPTransform_WriteToPandas-df---tmp-tmps_y5o9_z-result-WriteToFiles-Map-lambda-at-fileio-py_42))+(WriteToPandas(df)
- /tmp/tmps_y5o9_z.result/WriteToFiles/GroupTempFilesByDestination/Write)
INFO
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621
Running (WriteToPandas(df) -
/tmp/tmps_y5o9_z.result/WriteToFiles/GroupTempFilesByDestination/Read)+(ref_AppliedPTransform_WriteToPandas-df---tmp-tmps_y5o9_z-result-WriteToFiles-ParDo-_MoveTempFilesInt_44)
INFO apache_beam.io.fileio:fileio.py:642 Moving temporary file
/tmp/.temp5623238c-076a-4745-a57f-062dd9427495/4797183535214225774_851e6a98-68a3-487b-b713-938fad2659c2
to dir: /tmp as tmps_y5o9_z.result-00000-of-00001. Res:
FileResult(file_name='/tmp/.temp5623238c-076a-4745-a57f-062dd9427495/4797183535214225774_851e6a98-68a3-487b-b713-938fad2659c2',
shard_index=-1, total_shards=0, window=GlobalWindow, pane=None,
destination=None)
INFO apache_beam.io.fileio:fileio.py:665 Checking orphaned temporary files
for destination None and window GlobalWindow
INFO apache_beam.io.fileio:fileio.py:678 Some files may be left orphaned in
the temporary folder: []
PASSED [100%]
=============================== warnings summary ===============================
apache_beam/io/filesystems_test.py:54
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:54:
DeprecationWarning: invalid escape sequence \c
self.assertIsNone(FileSystems.get_scheme('c:\\abc\cdf')) # pylint:
disable=anomalous-backslash-in-string
apache_beam/io/filesystems_test.py:62
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:62:
DeprecationWarning: invalid escape sequence \d
self.assertTrue(isinstance(FileSystems.get_filesystem('c:\\abc\def'), #
pylint: disable=anomalous-backslash-in-string
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63:
PendingDeprecationWarning: Client.dataset is deprecated and will be removed in
a future version. Use a string like 'my_project.my_dataset' or a
cloud.google.bigquery.DatasetReference object, instead.
dataset_ref = client.dataset(unique_dataset_name, project=project)
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2138:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
is_streaming_pipeline = p.options.view_as(StandardOptions).streaming
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2144:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
experiments = p.options.view_as(DebugOptions).experiments or []
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1128:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
temp_location = p.options.view_as(GoogleCloudOptions).temp_location
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1130:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2437:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
temp_location = pcoll.pipeline.options.view_as(
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2439:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2470:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
| _PassThroughThenCleanup(files_to_remove_pcoll))
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2134:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
self.table_reference.projectId = pcoll.pipeline.options.view_as(
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100:
PendingDeprecationWarning: Client.dataset is deprecated and will be removed in
a future version. Use a string like 'my_project.my_dataset' or a
cloud.google.bigquery.DatasetReference object, instead.
table_ref = client.dataset(dataset_id).table(table_id)
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:45:
FutureWarning: Dropping of nuisance columns in DataFrame reductions (with
'numeric_only=None') is deprecated; in a future version this will raise
TypeError. Select only valid columns before calling the reduction.
return airline_df[at_top_airports].mean()
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/dataframe/io.py>:632:
FutureWarning: WriteToFiles is experimental.
sink=lambda _: _WriteToPandasFileSink(
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/fileio.py>:550:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
p.options.view_as(GoogleCloudOptions).temp_location or
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file:
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/pytest_postCommitExamples-flink-py37.xml>
-
===== 22 passed, 1 skipped, 5181 deselected, 40 warnings in 612.87 seconds =====
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
self.run()
File "/usr/lib/python3.7/threading.py", line 870, in run
self._target(*self._args, **self._kwargs)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",>
line 651, in <lambda>
target=lambda: self._read_inputs(elements_iterator),
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",>
line 634, in _read_inputs
for elements in elements_iterator:
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",>
line 426, in __next__
return self._next()
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",>
line 826, in _next
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that
terminated with:
status = StatusCode.CANCELLED
details = "Multiplexer hanging up"
debug_error_string =
"{"created":"@1644387490.517618558","description":"Error received from peer
ipv6:[::1]:45653","file":"src/core/lib/surface/call.cc","file_line":1074,"grpc_message":"Multiplexer
hanging up","grpc_status":1}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
self.run()
File "/usr/lib/python3.7/threading.py", line 870, in run
self._target(*self._args, **self._kwargs)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",>
line 651, in <lambda>
target=lambda: self._read_inputs(elements_iterator),
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",>
line 634, in _read_inputs
for elements in elements_iterator:
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",>
line 426, in __next__
return self._next()
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",>
line 826, in _next
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that
terminated with:
status = StatusCode.CANCELLED
details = "Multiplexer hanging up"
debug_error_string =
"{"created":"@1644387807.032361336","description":"Error received from peer
ipv6:[::1]:33973","file":"src/core/lib/surface/call.cc","file_line":1074,"grpc_message":"Multiplexer
hanging up","grpc_status":1}"
>
FAILURE: Build failed with an exception.
* Where:
Script
'<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/test-suites/portable/common.gradle'>
line: 217
* What went wrong:
Execution failed for task
':sdks:python:test-suites:portable:py38:flinkExamples'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings
and determine if they come from your own scripts or plugins.
See
https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 24m 50s
133 actionable tasks: 99 executed, 32 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/i5r7rmufcg3qk
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]