See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/18/display/redirect?page=changes>

Changes:

[david.prieto.rivera] Missing contribution

[noreply] Merge pull request #15848 from [BEAM-13835] An any-type implementation

[Valentyn Tymofieiev] [BEAM-12920] Assume that bare generators types define 
simple generators.

[noreply] [release-2.36.0][website] Fix github release notes script, header for

[noreply] Use shell to run python for setupVirtualenv (#16796)

[Daniel Oliveira] [BEAM-13830] Properly shut down Debezium expansion service in 
IT script.

[noreply] Merge pull request #16659 from [BEAM-13774][Playground] Add user to

[noreply] [BEAM-13776][Playground] (#16731)


------------------------------------------
[...truncated 98.69 MB...]
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      
"urn": "beam:metric:pardo_execution_time:start_bundle_msecs:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      
"type": "beam:metrics:sum_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      
"payload": "AA==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      
"labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        
"PTRANSFORM": 
"ref_AppliedPTransform_read-table-_PassThroughThenCleanup-ParDo-RemoveExtractedFiles-_22"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, 
{'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      
"urn": "beam:metric:pardo_execution_time:start_bundle_msecs:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      
"type": "beam:metrics:sum_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      
"payload": "AA==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      
"labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        
"PTRANSFORM": "fn/read/ref_PCollection_PCollection_12:0"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, 
{'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      
"urn": "beam:metric:pardo_execution_time:process_bundle_msecs:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      
"type": "beam:metrics:sum_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      
"payload": "AA==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      
"labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        
"PTRANSFORM": 
"ref_AppliedPTransform_read-table-_PassThroughThenCleanup-ParDo-RemoveExtractedFiles-_22"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, 
{'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      
"urn": "beam:metric:sampled_byte_size:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      
"type": "beam:metrics:distribution_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      
"payload": "AQ8PDw==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      
"labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        
"PCOLLECTION": "ref_PCollection_PCollection_12"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, 
{'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      
"urn": "beam:metric:sampled_byte_size:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      
"type": "beam:metrics:distribution_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      
"payload": "AAD//////////3+AgICAgICAgIAB",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      
"labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        
"PCOLLECTION": "ref_PCollection_PCollection_13"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }]'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'  }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'}'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 10, 
2022 12:24:30 AM org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService 
stop'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: 
Stop job leader service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 10, 
2022 12:24:30 AM org.apache.flink.runtime.resourcemanager.ResourceManager 
deregisterApplication'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: 
Shut down cluster because application is in CANCELED, diagnostics 
DispatcherResourceManagerComponent has been closed..'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 10, 
2022 12:24:30 AM 
org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager shutdown'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: 
Shutting down TaskExecutorLocalStateStoresManager.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 10, 
2022 12:24:30 AM 
org.apache.flink.runtime.entrypoint.component.DispatcherResourceManagerComponent
 closeAsyncInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: 
Closing components.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 10, 
2022 12:24:30 AM 
org.apache.flink.runtime.dispatcher.runner.AbstractDispatcherLeaderProcess 
closeInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: 
Stopping SessionDispatcherLeaderProcess.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 10, 
2022 12:24:30 AM org.apache.flink.runtime.dispatcher.Dispatcher onStop'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: 
Stopping dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 10, 
2022 12:24:30 AM org.apache.flink.runtime.dispatcher.Dispatcher 
terminateRunningJobs'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: 
Stopping all currently running jobs of dispatcher 
akka://flink/user/rpc/dispatcher_2.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 10, 
2022 12:24:30 AM 
org.apache.flink.runtime.resourcemanager.slotmanager.DeclarativeSlotManager 
close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: 
Closing the slot manager.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 10, 
2022 12:24:30 AM 
org.apache.flink.runtime.resourcemanager.slotmanager.DeclarativeSlotManager 
suspend'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: 
Suspending the slot manager.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 10, 
2022 12:24:30 AM org.apache.flink.runtime.dispatcher.Dispatcher lambda$onStop$0'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: 
Stopped dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 10, 
2022 12:24:30 AM org.apache.flink.runtime.io.disk.FileChannelManagerImpl 
lambda$getFileCloser$0'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: 
FileChannelManager removed spill file directory 
/tmp/flink-io-0a593811-3424-430d-95a5-2a32b3c12da7'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 10, 
2022 12:24:30 AM org.apache.flink.runtime.io.network.NettyShuffleEnvironment 
close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: 
Shutting down the network environment and its components.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 10, 
2022 12:24:30 AM org.apache.flink.runtime.io.disk.FileChannelManagerImpl 
lambda$getFileCloser$0'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: 
FileChannelManager removed spill file directory 
/tmp/flink-netty-shuffle-80a18606-405f-4c72-916a-003de93add3c'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 10, 
2022 12:24:30 AM org.apache.flink.runtime.taskexecutor.KvStateService shutdown'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: 
Shutting down the kvState service and its components.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 10, 
2022 12:24:30 AM org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService 
stop'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: 
Stop job leader service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 10, 
2022 12:24:30 AM org.apache.flink.runtime.filecache.FileCache shutdown'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: 
removed file cache directory 
/tmp/flink-dist-cache-c08d122d-566f-4765-a131-f7e031b7d525'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 10, 
2022 12:24:30 AM org.apache.flink.runtime.taskexecutor.TaskExecutor 
handleOnStopException'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: 
Stopped TaskExecutor akka://flink/user/rpc/taskmanager_0.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 10, 
2022 12:24:30 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService stopService'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: 
Stopping Akka RPC service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 10, 
2022 12:24:30 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService stopService'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: 
Stopping Akka RPC service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 10, 
2022 12:24:30 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService 
lambda$stopService$7'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: 
Stopped Akka RPC service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 10, 
2022 12:24:30 AM org.apache.flink.runtime.blob.AbstractBlobCache close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: 
Shutting down BLOB cache'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 10, 
2022 12:24:30 AM org.apache.flink.runtime.blob.AbstractBlobCache close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: 
Shutting down BLOB cache'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 10, 
2022 12:24:30 AM org.apache.flink.runtime.blob.BlobServer close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: 
Stopped BLOB server at 0.0.0.0:45157'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Feb 10, 
2022 12:24:30 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService 
lambda$stopService$7'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: 
Stopped Akka RPC service.'
INFO     apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of 
the input
INFO     apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 11 files in 
0.040631771087646484 seconds.
INFO     apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of 
the input
INFO     apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 11 files in 
0.06815433502197266 seconds.
INFO     apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of 
the input
INFO     apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 33 files in 
0.059491872787475586 seconds.
PASSED                                                                   [ 95%]
apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics 
-------------------------------- live log call ---------------------------------
INFO     root:pipeline.py:188 Missing pipeline option (runner). Executing 
pipeline using the default runner: DirectRunner.
INFO     root:transforms.py:182 Computing dataframe stage 
<ComputeStage(PTransform) 
label=[[ComputedExpression[set_column_DataFrame_139646032046160], 
ComputedExpression[set_index_DataFrame_139645055959632], 
ComputedExpression[pre_combine_sum_DataFrame_139645028357584]]:139644680650512]>
 for 
Stage[inputs={PlaceholderExpression[placeholder_DataFrame_139645027559440]}, 
partitioning=Arbitrary, 
ops=[ComputedExpression[set_column_DataFrame_139646032046160], 
ComputedExpression[set_index_DataFrame_139645055959632], 
ComputedExpression[pre_combine_sum_DataFrame_139645028357584]], 
outputs={ComputedExpression[pre_combine_sum_DataFrame_139645028357584], 
PlaceholderExpression[placeholder_DataFrame_139645027559440]}]
INFO     root:transforms.py:182 Computing dataframe stage 
<ComputeStage(PTransform) 
label=[[ComputedExpression[post_combine_sum_DataFrame_139645028358288]]:139644680648528]>
 for 
Stage[inputs={ComputedExpression[pre_combine_sum_DataFrame_139645028357584]}, 
partitioning=Index, 
ops=[ComputedExpression[post_combine_sum_DataFrame_139645028358288]], 
outputs={ComputedExpression[post_combine_sum_DataFrame_139645028358288]}]
INFO     apache_beam.io.fileio:fileio.py:555 Added temporary directory 
/tmp/.temp64b54178-fffd-493b-ab55-d376135978ec
WARNING  root:environments.py:374 Make sure that locally built Python SDK 
docker image has Python 3.7 interpreter.
INFO     root:environments.py:380 Default Python SDK image for environment is 
apache/beam_python3.7_sdk:2.37.0.dev
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function annotate_downstream_side_inputs at 
0x7f01b87797a0> ====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function fix_side_input_pcoll_coders at 0x7f01b87798c0> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function pack_combiners at 0x7f01b8779dd0> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function lift_combiners at 0x7f01b8779e60> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function expand_sdf at 0x7f01b8778050> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function expand_gbk at 0x7f01b87780e0> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function sink_flattens at 0x7f01b8778200> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function greedily_fuse at 0x7f01b8778290> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function read_to_impulse at 0x7f01b8778320> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function impulse_to_input at 0x7f01b87783b0> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function sort_stages at 0x7f01b87785f0> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function setup_timer_mapping at 0x7f01b8778560> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function populate_data_channel_coders at 0x7f01b8778680> 
====================
INFO     apache_beam.runners.worker.statecache:statecache.py:172 Creating state 
cache with size 100
INFO     
apache_beam.runners.portability.fn_api_runner.worker_handlers:worker_handlers.py:894
 Created Worker handler 
<apache_beam.runners.portability.fn_api_runner.worker_handlers.EmbeddedWorkerHandler
 object at 0x7f01b42fdd50> for environment 
ref_Environment_default_environment_1 (beam:env:embedded_python:v1, b'')
INFO     
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621 
Running 
((((ref_AppliedPTransform_Read-Read-Impulse_4)+(ref_AppliedPTransform_Read-Read-Map-lambda-at-iobase-py-898-_5))+(Read/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+(Read/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction))+(ref_PCollection_PCollection_2_split/Write)
INFO     
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621 
Running 
((((((((((((((ref_PCollection_PCollection_2_split/Read)+(Read/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/Process))+(ref_AppliedPTransform_Split_8))+(ref_AppliedPTransform_ToRows_9))+(ref_AppliedPTransform_BatchElements-words-BatchElements-ParDo-_GlobalWindowsBatchingDoFn-_12))+(ref_AppliedPTransform_BatchElements-words-Map-lambda-at-schemas-py-140-_13))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmp3or4bx0m-result-ComputedExpression-set_column_DataFr_16))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmp3or4bx0m-result-ComputedExpression-set_column_DataFr_18))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmp3or4bx0m-result-ComputedExpression-post_combine_sum__21))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmp3or4bx0m-result-ComputedExpression-post_combine_sum__22))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmp3or4bx0m-result-ComputedExpression-post_combine_sum__23))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmp3or4bx0m-result-ComputedExpression-post_combine_sum__26))+(ToPCollection(df)
 - 
/tmp/tmp3or4bx0m.result/[ComputedExpression[post_combine_sum_DataFrame_139645028358288]]:139644680648528/CoGroupByKey/CoGroupByKeyImpl/Flatten/Transcode/0))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmp3or4bx0m-result-ComputedExpression-post_combine_sum__27))+(ToPCollection(df)
 - 
/tmp/tmp3or4bx0m.result/[ComputedExpression[post_combine_sum_DataFrame_139645028358288]]:139644680648528/CoGroupByKey/CoGroupByKeyImpl/GroupByKey/Write)
INFO     
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621 
Running ((((((((((((ToPCollection(df) - 
/tmp/tmp3or4bx0m.result/[ComputedExpression[post_combine_sum_DataFrame_139645028358288]]:139644680648528/CoGroupByKey/CoGroupByKeyImpl/GroupByKey/Read)+(ref_AppliedPTransform_ToPCollection-df---tmp-tmp3or4bx0m-result-ComputedExpression-post_combine_sum__29))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmp3or4bx0m-result-ComputedExpression-post_combine_sum__30))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmp3or4bx0m-result-ComputedExpression-post_combine_sum__31))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmp3or4bx0m-result-ComputedExpression-post_combine_sum__33))+(ref_AppliedPTransform_WriteToPandas-df---tmp-tmp3or4bx0m-result-WriteToFiles-ParDo-_WriteUnshardedRe_37))+(ref_AppliedPTransform_Unbatch-post_combine_sum_DataFrame_139645028358288-with-indexes-ParDo-_Unbatch_46))+(ref_AppliedPTransform_WriteToPandas-df---tmp-tmp3or4bx0m-result-WriteToFiles-ParDo-_AppendShardedDes_38))+(WriteToPandas(df)
 - /tmp/tmp3or4bx0m.result/WriteToFiles/Flatten/Write/0))+(WriteToPandas(df) - 
/tmp/tmp3or4bx0m.result/WriteToFiles/GroupRecordsByDestinationAndShard/Write))+(ref_AppliedPTransform_Filter-lambda-at-wordcount-py-80-_47))+(ref_AppliedPTransform_Map-lambda-at-wordcount-py-81-_48))+(ref_AppliedPTransform_Map-print-_49)
INFO     
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621 
Running ((WriteToPandas(df) - 
/tmp/tmp3or4bx0m.result/WriteToFiles/GroupRecordsByDestinationAndShard/Read)+(ref_AppliedPTransform_WriteToPandas-df---tmp-tmp3or4bx0m-result-WriteToFiles-ParDo-_WriteShardedReco_40))+(WriteToPandas(df)
 - /tmp/tmp3or4bx0m.result/WriteToFiles/Flatten/Write/1)
INFO     
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621 
Running ((WriteToPandas(df) - 
/tmp/tmp3or4bx0m.result/WriteToFiles/Flatten/Read)+(ref_AppliedPTransform_WriteToPandas-df---tmp-tmp3or4bx0m-result-WriteToFiles-Map-lambda-at-fileio-py_42))+(WriteToPandas(df)
 - /tmp/tmp3or4bx0m.result/WriteToFiles/GroupTempFilesByDestination/Write)
INFO     
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621 
Running (WriteToPandas(df) - 
/tmp/tmp3or4bx0m.result/WriteToFiles/GroupTempFilesByDestination/Read)+(ref_AppliedPTransform_WriteToPandas-df---tmp-tmp3or4bx0m-result-WriteToFiles-ParDo-_MoveTempFilesInt_44)
INFO     apache_beam.io.fileio:fileio.py:642 Moving temporary file 
/tmp/.temp64b54178-fffd-493b-ab55-d376135978ec/2252589742617321183_104bc4b3-0de4-4afb-b83a-884ccc9af849
 to dir: /tmp as tmp3or4bx0m.result-00000-of-00001. Res: 
FileResult(file_name='/tmp/.temp64b54178-fffd-493b-ab55-d376135978ec/2252589742617321183_104bc4b3-0de4-4afb-b83a-884ccc9af849',
 shard_index=-1, total_shards=0, window=GlobalWindow, pane=None, 
destination=None)
INFO     apache_beam.io.fileio:fileio.py:665 Checking orphaned temporary files 
for destination None and window GlobalWindow
INFO     apache_beam.io.fileio:fileio.py:678 Some files may be left orphaned in 
the temporary folder: []
PASSED                                                                   [100%]

=============================== warnings summary ===============================
apache_beam/io/filesystems_test.py:54
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:54:
 DeprecationWarning: invalid escape sequence \c
    self.assertIsNone(FileSystems.get_scheme('c:\\abc\cdf'))  # pylint: 
disable=anomalous-backslash-in-string

apache_beam/io/filesystems_test.py:62
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:62:
 DeprecationWarning: invalid escape sequence \d
    self.assertTrue(isinstance(FileSystems.get_filesystem('c:\\abc\def'),  # 
pylint: disable=anomalous-backslash-in-string

apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63:
 PendingDeprecationWarning: Client.dataset is deprecated and will be removed in 
a future version. Use a string like 'my_project.my_dataset' or a 
cloud.google.bigquery.DatasetReference object, instead.
    dataset_ref = client.dataset(unique_dataset_name, project=project)

apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2138:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    is_streaming_pipeline = p.options.view_as(StandardOptions).streaming

apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2144:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    experiments = p.options.view_as(DebugOptions).experiments or []

apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1128:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1130:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2437:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    temp_location = pcoll.pipeline.options.view_as(

apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2439:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2470:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    | _PassThroughThenCleanup(files_to_remove_pcoll))

apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2134:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100:
 PendingDeprecationWarning: Client.dataset is deprecated and will be removed in 
a future version. Use a string like 'my_project.my_dataset' or a 
cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:45:
 FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 
'numeric_only=None') is deprecated; in a future version this will raise 
TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/dataframe/io.py>:632:
 FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/fileio.py>:550:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/pytest_postCommitExamples-flink-py37.xml>
 -
===== 22 passed, 1 skipped, 5184 deselected, 40 warnings in 542.68 seconds =====


> Task :sdks:python:test-suites:portable:py38:flinkExamples FAILED

FAILURE: Build failed with an exception.

* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/test-suites/portable/common.gradle'>
 line: 217

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:portable:py38:flinkExamples'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 11m 21s
133 actionable tasks: 87 executed, 44 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/jvgx6zpe2ziv4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to