See <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/16/display/redirect?page=changes>
Changes: [mmack] [BEAM-13246] Add support for S3 Bucket Key at the object level (AWS Sdk [Pablo Estrada] Output successful rows from BQ Streaming Inserts [schapman] BEAM-13439 Type annotation for ptransform_fn [noreply] [BEAM-13606] Fail bundles with failed BigTable mutations (#16751) ------------------------------------------ [...truncated 2.87 MB...] INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'22/02/08 18:12:02 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'22/02/08 18:12:02 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'22/02/08 18:12:02 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'22/02/08 18:12:02 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'22/02/08 18:12:02 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'22/02/08 18:12:02 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'22/02/08 18:12:02 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'22/02/08 18:12:02 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.' INFO apache_beam.runners.worker.statecache:statecache.py:172 Creating state cache with size 0 INFO apache_beam.runners.worker.sdk_worker:sdk_worker.py:164 Creating insecure control channel for localhost:41753. INFO apache_beam.runners.worker.sdk_worker:sdk_worker.py:172 Control channel established. INFO apache_beam.runners.worker.sdk_worker:sdk_worker.py:215 Initializing SDKHarness with unbounded number of workers. INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'22/02/08 18:12:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 131-1' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'22/02/08 18:12:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 131-2' INFO apache_beam.runners.worker.sdk_worker:sdk_worker.py:807 Creating insecure state channel for localhost:35105. INFO apache_beam.runners.worker.sdk_worker:sdk_worker.py:814 State channel established. INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'22/02/08 18:12:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 131-3' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'22/02/08 18:12:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 131-4' INFO apache_beam.runners.worker.data_plane:data_plane.py:750 Creating client data channel for localhost:38727 INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'22/02/08 18:12:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.' INFO apache_beam.io.fileio:fileio.py:642 Moving temporary file gs://temp-storage-for-end-to-end-tests/temp-it/.temp0319c52e-3a09-43c2-9119-41a005cf7c13/5071369618668368078_a891fb01-ac41-4721-8d6e-0764d840174e to dir: gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-0ba0c961-7e10-44e5-85c1-59b9b2639b7b as output.csv-2012-12-24T00:00:00-2012-12-25T00:00:00-00000-of-00004. Res: FileResult(file_name='gs://temp-storage-for-end-to-end-tests/temp-it/.temp0319c52e-3a09-43c2-9119-41a005cf7c13/5071369618668368078_a891fb01-ac41-4721-8d6e-0764d840174e', shard_index=-1, total_shards=0, window=[1356307200.0, 1356393600.0), pane=None, destination=None) INFO apache_beam.io.fileio:fileio.py:642 Moving temporary file gs://temp-storage-for-end-to-end-tests/temp-it/.temp0319c52e-3a09-43c2-9119-41a005cf7c13/2384709940919684914_78177a81-2072-4022-b70b-8cbb7034ffcc to dir: gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-0ba0c961-7e10-44e5-85c1-59b9b2639b7b as output.csv-2012-12-25T00:00:00-2012-12-26T00:00:00-00000-of-00004. Res: FileResult(file_name='gs://temp-storage-for-end-to-end-tests/temp-it/.temp0319c52e-3a09-43c2-9119-41a005cf7c13/2384709940919684914_78177a81-2072-4022-b70b-8cbb7034ffcc', shard_index=-1, total_shards=0, window=[1356393600.0, 1356480000.0), pane=None, destination=None) INFO apache_beam.io.fileio:fileio.py:642 Moving temporary file gs://temp-storage-for-end-to-end-tests/temp-it/.temp0319c52e-3a09-43c2-9119-41a005cf7c13/5896756308365160654_01932417-e0a2-40f2-9139-227fac3a5b0f to dir: gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-0ba0c961-7e10-44e5-85c1-59b9b2639b7b as output.csv-2012-12-23T00:00:00-2012-12-24T00:00:00-00000-of-00004. Res: FileResult(file_name='gs://temp-storage-for-end-to-end-tests/temp-it/.temp0319c52e-3a09-43c2-9119-41a005cf7c13/5896756308365160654_01932417-e0a2-40f2-9139-227fac3a5b0f', shard_index=-1, total_shards=0, window=[1356220800.0, 1356307200.0), pane=None, destination=None) INFO apache_beam.io.fileio:fileio.py:642 Moving temporary file gs://temp-storage-for-end-to-end-tests/temp-it/.temp0319c52e-3a09-43c2-9119-41a005cf7c13/2384709940919684914_bbbfe2a6-7fd2-444e-ab9f-45dd84297c3d to dir: gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-0ba0c961-7e10-44e5-85c1-59b9b2639b7b as output.csv-2012-12-25T00:00:00-2012-12-26T00:00:00-00001-of-00004. Res: FileResult(file_name='gs://temp-storage-for-end-to-end-tests/temp-it/.temp0319c52e-3a09-43c2-9119-41a005cf7c13/2384709940919684914_bbbfe2a6-7fd2-444e-ab9f-45dd84297c3d', shard_index=-1, total_shards=0, window=[1356393600.0, 1356480000.0), pane=None, destination=None) INFO apache_beam.io.fileio:fileio.py:642 Moving temporary file gs://temp-storage-for-end-to-end-tests/temp-it/.temp0319c52e-3a09-43c2-9119-41a005cf7c13/5896756308365160654_c5453e23-5e99-459f-8a65-7f3ab3b8f8c1 to dir: gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-0ba0c961-7e10-44e5-85c1-59b9b2639b7b as output.csv-2012-12-23T00:00:00-2012-12-24T00:00:00-00001-of-00004. Res: FileResult(file_name='gs://temp-storage-for-end-to-end-tests/temp-it/.temp0319c52e-3a09-43c2-9119-41a005cf7c13/5896756308365160654_c5453e23-5e99-459f-8a65-7f3ab3b8f8c1', shard_index=-1, total_shards=0, window=[1356220800.0, 1356307200.0), pane=None, destination=None) INFO apache_beam.io.fileio:fileio.py:642 Moving temporary file gs://temp-storage-for-end-to-end-tests/temp-it/.temp0319c52e-3a09-43c2-9119-41a005cf7c13/5071369618668368078_b65aeea1-226e-4556-bf32-9975786661bc to dir: gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-0ba0c961-7e10-44e5-85c1-59b9b2639b7b as output.csv-2012-12-24T00:00:00-2012-12-25T00:00:00-00001-of-00004. Res: FileResult(file_name='gs://temp-storage-for-end-to-end-tests/temp-it/.temp0319c52e-3a09-43c2-9119-41a005cf7c13/5071369618668368078_b65aeea1-226e-4556-bf32-9975786661bc', shard_index=-1, total_shards=0, window=[1356307200.0, 1356393600.0), pane=None, destination=None) INFO apache_beam.io.fileio:fileio.py:642 Moving temporary file gs://temp-storage-for-end-to-end-tests/temp-it/.temp0319c52e-3a09-43c2-9119-41a005cf7c13/5896756308365160654_b560b230-5392-440d-96d2-304130d3df9a to dir: gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-0ba0c961-7e10-44e5-85c1-59b9b2639b7b as output.csv-2012-12-23T00:00:00-2012-12-24T00:00:00-00002-of-00004. Res: FileResult(file_name='gs://temp-storage-for-end-to-end-tests/temp-it/.temp0319c52e-3a09-43c2-9119-41a005cf7c13/5896756308365160654_b560b230-5392-440d-96d2-304130d3df9a', shard_index=-1, total_shards=0, window=[1356220800.0, 1356307200.0), pane=None, destination=None) INFO apache_beam.io.fileio:fileio.py:642 Moving temporary file gs://temp-storage-for-end-to-end-tests/temp-it/.temp0319c52e-3a09-43c2-9119-41a005cf7c13/2384709940919684914_93a4795d-a61a-4d60-933f-0d2901c8db7d to dir: gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-0ba0c961-7e10-44e5-85c1-59b9b2639b7b as output.csv-2012-12-25T00:00:00-2012-12-26T00:00:00-00002-of-00004. Res: FileResult(file_name='gs://temp-storage-for-end-to-end-tests/temp-it/.temp0319c52e-3a09-43c2-9119-41a005cf7c13/2384709940919684914_93a4795d-a61a-4d60-933f-0d2901c8db7d', shard_index=-1, total_shards=0, window=[1356393600.0, 1356480000.0), pane=None, destination=None) INFO apache_beam.io.fileio:fileio.py:642 Moving temporary file gs://temp-storage-for-end-to-end-tests/temp-it/.temp0319c52e-3a09-43c2-9119-41a005cf7c13/5071369618668368078_cbdc0846-7ee0-4065-b252-7eecc9b2a478 to dir: gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-0ba0c961-7e10-44e5-85c1-59b9b2639b7b as output.csv-2012-12-24T00:00:00-2012-12-25T00:00:00-00002-of-00004. Res: FileResult(file_name='gs://temp-storage-for-end-to-end-tests/temp-it/.temp0319c52e-3a09-43c2-9119-41a005cf7c13/5071369618668368078_cbdc0846-7ee0-4065-b252-7eecc9b2a478', shard_index=-1, total_shards=0, window=[1356307200.0, 1356393600.0), pane=None, destination=None) INFO apache_beam.io.fileio:fileio.py:642 Moving temporary file gs://temp-storage-for-end-to-end-tests/temp-it/.temp0319c52e-3a09-43c2-9119-41a005cf7c13/5896756308365160654_a7576540-640b-43aa-85a1-388816a5ebc6 to dir: gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-0ba0c961-7e10-44e5-85c1-59b9b2639b7b as output.csv-2012-12-23T00:00:00-2012-12-24T00:00:00-00003-of-00004. Res: FileResult(file_name='gs://temp-storage-for-end-to-end-tests/temp-it/.temp0319c52e-3a09-43c2-9119-41a005cf7c13/5896756308365160654_a7576540-640b-43aa-85a1-388816a5ebc6', shard_index=-1, total_shards=0, window=[1356220800.0, 1356307200.0), pane=None, destination=None) INFO apache_beam.io.fileio:fileio.py:642 Moving temporary file gs://temp-storage-for-end-to-end-tests/temp-it/.temp0319c52e-3a09-43c2-9119-41a005cf7c13/5071369618668368078_cc6d5b64-697d-4b2d-b81b-2147e5547b04 to dir: gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-0ba0c961-7e10-44e5-85c1-59b9b2639b7b as output.csv-2012-12-24T00:00:00-2012-12-25T00:00:00-00003-of-00004. Res: FileResult(file_name='gs://temp-storage-for-end-to-end-tests/temp-it/.temp0319c52e-3a09-43c2-9119-41a005cf7c13/5071369618668368078_cc6d5b64-697d-4b2d-b81b-2147e5547b04', shard_index=-1, total_shards=0, window=[1356307200.0, 1356393600.0), pane=None, destination=None) INFO apache_beam.io.fileio:fileio.py:642 Moving temporary file gs://temp-storage-for-end-to-end-tests/temp-it/.temp0319c52e-3a09-43c2-9119-41a005cf7c13/2384709940919684914_b87081a1-09e1-4a4f-857c-63cf941b2b02 to dir: gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-0ba0c961-7e10-44e5-85c1-59b9b2639b7b as output.csv-2012-12-25T00:00:00-2012-12-26T00:00:00-00003-of-00004. Res: FileResult(file_name='gs://temp-storage-for-end-to-end-tests/temp-it/.temp0319c52e-3a09-43c2-9119-41a005cf7c13/2384709940919684914_b87081a1-09e1-4a4f-857c-63cf941b2b02', shard_index=-1, total_shards=0, window=[1356393600.0, 1356480000.0), pane=None, destination=None) INFO apache_beam.io.fileio:fileio.py:665 Checking orphaned temporary files for destination None and window [1356307200.0, 1356393600.0) INFO apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of the input INFO apache_beam.io.fileio:fileio.py:665 Checking orphaned temporary files for destination None and window [1356220800.0, 1356307200.0) INFO apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of the input INFO apache_beam.io.fileio:fileio.py:665 Checking orphaned temporary files for destination None and window [1356393600.0, 1356480000.0) INFO apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of the input INFO apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 0 files in 0.04423856735229492 seconds. INFO apache_beam.io.fileio:fileio.py:678 Some files may be left orphaned in the temporary folder: [] INFO apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 0 files in 0.048372745513916016 seconds. INFO apache_beam.io.fileio:fileio.py:678 Some files may be left orphaned in the temporary folder: [] INFO apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 0 files in 0.04070782661437988 seconds. INFO apache_beam.io.fileio:fileio.py:678 Some files may be left orphaned in the temporary folder: [] INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'22/02/08 18:12:03 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:external:v1"' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'payload: "\\n\\021\\n\\017localhost:45383"' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'capabilities: "beam:coder:bytes:v1"' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'capabilities: "beam:coder:string_utf8:v1"' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'capabilities: "beam:coder:kv:v1"' INFO apache_beam.runners.worker.sdk_worker:sdk_worker.py:244 No more requests from control plane INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'capabilities: "beam:coder:bool:v1"' ERROR apache_beam.runners.worker.data_plane:data_plane.py:641 Failed to read inputs in the data plane. Traceback (most recent call last): File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs for elements in elements_iterator: File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__ return self._next() File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next raise self grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with: status = StatusCode.CANCELLED details = "Multiplexer hanging up" debug_error_string = "{"created":"@1644343923.275773340","description":"Error received from peer ipv6:[::1]:38727","file":"src/core/lib/surface/call.cc","file_line":1074,"grpc_message":"Multiplexer hanging up","grpc_status":1}" > INFO apache_beam.runners.worker.sdk_worker:sdk_worker.py:245 SDK Harness waiting for in-flight requests to complete INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'capabilities: "beam:coder:varint:v1"' INFO apache_beam.runners.worker.data_plane:data_plane.py:782 Closing all cached grpc data channels. INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'capabilities: "beam:coder:double:v1"' INFO apache_beam.runners.worker.sdk_worker:sdk_worker.py:826 Closing all cached gRPC state handlers. INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'capabilities: "beam:coder:iterable:v1"' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'capabilities: "beam:coder:timer:v1"' INFO apache_beam.runners.worker.sdk_worker:sdk_worker.py:257 Done consuming work. INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'capabilities: "beam:coder:interval_window:v1"' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'capabilities: "beam:coder:length_prefix:v1"' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'capabilities: "beam:coder:global_window:v1"' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'capabilities: "beam:coder:windowed_value:v1"' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'capabilities: "beam:coder:param_windowed_value:v1"' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'capabilities: "beam:coder:state_backed_iterable:v1"' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'capabilities: "beam:coder:custom_window:v1"' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'capabilities: "beam:coder:row:v1"' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'capabilities: "beam:coder:sharded_key:v1"' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'capabilities: "beam:protocol:progress_reporting:v0"' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'capabilities: "beam:protocol:harness_monitoring_infos:v1"' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'capabilities: "beam:protocol:worker_status:v1"' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'capabilities: "beam:combinefn:packed_python:v1"' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'capabilities: "beam:version:sdk_base:apache/beam_python3.7_sdk:2.37.0.dev"' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'capabilities: "beam:transform:to_string:v1"' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'22/02/08 18:12:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'22/02/08 18:12:03 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-0208181137-3c2d3c4f_da091925-d871-4979-8f69-111f68f2a085 finished.' INFO apache_beam.runners.portability.portable_runner:portable_runner.py:576 Job state changed to DONE INFO apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of the input INFO apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 4 files in 0.0433964729309082 seconds. INFO apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of the input INFO apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 4 files in 0.05379056930541992 seconds. INFO apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of the input INFO apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 4 files in 0.029974937438964844 seconds. INFO apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of the input INFO apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 12 files in 0.03670692443847656 seconds. PASSED [ 95%] apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics -------------------------------- live log call --------------------------------- INFO root:pipeline.py:188 Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner. INFO root:transforms.py:182 Computing dataframe stage <ComputeStage(PTransform) label=[[ComputedExpression[set_column_DataFrame_139909278743248], ComputedExpression[set_index_DataFrame_139909278743760], ComputedExpression[pre_combine_sum_DataFrame_139909275999248]]:139908864531152]> for Stage[inputs={PlaceholderExpression[placeholder_DataFrame_139908941668944]}, partitioning=Arbitrary, ops=[ComputedExpression[set_column_DataFrame_139909278743248], ComputedExpression[set_index_DataFrame_139909278743760], ComputedExpression[pre_combine_sum_DataFrame_139909275999248]], outputs={ComputedExpression[pre_combine_sum_DataFrame_139909275999248], PlaceholderExpression[placeholder_DataFrame_139908941668944]}] INFO root:transforms.py:182 Computing dataframe stage <ComputeStage(PTransform) label=[[ComputedExpression[post_combine_sum_DataFrame_139909275999568]]:139909277633424]> for Stage[inputs={ComputedExpression[pre_combine_sum_DataFrame_139909275999248]}, partitioning=Index, ops=[ComputedExpression[post_combine_sum_DataFrame_139909275999568]], outputs={ComputedExpression[post_combine_sum_DataFrame_139909275999568]}] INFO apache_beam.io.fileio:fileio.py:555 Added temporary directory /tmp/.tempc0aee8a7-6649-446b-9317-70b170a29039 WARNING root:environments.py:374 Make sure that locally built Python SDK docker image has Python 3.7 interpreter. INFO root:environments.py:380 Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev INFO apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function annotate_downstream_side_inputs at 0x7f3f16c89710> ==================== INFO apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function fix_side_input_pcoll_coders at 0x7f3f16c89830> ==================== INFO apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function pack_combiners at 0x7f3f16c89d40> ==================== INFO apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function lift_combiners at 0x7f3f16c89dd0> ==================== INFO apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function expand_sdf at 0x7f3f16c89f80> ==================== INFO apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function expand_gbk at 0x7f3f16c83050> ==================== INFO apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function sink_flattens at 0x7f3f16c83170> ==================== INFO apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function greedily_fuse at 0x7f3f16c83200> ==================== INFO apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function read_to_impulse at 0x7f3f16c83290> ==================== INFO apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function impulse_to_input at 0x7f3f16c83320> ==================== INFO apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function sort_stages at 0x7f3f16c83560> ==================== INFO apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function setup_timer_mapping at 0x7f3f16c834d0> ==================== INFO apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function populate_data_channel_coders at 0x7f3f16c835f0> ==================== INFO apache_beam.runners.worker.statecache:statecache.py:172 Creating state cache with size 100 INFO apache_beam.runners.portability.fn_api_runner.worker_handlers:worker_handlers.py:894 Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.EmbeddedWorkerHandler object at 0x7f3eb5995a50> for environment ref_Environment_default_environment_1 (beam:env:embedded_python:v1, b'') INFO apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621 Running ((((ref_AppliedPTransform_Read-Read-Impulse_4)+(ref_AppliedPTransform_Read-Read-Map-lambda-at-iobase-py-898-_5))+(Read/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+(Read/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction))+(ref_PCollection_PCollection_2_split/Write) INFO apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621 Running ((((((((((((((ref_PCollection_PCollection_2_split/Read)+(Read/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/Process))+(ref_AppliedPTransform_Split_8))+(ref_AppliedPTransform_ToRows_9))+(ref_AppliedPTransform_BatchElements-words-BatchElements-ParDo-_GlobalWindowsBatchingDoFn-_12))+(ref_AppliedPTransform_BatchElements-words-Map-lambda-at-schemas-py-140-_13))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpvohk5v91-result-ComputedExpression-set_column_DataFr_16))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpvohk5v91-result-ComputedExpression-set_column_DataFr_18))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpvohk5v91-result-ComputedExpression-post_combine_sum__21))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpvohk5v91-result-ComputedExpression-post_combine_sum__22))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpvohk5v91-result-ComputedExpression-post_combine_sum__23))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpvohk5v91-result-ComputedExpression-post_combine_sum__26))+(ToPCollection(df) - /tmp/tmpvohk5v91.result/[ComputedExpression[post_combine_sum_DataFrame_139909275999568]]:139909277633424/CoGroupByKey/CoGroupByKeyImpl/Flatten/Transcode/0))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpvohk5v91-result-ComputedExpression-post_combine_sum__27))+(ToPCollection(df) - /tmp/tmpvohk5v91.result/[ComputedExpression[post_combine_sum_DataFrame_139909275999568]]:139909277633424/CoGroupByKey/CoGroupByKeyImpl/GroupByKey/Write) INFO apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621 Running ((((((((((((ToPCollection(df) - /tmp/tmpvohk5v91.result/[ComputedExpression[post_combine_sum_DataFrame_139909275999568]]:139909277633424/CoGroupByKey/CoGroupByKeyImpl/GroupByKey/Read)+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpvohk5v91-result-ComputedExpression-post_combine_sum__29))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpvohk5v91-result-ComputedExpression-post_combine_sum__30))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpvohk5v91-result-ComputedExpression-post_combine_sum__31))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpvohk5v91-result-ComputedExpression-post_combine_sum__33))+(ref_AppliedPTransform_WriteToPandas-df---tmp-tmpvohk5v91-result-WriteToFiles-ParDo-_WriteUnshardedRe_37))+(ref_AppliedPTransform_Unbatch-post_combine_sum_DataFrame_139909275999568-with-indexes-ParDo-_Unbatch_46))+(ref_AppliedPTransform_WriteToPandas-df---tmp-tmpvohk5v91-result-WriteToFiles-ParDo-_AppendShardedDes_38))+(WriteToPandas(df) - /tmp/tmpvohk5v91.result/WriteToFiles/Flatten/Write/0))+(WriteToPandas(df) - /tmp/tmpvohk5v91.result/WriteToFiles/GroupRecordsByDestinationAndShard/Write))+(ref_AppliedPTransform_Filter-lambda-at-wordcount-py-80-_47))+(ref_AppliedPTransform_Map-lambda-at-wordcount-py-81-_48))+(ref_AppliedPTransform_Map-print-_49) INFO apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621 Running ((WriteToPandas(df) - /tmp/tmpvohk5v91.result/WriteToFiles/GroupRecordsByDestinationAndShard/Read)+(ref_AppliedPTransform_WriteToPandas-df---tmp-tmpvohk5v91-result-WriteToFiles-ParDo-_WriteShardedReco_40))+(WriteToPandas(df) - /tmp/tmpvohk5v91.result/WriteToFiles/Flatten/Write/1) INFO apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621 Running ((WriteToPandas(df) - /tmp/tmpvohk5v91.result/WriteToFiles/Flatten/Read)+(ref_AppliedPTransform_WriteToPandas-df---tmp-tmpvohk5v91-result-WriteToFiles-Map-lambda-at-fileio-py_42))+(WriteToPandas(df) - /tmp/tmpvohk5v91.result/WriteToFiles/GroupTempFilesByDestination/Write) INFO apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621 Running (WriteToPandas(df) - /tmp/tmpvohk5v91.result/WriteToFiles/GroupTempFilesByDestination/Read)+(ref_AppliedPTransform_WriteToPandas-df---tmp-tmpvohk5v91-result-WriteToFiles-ParDo-_MoveTempFilesInt_44) INFO apache_beam.io.fileio:fileio.py:642 Moving temporary file /tmp/.tempc0aee8a7-6649-446b-9317-70b170a29039/943171195407538307_ee49a162-a000-4ca1-9134-4c208b10e635 to dir: /tmp as tmpvohk5v91.result-00000-of-00001. Res: FileResult(file_name='/tmp/.tempc0aee8a7-6649-446b-9317-70b170a29039/943171195407538307_ee49a162-a000-4ca1-9134-4c208b10e635', shard_index=-1, total_shards=0, window=GlobalWindow, pane=None, destination=None) INFO apache_beam.io.fileio:fileio.py:665 Checking orphaned temporary files for destination None and window GlobalWindow INFO apache_beam.io.fileio:fileio.py:678 Some files may be left orphaned in the temporary folder: [] PASSED [100%] =============================== warnings summary =============================== apache_beam/io/filesystems_test.py:54 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:54: DeprecationWarning: invalid escape sequence \c self.assertIsNone(FileSystems.get_scheme('c:\\abc\cdf')) # pylint: disable=anomalous-backslash-in-string apache_beam/io/filesystems_test.py:62 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:62: DeprecationWarning: invalid escape sequence \d self.assertTrue(isinstance(FileSystems.get_filesystem('c:\\abc\def'), # pylint: disable=anomalous-backslash-in-string apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead. dataset_ref = client.dataset(unique_dataset_name, project=project) apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2138: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported is_streaming_pipeline = p.options.view_as(StandardOptions).streaming apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2144: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1128: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1130: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME') apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2437: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = pcoll.pipeline.options.view_as( apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2439: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2470: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported | _PassThroughThenCleanup(files_to_remove_pcoll)) apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2134: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported self.table_reference.projectId = pcoll.pipeline.options.view_as( apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead. table_ref = client.dataset(dataset_id).table(table_id) apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:45: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError. Select only valid columns before calling the reduction. return airline_df[at_top_airports].mean() apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/dataframe/io.py>:632: FutureWarning: WriteToFiles is experimental. sink=lambda _: _WriteToPandasFileSink( apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported p.options.view_as(GoogleCloudOptions).temp_location or -- Docs: https://docs.pytest.org/en/latest/warnings.html - generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/pytest_postCommitExamples-spark-py37.xml> - ===== 22 passed, 1 skipped, 5178 deselected, 39 warnings in 225.23 seconds ===== FAILURE: Build failed with an exception. * Where: Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 248 * What went wrong: Execution failed for task ':sdks:python:test-suites:portable:py38:sparkExamples'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. > Run with --scan to get full insights. * Get more help at https://help.gradle.org BUILD FAILED in 5m 55s 79 actionable tasks: 51 executed, 26 from cache, 2 up-to-date Publishing build scan... https://gradle.com/s/ggcrlao7f4xzm Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
