See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/14/display/redirect?page=changes>

Changes:

[noreply] [BEAM-13848] Update numpy intersphinx link (#16767)

[noreply] [release-23.6.0] Fix JIRA link for 2.36 blog (#16771)

[noreply] [BEAM-13647] Use role for Go worker binary. (#16729)


------------------------------------------
[...truncated 2.74 MB...]
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'22/02/08 06:09:12 WARN 
software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to 
retrieve the requested metadata.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'22/02/08 06:09:12 WARN 
software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to 
retrieve the requested metadata.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'22/02/08 06:09:12 WARN 
software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to 
retrieve the requested metadata.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'22/02/08 06:09:12 WARN 
software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to 
retrieve the requested metadata.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'22/02/08 06:09:12 WARN 
software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to 
retrieve the requested metadata.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'22/02/08 06:09:12 WARN 
software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to 
retrieve the requested metadata.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'22/02/08 06:09:12 WARN 
software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to 
retrieve the requested metadata.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'22/02/08 06:09:12 WARN 
software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to 
retrieve the requested metadata.'
INFO     apache_beam.runners.worker.statecache:statecache.py:172 Creating state 
cache with size 0
INFO     apache_beam.runners.worker.sdk_worker:sdk_worker.py:164 Creating 
insecure control channel for localhost:45441.
INFO     apache_beam.runners.worker.sdk_worker:sdk_worker.py:172 Control 
channel established.
INFO     apache_beam.runners.worker.sdk_worker:sdk_worker.py:215 Initializing 
SDKHarness with unbounded number of workers.
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'22/02/08 06:09:12 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam 
Fn Control client connected with id 131-1'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'22/02/08 06:09:12 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 131-2'
INFO     apache_beam.runners.worker.sdk_worker:sdk_worker.py:807 Creating 
insecure state channel for localhost:33677.
INFO     apache_beam.runners.worker.sdk_worker:sdk_worker.py:814 State channel 
established.
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'22/02/08 06:09:12 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 131-3'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'22/02/08 06:09:12 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 131-4'
INFO     apache_beam.runners.worker.data_plane:data_plane.py:750 Creating 
client data channel for localhost:46653
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'22/02/08 06:09:12 INFO 
org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client 
connected.'
INFO     apache_beam.io.fileio:fileio.py:642 Moving temporary file 
gs://temp-storage-for-end-to-end-tests/temp-it/.temp2fd192ce-a2d3-44ea-91b1-b3d0fa9724c7/3458111738400001586_441bffc2-e5ff-4e63-befe-60945db02eac
 to dir: 
gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-90bed2d4-1817-43b0-9b32-27f42081ef17
 as output.csv-2012-12-24T00:00:00-2012-12-25T00:00:00-00000-of-00004. Res: 
FileResult(file_name='gs://temp-storage-for-end-to-end-tests/temp-it/.temp2fd192ce-a2d3-44ea-91b1-b3d0fa9724c7/3458111738400001586_441bffc2-e5ff-4e63-befe-60945db02eac',
 shard_index=-1, total_shards=0, window=[1356307200.0, 1356393600.0), 
pane=None, destination=None)
INFO     apache_beam.io.fileio:fileio.py:642 Moving temporary file 
gs://temp-storage-for-end-to-end-tests/temp-it/.temp2fd192ce-a2d3-44ea-91b1-b3d0fa9724c7/1631671061563887154_2f090387-2b28-4900-8680-8fc55c3b32ec
 to dir: 
gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-90bed2d4-1817-43b0-9b32-27f42081ef17
 as output.csv-2012-12-25T00:00:00-2012-12-26T00:00:00-00000-of-00004. Res: 
FileResult(file_name='gs://temp-storage-for-end-to-end-tests/temp-it/.temp2fd192ce-a2d3-44ea-91b1-b3d0fa9724c7/1631671061563887154_2f090387-2b28-4900-8680-8fc55c3b32ec',
 shard_index=-1, total_shards=0, window=[1356393600.0, 1356480000.0), 
pane=None, destination=None)
INFO     apache_beam.io.fileio:fileio.py:642 Moving temporary file 
gs://temp-storage-for-end-to-end-tests/temp-it/.temp2fd192ce-a2d3-44ea-91b1-b3d0fa9724c7/2296956298787467726_9898fa08-f2c2-45b2-9d2f-eabacbef6652
 to dir: 
gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-90bed2d4-1817-43b0-9b32-27f42081ef17
 as output.csv-2012-12-23T00:00:00-2012-12-24T00:00:00-00000-of-00004. Res: 
FileResult(file_name='gs://temp-storage-for-end-to-end-tests/temp-it/.temp2fd192ce-a2d3-44ea-91b1-b3d0fa9724c7/2296956298787467726_9898fa08-f2c2-45b2-9d2f-eabacbef6652',
 shard_index=-1, total_shards=0, window=[1356220800.0, 1356307200.0), 
pane=None, destination=None)
INFO     apache_beam.io.fileio:fileio.py:642 Moving temporary file 
gs://temp-storage-for-end-to-end-tests/temp-it/.temp2fd192ce-a2d3-44ea-91b1-b3d0fa9724c7/3458111738400001586_12609429-ca41-4cf3-aef7-7a6f3f3f5312
 to dir: 
gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-90bed2d4-1817-43b0-9b32-27f42081ef17
 as output.csv-2012-12-24T00:00:00-2012-12-25T00:00:00-00001-of-00004. Res: 
FileResult(file_name='gs://temp-storage-for-end-to-end-tests/temp-it/.temp2fd192ce-a2d3-44ea-91b1-b3d0fa9724c7/3458111738400001586_12609429-ca41-4cf3-aef7-7a6f3f3f5312',
 shard_index=-1, total_shards=0, window=[1356307200.0, 1356393600.0), 
pane=None, destination=None)
INFO     apache_beam.io.fileio:fileio.py:642 Moving temporary file 
gs://temp-storage-for-end-to-end-tests/temp-it/.temp2fd192ce-a2d3-44ea-91b1-b3d0fa9724c7/2296956298787467726_3df734cb-c2ea-48ef-bc12-d6969471c1db
 to dir: 
gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-90bed2d4-1817-43b0-9b32-27f42081ef17
 as output.csv-2012-12-23T00:00:00-2012-12-24T00:00:00-00001-of-00004. Res: 
FileResult(file_name='gs://temp-storage-for-end-to-end-tests/temp-it/.temp2fd192ce-a2d3-44ea-91b1-b3d0fa9724c7/2296956298787467726_3df734cb-c2ea-48ef-bc12-d6969471c1db',
 shard_index=-1, total_shards=0, window=[1356220800.0, 1356307200.0), 
pane=None, destination=None)
INFO     apache_beam.io.fileio:fileio.py:642 Moving temporary file 
gs://temp-storage-for-end-to-end-tests/temp-it/.temp2fd192ce-a2d3-44ea-91b1-b3d0fa9724c7/1631671061563887154_965ee60c-98c2-4b1a-b3dc-0e574fee4c34
 to dir: 
gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-90bed2d4-1817-43b0-9b32-27f42081ef17
 as output.csv-2012-12-25T00:00:00-2012-12-26T00:00:00-00001-of-00004. Res: 
FileResult(file_name='gs://temp-storage-for-end-to-end-tests/temp-it/.temp2fd192ce-a2d3-44ea-91b1-b3d0fa9724c7/1631671061563887154_965ee60c-98c2-4b1a-b3dc-0e574fee4c34',
 shard_index=-1, total_shards=0, window=[1356393600.0, 1356480000.0), 
pane=None, destination=None)
INFO     apache_beam.io.fileio:fileio.py:642 Moving temporary file 
gs://temp-storage-for-end-to-end-tests/temp-it/.temp2fd192ce-a2d3-44ea-91b1-b3d0fa9724c7/3458111738400001586_0fbcf48b-a361-4beb-8d08-bc3166f0e2a2
 to dir: 
gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-90bed2d4-1817-43b0-9b32-27f42081ef17
 as output.csv-2012-12-24T00:00:00-2012-12-25T00:00:00-00002-of-00004. Res: 
FileResult(file_name='gs://temp-storage-for-end-to-end-tests/temp-it/.temp2fd192ce-a2d3-44ea-91b1-b3d0fa9724c7/3458111738400001586_0fbcf48b-a361-4beb-8d08-bc3166f0e2a2',
 shard_index=-1, total_shards=0, window=[1356307200.0, 1356393600.0), 
pane=None, destination=None)
INFO     apache_beam.io.fileio:fileio.py:642 Moving temporary file 
gs://temp-storage-for-end-to-end-tests/temp-it/.temp2fd192ce-a2d3-44ea-91b1-b3d0fa9724c7/2296956298787467726_78999dcf-3040-4d1c-9f21-23077681f005
 to dir: 
gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-90bed2d4-1817-43b0-9b32-27f42081ef17
 as output.csv-2012-12-23T00:00:00-2012-12-24T00:00:00-00002-of-00004. Res: 
FileResult(file_name='gs://temp-storage-for-end-to-end-tests/temp-it/.temp2fd192ce-a2d3-44ea-91b1-b3d0fa9724c7/2296956298787467726_78999dcf-3040-4d1c-9f21-23077681f005',
 shard_index=-1, total_shards=0, window=[1356220800.0, 1356307200.0), 
pane=None, destination=None)
INFO     apache_beam.io.fileio:fileio.py:642 Moving temporary file 
gs://temp-storage-for-end-to-end-tests/temp-it/.temp2fd192ce-a2d3-44ea-91b1-b3d0fa9724c7/1631671061563887154_8b46a52d-be77-4e6a-8f5a-3fe1d8e555a7
 to dir: 
gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-90bed2d4-1817-43b0-9b32-27f42081ef17
 as output.csv-2012-12-25T00:00:00-2012-12-26T00:00:00-00002-of-00004. Res: 
FileResult(file_name='gs://temp-storage-for-end-to-end-tests/temp-it/.temp2fd192ce-a2d3-44ea-91b1-b3d0fa9724c7/1631671061563887154_8b46a52d-be77-4e6a-8f5a-3fe1d8e555a7',
 shard_index=-1, total_shards=0, window=[1356393600.0, 1356480000.0), 
pane=None, destination=None)
INFO     apache_beam.io.fileio:fileio.py:642 Moving temporary file 
gs://temp-storage-for-end-to-end-tests/temp-it/.temp2fd192ce-a2d3-44ea-91b1-b3d0fa9724c7/2296956298787467726_e777c774-1612-4699-9cea-7c316d897f65
 to dir: 
gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-90bed2d4-1817-43b0-9b32-27f42081ef17
 as output.csv-2012-12-23T00:00:00-2012-12-24T00:00:00-00003-of-00004. Res: 
FileResult(file_name='gs://temp-storage-for-end-to-end-tests/temp-it/.temp2fd192ce-a2d3-44ea-91b1-b3d0fa9724c7/2296956298787467726_e777c774-1612-4699-9cea-7c316d897f65',
 shard_index=-1, total_shards=0, window=[1356220800.0, 1356307200.0), 
pane=None, destination=None)
INFO     apache_beam.io.fileio:fileio.py:642 Moving temporary file 
gs://temp-storage-for-end-to-end-tests/temp-it/.temp2fd192ce-a2d3-44ea-91b1-b3d0fa9724c7/3458111738400001586_a56cbb27-172b-43eb-9cf3-21ce96377e26
 to dir: 
gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-90bed2d4-1817-43b0-9b32-27f42081ef17
 as output.csv-2012-12-24T00:00:00-2012-12-25T00:00:00-00003-of-00004. Res: 
FileResult(file_name='gs://temp-storage-for-end-to-end-tests/temp-it/.temp2fd192ce-a2d3-44ea-91b1-b3d0fa9724c7/3458111738400001586_a56cbb27-172b-43eb-9cf3-21ce96377e26',
 shard_index=-1, total_shards=0, window=[1356307200.0, 1356393600.0), 
pane=None, destination=None)
INFO     apache_beam.io.fileio:fileio.py:642 Moving temporary file 
gs://temp-storage-for-end-to-end-tests/temp-it/.temp2fd192ce-a2d3-44ea-91b1-b3d0fa9724c7/1631671061563887154_68bcee19-5f58-484c-a1f6-ad1d003f39dd
 to dir: 
gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-90bed2d4-1817-43b0-9b32-27f42081ef17
 as output.csv-2012-12-25T00:00:00-2012-12-26T00:00:00-00003-of-00004. Res: 
FileResult(file_name='gs://temp-storage-for-end-to-end-tests/temp-it/.temp2fd192ce-a2d3-44ea-91b1-b3d0fa9724c7/1631671061563887154_68bcee19-5f58-484c-a1f6-ad1d003f39dd',
 shard_index=-1, total_shards=0, window=[1356393600.0, 1356480000.0), 
pane=None, destination=None)
INFO     apache_beam.io.fileio:fileio.py:665 Checking orphaned temporary files 
for destination None and window [1356220800.0, 1356307200.0)
INFO     apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of 
the input
INFO     apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 0 files in 
0.03402066230773926 seconds.
INFO     apache_beam.io.fileio:fileio.py:678 Some files may be left orphaned in 
the temporary folder: []
INFO     apache_beam.io.fileio:fileio.py:665 Checking orphaned temporary files 
for destination None and window [1356307200.0, 1356393600.0)
INFO     apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of 
the input
INFO     apache_beam.io.fileio:fileio.py:665 Checking orphaned temporary files 
for destination None and window [1356393600.0, 1356480000.0)
INFO     apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of 
the input
INFO     apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 0 files in 
0.02733755111694336 seconds.
INFO     apache_beam.io.fileio:fileio.py:678 Some files may be left orphaned in 
the temporary folder: []
INFO     apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 0 files in 
0.04362916946411133 seconds.
INFO     apache_beam.io.fileio:fileio.py:678 Some files may be left orphaned in 
the temporary folder: []
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'22/02/08 06:09:13 INFO 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing 
environment urn: "beam:env:external:v1"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'payload: "\\n\\021\\n\\017localhost:38827"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'capabilities: "beam:coder:bytes:v1"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'capabilities: "beam:coder:string_utf8:v1"'
INFO     apache_beam.runners.worker.sdk_worker:sdk_worker.py:244 No more 
requests from control plane
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'capabilities: "beam:coder:kv:v1"'
INFO     apache_beam.runners.worker.sdk_worker:sdk_worker.py:245 SDK Harness 
waiting for in-flight requests to complete
ERROR    apache_beam.runners.worker.data_plane:data_plane.py:641 Failed to read 
inputs in the data plane.
Traceback (most recent call last):
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py";,>
 line 634, in _read_inputs
    for elements in elements_iterator:
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py";,>
 line 426, in __next__
    return self._next()
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py";,>
 line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that 
terminated with:
        status = StatusCode.CANCELLED
        details = "Multiplexer hanging up"
        debug_error_string = 
"{"created":"@1644300553.846580955","description":"Error received from peer 
ipv6:[::1]:46653","file":"src/core/lib/surface/call.cc","file_line":1074,"grpc_message":"Multiplexer
 hanging up","grpc_status":1}"
>
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'capabilities: "beam:coder:bool:v1"'
INFO     apache_beam.runners.worker.data_plane:data_plane.py:782 Closing all 
cached grpc data channels.
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'capabilities: "beam:coder:varint:v1"'
INFO     apache_beam.runners.worker.sdk_worker:sdk_worker.py:826 Closing all 
cached gRPC state handlers.
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'capabilities: "beam:coder:double:v1"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'capabilities: "beam:coder:iterable:v1"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'capabilities: "beam:coder:timer:v1"'
INFO     apache_beam.runners.worker.sdk_worker:sdk_worker.py:257 Done consuming 
work.
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'capabilities: "beam:coder:interval_window:v1"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'capabilities: "beam:coder:length_prefix:v1"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'capabilities: "beam:coder:global_window:v1"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'capabilities: "beam:coder:windowed_value:v1"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'capabilities: "beam:coder:param_windowed_value:v1"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'capabilities: "beam:coder:state_backed_iterable:v1"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'capabilities: "beam:coder:custom_window:v1"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'capabilities: "beam:coder:row:v1"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'capabilities: "beam:coder:sharded_key:v1"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'capabilities: "beam:protocol:progress_reporting:v0"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'capabilities: "beam:protocol:harness_monitoring_infos:v1"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'capabilities: "beam:protocol:worker_status:v1"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'capabilities: "beam:combinefn:packed_python:v1"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'capabilities: "beam:version:sdk_base:apache/beam_python3.7_sdk:2.37.0.dev"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'capabilities: "beam:transform:to_string:v1"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b''
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'22/02/08 06:09:13 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: 
Hanged up for unknown endpoint.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'22/02/08 06:09:13 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job 
BeamApp-jenkins-0208060851-7c24d6c1_d4747440-e045-4825-b17c-da879bae3a20 
finished.'
INFO     apache_beam.runners.portability.portable_runner:portable_runner.py:576 
Job state changed to DONE
INFO     apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of 
the input
INFO     apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 4 files in 
0.03216838836669922 seconds.
INFO     apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of 
the input
INFO     apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 4 files in 
0.050238609313964844 seconds.
INFO     apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of 
the input
INFO     apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 4 files in 
0.03373980522155762 seconds.
INFO     apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of 
the input
INFO     apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 12 files in 
0.0411527156829834 seconds.
PASSED                                                                   [ 95%]
apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics 
-------------------------------- live log call ---------------------------------
INFO     root:pipeline.py:188 Missing pipeline option (runner). Executing 
pipeline using the default runner: DirectRunner.
INFO     root:transforms.py:182 Computing dataframe stage 
<ComputeStage(PTransform) 
label=[[ComputedExpression[set_column_DataFrame_140677047418768], 
ComputedExpression[set_index_DataFrame_140677047678544], 
ComputedExpression[pre_combine_sum_DataFrame_140676091688464]]:140677047417680]>
 for 
Stage[inputs={PlaceholderExpression[placeholder_DataFrame_140677451204688]}, 
partitioning=Arbitrary, 
ops=[ComputedExpression[set_column_DataFrame_140677047418768], 
ComputedExpression[set_index_DataFrame_140677047678544], 
ComputedExpression[pre_combine_sum_DataFrame_140676091688464]], 
outputs={ComputedExpression[pre_combine_sum_DataFrame_140676091688464], 
PlaceholderExpression[placeholder_DataFrame_140677451204688]}]
INFO     root:transforms.py:182 Computing dataframe stage 
<ComputeStage(PTransform) 
label=[[ComputedExpression[post_combine_sum_DataFrame_140676091687760]]:140677047418192]>
 for 
Stage[inputs={ComputedExpression[pre_combine_sum_DataFrame_140676091688464]}, 
partitioning=Index, 
ops=[ComputedExpression[post_combine_sum_DataFrame_140676091687760]], 
outputs={ComputedExpression[post_combine_sum_DataFrame_140676091687760]}]
INFO     apache_beam.io.fileio:fileio.py:555 Added temporary directory 
/tmp/.tempaa80dd09-908e-4c53-b107-12cc145a4678
WARNING  root:environments.py:374 Make sure that locally built Python SDK 
docker image has Python 3.7 interpreter.
INFO     root:environments.py:380 Default Python SDK image for environment is 
apache/beam_python3.7_sdk:2.37.0.dev
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function annotate_downstream_side_inputs at 
0x7ff1f1e74710> ====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function fix_side_input_pcoll_coders at 0x7ff1f1e74830> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function pack_combiners at 0x7ff1f1e74d40> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function lift_combiners at 0x7ff1f1e74dd0> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function expand_sdf at 0x7ff1f1e74f80> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function expand_gbk at 0x7ff1f1e76050> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function sink_flattens at 0x7ff1f1e76170> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function greedily_fuse at 0x7ff1f1e76200> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function read_to_impulse at 0x7ff1f1e76290> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function impulse_to_input at 0x7ff1f1e76320> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function sort_stages at 0x7ff1f1e76560> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function setup_timer_mapping at 0x7ff1f1e764d0> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function populate_data_channel_coders at 0x7ff1f1e765f0> 
====================
INFO     apache_beam.runners.worker.statecache:statecache.py:172 Creating state 
cache with size 100
INFO     
apache_beam.runners.portability.fn_api_runner.worker_handlers:worker_handlers.py:894
 Created Worker handler 
<apache_beam.runners.portability.fn_api_runner.worker_handlers.EmbeddedWorkerHandler
 object at 0x7ff1d4439310> for environment 
ref_Environment_default_environment_1 (beam:env:embedded_python:v1, b'')
INFO     
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621 
Running 
((((ref_AppliedPTransform_Read-Read-Impulse_4)+(ref_AppliedPTransform_Read-Read-Map-lambda-at-iobase-py-898-_5))+(Read/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+(Read/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction))+(ref_PCollection_PCollection_2_split/Write)
INFO     
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621 
Running 
((((((((((((((ref_PCollection_PCollection_2_split/Read)+(Read/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/Process))+(ref_AppliedPTransform_Split_8))+(ref_AppliedPTransform_ToRows_9))+(ref_AppliedPTransform_BatchElements-words-BatchElements-ParDo-_GlobalWindowsBatchingDoFn-_12))+(ref_AppliedPTransform_BatchElements-words-Map-lambda-at-schemas-py-140-_13))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpvchc_jfq-result-ComputedExpression-set_column_DataFr_16))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpvchc_jfq-result-ComputedExpression-set_column_DataFr_18))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpvchc_jfq-result-ComputedExpression-post_combine_sum__21))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpvchc_jfq-result-ComputedExpression-post_combine_sum__22))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpvchc_jfq-result-ComputedExpression-post_combine_sum__23))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpvchc_jfq-result-ComputedExpression-post_combine_sum__26))+(ToPCollection(df)
 - 
/tmp/tmpvchc_jfq.result/[ComputedExpression[post_combine_sum_DataFrame_140676091687760]]:140677047418192/CoGroupByKey/CoGroupByKeyImpl/Flatten/Transcode/0))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpvchc_jfq-result-ComputedExpression-post_combine_sum__27))+(ToPCollection(df)
 - 
/tmp/tmpvchc_jfq.result/[ComputedExpression[post_combine_sum_DataFrame_140676091687760]]:140677047418192/CoGroupByKey/CoGroupByKeyImpl/GroupByKey/Write)
INFO     
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621 
Running ((((((((((((ToPCollection(df) - 
/tmp/tmpvchc_jfq.result/[ComputedExpression[post_combine_sum_DataFrame_140676091687760]]:140677047418192/CoGroupByKey/CoGroupByKeyImpl/GroupByKey/Read)+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpvchc_jfq-result-ComputedExpression-post_combine_sum__29))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpvchc_jfq-result-ComputedExpression-post_combine_sum__30))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpvchc_jfq-result-ComputedExpression-post_combine_sum__31))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpvchc_jfq-result-ComputedExpression-post_combine_sum__33))+(ref_AppliedPTransform_WriteToPandas-df---tmp-tmpvchc_jfq-result-WriteToFiles-ParDo-_WriteUnshardedRe_37))+(ref_AppliedPTransform_Unbatch-post_combine_sum_DataFrame_140676091687760-with-indexes-ParDo-_Unbatch_46))+(WriteToPandas(df)
 - 
/tmp/tmpvchc_jfq.result/WriteToFiles/Flatten/Write/0))+(ref_AppliedPTransform_WriteToPandas-df---tmp-tmpvchc_jfq-result-WriteToFiles-ParDo-_AppendShardedDes_38))+(WriteToPandas(df)
 - 
/tmp/tmpvchc_jfq.result/WriteToFiles/GroupRecordsByDestinationAndShard/Write))+(ref_AppliedPTransform_Filter-lambda-at-wordcount-py-80-_47))+(ref_AppliedPTransform_Map-lambda-at-wordcount-py-81-_48))+(ref_AppliedPTransform_Map-print-_49)
INFO     
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621 
Running ((WriteToPandas(df) - 
/tmp/tmpvchc_jfq.result/WriteToFiles/GroupRecordsByDestinationAndShard/Read)+(ref_AppliedPTransform_WriteToPandas-df---tmp-tmpvchc_jfq-result-WriteToFiles-ParDo-_WriteShardedReco_40))+(WriteToPandas(df)
 - /tmp/tmpvchc_jfq.result/WriteToFiles/Flatten/Write/1)
INFO     
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621 
Running ((WriteToPandas(df) - 
/tmp/tmpvchc_jfq.result/WriteToFiles/Flatten/Read)+(ref_AppliedPTransform_WriteToPandas-df---tmp-tmpvchc_jfq-result-WriteToFiles-Map-lambda-at-fileio-py_42))+(WriteToPandas(df)
 - /tmp/tmpvchc_jfq.result/WriteToFiles/GroupTempFilesByDestination/Write)
INFO     
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621 
Running (WriteToPandas(df) - 
/tmp/tmpvchc_jfq.result/WriteToFiles/GroupTempFilesByDestination/Read)+(ref_AppliedPTransform_WriteToPandas-df---tmp-tmpvchc_jfq-result-WriteToFiles-ParDo-_MoveTempFilesInt_44)
INFO     apache_beam.io.fileio:fileio.py:642 Moving temporary file 
/tmp/.tempaa80dd09-908e-4c53-b107-12cc145a4678/3770419459992350901_cb01a105-a12c-4665-84fd-ff40e3af17a4
 to dir: /tmp as tmpvchc_jfq.result-00000-of-00001. Res: 
FileResult(file_name='/tmp/.tempaa80dd09-908e-4c53-b107-12cc145a4678/3770419459992350901_cb01a105-a12c-4665-84fd-ff40e3af17a4',
 shard_index=-1, total_shards=0, window=GlobalWindow, pane=None, 
destination=None)
INFO     apache_beam.io.fileio:fileio.py:665 Checking orphaned temporary files 
for destination None and window GlobalWindow
INFO     apache_beam.io.fileio:fileio.py:678 Some files may be left orphaned in 
the temporary folder: []
PASSED                                                                   [100%]

=============================== warnings summary ===============================
apache_beam/io/filesystems_test.py:54
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:54:
 DeprecationWarning: invalid escape sequence \c
    self.assertIsNone(FileSystems.get_scheme('c:\\abc\cdf'))  # pylint: 
disable=anomalous-backslash-in-string

apache_beam/io/filesystems_test.py:62
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:62:
 DeprecationWarning: invalid escape sequence \d
    self.assertTrue(isinstance(FileSystems.get_filesystem('c:\\abc\def'),  # 
pylint: disable=anomalous-backslash-in-string

apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63:
 PendingDeprecationWarning: Client.dataset is deprecated and will be removed in 
a future version. Use a string like 'my_project.my_dataset' or a 
cloud.google.bigquery.DatasetReference object, instead.
    dataset_ref = client.dataset(unique_dataset_name, project=project)

apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2138:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    is_streaming_pipeline = p.options.view_as(StandardOptions).streaming

apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2144:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    experiments = p.options.view_as(DebugOptions).experiments or []

apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1128:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1130:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2437:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    temp_location = pcoll.pipeline.options.view_as(

apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2439:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2470:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    | _PassThroughThenCleanup(files_to_remove_pcoll))

apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2134:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100:
 PendingDeprecationWarning: Client.dataset is deprecated and will be removed in 
a future version. Use a string like 'my_project.my_dataset' or a 
cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:45:
 FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 
'numeric_only=None') is deprecated; in a future version this will raise 
TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/dataframe/io.py>:632:
 FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/fileio.py>:550:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/pytest_postCommitExamples-spark-py37.xml>
 -
===== 22 passed, 1 skipped, 5178 deselected, 39 warnings in 306.14 seconds =====


FAILURE: Build failed with an exception.

* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/test-suites/portable/common.gradle'>
 line: 248

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:portable:py38:sparkExamples'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 7m 19s
79 actionable tasks: 50 executed, 27 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/mn645mge6nx7w

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to