See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/3685/display/redirect>

Changes:


------------------------------------------
[...truncated 2.08 MB...]
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/sdf_utils.py";,>
 line 115, in check_done
    return self._restriction_tracker.check_done()
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/io/restriction_trackers.py";,>
 line 106, in check_done
    self._range.stop))
ValueError: OffsetRestrictionTracker is not done since work in range [0, 6) has 
not been claimed.

        at 
org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:177)
        at 
org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
        at 
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
        at 
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
        at 
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
        at 
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
        at 
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
        at 
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
        at 
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
        at 
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
        ... 3 more
ERROR:root:java.lang.RuntimeException: Error received from SDK harness for 
instruction 3: Traceback (most recent call last):
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py";,>
 line 258, in _execute
    response = task()
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py";,>
 line 315, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py";,>
 line 484, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py";,>
 line 519, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/bundle_processor.py";,>
 line 985, in process_bundle
    element.data)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/bundle_processor.py";,>
 line 221, in process_encoded
    self.output(decoded_value)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/operations.py";,>
 line 356, in output
    cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/operations.py";,>
 line 218, in receive
    self.consumer.process(windowed_value)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/operations.py";,>
 line 819, in process
    o)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/common.py";,>
 line 1224, in process_with_sized_restriction
    watermark_estimator_state=estimator_state)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/common.py";,>
 line 723, in invoke_process
    windowed_value, additional_args, additional_kwargs)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/common.py";,>
 line 872, in _invoke_process_per_window
    self.threadsafe_restriction_tracker.check_done()
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/sdf_utils.py";,>
 line 115, in check_done
    return self._restriction_tracker.check_done()
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/io/restriction_trackers.py";,>
 line 106, in check_done
    self._range.stop))
ValueError: OffsetRestrictionTracker is not done since work in range [0, 6) has 
not been claimed.

INFO:apache_beam.runners.portability.portable_runner:Job state changed to FAILED
.ssssINFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at 
localhost:36619
WARNING:root:Make sure that locally built Python SDK docker image has Python 
3.5 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python3.5_sdk:2.25.0.dev. 
If the image is not available at local, we will try to pull from hub.docker.com
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function lift_combiners at 0x7f117aa40400> ====================
20/09/23 18:09:16 INFO 
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging 
artifacts for job_4b6c5f5a-9e0f-4edd-a5f5-24a9a2756a7e.
20/09/23 18:09:16 INFO 
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving 
artifacts for 
job_4b6c5f5a-9e0f-4edd-a5f5-24a9a2756a7e.ref_Environment_default_environment_1.
20/09/23 18:09:16 INFO 
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 0 
artifacts for job_4b6c5f5a-9e0f-4edd-a5f5-24a9a2756a7e.null.
20/09/23 18:09:16 INFO 
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts 
fully staged for job_4b6c5f5a-9e0f-4edd-a5f5-24a9a2756a7e.
20/09/23 18:09:16 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking 
job 
test_windowed_pardo_state_timers_1600884556.4347603_ee1839dc-d85b-465d-99e1-4ab34132f70e
20/09/23 18:09:16 INFO org.apache.beam.runners.jobsubmission.JobInvocation: 
Starting job invocation 
test_windowed_pardo_state_timers_1600884556.4347603_ee1839dc-d85b-465d-99e1-4ab34132f70e
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has 
started a component necessary for the execution. Be sure to run the pipeline 
using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
RUNNING
20/09/23 18:09:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: 
PipelineOptions.filesToStage was not specified. Defaulting to files from the 
classpath
20/09/23 18:09:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will 
stage 7 files. (Enable logging at DEBUG level to see which files will be 
staged.)
20/09/23 18:09:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: 
Running job 
test_windowed_pardo_state_timers_1600884556.4347603_ee1839dc-d85b-465d-99e1-4ab34132f70e
 on Spark master local
20/09/23 18:09:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: 
Running job 
test_windowed_pardo_state_timers_1600884556.4347603_ee1839dc-d85b-465d-99e1-4ab34132f70e
 on Spark master local
20/09/23 18:09:17 WARN 
org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: 
Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not 
consistent with equals. That might cause issues on some runners.
20/09/23 18:09:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job 
test_windowed_pardo_state_timers_1600884556.4347603_ee1839dc-d85b-465d-99e1-4ab34132f70e:
 Pipeline translated successfully. Computing outputs
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel 
for localhost:35711.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with 
unbounded number of workers.
20/09/23 18:09:17 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam 
Fn Control client connected with id 32-1
20/09/23 18:09:17 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 32-2
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for 
localhost:38525.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for 
localhost:33187
20/09/23 18:09:17 INFO 
org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client 
connected.
20/09/23 18:09:17 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 32-3
20/09/23 18:09:17 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 32-4
20/09/23 18:09:17 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 32-5
20/09/23 18:09:17 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 32-6
20/09/23 18:09:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job 
test_windowed_pardo_state_timers_1600884556.4347603_ee1839dc-d85b-465d-99e1-4ab34132f70e
 finished.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at 
localhost:45975
WARNING:root:Make sure that locally built Python SDK docker image has Python 
3.5 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python3.5_sdk:2.25.0.dev. 
If the image is not available at local, we will try to pull from hub.docker.com
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function lift_combiners at 0x7f117aa40400> ====================
20/09/23 18:09:17 INFO 
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging 
artifacts for job_bc8b9005-7232-4447-972a-ceddc340f359.
20/09/23 18:09:17 INFO 
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving 
artifacts for 
job_bc8b9005-7232-4447-972a-ceddc340f359.ref_Environment_default_environment_1.
20/09/23 18:09:17 INFO 
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 0 
artifacts for job_bc8b9005-7232-4447-972a-ceddc340f359.null.
20/09/23 18:09:17 INFO 
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts 
fully staged for job_bc8b9005-7232-4447-972a-ceddc340f359.
20/09/23 18:09:17 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking 
job test_windowing_1600884557.4028282_a3519c39-4b0d-4c1c-84d7-4615669137a6
20/09/23 18:09:17 INFO org.apache.beam.runners.jobsubmission.JobInvocation: 
Starting job invocation 
test_windowing_1600884557.4028282_a3519c39-4b0d-4c1c-84d7-4615669137a6
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has 
started a component necessary for the execution. Be sure to run the pipeline 
using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
RUNNING
20/09/23 18:09:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: 
PipelineOptions.filesToStage was not specified. Defaulting to files from the 
classpath
20/09/23 18:09:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will 
stage 7 files. (Enable logging at DEBUG level to see which files will be 
staged.)
20/09/23 18:09:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: 
Running job 
test_windowing_1600884557.4028282_a3519c39-4b0d-4c1c-84d7-4615669137a6 on Spark 
master local
20/09/23 18:09:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: 
Running job 
test_windowing_1600884557.4028282_a3519c39-4b0d-4c1c-84d7-4615669137a6 on Spark 
master local
20/09/23 18:09:17 WARN 
org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: 
Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not 
consistent with equals. That might cause issues on some runners.
20/09/23 18:09:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job 
test_windowing_1600884557.4028282_a3519c39-4b0d-4c1c-84d7-4615669137a6: 
Pipeline translated successfully. Computing outputs

> Task 
> :sdks:python:test-suites:portable:py37:sparkCompatibilityMatrixBatchLOOPBACK
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function lift_combiners at 0x7f71503c9840> ====================
20/09/23 18:09:15 INFO 
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging 
artifacts for job_f7c27e46-37f1-42fb-9294-9e0e987decdd.
20/09/23 18:09:15 INFO 
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving 
artifacts for 
job_f7c27e46-37f1-42fb-9294-9e0e987decdd.ref_Environment_default_environment_1.
20/09/23 18:09:15 INFO 
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 0 
artifacts for job_f7c27e46-37f1-42fb-9294-9e0e987decdd.null.
20/09/23 18:09:15 INFO 
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts 
fully staged for job_f7c27e46-37f1-42fb-9294-9e0e987decdd.
20/09/23 18:09:15 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking 
job 
test_windowed_pardo_state_timers_1600884555.4256392_15e2701f-7f89-4250-a7e6-997124fb7596
20/09/23 18:09:15 INFO org.apache.beam.runners.jobsubmission.JobInvocation: 
Starting job invocation 
test_windowed_pardo_state_timers_1600884555.4256392_15e2701f-7f89-4250-a7e6-997124fb7596
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has 
started a component necessary for the execution. Be sure to run the pipeline 
using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
RUNNING
20/09/23 18:09:16 INFO org.apache.beam.runners.spark.SparkPipelineRunner: 
PipelineOptions.filesToStage was not specified. Defaulting to files from the 
classpath
20/09/23 18:09:16 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will 
stage 7 files. (Enable logging at DEBUG level to see which files will be 
staged.)
20/09/23 18:09:16 INFO org.apache.beam.runners.spark.SparkPipelineRunner: 
Running job 
test_windowed_pardo_state_timers_1600884555.4256392_15e2701f-7f89-4250-a7e6-997124fb7596
 on Spark master local
20/09/23 18:09:16 INFO org.apache.beam.runners.spark.SparkPipelineRunner: 
Running job 
test_windowed_pardo_state_timers_1600884555.4256392_15e2701f-7f89-4250-a7e6-997124fb7596
 on Spark master local
20/09/23 18:09:16 WARN 
org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: 
Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not 
consistent with equals. That might cause issues on some runners.
20/09/23 18:09:16 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job 
test_windowed_pardo_state_timers_1600884555.4256392_15e2701f-7f89-4250-a7e6-997124fb7596:
 Pipeline translated successfully. Computing outputs
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel 
for localhost:33053.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with 
unbounded number of workers.
20/09/23 18:09:16 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam 
Fn Control client connected with id 32-1
20/09/23 18:09:16 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 32-2
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for 
localhost:40703.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for 
localhost:45335
20/09/23 18:09:16 INFO 
org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client 
connected.
20/09/23 18:09:16 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 32-3
20/09/23 18:09:16 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 32-4
20/09/23 18:09:16 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 32-5
20/09/23 18:09:16 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 32-6
20/09/23 18:09:16 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job 
test_windowed_pardo_state_timers_1600884555.4256392_15e2701f-7f89-4250-a7e6-997124fb7596
 finished.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at 
localhost:35845
WARNING:root:Make sure that locally built Python SDK docker image has Python 
3.7 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python3.7_sdk:2.25.0.dev. 
If the image is not available at local, we will try to pull from hub.docker.com
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function lift_combiners at 0x7f71503c9840> ====================
20/09/23 18:09:16 INFO 
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging 
artifacts for job_f634054c-0724-4ea9-8aca-c32bbb9e0820.
20/09/23 18:09:16 INFO 
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving 
artifacts for 
job_f634054c-0724-4ea9-8aca-c32bbb9e0820.ref_Environment_default_environment_1.
20/09/23 18:09:16 INFO 
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 0 
artifacts for job_f634054c-0724-4ea9-8aca-c32bbb9e0820.null.
20/09/23 18:09:16 INFO 
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts 
fully staged for job_f634054c-0724-4ea9-8aca-c32bbb9e0820.
20/09/23 18:09:16 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking 
job test_windowing_1600884556.359002_9532f1ca-acdf-46d3-bdb2-170d2cca2f06
20/09/23 18:09:16 INFO org.apache.beam.runners.jobsubmission.JobInvocation: 
Starting job invocation 
test_windowing_1600884556.359002_9532f1ca-acdf-46d3-bdb2-170d2cca2f06
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has 
started a component necessary for the execution. Be sure to run the pipeline 
using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
RUNNING
20/09/23 18:09:16 INFO org.apache.beam.runners.spark.SparkPipelineRunner: 
PipelineOptions.filesToStage was not specified. Defaulting to files from the 
classpath
20/09/23 18:09:16 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will 
stage 7 files. (Enable logging at DEBUG level to see which files will be 
staged.)
20/09/23 18:09:16 INFO org.apache.beam.runners.spark.SparkPipelineRunner: 
Running job 
test_windowing_1600884556.359002_9532f1ca-acdf-46d3-bdb2-170d2cca2f06 on Spark 
master local
20/09/23 18:09:16 INFO org.apache.beam.runners.spark.SparkPipelineRunner: 
Running job 
test_windowing_1600884556.359002_9532f1ca-acdf-46d3-bdb2-170d2cca2f06 on Spark 
master local
20/09/23 18:09:17 WARN 
org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: 
Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not 
consistent with equals. That might cause issues on some runners.
20/09/23 18:09:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job 
test_windowing_1600884556.359002_9532f1ca-acdf-46d3-bdb2-170d2cca2f06: Pipeline 
translated successfully. Computing outputs
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel 
for localhost:45939.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with 
unbounded number of workers.
20/09/23 18:09:17 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam 
Fn Control client connected with id 33-1
20/09/23 18:09:17 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 33-2
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for 
localhost:45815.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for 
localhost:40421
20/09/23 18:09:17 INFO 
org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client 
connected.
20/09/23 18:09:17 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 33-3
20/09/23 18:09:17 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 33-4
20/09/23 18:09:17 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 33-5
20/09/23 18:09:17 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 33-6
20/09/23 18:09:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job 
test_windowing_1600884556.359002_9532f1ca-acdf-46d3-bdb2-170d2cca2f06 finished.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
----------------------------------------------------------------------
Ran 46 tests in 60.624s

OK (skipped=14)

> Task :sdks:python:test-suites:portable:py37:sparkCompatibilityMatrixLoopback
> Task :sdks:python:test-suites:portable:py37:sparkValidatesRunner

> Task 
> :sdks:python:test-suites:portable:py35:sparkCompatibilityMatrixBatchLOOPBACK
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel 
for localhost:46597.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with 
unbounded number of workers.
20/09/23 18:09:17 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam 
Fn Control client connected with id 33-1
20/09/23 18:09:17 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 33-2
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for 
localhost:39179.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for 
localhost:41405
20/09/23 18:09:17 INFO 
org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client 
connected.
20/09/23 18:09:18 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 33-3
20/09/23 18:09:18 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 33-4
20/09/23 18:09:18 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 33-5
20/09/23 18:09:18 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 33-6
20/09/23 18:09:18 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job 
test_windowing_1600884557.4028282_a3519c39-4b0d-4c1c-84d7-4615669137a6 finished.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
----------------------------------------------------------------------
Ran 46 tests in 61.497s

OK (skipped=14)

> Task :sdks:python:test-suites:portable:py35:sparkCompatibilityMatrixLoopback
> Task :sdks:python:test-suites:portable:py35:sparkValidatesRunner

FAILURE: Build failed with an exception.

* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/common.gradle'>
 line: 140

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:portable:py36:createProcessWorker'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 28s
74 actionable tasks: 53 executed, 21 from cache

Publishing build scan...
https://gradle.com/s/dmpeui3x7aj3s

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to