See 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_V2_Streaming_Java17/64/display/redirect>

Changes:


------------------------------------------
[...truncated 120.77 KB...]
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on 
the backend. The work has already completed or will be retried. This is 
expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on 
the backend. The work has already completed or will be retried. This is 
expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on 
the backend. The work has already completed or will be retried. This is 
expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on 
the backend. The work has already completed or will be retried. This is 
expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on 
the backend. The work has already completed or will be retried. This is 
expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on 
the backend. The work has already completed or will be retried. This is 
expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on 
the backend. The work has already completed or will be retried. This is 
expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on 
the backend. The work has already completed or will be retried. This is 
expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on 
the backend. The work has already completed or will be retried. This is 
expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on 
the backend. The work has already completed or will be retried. This is 
expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on 
the backend. The work has already completed or will be retried. This is 
expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on 
the backend. The work has already completed or will be retried. This is 
expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on 
the backend. The work has already completed or will be retried. This is 
expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on 
the backend. The work has already completed or will be retried. This is 
expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on 
the backend. The work has already completed or will be retried. This is 
expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on 
the backend. The work has already completed or will be retried. This is 
expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on 
the backend. The work has already completed or will be retried. This is 
expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on 
the backend. The work has already completed or will be retried. This is 
expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on 
the backend. The work has already completed or will be retried. This is 
expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on 
the backend. The work has already completed or will be retried. This is 
expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on 
the backend. The work has already completed or will be retried. This is 
expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on 
the backend. The work has already completed or will be retried. This is 
expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on 
the backend. The work has already completed or will be retried. This is 
expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on 
the backend. The work has already completed or will be retried. This is 
expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on 
the backend. The work has already completed or will be retried. This is 
expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
generic::internal: The work item requesting state read is no longer valid on 
the backend. The work has already completed or will be retried. This is 
expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
generic::internal: The work item requesting state read is no longer valid on 
the backend. The work has already completed or will be retried. This is 
expected during autoscaling events.
passed through:
==>
    dist_proc/windmill/client/streaming_rpc_client.cc:697
==>
    dist_proc/dax/workflow/****/streaming/merge_windows_fn.cc:230
==>
    dist_proc/dax/workflow/****/streaming/fnapi_streaming_operators.cc:439
Feb 22, 2022 4:07:31 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-22T16:07:30.502Z: Autoscaling: Raised the number of ****s to 5 so 
that the pipeline can catch up with its backlog and keep up with its input rate.
Feb 22, 2022 4:09:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-22T16:09:41.183Z: Autoscaling: Raised the number of ****s to 5 so 
that the pipeline can catch up with its backlog and keep up with its input rate.
Feb 22, 2022 4:11:46 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-22T16:11:44.875Z: Autoscaling: Raised the number of ****s to 5 so 
that the pipeline can catch up with its backlog and keep up with its input rate.
Feb 22, 2022 4:12:34 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-22T16:12:33.920Z: Autoscaling: Raised the number of ****s to 5 so 
that the pipeline can catch up with its backlog and keep up with its input rate.
Feb 22, 2022 4:17:39 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-22T16:17:39.300Z: Autoscaling: Raised the number of ****s to 5 so 
that the pipeline can catch up with its backlog and keep up with its input rate.
Feb 22, 2022 4:19:42 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-22T16:19:40.393Z: Autoscaling: Raised the number of ****s to 5 so 
that the pipeline can catch up with its backlog and keep up with its input rate.
Feb 22, 2022 4:20:46 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-22T16:20:45.605Z: Autoscaling: Raised the number of ****s to 5 so 
that the pipeline can catch up with its backlog and keep up with its input rate.
Feb 22, 2022 4:21:45 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-22T16:21:45.148Z: Autoscaling: Raised the number of ****s to 5 so 
that the pipeline can catch up with its backlog and keep up with its input rate.
Feb 22, 2022 4:25:46 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-22T16:25:45.582Z: Autoscaling: Raised the number of ****s to 5 so 
that the pipeline can catch up with its backlog and keep up with its input rate.
Feb 22, 2022 4:32:52 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-22T16:32:51.684Z: Autoscaling: Raised the number of ****s to 5 so 
that the pipeline can catch up with its backlog and keep up with its input rate.
Feb 22, 2022 4:33:52 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-22T16:33:51.790Z: Autoscaling: Raised the number of ****s to 5 so 
that the pipeline can catch up with its backlog and keep up with its input rate.
Feb 22, 2022 4:34:56 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-22T16:34:56.577Z: Autoscaling: Raised the number of ****s to 5 so 
that the pipeline can catch up with its backlog and keep up with its input rate.
Feb 22, 2022 4:37:59 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-22T16:37:59.787Z: Autoscaling: Raised the number of ****s to 5 so 
that the pipeline can catch up with its backlog and keep up with its input rate.
Build timed out (after 240 minutes). Marking the build as aborted.
FATAL: command execution failed
java.io.IOException: Backing channel 'apache-beam-jenkins-12' is disconnected.
        at 
hudson.remoting.RemoteInvocationHandler.channelOrFail(RemoteInvocationHandler.java:216)
        at 
hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:286)
        at com.sun.proxy.$Proxy121.isAlive(Unknown Source)
        at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1213)
        at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1205)
        at hudson.Launcher$ProcStarter.join(Launcher.java:522)
        at hudson.plugins.gradle.Gradle.perform(Gradle.java:317)
        at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
        at 
hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:806)
        at hudson.model.Build$BuildExecution.build(Build.java:198)
        at hudson.model.Build$BuildExecution.doRun(Build.java:163)
        at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:514)
        at hudson.model.Run.execute(Run.java:1888)
        at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
        at hudson.model.ResourceController.execute(ResourceController.java:99)
        at hudson.model.Executor.run(Executor.java:432)
Caused by: java.io.IOException: Pipe closed after 0 cycles
        at 
org.apache.sshd.common.channel.ChannelPipedInputStream.read(ChannelPipedInputStream.java:126)
        at 
org.apache.sshd.common.channel.ChannelPipedInputStream.read(ChannelPipedInputStream.java:105)
        at 
hudson.remoting.FlightRecorderInputStream.read(FlightRecorderInputStream.java:93)
        at 
hudson.remoting.ChunkedInputStream.readHeader(ChunkedInputStream.java:74)
        at 
hudson.remoting.ChunkedInputStream.readUntilBreak(ChunkedInputStream.java:104)
        at 
hudson.remoting.ChunkedCommandTransport.readBlock(ChunkedCommandTransport.java:39)
        at 
hudson.remoting.AbstractSynchronousByteArrayCommandTransport.read(AbstractSynchronousByteArrayCommandTransport.java:34)
        at 
hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:61)
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
ERROR: apache-beam-jenkins-12 is offline; cannot locate jdk_1.8_latest

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to