[
https://issues.apache.org/jira/browse/BEAM-11486?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Kenneth Knowles updated BEAM-11486:
-----------------------------------
Status: Open (was: Triage Needed)
> Spark test failure:
> org.apache.beam.sdk.testing.PAssertTest.testSerializablePredicate
> -------------------------------------------------------------------------------------
>
> Key: BEAM-11486
> URL: https://issues.apache.org/jira/browse/BEAM-11486
> Project: Beam
> Issue Type: Sub-task
> Components: runner-spark, test-failures
> Reporter: Tyson Hamilton
> Priority: P1
> Labels: flake, portability-spark
>
> h1.
> From:
> [https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Spark_Streaming/464/testReport/org.apache.beam.sdk.testing/PAssertTest/testSerializablePredicate/]
>
> {code:java}
> Regression
> org.apache.beam.sdk.testing.PAssertTest.testSerializablePredicate
> Failing for the past 1 build (Since #464 ) Took 36 sec. Error Message
> java.lang.AssertionError: Expected 1 successful assertions, but found 0.
> Expected: is <1L> but: was <0L>
> Stacktrace
> java.lang.AssertionError: Expected 1 successful assertions, but found 0.
> Expected: is <1L> but: was <0L> at
> org.hamcrest.MatcherAssert.assertThat(MatcherAssert.java:18) at
> org.apache.beam.sdk.testing.TestPipeline.verifyPAssertsSucceeded(TestPipeline.java:516)
> at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:354) at
> org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:334) at
> org.apache.beam.sdk.testing.PAssertTest.testSerializablePredicate(PAssertTest.java:209)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498) at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
> at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
> at
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at
> org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:322) at
> org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:266)
> at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:305) at
> org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:365) at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
> at org.junit.runners.ParentRunner$4.run(ParentRunner.java:330) at
> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:78) at
> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:328) at
> org.junit.runners.ParentRunner.access$100(ParentRunner.java:65) at
> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:292) at
> org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:305) at
> org.junit.runners.ParentRunner.run(ParentRunner.java:412) at
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
> at
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
> at
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
> at
> org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
> at
> org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498) at
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> at
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> at
> org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
> at
> org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
> at com.sun.proxy.$Proxy2.processTestClass(Unknown Source) at
> org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498) at
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> at
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> at
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
> at
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
> at
> org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
> at
> org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
> at
> org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at
> org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
> at java.lang.Thread.run(Thread.java:748)
> Standard Output
> Shutting SDK harness down. Shutting SDK harness down. Shutting SDK harness
> down. Shutting SDK harness down. Shutting SDK harness down. Shutting SDK
> harness down.
> Standard Error
> 20/12/16 12:04:50 INFO org.apache.beam.runners.jobsubmission.JobServerDriver:
> ArtifactStagingService started on localhost:45321 20/12/16 12:04:50 INFO
> org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService
> started on localhost:43123 20/12/16 12:04:50 INFO
> org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on
> localhost:35739 20/12/16 12:04:53 INFO
> org.apache.beam.runners.portability.PortableRunner: Using job server
> endpoint: localhost:35739 20/12/16 12:04:53 INFO
> org.apache.beam.runners.portability.PortableRunner: PrepareJobResponse:
> preparation_id:
> "passerttest0testisequalto-jenkins-1216120453-2b2ce63e_4a6f6abb-37ba-4a63-9e43-51db42a3a5ef"
> artifact_staging_endpoint { url: "localhost:45321" } staging_session_token:
> "passerttest0testisequalto-jenkins-1216120453-2b2ce63e_4a6f6abb-37ba-4a63-9e43-51db42a3a5ef"
> 20/12/16 12:04:53 INFO
> org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging
> artifacts for
> passerttest0testisequalto-jenkins-1216120453-2b2ce63e_4a6f6abb-37ba-4a63-9e43-51db42a3a5ef.
> 20/12/16 12:04:53 INFO
> org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService:
> Resolving artifacts for
> passerttest0testisequalto-jenkins-1216120453-2b2ce63e_4a6f6abb-37ba-4a63-9e43-51db42a3a5ef.EMBEDDED.
> 20/12/16 12:04:53 INFO
> org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting
> 313 artifacts for
> passerttest0testisequalto-jenkins-1216120453-2b2ce63e_4a6f6abb-37ba-4a63-9e43-51db42a3a5ef.null.
> 20/12/16 12:04:54 INFO
> org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService:
> Artifacts fully staged for
> passerttest0testisequalto-jenkins-1216120453-2b2ce63e_4a6f6abb-37ba-4a63-9e43-51db42a3a5ef.
> 20/12/16 12:04:54 INFO org.apache.beam.runners.spark.SparkJobInvoker:
> Invoking job
> passerttest0testisequalto-jenkins-1216120453-2b2ce63e_e1417217-9119-4a42-bc69-9abbb1ca6c83
> 20/12/16 12:04:54 INFO org.apache.beam.runners.jobsubmission.JobInvocation:
> Starting job invocation
> passerttest0testisequalto-jenkins-1216120453-2b2ce63e_e1417217-9119-4a42-bc69-9abbb1ca6c83
> 20/12/16 12:04:54 INFO org.apache.beam.runners.portability.PortableRunner:
> RunJobResponse: job_id:
> "passerttest0testisequalto-jenkins-1216120453-2b2ce63e_e1417217-9119-4a42-bc69-9abbb1ca6c83"
> 20/12/16 12:04:54 INFO org.apache.beam.runners.spark.SparkPipelineRunner:
> Will stage 313 files. (Enable logging at DEBUG level to see which files will
> be staged.) 20/12/16 12:04:54 INFO
> org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a
> brand new Spark Context. 20/12/16 12:04:54 INFO
> org.apache.beam.runners.spark.SparkPipelineRunner: Running job
> passerttest0testisequalto-jenkins-1216120453-2b2ce63e_e1417217-9119-4a42-bc69-9abbb1ca6c83
> on Spark master local[4] 20/12/16 12:04:54 INFO
> org.apache.beam.runners.spark.SparkPipelineRunner: Running job
> passerttest0testisequalto-jenkins-1216120453-2b2ce63e_e1417217-9119-4a42-bc69-9abbb1ca6c83
> on Spark master local[4] 20/12/16 12:04:54 INFO
> org.apache.beam.runners.spark.SparkPipelineRunner: Job
> passerttest0testisequalto-jenkins-1216120453-2b2ce63e_e1417217-9119-4a42-bc69-9abbb1ca6c83:
> Pipeline translated successfully. Computing outputs 20/12/16 12:04:54 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:04:54 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:04:54 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:04:54 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:04:55 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:04:55 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:04:55 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:04:55 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:04:56 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:04:56 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:04:56 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:04:56 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:04:57 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:04:57 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:04:57 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:04:57 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:04:58 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:04:58 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:04:58 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:04:58 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:04:59 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:04:59 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:04:59 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:04:59 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:00 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:00 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:00 INFO
> org.apache.beam.fn.harness.FnHarness: Fn Harness started 20/12/16 12:05:00
> INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn
> Logging client connected. 20/12/16 12:05:00 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:00 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:00 WARN
> org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: No
> worker_id header provided in control request 20/12/16 12:05:00 INFO
> org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
> Beam Fn Control client connected with id 20/12/16 12:05:00 INFO
> org.apache.beam.fn.harness.FnHarness: Entering instruction processing loop
> 20/12/16 12:05:00 INFO
> org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
> getProcessBundleDescriptor request with id 5-2 20/12/16 12:05:00 INFO
> org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
> getProcessBundleDescriptor request with id 5-3 20/12/16 12:05:00 INFO
> org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client
> connected. 20/12/16 12:05:00 INFO
> org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
> getProcessBundleDescriptor request with id 5-4 20/12/16 12:05:00 INFO
> org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
> getProcessBundleDescriptor request with id 5-5 20/12/16 12:05:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:01 INFO
> org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
> getProcessBundleDescriptor request with id 5-6 20/12/16 12:05:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:01 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: Put new watermark
> block: {0=SparkWatermarks{lowWatermark=294247-01-09T04:00:54.775Z,
> highWatermark=294247-01-10T04:00:54.775Z,
> synchronizedProcessingTime=2020-12-16T12:04:54.218Z},
> 1=SparkWatermarks{lowWatermark=294247-01-09T04:00:54.775Z,
> highWatermark=294247-01-10T04:00:54.775Z,
> synchronizedProcessingTime=2020-12-16T12:04:54.218Z}} 20/12/16 12:05:01 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120294500 has completed, watermarks have been
> updated. 20/12/16 12:05:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:01 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120295000 20/12/16 12:05:01
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120295000 has completed, watermarks have been
> updated. 20/12/16 12:05:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:01 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120295500 20/12/16 12:05:01
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120295500 has completed, watermarks have been
> updated. 20/12/16 12:05:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:01 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120296000 20/12/16 12:05:01
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120296000 has completed, watermarks have been
> updated. 20/12/16 12:05:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:01 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120296500 20/12/16 12:05:01
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120296500 has completed, watermarks have been
> updated. 20/12/16 12:05:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:01 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120297000 20/12/16 12:05:01
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120297000 has completed, watermarks have been
> updated. 20/12/16 12:05:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:01 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120297500 20/12/16 12:05:01
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120297500 has completed, watermarks have been
> updated. 20/12/16 12:05:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:01 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120298000 20/12/16 12:05:01
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120298000 has completed, watermarks have been
> updated. 20/12/16 12:05:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:02 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120298500 20/12/16 12:05:02
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120298500 has completed, watermarks have been
> updated. 20/12/16 12:05:02 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:02 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:02 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:02 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:02 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:02 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:02 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120299000 20/12/16 12:05:02
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120299000 has completed, watermarks have been
> updated. 20/12/16 12:05:02 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:02 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:02 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120299500 20/12/16 12:05:02
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120299500 has completed, watermarks have been
> updated. 20/12/16 12:05:02 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:02 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:02 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120300000 20/12/16 12:05:02
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120300000 has completed, watermarks have been
> updated. 20/12/16 12:05:02 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120300500 20/12/16 12:05:02
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120300500 has completed, watermarks have been
> updated. 20/12/16 12:05:02 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120301000 20/12/16 12:05:02
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120301000 has completed, watermarks have been
> updated. 20/12/16 12:05:02 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120301500 20/12/16 12:05:02
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120301500 has completed, watermarks have been
> updated. 20/12/16 12:05:02 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:02 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:02 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120302000 20/12/16 12:05:02
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120302000 has completed, watermarks have been
> updated. 20/12/16 12:05:02 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:02 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:02 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120302500 20/12/16 12:05:02
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120302500 has completed, watermarks have been
> updated. 20/12/16 12:05:03 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:03 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:03 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:03 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:03 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120303000 20/12/16 12:05:03
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120303000 has completed, watermarks have been
> updated. 20/12/16 12:05:03 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:03 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:03 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:03 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:03 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120303500 20/12/16 12:05:03
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120303500 has completed, watermarks have been
> updated. 20/12/16 12:05:04 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:04 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:04 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:04 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:04 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120304000 20/12/16 12:05:04
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120304000 has completed, watermarks have been
> updated. 20/12/16 12:05:04 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:04 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:04 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:04 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:04 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120304500 20/12/16 12:05:04
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120304500 has completed, watermarks have been
> updated. 20/12/16 12:05:05 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:05 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:05 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:05 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:05 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120305000 20/12/16 12:05:05
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120305000 has completed, watermarks have been
> updated. 20/12/16 12:05:05 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:05 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:05 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:05 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:05 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120305500 20/12/16 12:05:05
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120305500 has completed, watermarks have been
> updated. 20/12/16 12:05:06 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:06 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:06 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:06 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:06 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120306000 20/12/16 12:05:06
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120306000 has completed, watermarks have been
> updated. 20/12/16 12:05:06 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:06 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:06 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:06 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:06 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120306500 20/12/16 12:05:06
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120306500 has completed, watermarks have been
> updated. 20/12/16 12:05:07 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:07 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:07 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:07 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:07 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120307000 20/12/16 12:05:07
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120307000 has completed, watermarks have been
> updated. 20/12/16 12:05:07 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:07 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:07 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:07 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:07 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120307500 20/12/16 12:05:07
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120307500 has completed, watermarks have been
> updated. 20/12/16 12:05:08 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:08 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:08 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:08 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:08 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120308000 20/12/16 12:05:08
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120308000 has completed, watermarks have been
> updated. 20/12/16 12:05:08 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:08 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:08 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:08 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:08 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120308500 20/12/16 12:05:08
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120308500 has completed, watermarks have been
> updated. 20/12/16 12:05:09 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:09 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:09 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:09 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:09 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120309000 20/12/16 12:05:09
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120309000 has completed, watermarks have been
> updated. 20/12/16 12:05:09 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:09 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:09 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:09 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:09 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120309500 20/12/16 12:05:09
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120309500 has completed, watermarks have been
> updated. 20/12/16 12:05:10 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:10 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:10 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:10 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:10 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120310000 20/12/16 12:05:10
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120310000 has completed, watermarks have been
> updated. 20/12/16 12:05:10 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:10 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:10 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:10 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:10 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120310500 20/12/16 12:05:10
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120310500 has completed, watermarks have been
> updated. 20/12/16 12:05:11 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:11 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:11 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:11 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:05:11 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120311000 20/12/16 12:05:11
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120311000 has completed, watermarks have been
> updated. 20/12/16 12:05:11 INFO
> org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing
> environment urn: "EMBEDDED" capabilities: "beam:coder:bytes:v1" capabilities:
> "beam:coder:bool:v1" capabilities: "beam:coder:varint:v1" capabilities:
> "beam:coder:string_utf8:v1" capabilities: "beam:coder:iterable:v1"
> capabilities: "beam:coder:timer:v1" capabilities: "beam:coder:kv:v1"
> capabilities: "beam:coder:length_prefix:v1" capabilities:
> "beam:coder:global_window:v1" capabilities: "beam:coder:interval_window:v1"
> capabilities: "beam:coder:windowed_value:v1" capabilities:
> "beam:coder:double:v1" capabilities: "beam:coder:row:v1" capabilities:
> "beam:coder:param_windowed_value:v1" capabilities:
> "beam:coder:state_backed_iterable:v1" capabilities:
> "beam:coder:sharded_key:v1" capabilities:
> "beam:protocol:multi_core_bundle_processing:v1" capabilities:
> "beam:protocol:progress_reporting:v1" capabilities:
> "beam:version:sdk_base:apache/beam_java8_sdk:2.27.0.dev" capabilities:
> "beam:transform:sdf_truncate_sized_restrictions:v1" dependencies { type_urn:
> "beam:artifact:type:file:v1" type_payload:
> "\n\244\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/1-EMBEDDED-icedtea-sound-Iwne2hzRY_LcFqVV9QV5vKlMjP51hQ-1eOLJzzvtpbI.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n=icedtea-sound-Iwne2hzRY_LcFqVV9QV5vKlMjP51hQ-1eOLJzzvtpbI.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\236\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/2-EMBEDDED-jaccess-GSuLz3csmJu-X9AUZiHTbJqpaMTUV566wbey67oiHdQ.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n7jaccess-GSuLz3csmJu-X9AUZiHTbJqpaMTUV566wbey67oiHdQ.jar" } dependencies {
> type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\241\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/3-EMBEDDED-localedata-TEwJCunHd18wxAwBm3LvJNJx12solPe-xMDYaFol2sA.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n:localedata-TEwJCunHd18wxAwBm3LvJNJx12solPe-xMDYaFol2sA.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\236\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/4-EMBEDDED-nashorn-SdgMcTpgWcQtx3JL51KmsDXzNAZxettGu89S8K56auc.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n7nashorn-SdgMcTpgWcQtx3JL51KmsDXzNAZxettGu89S8K56auc.jar" } dependencies {
> type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\237\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/5-EMBEDDED-cldrdata-OUiyQiNqFFtu58zOPukvi1Butg3ZPuK250n_3RmVExU.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n8cldrdata-OUiyQiNqFFtu58zOPukvi1Butg3ZPuK250n_3RmVExU.jar" } dependencies
> { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\234\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/6-EMBEDDED-dnsns-R-BZtd9v6E5wbD8eGFy7fqsxltVReeEAFsXCJxHGNrw.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n5dnsns-R-BZtd9v6E5wbD8eGFy7fqsxltVReeEAFsXCJxHGNrw.jar" } dependencies {
> type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\244\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/7-EMBEDDED-gradle-worker-qf5IiwZXu67peCt-moRUDeA9ftsB4QLMU30y-NdWzC8.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n=gradle-worker-qf5IiwZXu67peCt-moRUDeA9ftsB4QLMU30y-NdWzC8.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\277\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/8-EMBEDDED-beam-runners-spark-2.27.0-SNAPSHOT-tests-VIUwWCUBH9VYrRfgJ3JgkXOH_rL8lwAXCrO3No5eqhs.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nXbeam-runners-spark-2.27.0-SNAPSHOT-tests-VIUwWCUBH9VYrRfgJ3JgkXOH_rL8lwAXCrO3No5eqhs.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\271\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/9-EMBEDDED-beam-runners-spark-2.27.0-SNAPSHOT-roDcluFjvJNigr8KNT1mYws9zKKLnT9KBmLXJGHrakA.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nRbeam-runners-spark-2.27.0-SNAPSHOT-roDcluFjvJNigr8KNT1mYws9zKKLnT9KBmLXJGHrakA.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\300\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/10-EMBEDDED-beam-runners-portability-java-2.27.0-SNAPSHOT-tests-k6rmbO4nQYSnVfGyVKueSIwhszdOeM0Ln-vO"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\ncbeam-runners-portability-java-2.27.0-SNAPSHOT-tests-k6rmbO4nQYSnVfGyVKueSIwhszdOeM0Ln-vOOb1MPmI.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\300\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/11-EMBEDDED-beam-runners-portability-java-2.27.0-SNAPSHOT-rnaz2uH7bTuALGVp2OiUkziXEVGgWgkfy2FYrt0Gd4"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n]beam-runners-portability-java-2.27.0-SNAPSHOT-rnaz2uH7bTuALGVp2OiUkziXEVGgWgkfy2FYrt0Gd4w.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\300\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/12-EMBEDDED-beam-sdks-java-harness-2.27.0-SNAPSHOT-unshaded-nYh58N3Io3bOmnY4RSS60khe4fFu_STm-bLYDeed"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n_beam-sdks-java-harness-2.27.0-SNAPSHOT-unshaded-nYh58N3Io3bOmnY4RSS60khe4fFu_STm-bLYDeedV84.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\276\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/13-EMBEDDED-beam-sdks-java-harness-2.27.0-SNAPSHOT-gh6Mx8dlVqTJlqeMiGJpGfwQHxVyXfv71JfZxT9XlAs.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nVbeam-sdks-java-harness-2.27.0-SNAPSHOT-gh6Mx8dlVqTJlqeMiGJpGfwQHxVyXfv71JfZxT9XlAs.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\300\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/14-EMBEDDED-beam-runners-core-java-2.27.0-SNAPSHOT-tests-RDLdZGC6JzZSXd5C3QOfmvw0_5AsazsorSCUoRhjukg"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n\\beam-runners-core-java-2.27.0-SNAPSHOT-tests-RDLdZGC6JzZSXd5C3QOfmvw0_5AsazsorSCUoRhjukg.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\276\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/15-EMBEDDED-beam-runners-core-java-2.27.0-SNAPSHOT-EoJ25ciIZMjsAsXPS5rF9lMtKKNpvgoVfoiRRL_Y070.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nVbeam-runners-core-java-2.27.0-SNAPSHOT-EoJ25ciIZMjsAsXPS5rF9lMtKKNpvgoVfoiRRL_Y070.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\300\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/16-EMBEDDED-beam-runners-java-job-service-2.27.0-SNAPSHOT-uX1c_5frzX2Xl8Dxwhv74l1LtK_tJN0lnW47YmuX_m"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n]beam-runners-java-job-service-2.27.0-SNAPSHOT-uX1c_5frzX2Xl8Dxwhv74l1LtK_tJN0lnW47YmuX_mg.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\277\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/17-EMBEDDED-beam-sdks-java-io-kafka-2.27.0-SNAPSHOT--3tpjyY5fnm6MnYASABrYhNPATm_9NQWNJ5O0OzNd9g.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nWbeam-sdks-java-io-kafka-2.27.0-SNAPSHOT--3tpjyY5fnm6MnYASABrYhNPATm_9NQWNJ5O0OzNd9g.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\300\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/18-EMBEDDED-beam-sdks-java-expansion-service-2.27.0-SNAPSHOT-Pq7-h5rVFSiymvpGKRHg_tR7ri-Pw5Fpm-Oku_A"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n`beam-sdks-java-expansion-service-2.27.0-SNAPSHOT-Pq7-h5rVFSiymvpGKRHg_tR7ri-Pw5Fpm-Oku_AFSf0.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\300\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/19-EMBEDDED-beam-runners-java-fn-execution-2.27.0-SNAPSHOT-F7yo2w8-i6VnpVD9viLcI04VkkboYBM2c7mSZqfHp"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n^beam-runners-java-fn-execution-2.27.0-SNAPSHOT-F7yo2w8-i6VnpVD9viLcI04VkkboYBM2c7mSZqfHpLc.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\300\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/20-EMBEDDED-beam-runners-core-construction-java-2.27.0-SNAPSHOT-CIMmn5THHKVYsTOdd9ZB67FVO6Ru7B_q48Of"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\ncbeam-runners-core-construction-java-2.27.0-SNAPSHOT-CIMmn5THHKVYsTOdd9ZB67FVO6Ru7B_q48Of6usYPc4.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\300\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/21-EMBEDDED-beam-runners-core-construction-java-2.27.0-SNAPSHOT-tests-O6I6jZM0kJWyiJHL-qHgPRFSUwXEEj"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nibeam-runners-core-construction-java-2.27.0-SNAPSHOT-tests-O6I6jZM0kJWyiJHL-qHgPRFSUwXEEj-oJVNOB4027p4.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\300\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/22-EMBEDDED-beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-RGTlUSZ9ATn9KQPtZ4Q"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\ntbeam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-RGTlUSZ9ATn9KQPtZ4QuP0pbls8RBTH-gdR-EFJhkYM.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\300\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/23-EMBEDDED-beam-sdks-java-fn-execution-2.27.0-SNAPSHOT-Pmfe35mZE8zLMJpjz-sptwayCpdMEM8Z04gJSvwMALs."
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n[beam-sdks-java-fn-execution-2.27.0-SNAPSHOT-Pmfe35mZE8zLMJpjz-sptwayCpdMEM8Z04gJSvwMALs.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\300\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/24-EMBEDDED-beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-AYlO4ihGFJdqvbnpluofQsA8Z1QEUN"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nibeam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-AYlO4ihGFJdqvbnpluofQsA8Z1QEUN-0HOitHv_f_ww.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\300\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/25-EMBEDDED-beam-sdks-java-core-2.27.0-SNAPSHOT-tests-FB4KyxowVLZuqUVDK6zQ0kXL6qK9mF6mrJARQCHcw10.ja"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nYbeam-sdks-java-core-2.27.0-SNAPSHOT-tests-FB4KyxowVLZuqUVDK6zQ0kXL6qK9mF6mrJARQCHcw10.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\273\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/26-EMBEDDED-beam-sdks-java-core-2.27.0-SNAPSHOT-UrazWBt9fU5ArRv6sZ1H15t_HWtduxqPkHElrCI0asI.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nSbeam-sdks-java-core-2.27.0-SNAPSHOT-UrazWBt9fU5ArRv6sZ1H15t_HWtduxqPkHElrCI0asI.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\300\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/27-EMBEDDED-beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded-A11QO_xoqKlDnci-I1cfTYwtNTqmLjC18EwBHjNmHJg"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n\\beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded-A11QO_xoqKlDnci-I1cfTYwtNTqmLjC18EwBHjNmHJg.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\256\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/28-EMBEDDED-guava-testlib-25.1-jre-Gs-YhZbSzRrSwEz1JPl5w-hUaJjNF3C9wAfy8mjXtmQ.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nFguava-testlib-25.1-jre-Gs-YhZbSzRrSwEz1JPl5w-hUaJjNF3C9wAfy8mjXtmQ.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\254\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/29-EMBEDDED-spark-sql_2.11-2.4.7-CsTKJqhM5x-U_FGx7jDozW9Fht4c_cDu74ZBnyGX6fw.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nDspark-sql_2.11-2.4.7-CsTKJqhM5x-U_FGx7jDozW9Fht4c_cDu74ZBnyGX6fw.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\262\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/30-EMBEDDED-spark-streaming_2.11-2.4.7-ue_7LAg7Qr813xkNgtQfU9eAIy07ReCHArqpz9Cv0Ew.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nJspark-streaming_2.11-2.4.7-ue_7LAg7Qr813xkNgtQfU9eAIy07ReCHArqpz9Cv0Ew.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\261\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/31-EMBEDDED-spark-catalyst_2.11-2.4.7-HPCOgFf3FaqeYmff418i_Br9I2v89E_YDyEEJizcIU0.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nIspark-catalyst_2.11-2.4.7-HPCOgFf3FaqeYmff418i_Br9I2v89E_YDyEEJizcIU0.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\255\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/32-EMBEDDED-spark-core_2.11-2.4.7-kfYudFrbOp2tu2iliTUL91CGXawNYRvw3hHz-VnbUFI.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nEspark-core_2.11-2.4.7-kfYudFrbOp2tu2iliTUL91CGXawNYRvw3hHz-VnbUFI.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\254\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/33-EMBEDDED-hadoop-client-2.10.1-bZMdxnHBprJoceVSpimHjHyDAlR_8INTS5Ku1LylsAI.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nDhadoop-client-2.10.1-bZMdxnHBprJoceVSpimHjHyDAlR_8INTS5Ku1LylsAI.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\272\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/34-EMBEDDED-hadoop-mapreduce-client-app-2.10.1-NH4nvQw_Czc4M4WzRq2tj6-1cFEdzq5PBdwN4QG4RBE.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nRhadoop-mapreduce-client-app-2.10.1-NH4nvQw_Czc4M4WzRq2tj6-1cFEdzq5PBdwN4QG4RBE.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\300\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/35-EMBEDDED-hadoop-mapreduce-client-jobclient-2.10.1-nfZ_Ocdnp24qWUZ6RM9JQ3wro8n2JRrCanuecVq7Ubg.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nXhadoop-mapreduce-client-jobclient-2.10.1-nfZ_Ocdnp24qWUZ6RM9JQ3wro8n2JRrCanuecVq7Ubg.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\276\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/36-EMBEDDED-hadoop-mapreduce-client-shuffle-2.10.1-UTrRBwBQKG40z_CG64RfISUBsktoZ6cl7KEunywMYfQ.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nVhadoop-mapreduce-client-shuffle-2.10.1-UTrRBwBQKG40z_CG64RfISUBsktoZ6cl7KEunywMYfQ.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\275\001/tmp/beam-artifact-staging/2707c96f1fced59993ea3da6444b6592acbca62f27616ed750ad7aad6fd28900/37-EMBEDDED-hadoop-yarn-server-nodemanager-2.10.1-tgNbEbkCuTUhAYpNN5XJZnAX1iNurbPv99ysUI6cCaU.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nUhadoop-yarn-server-nodemanager-2.10.1-tgNbEbkCuTUhAYpNN5XJZnAX1iNurbPv99ysUI6cCaU.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" t ...[truncated
> 1095061 chars]...
> 1e09d6608/232-EMBEDDED-protobuf-java-3.12.0-qY7VoCcs3aa96Y_hXnlOPZnsBFVNuguo4KSf9c7MXp4.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nDprotobuf-java-3.12.0-qY7VoCcs3aa96Y_hXnlOPZnsBFVNuguo4KSf9c7MXp4.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\243\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/233-EMBEDDED-gson-2.8.6-yPtIOQVNKAswM_gA0fWpfeLwKOuLoutFitKH5Tbz8l8.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n:gson-2.8.6-yPtIOQVNKAswM_gA0fWpfeLwKOuLoutFitKH5Tbz8l8.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\244\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/234-EMBEDDED-jsch-0.1.55-1JKxWm0uo_HMOcQiyVPEDBIokHPb6DYNmMD2-ex0_EQ.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n;jsch-0.1.55-1JKxWm0uo_HMOcQiyVPEDBIokHPb6DYNmMD2-ex0_EQ.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\266\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/235-EMBEDDED-htrace-core4-4.1.0-incubating-XUW3kEhXw-StNrO8xXvi0sXzCMabX2pYvYaqfUiiXvY.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nMhtrace-core4-4.1.0-incubating-XUW3kEhXw-StNrO8xXvi0sXzCMabX2pYvYaqfUiiXvY.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\254\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/236-EMBEDDED-woodstox-core-5.0.3-ocBLZPv-IK6fLGCjvxYz_tZoiuMZNba9SkV6G7sugtQ.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nCwoodstox-core-5.0.3-ocBLZPv-IK6fLGCjvxYz_tZoiuMZNba9SkV6G7sugtQ.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\250\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/237-EMBEDDED-stax2-api-4.2.1-Z4Vn5ItRpCxlxpnyZlOa09Z21LGlsK19iezoudV3JXk.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n?stax2-api-4.2.1-Z4Vn5ItRpCxlxpnyZlOa09Z21LGlsK19iezoudV3JXk.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\245\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/238-EMBEDDED-hamcrest-2.1-upOy46ViMiukMvChtTrdzFXLGIJTMZoCDtd_gk5pIFA.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n<hamcrest-2.1-upOy46ViMiukMvChtTrdzFXLGIJTMZoCDtd_gk5pIFA.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\257\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/239-EMBEDDED-reflectasm-1.07-shaded-CKcOrbSydO2u_BGUwfdXBiGlGwqaoDaqFdzbe5J-fHY.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nFreflectasm-1.07-shaded-CKcOrbSydO2u_BGUwfdXBiGlGwqaoDaqFdzbe5J-fHY.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\243\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/240-EMBEDDED-minlog-1.2-pnjLGqj10D2QHJksdXQYQdmKm8PVXa0C6E1lMVxOYPI.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n:minlog-1.2-pnjLGqj10D2QHJksdXQYQdmKm8PVXa0C6E1lMVxOYPI.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\257\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/241-EMBEDDED-j2objc-annotations-1.3-Ia8wySJnvWEiwOC00gzMtmQaN-r5VsZUDsRx1YTmSns.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nFj2objc-annotations-1.3-Ia8wySJnvWEiwOC00gzMtmQaN-r5VsZUDsRx1YTmSns.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\250\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/242-EMBEDDED-javaruntype-1.3-hGIPTL4YOPBHEuOnoizGWeK7lC6c27i1TX8uRrF9Q-g.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n?javaruntype-1.3-hGIPTL4YOPBHEuOnoizGWeK7lC6c27i1TX8uRrF9Q-g.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\244\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/243-EMBEDDED-ognl-3.1.12-dLY_oM2x1HGOaAfy7RAFrC8VpRORDWgDmvmlWRlhlek.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n;ognl-3.1.12-dLY_oM2x1HGOaAfy7RAFrC8VpRORDWgDmvmlWRlhlek.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\260\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/244-EMBEDDED-generics-resolver-2.0.1-LT7P9gZLIjSTlVVcCZQJSTcacMt8Dg_e23EVZrE_KTE.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nGgenerics-resolver-2.0.1-LT7P9gZLIjSTlVVcCZQJSTcacMt8Dg_e23EVZrE_KTE.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\265\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/245-EMBEDDED-auto-value-annotations-1.7.2-hzmNqWKhIQOhlXuyOcVW-OjWbKESv4q5msQEyAojhas.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nLauto-value-annotations-1.7.2-hzmNqWKhIQOhlXuyOcVW-OjWbKESv4q5msQEyAojhas.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\256\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/246-EMBEDDED-opencensus-api-0.24.0-9WGxzCZzhEKI5Zbd9btlloaKhHL9LLiZOVP8XANLI1I.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nEopencensus-api-0.24.0-9WGxzCZzhEKI5Zbd9btlloaKhHL9LLiZOVP8XANLI1I.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\254\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/247-EMBEDDED-grpc-context-1.32.2-0H-oAV0WIUvlA0wvEXe_uKBRhqwRIR-eLAJr7PQ6844.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nCgrpc-context-1.32.2-0H-oAV0WIUvlA0wvEXe_uKBRhqwRIR-eLAJr7PQ6844.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\260\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/248-EMBEDDED-google-extensions-0.5.1-iwhiythblUnzVf44PGxjgW0vGVKWNOAzrgbQEHqxELk.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nGgoogle-extensions-0.5.1-iwhiythblUnzVf44PGxjgW0vGVKWNOAzrgbQEHqxELk.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\265\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/249-EMBEDDED-flogger-system-backend-0.5.1-aF3jO1PrMTBJu-7n9LeoDdCejnVOlrBIo-2rLOuzZEI.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nLflogger-system-backend-0.5.1-aF3jO1PrMTBJu-7n9LeoDdCejnVOlrBIo-2rLOuzZEI.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\255\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/250-EMBEDDED-zookeeper-jute-3.5.7-AjJ2xg8IPcfJGZqfwkbEb-l3XELhhcJ_RTiH4zyDtwU.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nDzookeeper-jute-3.5.7-AjJ2xg8IPcfJGZqfwkbEb-l3XELhhcJ_RTiH4zyDtwU.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\263\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/251-EMBEDDED-audience-annotations-0.5.0-yCYx8Gx11Gv2Uk2V8NbC467xs-tKe1hMopZiTvDUdL4.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nJaudience-annotations-0.5.0-yCYx8Gx11Gv2Uk2V8NbC467xs-tKe1hMopZiTvDUdL4.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\263\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/252-EMBEDDED-netty-handler-4.1.51.Final-RGGXDwT01euREq2UJVzhmHOUzmTebDyHaQvwhlyTYlg.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nJnetty-handler-4.1.51.Final-RGGXDwT01euREq2UJVzhmHOUzmTebDyHaQvwhlyTYlg.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\300\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/253-EMBEDDED-netty-transport-native-epoll-4.1.51.Final-jhp8_fw4knNnkBZ4PgG68gbezxSQmhnAGpocbjesu9M.j"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nYnetty-transport-native-epoll-4.1.51.Final-jhp8_fw4knNnkBZ4PgG68gbezxSQmhnAGpocbjesu9M.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\262\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/254-EMBEDDED-hadoop-hdfs-client-2.10.1-PY6m6joD2T6vr-wkECaRkhrDk3J5PT6sa_ybU6-bjkQ.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nIhadoop-hdfs-client-2.10.1-PY6m6joD2T6vr-wkECaRkhrDk3J5PT6sa_ybU6-bjkQ.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\245\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/255-EMBEDDED-shims-0.7.45-jqsADNtdJKUQB6hTc082FGkyfte63kS6gYD9Pk-w_V0.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n<shims-0.7.45-jqsADNtdJKUQB6hTc082FGkyfte63kS6gYD9Pk-w_V0.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\256\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/256-EMBEDDED-javax.ws.rs-api-2.0.1-OGB9Ym8iiNj7wbH4piw2nmOAbZoxOsfLxfnWyU9LRm0.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nEjavax.ws.rs-api-2.0.1-OGB9Ym8iiNj7wbH4piw2nmOAbZoxOsfLxfnWyU9LRm0.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\256\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/257-EMBEDDED-hk2-locator-2.4.0-b34-6kfr9-1W73UQVXEM-tNoQLzDY4PPOHxKljtBRHwGb48.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nEhk2-locator-2.4.0-b34-6kfr9-1W73UQVXEM-tNoQLzDY4PPOHxKljtBRHwGb48.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\252\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/258-EMBEDDED-hk2-api-2.4.0-b34-brBxquoycBWsPaGNUGbDZMGjmXj0tvlGRBWGdcpbnO0.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nAhk2-api-2.4.0-b34-brBxquoycBWsPaGNUGbDZMGjmXj0tvlGRBWGdcpbnO0.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\257\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/259-EMBEDDED-javax.inject-2.4.0-b34-_b-AoBuFQEW9QAS3xrH9wtqB20db-9CO1XTu_8-aexo.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nFjavax.inject-2.4.0-b34-_b-AoBuFQEW9QAS3xrH9wtqB20db-9CO1XTu_8-aexo.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\261\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/260-EMBEDDED-javax.annotation-api-1.2-WQmzlso6K-ENDuoyx073jYFuG06tId4deN4fiQ0DPgQ.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nHjavax.annotation-api-1.2-WQmzlso6K-ENDuoyx073jYFuG06tId4deN4fiQ0DPgQ.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\254\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/261-EMBEDDED-jersey-guava-2.22.2-D9zHXQJa_0Ay07i-kJtaCCkTsn2VOtgt1d8q0prqY2s.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nCjersey-guava-2.22.2-D9zHXQJa_0Ay07i-kJtaCCkTsn2VOtgt1d8q0prqY2s.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\264\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/262-EMBEDDED-osgi-resource-locator-1.0.1-d1ADvld-iAb1G25EK-EDPYO-LLIgciezSb4L8W5sCEM.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nKosgi-resource-locator-1.0.1-d1ADvld-iAb1G25EK-EDPYO-LLIgciezSb4L8W5sCEM.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\263\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/263-EMBEDDED-validation-api-1.1.0.Final-8517pyU-NfWsSAgewbwoxd-bMqxLfbIIU-Wo52v3sO0.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nJvalidation-api-1.1.0.Final-8517pyU-NfWsSAgewbwoxd-bMqxLfbIIU-Wo52v3sO0.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\246\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/264-EMBEDDED-janino-3.0.16-9h24Y75jpbOBXYwclGRugdEWiJXv3lrNNIPCQiG5xlU.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n=janino-3.0.16-9h24Y75jpbOBXYwclGRugdEWiJXv3lrNNIPCQiG5xlU.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\260\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/265-EMBEDDED-commons-compiler-3.0.16-C4BjaTC2IZexotsEriG57Vz6yXFctE7dbcb2dfQXlwg.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nGcommons-compiler-3.0.16-C4BjaTC2IZexotsEriG57Vz6yXFctE7dbcb2dfQXlwg.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\243\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/266-EMBEDDED-antlr4-4.7-eGclcCizNzrwEd7nts6bWHqP1cegsl9osv9MuQvoqgc.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n:antlr4-4.7-eGclcCizNzrwEd7nts6bWHqP1cegsl9osv9MuQvoqgc.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\253\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/267-EMBEDDED-antlr4-runtime-4.7-KmGUP4A7vR0OAt_9GbkqQY-DNAyZQ0aAnjtR4iMapsA.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nBantlr4-runtime-4.7-KmGUP4A7vR0OAt_9GbkqQY-DNAyZQ0aAnjtR4iMapsA.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\253\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/268-EMBEDDED-aircompressor-0.10-pUcavdyZqVk5q_wEBc3bIhPE-6Vh3pT4iNbmJVZugmw.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nBaircompressor-0.10-pUcavdyZqVk5q_wEBc3bIhPE-6Vh3pT4iNbmJVZugmw.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\257\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/269-EMBEDDED-parquet-jackson-1.10.1-m8RDI886Nr-xqxl_W48rE6OiYTuqBIm7JSszVTVipSg.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nFparquet-jackson-1.10.1-m8RDI886Nr-xqxl_W48rE6OiYTuqBIm7JSszVTVipSg.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\254\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/270-EMBEDDED-arrow-format-0.10.0-ITh71gEtmLvHCD80n5Vp3EeYzXLFt8mqcTCSu84ZOes.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nCarrow-format-0.10.0-ITh71gEtmLvHCD80n5Vp3EeYzXLFt8mqcTCSu84ZOes.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\243\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/271-EMBEDDED-hppc-0.7.2-ez3WZh6D4xPXC0qoLFGAuzlTXlNqNDX6dB__lydDO2o.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n:hppc-0.7.2-ez3WZh6D4xPXC0qoLFGAuzlTXlNqNDX6dB__lydDO2o.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\263\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/272-EMBEDDED-flatbuffers-1.2.0-3f79e055-dD-XMWCWum6FKJFOorBi9qAvyR7HPJilpGJA1tZ-aJg.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nJflatbuffers-1.2.0-3f79e055-dD-XMWCWum6FKJFOorBi9qAvyR7HPJilpGJA1tZ-aJg.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\270\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/273-EMBEDDED-animal-sniffer-annotations-1.18-R_BYUrSO6brv74D6PYzqYO-kdTwAExId1_5e7y5ccp0.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nOanimal-sniffer-annotations-1.18-R_BYUrSO6brv74D6PYzqYO-kdTwAExId1_5e7y5ccp0.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\250\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/274-EMBEDDED-httpcore-4.4.13-4G6J1AlDJF_Po57FN82_zjdirs3o-cWXeA0rAMK0NCQ.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n?httpcore-4.4.13-4G6J1AlDJF_Po57FN82_zjdirs3o-cWXeA0rAMK0NCQ.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\245\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/275-EMBEDDED-jettison-1.1-N3lAKIsGQ8SHgBN_b2hXiTfh6lyitzgwqCDFCnt-2AE.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n<jettison-1.1-N3lAKIsGQ8SHgBN_b2hXiTfh6lyitzgwqCDFCnt-2AE.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\250\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/276-EMBEDDED-jaxb-impl-2.3.3-5ReNDHlIJH91oTxom_NvTV1JEKEh9xKqOyCulDdwadg.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n?jaxb-impl-2.3.3-5ReNDHlIJH91oTxom_NvTV1JEKEh9xKqOyCulDdwadg.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\240\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/277-EMBEDDED-asm-3.1-Mz_1NpBDl1t-AxuLJyBpN0QYVHOOA4wfR_mNByogQ3o.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n7asm-3.1-Mz_1NpBDl1t-AxuLJyBpN0QYVHOOA4wfR_mNByogQ3o.jar" } dependencies {
> type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\254\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/278-EMBEDDED-java-xmlbuilder-0.4-aB5TxP_Vn6EgaIA7JZ46g9Q_B6R8ES50ihh97heesx8.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nCjava-xmlbuilder-0.4-aB5TxP_Vn6EgaIA7JZ46g9Q_B6R8ES50ihh97heesx8.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\254\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/279-EMBEDDED-nimbus-jose-jwt-7.9-tPWEU-GAqYHrdEoZtNVq-xLxDD3TXnUxzFjubL9b8oY.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nCnimbus-jose-jwt-7.9-tPWEU-GAqYHrdEoZtNVq-xLxDD3TXnUxzFjubL9b8oY.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\247\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/280-EMBEDDED-json-smart-2.3-kD9IyKpMP2QmRAuNMt6J-h3COxFpq94l5OHQaKpncIs.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n>json-smart-2.3-kD9IyKpMP2QmRAuNMt6J-h3COxFpq94l5OHQaKpncIs.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\254\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/281-EMBEDDED-accessors-smart-1.2-DHwmXWL8AHEk3DK5EzbpxCcmUdYpvF-hpOTjvHWOsuQ.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nCaccessors-smart-1.2-DHwmXWL8AHEk3DK5EzbpxCcmUdYpvF-hpOTjvHWOsuQ.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\242\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/282-EMBEDDED-asm-5.0.4-iWYY7YrmJwJSGni8e-QrfEkaCOaSChX4mj7N7DHpoiA.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n9asm-5.0.4-iWYY7YrmJwJSGni8e-QrfEkaCOaSChX4mj7N7DHpoiA.jar" } dependencies
> { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\242\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/283-EMBEDDED-ST4-4.0.8-WMqrxAyfdLC1mT_YaOD2SlDAdZCU5qJRqq-tmO38ejs.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n9ST4-4.0.8-WMqrxAyfdLC1mT_YaOD2SlDAdZCU5qJRqq-tmO38ejs.jar" } dependencies
> { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\254\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/284-EMBEDDED-antlr-runtime-3.5.2-zj_I7LEPOemjzdy7LONQ0nLZzT0LHhjm_nPDuTichzQ.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nCantlr-runtime-3.5.2-zj_I7LEPOemjzdy7LONQ0nLZzT0LHhjm_nPDuTichzQ.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\254\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/285-EMBEDDED-javassist-3.20.0-GA-12kQYvt3nCOBZAyPcqy6LCOHOwHCQ4ZtQcFdxMiEjqI.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nCjavassist-3.20.0-GA-12kQYvt3nCOBZAyPcqy6LCOHOwHCQ4ZtQcFdxMiEjqI.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\246\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/286-EMBEDDED-flogger-0.5.1-tezRSD4EEZcBJ4b3SZaKYgY8GWTT7Pv5a6kqlXl7uPU.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n=flogger-0.5.1-tezRSD4EEZcBJ4b3SZaKYgY8GWTT7Pv5a6kqlXl7uPU.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\262\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/287-EMBEDDED-checker-compat-qual-2.5.3-12ua_qYcfAgpCAI_DLwUJ_q5q9LfkVyLij56UJvMvG0.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nIchecker-compat-qual-2.5.3-12ua_qYcfAgpCAI_DLwUJ_q5q9LfkVyLij56UJvMvG0.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\261\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/288-EMBEDDED-netty-codec-4.1.51.Final-_3QaqjX3BIpr58cAqkhRv2Q5F2SOpbfAy62i84SMK-4.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nHnetty-codec-4.1.51.Final-_3QaqjX3BIpr58cAqkhRv2Q5F2SOpbfAy62i84SMK-4.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\300\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/289-EMBEDDED-netty-transport-native-unix-common-4.1.51.Final-FHWV_0ViQv0bMtHmzXgKZluNjgOk4b9W2ot3Kzh"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n_netty-transport-native-unix-common-4.1.51.Final-FHWV_0ViQv0bMtHmzXgKZluNjgOk4b9W2ot3Kzhjxtk.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\265\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/290-EMBEDDED-netty-transport-4.1.51.Final-5b4lnzWiRr9QStk-qPXfMYcrWr6_t1E4DquV1dyEDUQ.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nLnetty-transport-4.1.51.Final-5b4lnzWiRr9QStk-qPXfMYcrWr6_t1E4DquV1dyEDUQ.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\264\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/291-EMBEDDED-netty-resolver-4.1.51.Final-yKd3ZeSB-_WQbFlutEHeSQlrNUvK4DVrdASsXpY5k1A.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nKnetty-resolver-4.1.51.Final-yKd3ZeSB-_WQbFlutEHeSQlrNUvK4DVrdASsXpY5k1A.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\262\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/292-EMBEDDED-netty-buffer-4.1.51.Final-w8O3EOG1qN89YM1GAuCnQ0gdXmCeSqhS-iYp5OQS0kU.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nInetty-buffer-4.1.51.Final-w8O3EOG1qN89YM1GAuCnQ0gdXmCeSqhS-iYp5OQS0kU.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\262\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/293-EMBEDDED-netty-common-4.1.51.Final-EQ4GUV9DkTorusI-GqeLf1muCdRmsAr1_POZpPmvG2s.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nInetty-common-4.1.51.Final-EQ4GUV9DkTorusI-GqeLf1muCdRmsAr1_POZpPmvG2s.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\245\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/294-EMBEDDED-minlog-1.3.0-97OZ06VHik8-DZi9HJ9HdmEZxmQUvDOqD2zeAGbyTMI.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n<minlog-1.3.0-97OZ06VHik8-DZi9HJ9HdmEZxmQUvDOqD2zeAGbyTMI.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\245\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/295-EMBEDDED-okhttp-2.7.5-iKyf0btR-CvMZkzB65wiXJDcQ4nWYCMbTMc3vr_n0Ko.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n<okhttp-2.7.5-iKyf0btR-CvMZkzB65wiXJDcQ4nWYCMbTMc3vr_n0Ko.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\254\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/296-EMBEDDED-hk2-utils-2.4.0-b34-cCEbH5GIGb9q-_adPRnUrm4qddbib2w5up8g645WEtc.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nChk2-utils-2.4.0-b34-cCEbH5GIGb9q-_adPRnUrm4qddbib2w5up8g645WEtc.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\271\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/297-EMBEDDED-aopalliance-repackaged-2.4.0-b34-XTywzs5yLHuoq5h7kxBTzbywyxKtXIyKdpHrb35gpks.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nPaopalliance-repackaged-2.4.0-b34-XTywzs5yLHuoq5h7kxBTzbywyxKtXIyKdpHrb35gpks.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\261\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/298-EMBEDDED-servlet-api-2.5-20081211-BodWCWmW_gD2BKw7ZnLW9mPcd36kqDBW4kDQRW535HI.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nHservlet-api-2.5-20081211-BodWCWmW_gD2BKw7ZnLW9mPcd36kqDBW4kDQRW535HI.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\263\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/299-EMBEDDED-jakarta.xml.bind-api-2.3.3-wEU59HLppt0MdoXqgtZ3KCJpq457rKLhRQDjgeDGzsU.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nJjakarta.xml.bind-api-2.3.3-wEU59HLppt0MdoXqgtZ3KCJpq457rKLhRQDjgeDGzsU.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\261\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/300-EMBEDDED-jakarta.activation-1.2.2-AhVnc-SunQSNFKVq011kS-6fEFKnkdBy3z3tPGVubho.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nHjakarta.activation-1.2.2-AhVnc-SunQSNFKVq011kS-6fEFKnkdBy3z3tPGVubho.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\257\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/301-EMBEDDED-jcip-annotations-1.0-1-T8z_g4Kq_FiZYsTtsmL2qlleNPHhHmEFfRxqluj8cyM.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nFjcip-annotations-1.0-1-T8z_g4Kq_FiZYsTtsmL2qlleNPHhHmEFfRxqluj8cyM.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\270\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/302-EMBEDDED-org.abego.treelayout.core-1.0.3--l4xOVw5wufUasoPgfcgYJMWB7L6Qb02A46yy2-5MyY.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nOorg.abego.treelayout.core-1.0.3--l4xOVw5wufUasoPgfcgYJMWB7L6Qb02A46yy2-5MyY.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\251\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/303-EMBEDDED-javax.json-1.0.4-Dh3sQKHt6WWUElHtqWiu7gUsxPUDeLwxbMSOgVm9vrQ.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n...@javax.json-1.0.4-dh3sqkht6wwuelhtqwiu7gusxpudelwxbmsogvm9vrq.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\243\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/304-EMBEDDED-icu4j-58.2-lT4eg7K-fD6i-I2obBNhT0fp5x01eMhSHX8Yd1a2OWI.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n:icu4j-58.2-lT4eg7K-fD6i-I2obBNhT0fp5x01eMhSHX8Yd1a2OWI.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\254\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/305-EMBEDDED-annotations-4.1.1.4-unNOHoTAnWFa9qCdMwNLTwRC-Hct7BIO-zdthqVlrhU.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nCannotations-4.1.1.4-unNOHoTAnWFa9qCdMwNLTwRC-Hct7BIO-zdthqVlrhU.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\243\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/306-EMBEDDED-okio-1.6.0-EUvcH0czimi8vJWr8vXNxyvu7JGBLy_Ne1IcGTeHYmY.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n:okio-1.6.0-EUvcH0czimi8vJWr8vXNxyvu7JGBLy_Ne1IcGTeHYmY.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\247\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/307-EMBEDDED-stax-api-1.0-2-6McOvXb5gslYKoLvgs9s4Up9WKSk3KXLe3_JiMgAibc.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n>stax-api-1.0-2-6McOvXb5gslYKoLvgs9s4Up9WKSk3KXLe3_JiMgAibc.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\242\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/308-EMBEDDED-guice-3.0-GlnQQh_9NVzAtwtC3xwumvdEyKLQyS2jefX8ovB_HSI.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n9guice-3.0-GlnQQh_9NVzAtwtC3xwumvdEyKLQyS2jefX8ovB_HSI.jar" } dependencies
> { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\247\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/309-EMBEDDED-javax.inject-1-kcdwRKUMSBY2wy2Rb9ickRinIZU5BFLIEGUID5V95_8.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n>javax.inject-1-kcdwRKUMSBY2wy2Rb9ickRinIZU5BFLIEGUID5V95_8.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\275\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/310-EMBEDDED-geronimo-jcache_1.0_spec-1.0-alpha-1-AHChLlj0kblXGTkTJSmaYpRTDubDziXlC9yYsLcAlmw.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nTgeronimo-jcache_1.0_spec-1.0-alpha-1-AHChLlj0kblXGTkTJSmaYpRTDubDziXlC9yYsLcAlmw.jar"
> } dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\256\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/311-EMBEDDED-mssql-jdbc-6.2.1.jre7-nPollFCuNHHS5uLD2K78ziNuPa74s3NNIdyTw6W76AY.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nEmssql-jdbc-6.2.1.jre7-nPollFCuNHHS5uLD2K78ziNuPa74s3NNIdyTw6W76AY.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\250\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/312-EMBEDDED-aopalliance-1.0-Ct3sZw_tzT8RPFyAkdeDKA0j9146y4QbYanNsHk3agg.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\n?aopalliance-1.0-Ct3sZw_tzT8RPFyAkdeDKA0j9146y4QbYanNsHk3agg.jar" }
> dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
> "\n\256\001/tmp/beam-artifact-staging/d272049bfc04e55c494cb069356a651448009af4b1a90a9ee591f0b1e09d6608/313-EMBEDDED-cglib-2.2.1-v20090111-QuHfsmvsvxpjPyW0fjn8xCK4XnfkwEaNmkT4hfX6C-I.jar"
> role_urn: "beam:artifact:role:staging_to:v1" role_payload:
> "\nEcglib-2.2.1-v20090111-QuHfsmvsvxpjPyW0fjn8xCK4XnfkwEaNmkT4hfX6C-I.jar" }
> 20/12/16 12:08:59 INFO
> org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: 1 Beam Fn
> Logging clients still connected during shutdown. 20/12/16 12:08:59 WARN
> org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown
> endpoint. 20/12/16 12:08:59 ERROR
> org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Failed to handle for
> url: "InProcessServer_58"
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.StatusRuntimeException:
> CANCELLED: Multiplexer hanging up at
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Status.asRuntimeException(Status.java:533)
> at
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ClientCalls$StreamObserverToCallListenerAdapter.onClose(ClientCalls.java:449)
> at
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.PartialForwardingClientCallListener.onClose(PartialForwardingClientCallListener.java:39)
> at
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingClientCallListener.onClose(ForwardingClientCallListener.java:23)
> at
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingClientCallListener$SimpleForwardingClientCallListener.onClose(ForwardingClientCallListener.java:40)
> at
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.CensusStatsModule$StatsClientInterceptor$1$1.onClose(CensusStatsModule.java:700)
> at
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.PartialForwardingClientCallListener.onClose(PartialForwardingClientCallListener.java:39)
> at
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingClientCallListener.onClose(ForwardingClientCallListener.java:23)
> at
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingClientCallListener$SimpleForwardingClientCallListener.onClose(ForwardingClientCallListener.java:40)
> at
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.CensusTracingModule$TracingClientInterceptor$1$1.onClose(CensusTracingModule.java:399)
> at
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ClientCallImpl.closeObserver(ClientCallImpl.java:521)
> at
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ClientCallImpl.access$300(ClientCallImpl.java:66)
> at
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl.close(ClientCallImpl.java:641)
> at
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl.access$700(ClientCallImpl.java:529)
> at
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1StreamClosed.runInternal(ClientCallImpl.java:703)
> at
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1StreamClosed.runInContext(ClientCallImpl.java:692)
> at
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
> at
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:748) 20/12/16 12:09:00 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:00 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:00 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:00 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:00 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120540000 20/12/16 12:09:00
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120540000 has completed, watermarks have been
> updated. Dec 16, 2020 12:09:00 PM
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference
> cleanQueue SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=426,
> target=directaddress:///InProcessServer_58} was not shutdown properly!!!
> ~*~*~* Make sure to call shutdown()/shutdownNow() and wait until
> awaitTermination() returns true. java.lang.RuntimeException: ManagedChannel
> allocation site at
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:94)
> at
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:52)
> at
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:43)
> at
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:524)
> at
> org.apache.beam.sdk.fn.channel.ManagedChannelFactory.forDescriptor(ManagedChannelFactory.java:44)
> at
> org.apache.beam.fn.harness.data.BeamFnDataGrpcClient.lambda$getClientFor$0(BeamFnDataGrpcClient.java:116)
> at
> java.util.concurrent.ConcurrentHashMap.computeIfAbsent(ConcurrentHashMap.java:1660)
> at
> org.apache.beam.fn.harness.data.BeamFnDataGrpcClient.getClientFor(BeamFnDataGrpcClient.java:110)
> at
> org.apache.beam.fn.harness.data.BeamFnDataGrpcClient.send(BeamFnDataGrpcClient.java:101)
> at
> org.apache.beam.fn.harness.data.QueueingBeamFnDataClient.send(QueueingBeamFnDataClient.java:141)
> at
> org.apache.beam.fn.harness.BeamFnDataWriteRunner.registerForOutput(BeamFnDataWriteRunner.java:169)
> at
> org.apache.beam.fn.harness.data.PTransformFunctionRegistry.lambda$register$0(PTransformFunctionRegistry.java:108)
> at
> org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:301)
> at
> org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:173)
> at
> org.apache.beam.fn.harness.control.BeamFnControlClient.lambda$processInstructionRequests$0(BeamFnControlClient.java:157)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:748) 20/12/16 12:09:00 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:00 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:00 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:00 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:00 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120540500 20/12/16 12:09:00
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120540500 has completed, watermarks have been
> updated. 20/12/16 12:09:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:01 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120541000 20/12/16 12:09:01
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120541000 has completed, watermarks have been
> updated. 20/12/16 12:09:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:01 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:01 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120541500 20/12/16 12:09:01
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120541500 has completed, watermarks have been
> updated. 20/12/16 12:09:02 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:02 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:02 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:02 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:02 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120542000 20/12/16 12:09:02
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120542000 has completed, watermarks have been
> updated. 20/12/16 12:09:02 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:02 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:02 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:02 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:02 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120542500 20/12/16 12:09:02
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120542500 has completed, watermarks have been
> updated. 20/12/16 12:09:03 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:03 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:03 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:03 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:03 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120543000 20/12/16 12:09:03
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120543000 has completed, watermarks have been
> updated. 20/12/16 12:09:03 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:03 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:03 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:03 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:03 INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
> could be computed upon completion of batch: 1608120543500 20/12/16 12:09:03
> INFO
> org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
> Batch with timestamp: 1608120543500 has completed, watermarks have been
> updated. 20/12/16 12:09:04 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:04 WARN
> org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
> support checkpointing 20/12/16 12:09:04 WARN
> org.apache.spark.streaming.util.BatchedWriteAheadLog: BatchedWriteAheadLog
> Writer queue interrupted. 20/12/16 12:09:04 INFO
> org.apache.beam.runners.spark.SparkPipelineRunner: Job
> passerttest0testglobalwindowcontainsinanyorder-jenkins-1216120832-91d6670c_79bfe2a9-ea0a-43ae-883c-f0ed9de1c9a5
> finished. 20/12/16 12:09:04 WARN
> org.apache.spark.streaming.StreamingContext: StreamingContext has already
> been stopped 20/12/16 12:09:07 INFO
> org.apache.beam.runners.jobsubmission.InMemoryJobService: Getting job metrics
> for
> passerttest0testglobalwindowcontainsinanyorder-jenkins-1216120832-91d6670c_79bfe2a9-ea0a-43ae-883c-f0ed9de1c9a5
> 20/12/16 12:09:07 INFO
> org.apache.beam.runners.jobsubmission.InMemoryJobService: Finished getting
> job metrics for
> passerttest0testglobalwindowcontainsinanyorder-jenkins-1216120832-91d6670c_79bfe2a9-ea0a-43ae-883c-f0ed9de1c9a5
> Dec 16, 2020 12:09:07 PM
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference
> cleanQueue SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=422,
> target=directaddress:///InProcessServer_59} was not shutdown properly!!!
> ~*~*~* Make sure to call shutdown()/shutdownNow() and wait until
> awaitTermination() returns true. java.lang.RuntimeException: ManagedChannel
> allocation site at
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:94)
> at
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:52)
> at
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:43)
> at
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:524)
> at
> org.apache.beam.sdk.fn.channel.ManagedChannelFactory.forDescriptor(ManagedChannelFactory.java:44)
> at
> org.apache.beam.fn.harness.state.BeamFnStateGrpcClientCache$GrpcStateClient.<init>(BeamFnStateGrpcClientCache.java:89)
> at
> org.apache.beam.fn.harness.state.BeamFnStateGrpcClientCache$GrpcStateClient.<init>(BeamFnStateGrpcClientCache.java:79)
> at
> org.apache.beam.fn.harness.state.BeamFnStateGrpcClientCache.createBeamFnStateClient(BeamFnStateGrpcClientCache.java:75)
> at
> java.util.concurrent.ConcurrentHashMap.computeIfAbsent(ConcurrentHashMap.java:1660)
> at
> org.apache.beam.fn.harness.state.BeamFnStateGrpcClientCache.forApiServiceDescriptor(BeamFnStateGrpcClientCache.java:71)
> at
> org.apache.beam.fn.harness.control.ProcessBundleHandler.createBundleProcessor(ProcessBundleHandler.java:456)
> at
> org.apache.beam.fn.harness.control.ProcessBundleHandler.lambda$processBundle$0(ProcessBundleHandler.java:284)
> at
> org.apache.beam.fn.harness.control.ProcessBundleHandler$BundleProcessorCache.get(ProcessBundleHandler.java:572)
> at
> org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:279)
> at
> org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:173)
> at
> org.apache.beam.fn.harness.control.BeamFnControlClient.lambda$processInstructionRequests$0(BeamFnControlClient.java:157)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:748) Dec 16, 2020 12:09:07 PM
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference
> cleanQueue SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=418,
> target=directaddress:///InProcessServer_54} was not shutdown properly!!!
> ~*~*~* Make sure to call shutdown()/shutdownNow() and wait until
> awaitTermination() returns true. java.lang.RuntimeException: ManagedChannel
> allocation site at
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:94)
> at
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:52)
> at
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:43)
> at
> org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:524)
> at
> org.apache.beam.sdk.fn.channel.ManagedChannelFactory.forDescriptor(ManagedChannelFactory.java:44)
> at org.apache.beam.fn.harness.FnHarness.main(FnHarness.java:194) at
> org.apache.beam.runners.fnexecution.environment.EmbeddedEnvironmentFactory.lambda$createEnvironment$0(EmbeddedEnvironmentFactory.java:100)
> at java.util.concurrent.FutureTask.run(FutureTask.java:266) at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:748) 20/12/16 12:09:07 INFO
> org.apache.beam.runners.jobsubmission.JobServerDriver: JobServer stopped on
> localhost:36035 20/12/16 12:09:07 INFO
> org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingServer
> stopped on localhost:39093 20/12/16 12:09:07 INFO
> org.apache.beam.runners.jobsubmission.JobServerDriver: Expansion stopped on
> localhost:43057
> {code}
>
--
This message was sent by Atlassian Jira
(v8.3.4#803005)