See
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2264/display/redirect?page=changes>
Changes:
[Etienne Chauchot] [BEAM-12591] Put Spark Structured Streaming runner sources
back to main
[Etienne Chauchot] [BEAM-12629] As spark DataSourceV2 is only available for
spark 2,
[Etienne Chauchot] [BEAM-12627] Deal with spark Encoders braking change between
spark 2 and
[Etienne Chauchot] [BEAM-12591] move SchemaHelpers to correct package
[Etienne Chauchot] [BEAM-8470] Disable wait for termination in a streaming
pipeline because
[Etienne Chauchot] [BEAM-12630] Deal with breaking change in streaming
pipelines start by
[Etienne Chauchot] [BEAM-12629] Make source tests spark version agnostic and
move them back
[Etienne Chauchot] [BEAM-12629] Make a spark 3 source impl
[Etienne Chauchot] [BEAM-12591] Fix checkstyle and spotless
[Etienne Chauchot] [BEAM-12629] Reduce serializable to only needed classes and
Fix schema
[Etienne Chauchot] [BEAM-12591] Add checkstyle exceptions for version specific
classes
[Etienne Chauchot] [BEAM-12629] Fix sources javadocs and improve impl
[Etienne Chauchot] [BEAM-12591] Add spark 3 to structured streaming validates
runner tests
[noreply] [BEAM-6516] Fixes race condition in RabbitMqIO causing duplicate acks
------------------------------------------
[...truncated 355.13 KB...]
at
org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
at
org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
at
org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
at
org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
at
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT
> readUsingDirectReadMethodPushDown STANDARD_ERROR
Aug 05, 2021 12:45:18 PM
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
INFO: BigQuery method is set to: DIRECT_READ
Aug 05, 2021 12:45:18 PM
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
Aug 05, 2021 12:45:18 PM
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
INFO: SQL:
SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`,
`HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND
`HACKER_NEWS`.`score` > 2
Aug 05, 2021 12:45:18 PM
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
INFO: BigQuery method is set to: DIRECT_READ
Aug 05, 2021 12:45:18 PM
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
Aug 05, 2021 12:45:18 PM
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
INFO: SQLPlan>
LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
BeamIOSourceRel(table=[[beam, HACKER_NEWS]])
Aug 05, 2021 12:45:18 PM
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
INFO: BEAMPlan>
BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by,
type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8,
'job'))>($5, 2)}, unsupported{}]])
Aug 05, 2021 12:45:18 PM
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable
buildIOReader
INFO: Pushing down the following filter: (`type` = 'story' OR `type` =
'job') AND `score` > 2
Aug 05, 2021 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing
implications related to Google Compute Engine usage and other Google Cloud
Services.
Aug 05, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to
gs://temp-storage-for-perf-tests/loadtests/staging/
Aug 05, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading <115864 bytes, hash
1b7eda3a071ec23cdf1d54b5d9f0eb42a9b3380f91a746db8b37e91313106ea4> to
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-G37aOgcewjzfHVS12fDrQqmzOA-Rp0bbizfpExMQbqQ.pb
Aug 05, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Uploading 248 files from PipelineOptions.filesToStage to staging
location to prepare for execution.
Aug 05, 2021 12:45:24 PM
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes
forFileToStage
INFO: Staging custom dataflow-worker.jar as
beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
Aug 05, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading /tmp/test5213171920567842883.zip to
gs://temp-storage-for-perf-tests/loadtests/staging/test-6vm5Txh9PP0wqhHbEqdfmZc-K66HxA5wo2gNuc37Brs.jar
Aug 05, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
/home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar to
gs://temp-storage-for-perf-tests/loadtests/staging/gradle-worker-YjcXWNInX9ekye2Ilinimy8QNJZBoZtCQNEtODTeKIs.jar
Aug 05, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.33.0-SNAPSHOT-tests.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.33.0-SNAPSHOT-tests-z5Sh9dWo0DAeG5w5NaSrkooLfsJ26c7yIXDik9SzDm8.jar
Aug 05, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/thrift/build/libs/beam-sdks-java-io-thrift-2.33.0-SNAPSHOT-tests.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-thrift-2.33.0-SNAPSHOT-tests-I-osQrG-lSEES6RD4ibuovIeVl-_U3nOEEsBLo1rYrA.jar
Aug 05, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Staging files complete: 244 files cached, 4 files newly uploaded in 0
seconds
Aug 05, 2021 12:45:25 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with
push-down/Read(BigQueryStorageTableSource) as step s1
Aug 05, 2021 12:45:25 PM
io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13,
target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
Make sure to call shutdown()/shutdownNow() and wait until
awaitTermination() returns true.
java.lang.RuntimeException: ManagedChannel allocation site
at
io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
at
io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
at
io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
at
io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
at
io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
at
com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
at
com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
at
com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
at
com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
at
com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
at
com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
at
com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
at
com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
at
com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
at
org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
at
org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
at
org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
at
org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
at
org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
at
org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
at
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
at
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
at
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
at
org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
at
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
at
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
at
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
at
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
at
org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
at
org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
at
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at
org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
at
org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
at
org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
at
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at
org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
at
org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
at
org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
at
org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
at
org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at
org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
at java.lang.Thread.run(Thread.java:748)
Aug 05, 2021 12:45:25 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
Aug 05, 2021 12:45:25 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
Aug 05, 2021 12:45:25 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor) as step s4
Aug 05, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
Aug 05, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-05_05_45_25-2918618145126024327?project=apache-beam-testing
Aug 05, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-08-05_05_45_25-2918618145126024327
Aug 05, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
--region=us-central1 2021-08-05_05_45_25-2918618145126024327
Aug 05, 2021 12:45:30 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-08-05T12:45:29.066Z: The requested max number of workers (5)
is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
Aug 05, 2021 12:45:36 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:45:35.564Z: Worker configuration: e2-standard-2 in
us-central1-c.
Aug 05, 2021 12:45:36 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:45:36.261Z: Expanding CoGroupByKey operations into
optimizable parts.
Aug 05, 2021 12:45:36 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:45:36.317Z: Expanding GroupByKey operations into
optimizable parts.
Aug 05, 2021 12:45:36 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:45:36.348Z: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
Aug 05, 2021 12:45:36 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:45:36.410Z: Fusing adjacent ParDo, Read, Write, and
Flatten operations
Aug 05, 2021 12:45:36 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:45:36.448Z: Fusing consumer
BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into
BeamPushDownIOSourceRel_229/Read Input BQ Rows with
push-down/Read(BigQueryStorageTableSource)
Aug 05, 2021 12:45:36 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:45:36.483Z: Fusing consumer BeamCalcRel_285/ParDo(Calc)
into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
Aug 05, 2021 12:45:36 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:45:36.510Z: Fusing consumer ParDo(TimeMonitor) into
BeamCalcRel_285/ParDo(Calc)
Aug 05, 2021 12:45:36 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:45:36.885Z: Executing operation
BeamPushDownIOSourceRel_229/Read Input BQ Rows with
push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
Aug 05, 2021 12:45:39 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:45:36.976Z: Starting 5 workers in us-central1-c...
Aug 05, 2021 12:45:58 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:45:57.614Z: Your project already contains 100
Dataflow-created metric descriptors, so new user metrics of the form
custom.googleapis.com/* will not be created. However, all user metrics are also
available in the metric dataflow.googleapis.com/job/user_counter. If you rely
on the custom metrics, you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Aug 05, 2021 12:46:16 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:46:16.368Z: Autoscaling: Raised the number of workers
to 5 based on the rate of progress in the currently running stage(s).
Aug 05, 2021 12:46:45 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:46:43.359Z: Workers have started successfully.
Aug 05, 2021 12:46:45 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:46:43.423Z: Workers have started successfully.
Aug 05, 2021 12:47:13 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:47:12.912Z: Finished operation
BeamPushDownIOSourceRel_229/Read Input BQ Rows with
push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
Aug 05, 2021 12:47:13 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:47:13.077Z: Cleaning up.
Aug 05, 2021 12:47:16 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:47:13.150Z: Stopping worker pool...
Aug 05, 2021 12:49:41 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:49:39.839Z: Autoscaling: Resized worker pool from 5 to
0.
Aug 05, 2021 12:49:41 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:49:39.904Z: Worker pool stopped.
Aug 05, 2021 12:49:45 PM
org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-08-05_05_45_25-2918618145126024327 finished with status DONE.
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT
> readUsingDirectReadMethodPushDown STANDARD_OUT
Load test results for test (ID): c7e99327-1572-43d4-86e7-71937e6b8141 and
timestamp: 2021-08-05T12:49:45.486000000Z:
Metric: Value:
fields_read 4375276.0
read_time 8.862
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT
> readUsingDirectReadMethodPushDown STANDARD_ERROR
Aug 05, 2021 12:49:45 PM
org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
WARNING: Missing property -- measurement/database. Metrics won't be
published.
Gradle Test Executor 3 finished executing tests.
> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED
3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into:
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into:
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker
for ':' Thread 4,5,main]) completed. Took 4 mins 36.652 secs.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task
':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at:
> file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 5m 25s
152 actionable tasks: 101 executed, 51 from cache
Publishing build scan...
https://gradle.com/s/i4y474hum3qv4
Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]