See
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2386/display/redirect?page=changes>
Changes:
[noreply] [BEAM-12769] Adds support for expanding a Java cross-language
transform
------------------------------------------
[...truncated 352.86 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT
> readUsingDefaultMethod FAILED
java.lang.IllegalStateException: Unable to return a default Coder for
BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output
[PCollection@2134971427]. Correct one of the following root causes:
No Coder has been manually specified; you may do so using .setCoder().
Inferring a Coder from the CoderRegistry failed: Cannot provide a coder
for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
Using the default output Coder from the producing PTransform failed:
PTransform.getOutputCoder called.
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
at
org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
at
org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
at
org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
at
org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
at
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT
> readUsingDirectReadMethodPushDown STANDARD_ERROR
Sep 05, 2021 12:45:26 AM
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
INFO: BigQuery method is set to: DIRECT_READ
Sep 05, 2021 12:45:26 AM
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
Sep 05, 2021 12:45:26 AM
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
INFO: SQL:
SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`,
`HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND
`HACKER_NEWS`.`score` > 2
Sep 05, 2021 12:45:26 AM
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
INFO: BigQuery method is set to: DIRECT_READ
Sep 05, 2021 12:45:26 AM
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
Sep 05, 2021 12:45:26 AM
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
INFO: SQLPlan>
LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
BeamIOSourceRel(table=[[beam, HACKER_NEWS]])
Sep 05, 2021 12:45:26 AM
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
INFO: BEAMPlan>
BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type,
title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8,
'job'))>($5, 2)}, unsupported{}]])
Sep 05, 2021 12:45:26 AM
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable
buildIOReader
INFO: Pushing down the following filter: (type = 'story' OR type = 'job')
AND score > 2
Sep 05, 2021 12:45:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing
implications related to Google Compute Engine usage and other Google Cloud
Services.
Sep 05, 2021 12:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to
gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 05, 2021 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading <112463 bytes, hash
f41cb218a1041a39cf3cf7afdb72df9a847c5a7d57d926c1dad92dd8bd1e3f95> to
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-9ByyGKEEGjnPPPev23LfmoR8Wn1X2SbB2tkt2L0eP5U.pb
Sep 05, 2021 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Uploading 248 files from PipelineOptions.filesToStage to staging
location to prepare for execution.
Sep 05, 2021 12:45:32 AM
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes
forFileToStage
INFO: Staging custom dataflow-worker.jar as
beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-QwBCBtprx84y_3PbjBTvra8IZ_z5DGx5ADyBhvZp408.jar
Sep 05, 2021 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading /tmp/test2707828256434146384.zip to
gs://temp-storage-for-perf-tests/loadtests/staging/test-jf_V0bMI3mpHFu88TUVTELO7vySycrb3r_Bta5kQDEQ.jar
Sep 05, 2021 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0
seconds
Sep 05, 2021 12:45:32 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with
push-down/Read(BigQueryStorageTableSource) as step s1
Sep 05, 2021 12:45:32 AM
io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13,
target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
Make sure to call shutdown()/shutdownNow() and wait until
awaitTermination() returns true.
java.lang.RuntimeException: ManagedChannel allocation site
at
io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
at
io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
at
io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
at
io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
at
io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
at
com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
at
com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
at
com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
at
com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
at
com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
at
com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
at
com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
at
com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
at
com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
at
org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
at
org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
at
org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
at
org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
at
org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
at
org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
at
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
at
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
at
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
at
org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
at
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
at
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
at
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
at
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
at
org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
at
org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
at
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at
org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
at
org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
at
org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
at
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at
org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
at
org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
at
org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
at
org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
at
org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at
org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
at java.lang.Thread.run(Thread.java:748)
Sep 05, 2021 12:45:33 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
Sep 05, 2021 12:45:33 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor) as step s3
Sep 05, 2021 12:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 05, 2021 12:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-04_17_45_33-10693535116842576941?project=apache-beam-testing
Sep 05, 2021 12:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-04_17_45_33-10693535116842576941
Sep 05, 2021 12:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
--region=us-central1 2021-09-04_17_45_33-10693535116842576941
Sep 05, 2021 12:45:37 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-05T00:45:36.848Z: The requested max number of workers (5)
is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
Sep 05, 2021 12:45:44 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-05T00:45:43.482Z: Worker configuration: e2-standard-2 in
us-central1-c.
Sep 05, 2021 12:45:44 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-05T00:45:44.128Z: Expanding CoGroupByKey operations into
optimizable parts.
Sep 05, 2021 12:45:44 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-05T00:45:44.164Z: Expanding GroupByKey operations into
optimizable parts.
Sep 05, 2021 12:45:44 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-05T00:45:44.200Z: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
Sep 05, 2021 12:45:45 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-05T00:45:44.273Z: Fusing adjacent ParDo, Read, Write, and
Flatten operations
Sep 05, 2021 12:45:45 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-05T00:45:44.307Z: Fusing consumer
BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into
BeamPushDownIOSourceRel_272/Read Input BQ Rows with
push-down/Read(BigQueryStorageTableSource)
Sep 05, 2021 12:45:45 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-05T00:45:44.348Z: Fusing consumer ParDo(TimeMonitor) into
BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
Sep 05, 2021 12:45:45 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-05T00:45:44.812Z: Executing operation
BeamPushDownIOSourceRel_272/Read Input BQ Rows with
push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
Sep 05, 2021 12:45:45 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-05T00:45:44.900Z: Starting 5 workers in us-central1-c...
Sep 05, 2021 12:46:08 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-05T00:46:07.834Z: Your project already contains 100
Dataflow-created metric descriptors, so new user metrics of the form
custom.googleapis.com/* will not be created. However, all user metrics are also
available in the metric dataflow.googleapis.com/job/user_counter. If you rely
on the custom metrics, you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 05, 2021 12:46:20 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-05T00:46:19.665Z: Autoscaling: Raised the number of workers
to 3 based on the rate of progress in the currently running stage(s).
Sep 05, 2021 12:46:20 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-05T00:46:19.693Z: Resized worker pool to 3, though goal was
5. This could be a quota issue.
Sep 05, 2021 12:46:31 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-05T00:46:29.998Z: Autoscaling: Raised the number of workers
to 5 based on the rate of progress in the currently running stage(s).
Sep 05, 2021 12:46:53 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-05T00:46:53.614Z: Workers have started successfully.
Sep 05, 2021 12:46:53 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-05T00:46:53.691Z: Workers have started successfully.
Sep 05, 2021 12:47:25 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-05T00:47:24.873Z: Finished operation
BeamPushDownIOSourceRel_272/Read Input BQ Rows with
push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
Sep 05, 2021 12:47:27 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-05T00:47:25.026Z: Cleaning up.
Sep 05, 2021 12:47:27 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-05T00:47:25.120Z: Stopping worker pool...
Sep 05, 2021 12:49:43 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-05T00:49:42.646Z: Autoscaling: Resized worker pool from 5 to
0.
Sep 05, 2021 12:49:43 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-05T00:49:42.694Z: Worker pool stopped.
Sep 05, 2021 12:49:48 AM
org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-09-04_17_45_33-10693535116842576941 finished with status
DONE.
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT
> readUsingDirectReadMethodPushDown STANDARD_OUT
Load test results for test (ID): f926465d-b3e6-40ed-9814-76040c167621 and
timestamp: 2021-09-05T00:49:48.149000000Z:
Metric: Value:
read_time 9.682
fields_read 4375276.0
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT
> readUsingDirectReadMethodPushDown STANDARD_ERROR
Sep 05, 2021 12:49:48 AM
org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
WARNING: Missing property -- measurement/database. Metrics won't be
published.
Gradle Test Executor 3 finished executing tests.
> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED
3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into:
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into:
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker
for ':' Thread 4,5,main]) completed. Took 4 mins 34.361 secs.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task
':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at:
> file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 5m 28s
152 actionable tasks: 101 executed, 51 from cache
Publishing build scan...
https://gradle.com/s/4u2tvivgrblcw
Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]