See
<https://ci-beam.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/6745/display/redirect?page=changes>
Changes:
[noreply] [Playground] Use current Go SDK by default (#24256)
[noreply] Fix dashboard links
------------------------------------------
[...truncated 342.84 KB...]
INFO: Monitor thread successfully connected to server with description
ServerDescription{address=146.148.86.139:27017, type=STANDALONE,
state=CONNECTED, ok=true, version=ServerVersion{versionList=[6, 0, 3]},
minWireVersion=0, maxWireVersion=17, maxDocumentSize=16777216,
logicalSessionTimeoutMinutes=30, roundTripTimeNanos=5486405}
Nov 21, 2022 4:55:40 PM com.mongodb.diagnostics.logging.SLF4JLogger info
INFO: Opened connection [connectionId{localValue:2, serverValue:2}] to
146.148.86.139:27017
Nov 21, 2022 4:55:41 PM org.apache.beam.runners.dataflow.DataflowRunner
validateSdkContainerImageOptions
WARNING: Prefer --sdkContainerImage over deprecated legacy option
--****HarnessContainerImage.
Nov 21, 2022 4:55:42 PM
org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory
tryCreateDefaultBucket
INFO: No tempLocation specified, attempting to use default bucket:
dataflow-staging-us-central1-844138762903
Nov 21, 2022 4:55:42 PM
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
handleResponse
WARNING: Request failed with code 409, performed 0 retries due to
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP
framework says request can be retried, (caller responsible for retrying):
https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
Nov 21, 2022 4:55:42 PM
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 21, 2022 4:55:43 PM org.apache.beam.runners.dataflow.DataflowRunner
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files
from the classpath: will stage 283 files. Enable logging at DEBUG level to see
which files will be staged.
Nov 21, 2022 4:55:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing
implications related to Google Compute Engine usage and other Google Cloud
Services.
Nov 21, 2022 4:55:48 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Uploading 284 files from PipelineOptions.filesToStage to staging
location to prepare for execution.
Nov 21, 2022 4:55:48 PM
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes
forFileToStage
INFO: Staging custom dataflow-****.jar as
beam-runners-google-cloud-dataflow-java-legacy-****-2.44.0-SNAPSHOT-sqK5ZIBDuufDKf3k3RD3W3dJh4-3cX-eGaeTr7BOLr8.jar
Nov 21, 2022 4:55:48 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading /tmp/main7380338371314834805.zip to
gs://dataflow-staging-us-central1-844138762903/temp/staging/main-5cc7zWIYGBMU7YLUQsU6vn-lCJ_QHFlFhCrsNPnj1WI.jar
Nov 21, 2022 4:55:48 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading /tmp/test7747745880519400729.zip to
gs://dataflow-staging-us-central1-844138762903/temp/staging/test-0Pdx1WESqkeREhzgEVIBp79i5LWu_cpmfiBbYxyXnlw.jar
Nov 21, 2022 4:55:48 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading /tmp/test4135825672590742589.zip to
gs://dataflow-staging-us-central1-844138762903/temp/staging/test-GEuKfH-cGKZJn-0RWl2ys_W1kHADEXsGAYcSW6wMfvQ.jar
Nov 21, 2022 4:55:49 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Staging files complete: 281 files cached, 3 files newly uploaded in 1
seconds
Nov 21, 2022 4:55:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to
gs://dataflow-staging-us-central1-844138762903/temp/staging/
Nov 21, 2022 4:55:49 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading <118876 bytes, hash
2bba4408c514fa3fd18ea1361c0ea3ccecda1bca3596e785024feae01f96089e> to
gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-K7pECMUU-j_RjqE2HA6jzOzaG8o1lueFAk_q4B-WCJ4.pb
Nov 21, 2022 4:55:52 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Generate sequence/Read(BoundedCountingSource) as step s1
Nov 21, 2022 4:55:52 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Produce documents/Map as step s2
Nov 21, 2022 4:55:52 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect write time metric as step s3
Nov 21, 2022 4:55:52 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write documents to MongoDB/ParDo(Write) as step s4
Nov 21, 2022 4:55:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.44.0-SNAPSHOT
Nov 21, 2022 4:55:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-11-21_08_55_52-5199153894683648621?project=apache-beam-testing
Nov 21, 2022 4:55:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-11-21_08_55_52-5199153894683648621
Nov 21, 2022 4:55:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
--region=us-central1 2022-11-21_08_55_52-5199153894683648621
Nov 21, 2022 4:56:11 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-21T16:56:08.536Z: Worker configuration: e2-standard-2 in
us-central1-b.
Nov 21, 2022 4:56:11 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-21T16:56:10.045Z: Expanding CoGroupByKey operations into
optimizable parts.
Nov 21, 2022 4:56:11 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-21T16:56:10.073Z: Expanding GroupByKey operations into
optimizable parts.
Nov 21, 2022 4:56:11 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-21T16:56:10.100Z: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
Nov 21, 2022 4:56:11 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-21T16:56:10.176Z: Fusing adjacent ParDo, Read, Write, and
Flatten operations
Nov 21, 2022 4:56:11 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-21T16:56:10.208Z: Fusing consumer Produce documents/Map into
Generate sequence/Read(BoundedCountingSource)
Nov 21, 2022 4:56:11 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-21T16:56:10.230Z: Fusing consumer Collect write time metric
into Produce documents/Map
Nov 21, 2022 4:56:11 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-21T16:56:10.255Z: Fusing consumer Write documents to
MongoDB/ParDo(Write) into Collect write time metric
Nov 21, 2022 4:56:11 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-21T16:56:10.601Z: Executing operation Generate
sequence/Read(BoundedCountingSource)+Produce documents/Map+Collect write time
metric+Write documents to MongoDB/ParDo(Write)
Nov 21, 2022 4:56:11 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-21T16:56:10.664Z: Starting 5 ****s in us-central1-b...
Nov 21, 2022 4:56:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-21T16:56:24.548Z: Your project already contains 100
Dataflow-created metric descriptors, so new user metrics of the form
custom.googleapis.com/* will not be created. However, all user metrics are also
available in the metric dataflow.googleapis.com/job/user_counter. If you rely
on the custom metrics, you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 21, 2022 4:56:55 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-21T16:56:54.816Z: Autoscaling: Raised the number of ****s to
5 based on the rate of progress in the currently running stage(s).
Nov 21, 2022 4:57:29 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-21T16:57:28.789Z: Workers have started successfully.
Nov 21, 2022 4:57:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-21T16:57:48.182Z: All ****s have finished the startup
processes and began to receive work requests.
Nov 21, 2022 4:58:45 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-21T16:58:44.885Z: Finished operation Generate
sequence/Read(BoundedCountingSource)+Produce documents/Map+Collect write time
metric+Write documents to MongoDB/ParDo(Write)
Nov 21, 2022 4:58:45 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-21T16:58:45.027Z: Cleaning up.
Nov 21, 2022 4:58:45 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-21T16:58:45.109Z: Stopping **** pool...
Nov 21, 2022 5:01:02 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-21T17:01:01.474Z: Autoscaling: Resized **** pool from 5 to 0.
Nov 21, 2022 5:01:02 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-21T17:01:01.510Z: Worker pool stopped.
Nov 21, 2022 5:01:08 PM
org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-11-21_08_55_52-5199153894683648621 finished with status DONE.
Nov 21, 2022 5:01:08 PM org.apache.beam.runners.dataflow.DataflowRunner
validateSdkContainerImageOptions
WARNING: Prefer --sdkContainerImage over deprecated legacy option
--****HarnessContainerImage.
Nov 21, 2022 5:01:08 PM
org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory
tryCreateDefaultBucket
INFO: No tempLocation specified, attempting to use default bucket:
dataflow-staging-us-central1-844138762903
Nov 21, 2022 5:01:08 PM
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
handleResponse
WARNING: Request failed with code 409, performed 0 retries due to
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP
framework says request can be retried, (caller responsible for retrying):
https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
Nov 21, 2022 5:01:08 PM
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 21, 2022 5:01:09 PM org.apache.beam.runners.dataflow.DataflowRunner
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files
from the classpath: will stage 283 files. Enable logging at DEBUG level to see
which files will be staged.
Nov 21, 2022 5:01:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing
implications related to Google Compute Engine usage and other Google Cloud
Services.
Nov 21, 2022 5:01:11 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Uploading 284 files from PipelineOptions.filesToStage to staging
location to prepare for execution.
Nov 21, 2022 5:01:11 PM
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes
forFileToStage
INFO: Staging custom dataflow-****.jar as
beam-runners-google-cloud-dataflow-java-legacy-****-2.44.0-SNAPSHOT-sqK5ZIBDuufDKf3k3RD3W3dJh4-3cX-eGaeTr7BOLr8.jar
Nov 21, 2022 5:01:11 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Staging files complete: 284 files cached, 0 files newly uploaded in 0
seconds
Nov 21, 2022 5:01:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to
gs://dataflow-staging-us-central1-844138762903/temp/staging/
Nov 21, 2022 5:01:12 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading <192554 bytes, hash
c242e8d9e643b52e8debd4f6fdfad632bcf488330edcfee09a6388b1eb2e2ef5> to
gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-wkLo2eZDtS6N69T2_frWMrz0iDMO3P7gmmOIsesuLvU.pb
Nov 21, 2022 5:01:14 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read all documents/Read(BoundedMongoDbSource) as step s1
Nov 21, 2022 5:01:14 PM com.mongodb.diagnostics.logging.SLF4JLogger info
INFO: Cluster created with settings {hosts=[146.148.86.139:27017],
mode=SINGLE, requiredClusterType=UNKNOWN, serverSelectionTimeout='30000 ms',
maxWaitQueueSize=500}
Nov 21, 2022 5:01:14 PM com.mongodb.diagnostics.logging.SLF4JLogger info
INFO: Cluster description not yet available. Waiting for 30000 ms before
timing out
Nov 21, 2022 5:01:14 PM com.mongodb.diagnostics.logging.SLF4JLogger info
INFO: Opened connection [connectionId{localValue:3, serverValue:23}] to
146.148.86.139:27017
Nov 21, 2022 5:01:14 PM com.mongodb.diagnostics.logging.SLF4JLogger info
INFO: Monitor thread successfully connected to server with description
ServerDescription{address=146.148.86.139:27017, type=STANDALONE,
state=CONNECTED, ok=true, version=ServerVersion{versionList=[6, 0, 3]},
minWireVersion=0, maxWireVersion=17, maxDocumentSize=16777216,
logicalSessionTimeoutMinutes=30, roundTripTimeNanos=1181696}
Nov 21, 2022 5:01:14 PM com.mongodb.diagnostics.logging.SLF4JLogger info
INFO: Opened connection [connectionId{localValue:4, serverValue:24}] to
146.148.86.139:27017
Nov 21, 2022 5:01:14 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect read time metrics as step s2
Nov 21, 2022 5:01:14 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Map documents to Strings/Map as step s3
Nov 21, 2022 5:01:14 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Calculate hashcode/WithKeys/AddKeys/Map as step s4
Nov 21, 2022 5:01:14 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Calculate hashcode/Combine.perKey(Hashing)/GroupByKey as step
s5
Nov 21, 2022 5:01:14 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Calculate
hashcode/Combine.perKey(Hashing)/Combine.GroupedValues as step s6
Nov 21, 2022 5:01:14 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Calculate hashcode/Values/Values/Map as step s7
Nov 21, 2022 5:01:14 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Calculate
hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow) as step s8
Nov 21, 2022 5:01:14 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Calculate hashcode/View.AsIterable/CreateDataflowView as step
s9
Nov 21, 2022 5:01:14 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Calculate hashcode/CreateVoid/Read(CreateSource) as step s10
Nov 21, 2022 5:01:14 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Calculate hashcode/ProduceDefault as step s11
Nov 21, 2022 5:01:14 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Calculate hashcode/Flatten.PCollections as step s12
Nov 21, 2022 5:01:14 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous) as step
s13
Nov 21, 2022 5:01:14 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/GroupGlobally/ParDo(ToSingletonIterables) as step s14
Nov 21, 2022 5:01:14 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/GroupGlobally/Create.Values/Read(CreateSource) as
step s15
Nov 21, 2022 5:01:14 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/GroupGlobally/Flatten.PCollections as step s16
Nov 21, 2022 5:01:14 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/GroupGlobally/Window.Into()/Flatten.PCollections as
step s17
Nov 21, 2022 5:01:14 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/GroupGlobally/WithKeys/AddKeys/Map as step s18
Nov 21, 2022 5:01:14 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/GroupGlobally/GroupByKey as step s19
Nov 21, 2022 5:01:14 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/GroupGlobally/Values/Values/Map as step s20
Nov 21, 2022 5:01:14 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/GroupGlobally/ParDo(Concat) as step s21
Nov 21, 2022 5:01:14 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/GetPane/Map as step s22
Nov 21, 2022 5:01:14 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/RunChecks as step s23
Nov 21, 2022 5:01:14 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/VerifyAssertions/ParDo(DefaultConclude) as step s24
Nov 21, 2022 5:01:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.44.0-SNAPSHOT
Nov 21, 2022 5:01:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-11-21_09_01_14-13301366438917436978?project=apache-beam-testing
Nov 21, 2022 5:01:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-11-21_09_01_14-13301366438917436978
Nov 21, 2022 5:01:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
--region=us-central1 2022-11-21_09_01_14-13301366438917436978
org.apache.beam.sdk.io.mongodb.MongoDBIOIT > testWriteAndRead SKIPPED
> Task :sdks:java:io:mongodb:integrationTest FAILED
:sdks:java:io:mongodb:integrationTest (Thread[Execution **** Thread 3,5,main])
completed. Took 5 mins 41.123 secs.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':sdks:java:io:mongodb:integrationTest'.
> Process 'Gradle Test Executor 1' finished with non-zero exit value 143
This problem might be caused by incorrect test process configuration.
Please refer to the test execution section in the User Manual at
https://docs.gradle.org/7.5.1/userguide/java_testing.html#sec:test_execution
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings
and determine if they come from your own scripts or plugins.
See
https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 6m 1s
135 actionable tasks: 79 executed, 54 from cache, 2 up-to-date
Publishing build scan...
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=b7994a2d-714a-478f-af8f-4d63a9f40adf,
currentDir=<https://ci-beam.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 3064113
log file: /home/jenkins/.gradle/daemon/7.5.1/daemon-3064113.out.log
----- Last 20 lines from daemon log file - daemon-3064113.out.log -----
This problem might be caused by incorrect test process configuration.
Please refer to the test execution section in the User Manual at
https://docs.gradle.org/7.5.1/userguide/java_testing.html#sec:test_execution
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings
and determine if they come from your own scripts or plugins.
See
https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 6m 1s
135 actionable tasks: 79 executed, 54 from cache, 2 up-to-date
Publishing build scan...
Daemon vm is shutting down... The daemon has exited normally or was terminated
in response to a user interrupt.
----- End of the daemon log -----
FAILURE: Build failed with an exception.
* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may
have crashed)
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]