See
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/2882/display/redirect?page=changes>
Changes:
[github] Flink 1.10 yarn deployment fix (#11146)
[github] [BEAM-9539] Fix copy-pasted comment in load-tests' build.gradle
(#11155)
------------------------------------------
[...truncated 229.14 KB...]
> Task :sdks:java:io:google-cloud-platform:testJar
Build cache key for task ':sdks:java:io:google-cloud-platform:testJar' is
caee10b4d2831232bea512df2185d9ae
Caching disabled for task ':sdks:java:io:google-cloud-platform:testJar':
Caching has not been enabled for the task
Task ':sdks:java:io:google-cloud-platform:testJar' is not up-to-date because:
No history is available.
:sdks:java:io:google-cloud-platform:testJar (Thread[Execution worker for ':'
Thread 4,5,main]) completed. Took 0.056 secs.
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution worker
for ':' Thread 4,5,main]) started.
> Task :runners:google-cloud-dataflow-java:compileTestJava FROM-CACHE
Build cache key for task ':runners:google-cloud-dataflow-java:compileTestJava'
is de0cdae69fdfbd2de716672acac439f1
Task ':runners:google-cloud-dataflow-java:compileTestJava' is not up-to-date
because:
No history is available.
Origin for
org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$TaskExecution@fa7841b:
{executionTime=8262, hostName=apache-beam-jenkins-2, operatingSystem=Linux,
buildInvocationId=tcrklelm6nh5pfremiu6o6ek5q, creationTime=1584516095329,
identity=:runners:google-cloud-dataflow-java:compileTestJava,
type=org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.TaskExecution,
userName=jenkins, gradleVersion=5.2.1,
rootPath=/home/jenkins/jenkins-slave/workspace/beam_PerformanceTests_Compressed_TextIOIT_HDFS/src}
Unpacked trees for task ':runners:google-cloud-dataflow-java:compileTestJava'
from cache.
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution worker
for ':' Thread 4,5,main]) completed. Took 0.157 secs.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution worker for
':' Thread 4,5,main]) started.
> Task :runners:google-cloud-dataflow-java:testClasses UP-TO-DATE
Skipping task ':runners:google-cloud-dataflow-java:testClasses' as it has no
actions.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution worker for
':' Thread 4,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:testJar (Thread[Execution worker for ':'
Thread 4,5,main]) started.
> Task :runners:google-cloud-dataflow-java:testJar
Build cache key for task ':runners:google-cloud-dataflow-java:testJar' is
7bd5a2813b485434d0dc2d8ca69171aa
Caching disabled for task ':runners:google-cloud-dataflow-java:testJar':
Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:testJar' is not up-to-date because:
No history is available.
:runners:google-cloud-dataflow-java:testJar (Thread[Execution worker for ':'
Thread 4,5,main]) completed. Took 0.028 secs.
> Task :runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar
Build cache key for task
':runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar' is
b173d579170474b436e2833bdd2b6d1d
Caching disabled for task
':runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar': Caching
has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar' is
not up-to-date because:
No history is available.
Custom actions are attached to task
':runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar'.
*******************
GRADLE SHADOW STATS
Total Jars: 16 (includes project)
Total Time: 3.081s [3081ms]
Average Time/Jar: 0.1925625s [192.5625ms]
*******************
:runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar
(Thread[Execution worker for ':',5,main]) completed. Took 3.885 secs.
:sdks:java:io:mongodb:integrationTest (Thread[Execution worker for ':',5,main])
started.
Gradle Test Executor 1 started executing tests.
> Task :sdks:java:io:mongodb:integrationTest
Build cache key for task ':sdks:java:io:mongodb:integrationTest' is
4757f3b9ad0c7723968e342d7170b163
Task ':sdks:java:io:mongodb:integrationTest' is not up-to-date because:
Task.upToDateWhen is false.
Custom actions are attached to task ':sdks:java:io:mongodb:integrationTest'.
Starting process 'Gradle Test Executor 1'. Working directory:
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb>
Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java
-DbeamTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--numberOfRecords=10000000","--bigQueryDataset=beam_performance","--bigQueryTable=mongodbioit_results","--mongoDBDatabaseName=beam","--mongoDBHostName=34.69.202.210","--mongoDBPort=27017","--runner=DataflowRunner","--autoscalingAlgorithm=NONE","--numWorkers=5","--workerHarnessContainerImage=","--dataflowWorkerJar=${dataflowWorkerJar}"]
-Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager
-Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US
-Duser.language=en -Duser.variant -ea -cp
/home/jenkins/.gradle/caches/5.2.1/workerMain/gradle-worker.jar
worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test
Executor 1'
Successfully started process 'Gradle Test Executor 1'
org.apache.beam.sdk.io.mongodb.MongoDBIOIT STANDARD_ERROR
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.21.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
SLF4J: Found binding in
[jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.25/bccda40ebc8067491b32a88f49615a747d20082d/slf4j-jdk14-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]
Mar 18, 2020 1:28:09 PM com.mongodb.diagnostics.logging.SLF4JLogger info
INFO: Cluster created with settings {hosts=[34.69.202.210:27017],
mode=SINGLE, requiredClusterType=UNKNOWN, serverSelectionTimeout='30000 ms',
maxWaitQueueSize=500}
org.apache.beam.sdk.io.mongodb.MongoDBIOIT > testWriteAndRead STANDARD_ERROR
Mar 18, 2020 1:28:09 PM com.mongodb.diagnostics.logging.SLF4JLogger info
INFO: Cluster description not yet available. Waiting for 30000 ms before
timing out
Mar 18, 2020 1:28:09 PM com.mongodb.diagnostics.logging.SLF4JLogger info
INFO: Opened connection [connectionId{localValue:1, serverValue:1}] to
34.69.202.210:27017
Mar 18, 2020 1:28:09 PM com.mongodb.diagnostics.logging.SLF4JLogger info
INFO: Monitor thread successfully connected to server with description
ServerDescription{address=34.69.202.210:27017, type=STANDALONE,
state=CONNECTED, ok=true, version=ServerVersion{versionList=[4, 2, 3]},
minWireVersion=0, maxWireVersion=8, maxDocumentSize=16777216,
logicalSessionTimeoutMinutes=30, roundTripTimeNanos=3644754}
Mar 18, 2020 1:28:09 PM com.mongodb.diagnostics.logging.SLF4JLogger info
INFO: Opened connection [connectionId{localValue:2, serverValue:2}] to
34.69.202.210:27017
Mar 18, 2020 1:28:11 PM
org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory
tryCreateDefaultBucket
INFO: No tempLocation specified, attempting to use default bucket:
dataflow-staging-us-central1-844138762903
Mar 18, 2020 1:28:11 PM
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
handleResponse
WARNING: Request failed with code 409, performed 0 retries due to
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP
framework says request can be retried, (caller responsible for retrying):
https://www.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
Mar 18, 2020 1:28:11 PM
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Mar 18, 2020 1:28:12 PM org.apache.beam.runners.dataflow.DataflowRunner
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files
from the classpath: will stage 189 files. Enable logging at DEBUG level to see
which files will be staged.
Mar 18, 2020 1:28:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing
implications related to Google Compute Engine usage and other Google Cloud
Services.
Mar 18, 2020 1:28:12 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Uploading 190 files from PipelineOptions.filesToStage to staging
location to prepare for execution.
Mar 18, 2020 1:28:12 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
WARNING: Skipping non-existent file to stage ${dataflowWorkerJar}.
Mar 18, 2020 1:28:12 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.21.0-SNAPSHOT-tests.jar>
to
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-common-2.21.0-SNAPSHOT-tests-t0zmbSfCuLZtfAsCKnSYog.jar
Mar 18, 2020 1:28:12 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/model/job-management/build/libs/beam-model-job-management-2.21.0-SNAPSHOT.jar>
to
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-model-job-management-2.21.0-SNAPSHOT-wZ-OzYpe5dqtPGpA0sBjeQ.jar
Mar 18, 2020 1:28:12 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.21.0-SNAPSHOT-tests.jar>
to
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.21.0-SNAPSHOT-tests-HaYEAfel4aZFwnJLnJqrPQ.jar
Mar 18, 2020 1:28:12 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.21.0-SNAPSHOT.jar>
to
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-model-pipeline-2.21.0-SNAPSHOT-A9r_LfIWIgDqiR0I8kvmTw.jar
Mar 18, 2020 1:28:12 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.21.0-SNAPSHOT.jar>
to
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-model-fn-execution-2.21.0-SNAPSHOT-KkM0_9Mf6P8necDoHTKnwg.jar
Mar 18, 2020 1:28:12 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.21.0-SNAPSHOT.jar>
to
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-extensions-protobuf-2.21.0-SNAPSHOT-n3JKwiHRFH4u5qM79vAP7w.jar
Mar 18, 2020 1:28:12 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/classes/java/test>
to
gs://dataflow-staging-us-central1-844138762903/temp/staging/test-zzvCzKyd5NYPwqrTA2w68g.jar
Mar 18, 2020 1:28:12 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.21.0-SNAPSHOT-tests.jar>
to
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-google-cloud-platform-2.21.0-SNAPSHOT-tests-LtVc7W-6QVgDVXrtjFAq2Q.jar
Mar 18, 2020 1:28:12 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/classes/java/main>
to
gs://dataflow-staging-us-central1-844138762903/temp/staging/main-5sAIl46ylkDmNuuBEYI_vQ.jar
Mar 18, 2020 1:28:12 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.21.0-SNAPSHOT.jar>
to
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.21.0-SNAPSHOT-JOyokii5lK9-qT420h9uuQ.jar
Mar 18, 2020 1:28:12 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.21.0-SNAPSHOT-tests.jar>
to
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-test-utils-2.21.0-SNAPSHOT-tests-Fu_1EHA0maqqdBYVoaMuKw.jar
Mar 18, 2020 1:28:12 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.21.0-SNAPSHOT.jar>
to
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-common-2.21.0-SNAPSHOT-vlNhTt4WsMpp3y8N-gkQUw.jar
Mar 18, 2020 1:28:12 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.21.0-SNAPSHOT.jar>
to
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-test-utils-2.21.0-SNAPSHOT-kg4-Vpjloesm3Ptj4cLOWQ.jar
Mar 18, 2020 1:28:12 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.21.0-SNAPSHOT.jar>
to
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-google-cloud-platform-2.21.0-SNAPSHOT-uAWPguUBardp4aTp2S4kWg.jar
Mar 18, 2020 1:28:12 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.21.0-SNAPSHOT.jar>
to
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-google-cloud-dataflow-java-2.21.0-SNAPSHOT-4majzadBZ3gw0ETkbXk_eg.jar
Mar 18, 2020 1:28:12 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.21.0-SNAPSHOT-tests.jar>
to
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-google-cloud-dataflow-java-2.21.0-SNAPSHOT-tests-UZKpVuAMjcP8g41Vsx-f9Q.jar
Mar 18, 2020 1:28:12 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.21.0-SNAPSHOT.jar>
to
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-direct-java-2.21.0-SNAPSHOT-ofMh7G4AvP_UiwibDGvigQ.jar
Mar 18, 2020 1:28:12 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.21.0-SNAPSHOT.jar>
to
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-expansion-service-2.21.0-SNAPSHOT-evm2Zwxo83QPBvJ0HIsm3Q.jar
Mar 18, 2020 1:28:12 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.21.0-SNAPSHOT.jar>
to
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-core-construction-java-2.21.0-SNAPSHOT-7_K5H9483OrqWc5RWZoZng.jar
Mar 18, 2020 1:28:13 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.21.0-SNAPSHOT.jar>
to
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-core-2.21.0-SNAPSHOT-1K1voEnfiRO88T5JBM7OTg.jar
Mar 18, 2020 1:28:13 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.21.0-SNAPSHOT-tests.jar>
to
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-core-2.21.0-SNAPSHOT-tests-vYrto5T8fvM1NPSr3sD5Vg.jar
Mar 18, 2020 1:28:13 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.21.0-SNAPSHOT.jar>
to
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.21.0-SNAPSHOT-MU5CFAXAmL0dt6dUB_RNPg.jar
Mar 18, 2020 1:28:14 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Staging files complete: 167 files cached, 22 files newly uploaded in
1 seconds
Mar 18, 2020 1:28:14 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Generate sequence/Read(BoundedCountingSource) as step s1
Mar 18, 2020 1:28:14 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Produce documents/Map as step s2
Mar 18, 2020 1:28:14 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect write time metric as step s3
Mar 18, 2020 1:28:14 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write documents to MongoDB/ParDo(Write) as step s4
Mar 18, 2020 1:28:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging pipeline description to
gs://dataflow-staging-us-central1-844138762903/temp/staging/
Mar 18, 2020 1:28:14 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading <8594 bytes, hash ZOwQTSvfPEyvBm--j_CMYA> to
gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-ZOwQTSvfPEyvBm--j_CMYA.pb
Mar 18, 2020 1:28:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.21.0-SNAPSHOT
Mar 18, 2020 1:28:15 PM
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$DefaultGcpRegionFactory
create
WARNING: Region will default to us-central1. Future releases of Beam will
require the user to set the region explicitly.
https://cloud.google.com/compute/docs/regions-zones/regions-zones
Mar 18, 2020 1:28:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-18_06_28_15-16266623838281836388?project=apache-beam-testing
Mar 18, 2020 1:28:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2020-03-18_06_28_15-16266623838281836388
Mar 18, 2020 1:28:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
--region=us-central1 2020-03-18_06_28_15-16266623838281836388
Mar 18, 2020 1:28:20 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-03-18T13:28:20.031Z: Checking permissions granted to controller
Service Account.
Mar 18, 2020 1:28:29 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-03-18T13:28:28.752Z: Worker configuration: n1-standard-1 in
us-central1-f.
Mar 18, 2020 1:28:31 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-03-18T13:28:29.375Z: Expanding CoGroupByKey operations into
optimizable parts.
Mar 18, 2020 1:28:31 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-03-18T13:28:29.434Z: Expanding GroupByKey operations into
optimizable parts.
Mar 18, 2020 1:28:31 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-03-18T13:28:29.463Z: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
Mar 18, 2020 1:28:31 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-03-18T13:28:29.546Z: Fusing adjacent ParDo, Read, Write, and
Flatten operations
Mar 18, 2020 1:28:31 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-03-18T13:28:29.573Z: Fusing consumer Produce documents/Map into
Generate sequence/Read(BoundedCountingSource)
Mar 18, 2020 1:28:31 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-03-18T13:28:29.598Z: Fusing consumer Collect write time metric
into Produce documents/Map
Mar 18, 2020 1:28:31 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-03-18T13:28:29.622Z: Fusing consumer Write documents to
MongoDB/ParDo(Write) into Collect write time metric
Mar 18, 2020 1:28:31 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-03-18T13:28:29.927Z: Executing operation Generate
sequence/Read(BoundedCountingSource)+Produce documents/Map+Collect write time
metric+Write documents to MongoDB/ParDo(Write)
Mar 18, 2020 1:28:31 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-03-18T13:28:29.991Z: Starting 5 workers in us-central1-f...
Mar 18, 2020 1:28:44 PM
org.apache.beam.runners.dataflow.DataflowPipelineJob lambda$waitUntilFinish$0
WARNING: Job is already running in Google Cloud Platform, Ctrl-C will not
cancel it.
To cancel the job in the cloud, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
--region=us-central1 2020-03-18_06_28_15-16266623838281836388
org.apache.beam.sdk.io.mongodb.MongoDBIOIT > testWriteAndRead SKIPPED
> Task :sdks:java:io:mongodb:integrationTest FAILED
:sdks:java:io:mongodb:integrationTest (Thread[Execution worker for ':',5,main])
completed. Took 39.123 secs.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':sdks:java:io:mongodb:integrationTest'.
> Process 'Gradle Test Executor 1' finished with non-zero exit value 143
This problem might be caused by incorrect test process configuration.
Please refer to the test execution section in the User Manual at
https://docs.gradle.org/5.2.1/userguide/java_testing.html#sec:test_execution
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1m 15s
81 actionable tasks: 54 executed, 27 from cache
Publishing build scan...
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=42ce9bb9-0e38-4e3c-aae5-16799b0a5aaf,
currentDir=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 7772
log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-7772.out.log
----- Last 20 lines from daemon log file - daemon-7772.out.log -----
* What went wrong:
Execution failed for task ':sdks:java:io:mongodb:integrationTest'.
> Process 'Gradle Test Executor 1' finished with non-zero exit value 143
This problem might be caused by incorrect test process configuration.
Please refer to the test execution section in the User Manual at
https://docs.gradle.org/5.2.1/userguide/java_testing.html#sec:test_execution
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1m 15s
81 actionable tasks: 54 executed, 27 from cache
Publishing build scan...
Daemon vm is shutting down... The daemon has exited normally or was terminated
in response to a user interrupt.
----- End of the daemon log -----
FAILURE: Build failed with an exception.
* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may
have crashed)
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]