See 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/2965/display/redirect>

Changes:


------------------------------------------
[...truncated 239.48 KB...]
Packing task 
':runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava'
:runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava 
(Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 23.9 secs.
:runners:google-cloud-dataflow-java:worker:legacy-worker:classes 
(Thread[Execution worker for ':' Thread 11,5,main]) started.

> Task :runners:google-cloud-dataflow-java:worker:legacy-worker:classes
Skipping task 
':runners:google-cloud-dataflow-java:worker:legacy-worker:classes' as it has no 
actions.
:runners:google-cloud-dataflow-java:worker:legacy-worker:classes 
(Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar 
(Thread[Execution worker for ':' Thread 11,5,main]) started.

> Task :runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar
Build cache key for task 
':runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar' is 
4d95201031447e9893925d2864f3a854
Caching disabled for task 
':runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar': Caching 
has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar' is 
not up-to-date because:
  No history is available.
Custom actions are attached to task 
':runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar'.

> Task :sdks:java:io:google-cloud-platform:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
Created classpath snapshot for incremental compilation in 0.495 secs. 1576 
duplicate classes found in classpath (see all with --debug).
Packing task ':sdks:java:io:google-cloud-platform:compileTestJava'
:sdks:java:io:google-cloud-platform:compileTestJava (Thread[Daemon 
worker,5,main]) completed. Took 17.348 secs.
:sdks:java:io:google-cloud-platform:testClasses (Thread[Daemon worker,5,main]) 
started.

> Task :sdks:java:io:google-cloud-platform:testClasses
Skipping task ':sdks:java:io:google-cloud-platform:testClasses' as it has no 
actions.
:sdks:java:io:google-cloud-platform:testClasses (Thread[Daemon worker,5,main]) 
completed. Took 0.0 secs.
:sdks:java:io:google-cloud-platform:testJar (Thread[Daemon worker,5,main]) 
started.

> Task :sdks:java:io:google-cloud-platform:testJar
Build cache key for task ':sdks:java:io:google-cloud-platform:testJar' is 
6d2409cf32cc2596d61a3b729a499658
Caching disabled for task ':sdks:java:io:google-cloud-platform:testJar': 
Caching has not been enabled for the task
Task ':sdks:java:io:google-cloud-platform:testJar' is not up-to-date because:
  No history is available.
:sdks:java:io:google-cloud-platform:testJar (Thread[Daemon worker,5,main]) 
completed. Took 0.122 secs.
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Daemon 
worker,5,main]) started.

> Task :runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar
*******************
GRADLE SHADOW STATS

Total Jars: 16 (includes project)
Total Time: 3.714s [3714ms]
Average Time/Jar: 0.232125s [232.125ms]
*******************
:runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar 
(Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 5.295 secs.

> Task :runners:google-cloud-dataflow-java:compileTestJava
Build cache key for task ':runners:google-cloud-dataflow-java:compileTestJava' 
is fe089ba4c95371c5821c500997c7f9f2
Task ':runners:google-cloud-dataflow-java:compileTestJava' is not up-to-date 
because:
  No history is available.
All input files are considered out-of-date for incremental task 
':runners:google-cloud-dataflow-java:compileTestJava'.
Full recompilation is required because no incremental change information is 
available. This is usually caused by clean builds or changing compiler 
arguments.
Compiling with error-prone compiler
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
Created classpath snapshot for incremental compilation in 0.082 secs. 1576 
duplicate classes found in classpath (see all with --debug).
Packing task ':runners:google-cloud-dataflow-java:compileTestJava'
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Daemon 
worker,5,main]) completed. Took 7.018 secs.
:runners:google-cloud-dataflow-java:testClasses (Thread[Daemon worker,5,main]) 
started.

> Task :runners:google-cloud-dataflow-java:testClasses
Skipping task ':runners:google-cloud-dataflow-java:testClasses' as it has no 
actions.
:runners:google-cloud-dataflow-java:testClasses (Thread[Daemon worker,5,main]) 
completed. Took 0.001 secs.
:runners:google-cloud-dataflow-java:testJar (Thread[Daemon worker,5,main]) 
started.

> Task :runners:google-cloud-dataflow-java:testJar
Build cache key for task ':runners:google-cloud-dataflow-java:testJar' is 
4401c3c34c7ff4f4005c066bfc92de78
Caching disabled for task ':runners:google-cloud-dataflow-java:testJar': 
Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:testJar' is not up-to-date because:
  No history is available.
:runners:google-cloud-dataflow-java:testJar (Thread[Daemon worker,5,main]) 
completed. Took 0.044 secs.
:sdks:java:io:mongodb:integrationTest (Thread[Daemon worker,5,main]) started.
Gradle Test Executor 2 started executing tests.

> Task :sdks:java:io:mongodb:integrationTest
Build cache key for task ':sdks:java:io:mongodb:integrationTest' is 
95aacfd119393bfcddcd6b35e667f006
Task ':sdks:java:io:mongodb:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Custom actions are attached to task ':sdks:java:io:mongodb:integrationTest'.
Starting process 'Gradle Test Executor 2'. Working directory: 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb>
 Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java 
-DbeamTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--numberOfRecords=10000000","--bigQueryDataset=beam_performance","--bigQueryTable=mongodbioit_results","--mongoDBDatabaseName=beam","--mongoDBHostName=34.69.92.113","--mongoDBPort=27017","--runner=DataflowRunner","--autoscalingAlgorithm=NONE","--numWorkers=5","--workerHarnessContainerImage=","--dataflowWorkerJar=${dataflowWorkerJar}","--region=${dataflowRegion}"]
 
-Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager
 -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US 
-Duser.language=en -Duser.variant -ea -cp 
/home/jenkins/.gradle/caches/5.2.1/workerMain/gradle-worker.jar 
worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test 
Executor 2'
Successfully started process 'Gradle Test Executor 2'

org.apache.beam.sdk.io.mongodb.MongoDBIOIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in 
[jar:<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.21.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in 
[jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.25/bccda40ebc8067491b32a88f49615a747d20082d/slf4j-jdk14-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]
    Apr 08, 2020 9:11:13 AM com.mongodb.diagnostics.logging.SLF4JLogger info
    INFO: Cluster created with settings {hosts=[34.69.92.113:27017], 
mode=SINGLE, requiredClusterType=UNKNOWN, serverSelectionTimeout='30000 ms', 
maxWaitQueueSize=500}

org.apache.beam.sdk.io.mongodb.MongoDBIOIT > testWriteAndRead STANDARD_ERROR
    Apr 08, 2020 9:11:14 AM com.mongodb.diagnostics.logging.SLF4JLogger info
    INFO: Cluster description not yet available. Waiting for 30000 ms before 
timing out
    Apr 08, 2020 9:11:14 AM com.mongodb.diagnostics.logging.SLF4JLogger info
    INFO: Opened connection [connectionId{localValue:1, serverValue:1}] to 
34.69.92.113:27017
    Apr 08, 2020 9:11:14 AM com.mongodb.diagnostics.logging.SLF4JLogger info
    INFO: Monitor thread successfully connected to server with description 
ServerDescription{address=34.69.92.113:27017, type=STANDALONE, state=CONNECTED, 
ok=true, version=ServerVersion{versionList=[4, 2, 5]}, minWireVersion=0, 
maxWireVersion=8, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, 
roundTripTimeNanos=4655064}
    Apr 08, 2020 9:11:14 AM com.mongodb.diagnostics.logging.SLF4JLogger info
    INFO: Opened connection [connectionId{localValue:2, serverValue:2}] to 
34.69.92.113:27017
    Apr 08, 2020 9:11:15 AM 
org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory 
tryCreateDefaultBucket
    INFO: No tempLocation specified, attempting to use default bucket: 
dataflow-staging-us-central1-844138762903
    Apr 08, 2020 9:11:15 AM 
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
 handleResponse
    WARNING: Request failed with code 409, performed 0 retries due to 
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP 
framework says request can be retried, (caller responsible for retrying): 
https://www.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
 
    Apr 08, 2020 9:11:15 AM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Apr 08, 2020 9:11:16 AM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files 
from the classpath: will stage 189 files. Enable logging at DEBUG level to see 
which files will be staged.
    Apr 08, 2020 9:11:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
    Apr 08, 2020 9:11:16 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Uploading 190 files from PipelineOptions.filesToStage to staging 
location to prepare for execution.
    Apr 08, 2020 9:11:16 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    WARNING: Skipping non-existent file to stage ${dataflowWorkerJar}.
    Apr 08, 2020 9:11:16 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.21.0-SNAPSHOT-tests.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-common-2.21.0-SNAPSHOT-tests-xY8C58TxnswUcjEBGx1tWA.jar
    Apr 08, 2020 9:11:16 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-common-2.21.0-SNAPSHOT-vj6UX1nsIijSlSmat9jXwA.jar
    Apr 08, 2020 9:11:16 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.21.0-SNAPSHOT-aU0vUFmB9ofjom-LOXVrgA.jar
    Apr 08, 2020 9:11:16 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-expansion-service-2.21.0-SNAPSHOT-Ts5YU7_UF04HnmQpJ2cZeA.jar
    Apr 08, 2020 9:11:16 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-test-utils-2.21.0-SNAPSHOT-y3CbjTwkmZTTQU6VFvcMeg.jar
    Apr 08, 2020 9:11:16 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-google-cloud-dataflow-java-2.21.0-SNAPSHOT-9Zsxre-9_GzbM6HaR2t8xw.jar
    Apr 08, 2020 9:11:16 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.21.0-SNAPSHOT-tests.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.21.0-SNAPSHOT-tests-fphpDJYHjoMRM8ltY2WZog.jar
    Apr 08, 2020 9:11:16 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/classes/java/test>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/test-uoOnR15A_vzHrbOZi1M_Gg.jar
    Apr 08, 2020 9:11:16 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.21.0-SNAPSHOT-tests.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-test-utils-2.21.0-SNAPSHOT-tests-OI_ar11pdqhQWPbsIIio4Q.jar
    Apr 08, 2020 9:11:16 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-extensions-protobuf-2.21.0-SNAPSHOT-R_N9GDJG8apuyS2SjRf8Ig.jar
    Apr 08, 2020 9:11:16 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.21.0-SNAPSHOT-tests.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-google-cloud-platform-2.21.0-SNAPSHOT-tests-odth1BEoLZJUf01JLWOXEA.jar
    Apr 08, 2020 9:11:17 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-direct-java-2.21.0-SNAPSHOT-KJOQ1sQ_rAKwiTsK29aiow.jar
    Apr 08, 2020 9:11:17 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-model-pipeline-2.21.0-SNAPSHOT-uweUKyDOsWn1c-QVXt-yLA.jar
    Apr 08, 2020 9:11:17 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-core-construction-java-2.21.0-SNAPSHOT-5mZEeXl0VXSOkR2m7D13OA.jar
    Apr 08, 2020 9:11:17 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/classes/java/main>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/main-Ha2eAV3kT2E09gOMO8bF0g.jar
    Apr 08, 2020 9:11:17 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/model/job-management/build/libs/beam-model-job-management-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-model-job-management-2.21.0-SNAPSHOT-IrN382inFnmLmb0nwt2rDg.jar
    Apr 08, 2020 9:11:17 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-google-cloud-platform-2.21.0-SNAPSHOT-nXSMFUGOQSfhNTJ2TiNsqg.jar
    Apr 08, 2020 9:11:17 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-model-fn-execution-2.21.0-SNAPSHOT-3FSignM0B1rg4xFp-AM2EA.jar
    Apr 08, 2020 9:11:17 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-core-2.21.0-SNAPSHOT-ore_NDzwg3frLPAQbVUJmQ.jar
    Apr 08, 2020 9:11:17 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.21.0-SNAPSHOT-O2kOhKEmOmBsYwmzoQrEmQ.jar
    Apr 08, 2020 9:11:17 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.21.0-SNAPSHOT-tests.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-google-cloud-dataflow-java-2.21.0-SNAPSHOT-tests-Q5Jn-yUgTeZafrhfocl0fA.jar
    Apr 08, 2020 9:11:17 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.21.0-SNAPSHOT-tests.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-core-2.21.0-SNAPSHOT-tests-vr_cN6URudnHYqY0tJ0GFw.jar
    Apr 08, 2020 9:11:18 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Staging files complete: 167 files cached, 22 files newly uploaded in 
2 seconds
    Apr 08, 2020 9:11:18 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Generate sequence/Read(BoundedCountingSource) as step s1
    Apr 08, 2020 9:11:19 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Produce documents/Map as step s2
    Apr 08, 2020 9:11:19 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Collect write time metric as step s3
    Apr 08, 2020 9:11:19 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write documents to MongoDB/ParDo(Write) as step s4
    Apr 08, 2020 9:11:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Apr 08, 2020 9:11:19 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading <8591 bytes, hash t2Ys7ObGbPgwJLYrAjfHaQ> to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-t2Ys7ObGbPgwJLYrAjfHaQ.pb
    Apr 08, 2020 9:11:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.21.0-SNAPSHOT
    Apr 08, 2020 9:11:19 AM 
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
 handleResponse
    WARNING: Request failed with code 403, performed 0 retries due to 
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP 
framework says request can be retried, (caller responsible for retrying): 
https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/$%7BdataflowRegion%7D/jobs.
 

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:io:mongodb:integrationTest FAILED

org.apache.beam.sdk.io.mongodb.MongoDBIOIT > testWriteAndRead FAILED
    java.lang.RuntimeException: Failed to create a workflow job: Permission 
denied on 'locations/${dataflowregion}' (or it may not exist).
        at 
org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:970)
        at 
org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:184)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:317)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:350)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:331)
        at 
org.apache.beam.sdk.io.mongodb.MongoDBIOIT.testWriteAndRead(MongoDBIOIT.java:165)

        Caused by:
        com.google.api.client.googleapis.json.GoogleJsonResponseException: 403 
Forbidden
        {
          "code" : 403,
          "errors" : [ {
            "domain" : "global",
            "message" : "Permission denied on 'locations/${dataflowregion}' (or 
it may not exist).",
            "reason" : "forbidden"
          } ],
          "message" : "Permission denied on 'locations/${dataflowregion}' (or 
it may not exist).",
          "status" : "PERMISSION_DENIED"
        }
            at 
com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:150)
            at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
            at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
            at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:443)
            at 
com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1108)
            at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:541)
            at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:474)
            at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:591)
            at 
org.apache.beam.runners.dataflow.DataflowClient.createJob(DataflowClient.java:61)
            at 
org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:956)
            ... 5 more

1 test completed, 1 failed
Finished generating test XML results (0.026 secs) into: 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.036 secs) into: 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/reports/tests/integrationTest>
:sdks:java:io:mongodb:integrationTest (Thread[Daemon worker,5,main]) completed. 
Took 9.761 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:mongodb:integrationTest'.
> There were failing tests. See the report at: 
> file://<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 9s
81 actionable tasks: 80 executed, 1 from cache

Publishing build scan...
https://gradle.com/s/qu7llfir75aiy

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to