See 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/2935/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-9638] Strengthen worker region & zone options tests.

[boyuanz] [BEAM-9454] Add Deduplication PTransform


------------------------------------------
[...truncated 235.30 KB...]
:sdks:java:io:mongodb:integrationTest (Thread[Execution worker for ':',5,main]) 
started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:io:mongodb:integrationTest
Build cache key for task ':sdks:java:io:mongodb:integrationTest' is 
524c44c566c04bdf87f11d9627535e25
Task ':sdks:java:io:mongodb:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Custom actions are attached to task ':sdks:java:io:mongodb:integrationTest'.
Starting process 'Gradle Test Executor 1'. Working directory: 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb>
 Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java 
-DbeamTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--numberOfRecords=10000000","--bigQueryDataset=beam_performance","--bigQueryTable=mongodbioit_results","--mongoDBDatabaseName=beam","--mongoDBHostName=34.69.47.7","--mongoDBPort=27017","--runner=DataflowRunner","--autoscalingAlgorithm=NONE","--numWorkers=5","--workerHarnessContainerImage=","--dataflowWorkerJar=${dataflowWorkerJar}"]
 
-Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager
 -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US 
-Duser.language=en -Duser.variant -ea -cp 
/home/jenkins/.gradle/caches/5.2.1/workerMain/gradle-worker.jar 
worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test 
Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.io.mongodb.MongoDBIOIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in 
[jar:<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.21.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in 
[jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.25/bccda40ebc8067491b32a88f49615a747d20082d/slf4j-jdk14-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]
    Mar 31, 2020 7:01:06 PM com.mongodb.diagnostics.logging.SLF4JLogger info
    INFO: Cluster created with settings {hosts=[34.69.47.7:27017], mode=SINGLE, 
requiredClusterType=UNKNOWN, serverSelectionTimeout='30000 ms', 
maxWaitQueueSize=500}

org.apache.beam.sdk.io.mongodb.MongoDBIOIT > testWriteAndRead STANDARD_ERROR
    Mar 31, 2020 7:01:06 PM com.mongodb.diagnostics.logging.SLF4JLogger info
    INFO: Cluster description not yet available. Waiting for 30000 ms before 
timing out
    Mar 31, 2020 7:01:16 PM com.mongodb.diagnostics.logging.SLF4JLogger info
    INFO: Exception in monitor thread while connecting to server 
34.69.47.7:27017
    com.mongodb.MongoSocketOpenException: Exception opening socket
        at 
com.mongodb.internal.connection.SocketChannelStream.open(SocketChannelStream.java:63)
        at 
com.mongodb.internal.connection.InternalStreamConnection.open(InternalStreamConnection.java:126)
        at 
com.mongodb.internal.connection.DefaultServerMonitor$ServerMonitorRunnable.run(DefaultServerMonitor.java:117)
        at java.lang.Thread.run(Thread.java:748)
    Caused by: java.net.SocketTimeoutException
        at sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:129)
        at 
com.mongodb.internal.connection.SocketStreamHelper.initialize(SocketStreamHelper.java:64)
        at 
com.mongodb.internal.connection.SocketChannelStream.initializeSocketChannel(SocketChannelStream.java:72)
        at 
com.mongodb.internal.connection.SocketChannelStream.open(SocketChannelStream.java:60)
        ... 3 more

    Mar 31, 2020 7:01:17 PM com.mongodb.diagnostics.logging.SLF4JLogger info
    INFO: Opened connection [connectionId{localValue:2, serverValue:1}] to 
34.69.47.7:27017
    Mar 31, 2020 7:01:17 PM com.mongodb.diagnostics.logging.SLF4JLogger info
    INFO: Monitor thread successfully connected to server with description 
ServerDescription{address=34.69.47.7:27017, type=STANDALONE, state=CONNECTED, 
ok=true, version=ServerVersion{versionList=[4, 2, 5]}, minWireVersion=0, 
maxWireVersion=8, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, 
roundTripTimeNanos=3622481}
    Mar 31, 2020 7:01:17 PM com.mongodb.diagnostics.logging.SLF4JLogger info
    INFO: Opened connection [connectionId{localValue:3, serverValue:2}] to 
34.69.47.7:27017
    Mar 31, 2020 7:01:18 PM 
org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory 
tryCreateDefaultBucket
    INFO: No tempLocation specified, attempting to use default bucket: 
dataflow-staging-us-central1-844138762903
    Mar 31, 2020 7:01:19 PM 
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
 handleResponse
    WARNING: Request failed with code 409, performed 0 retries due to 
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP 
framework says request can be retried, (caller responsible for retrying): 
https://www.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
 
    Mar 31, 2020 7:01:19 PM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Mar 31, 2020 7:01:19 PM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files 
from the classpath: will stage 189 files. Enable logging at DEBUG level to see 
which files will be staged.
    Mar 31, 2020 7:01:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
    Mar 31, 2020 7:01:19 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Uploading 190 files from PipelineOptions.filesToStage to staging 
location to prepare for execution.
    Mar 31, 2020 7:01:19 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    WARNING: Skipping non-existent file to stage ${dataflowWorkerJar}.
    Mar 31, 2020 7:01:20 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.21.0-SNAPSHOT-tests.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-google-cloud-dataflow-java-2.21.0-SNAPSHOT-tests-DbDzvOLY2V3-JVaJA-Bt9w.jar
    Mar 31, 2020 7:01:20 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/classes/java/test>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/test-cNiSccKvSiIG6MijKlDYWw.jar
    Mar 31, 2020 7:01:20 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-common-2.21.0-SNAPSHOT-bgkx8CrhWzpycBfZJM4xXA.jar
    Mar 31, 2020 7:01:20 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/classes/java/main>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/main-fCxeUt8bLWbB508qv0bHqw.jar
    Mar 31, 2020 7:01:20 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.21.0-SNAPSHOT-tests.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-test-utils-2.21.0-SNAPSHOT-tests-CAEY4lDL1PG1aLQua281dw.jar
    Mar 31, 2020 7:01:20 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.21.0-SNAPSHOT-tests.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.21.0-SNAPSHOT-tests-MyqbyxaKC7vcfEpuKrPUPw.jar
    Mar 31, 2020 7:01:20 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-extensions-protobuf-2.21.0-SNAPSHOT-q6zFqcdKGX_BCNGiBfQUqA.jar
    Mar 31, 2020 7:01:20 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-test-utils-2.21.0-SNAPSHOT-e2CcgWwqSXs1d5BMqc7hNg.jar
    Mar 31, 2020 7:01:20 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.21.0-SNAPSHOT-FOdI57MDwwEf_AeMH5fjbA.jar
    Mar 31, 2020 7:01:20 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.21.0-SNAPSHOT-tests.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-common-2.21.0-SNAPSHOT-tests-s0lrazqBjJe9Ol2mpxKJJA.jar
    Mar 31, 2020 7:01:20 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-expansion-service-2.21.0-SNAPSHOT-7fYahtUS9_AWZN7OSfQWoA.jar
    Mar 31, 2020 7:01:20 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-core-construction-java-2.21.0-SNAPSHOT-9gZAjMmJ0R3swAOSCicxCw.jar
    Mar 31, 2020 7:01:20 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.21.0-SNAPSHOT-tests.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-google-cloud-platform-2.21.0-SNAPSHOT-tests-kMy69jniVm9YJVu9InBWoA.jar
    Mar 31, 2020 7:01:20 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-model-fn-execution-2.21.0-SNAPSHOT-RqRjYyZscgTnRIaQPWSFew.jar
    Mar 31, 2020 7:01:20 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-google-cloud-platform-2.21.0-SNAPSHOT-k5Ce5MOfwKYEW1y1W7K1LA.jar
    Mar 31, 2020 7:01:20 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-model-pipeline-2.21.0-SNAPSHOT-uA-jd8tWWriQP_w_7Qwlog.jar
    Mar 31, 2020 7:01:20 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-google-cloud-dataflow-java-2.21.0-SNAPSHOT-nD-HkUE0nTU0l3JWVL9f9w.jar
    Mar 31, 2020 7:01:20 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/model/job-management/build/libs/beam-model-job-management-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-model-job-management-2.21.0-SNAPSHOT-ijYiRM6iTt8knUfvf_N_iQ.jar
    Mar 31, 2020 7:01:20 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-direct-java-2.21.0-SNAPSHOT-8GFRAFiQmk70FVnjf8s1uA.jar
    Mar 31, 2020 7:01:20 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.21.0-SNAPSHOT--8cpo1riGPbMYTYdteI2Fw.jar
    Mar 31, 2020 7:01:20 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.21.0-SNAPSHOT-tests.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-core-2.21.0-SNAPSHOT-tests-t6Lwmw5evOHq--IqswC6og.jar
    Mar 31, 2020 7:01:20 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-core-2.21.0-SNAPSHOT-80OkgVpq4FKWaBb__A-1tw.jar
    Mar 31, 2020 7:01:22 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Staging files complete: 167 files cached, 22 files newly uploaded in 
2 seconds
    Mar 31, 2020 7:01:22 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Generate sequence/Read(BoundedCountingSource) as step s1
    Mar 31, 2020 7:01:22 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Produce documents/Map as step s2
    Mar 31, 2020 7:01:22 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Collect write time metric as step s3
    Mar 31, 2020 7:01:22 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write documents to MongoDB/ParDo(Write) as step s4
    Mar 31, 2020 7:01:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Mar 31, 2020 7:01:22 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading <8587 bytes, hash GS_6qqDtXwCTnChdn0R92A> to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-GS_6qqDtXwCTnChdn0R92A.pb
    Mar 31, 2020 7:01:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.21.0-SNAPSHOT
    Mar 31, 2020 7:01:23 PM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$DefaultGcpRegionFactory
 create
    WARNING: Region will default to us-central1. Future releases of Beam will 
require the user to set the region explicitly. 
https://cloud.google.com/compute/docs/regions-zones/regions-zones
    Mar 31, 2020 7:01:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-31_12_01_23-9019861710269652086?project=apache-beam-testing
    Mar 31, 2020 7:01:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-03-31_12_01_23-9019861710269652086
    Mar 31, 2020 7:01:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2020-03-31_12_01_23-9019861710269652086
    Mar 31, 2020 7:01:27 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-31T19:01:27.100Z: Checking permissions granted to controller 
Service Account.
    Mar 31, 2020 7:01:37 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-31T19:01:34.845Z: Worker configuration: n1-standard-1 in 
us-central1-a.
    Mar 31, 2020 7:01:37 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-31T19:01:35.514Z: Expanding CoGroupByKey operations into 
optimizable parts.
    Mar 31, 2020 7:01:37 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-31T19:01:35.560Z: Expanding GroupByKey operations into 
optimizable parts.
    Mar 31, 2020 7:01:37 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-31T19:01:35.591Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
    Mar 31, 2020 7:01:37 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-31T19:01:35.690Z: Fusing adjacent ParDo, Read, Write, and 
Flatten operations
    Mar 31, 2020 7:01:37 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-31T19:01:35.730Z: Fusing consumer Produce documents/Map into 
Generate sequence/Read(BoundedCountingSource)
    Mar 31, 2020 7:01:37 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-31T19:01:35.765Z: Fusing consumer Collect write time metric 
into Produce documents/Map
    Mar 31, 2020 7:01:37 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-31T19:01:35.806Z: Fusing consumer Write documents to 
MongoDB/ParDo(Write) into Collect write time metric
    Mar 31, 2020 7:01:37 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-31T19:01:36.467Z: Executing operation Generate 
sequence/Read(BoundedCountingSource)+Produce documents/Map+Collect write time 
metric+Write documents to MongoDB/ParDo(Write)
    Mar 31, 2020 7:01:37 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-31T19:01:36.557Z: Starting 5 workers in us-central1-a...
    Mar 31, 2020 7:01:57 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-03-31T19:01:56.534Z: Your project already contains 100 
Dataflow-created metric descriptors and Stackdriver will not create new 
Dataflow custom metrics for this job. Each unique user-defined metric name 
(independent of the DoFn in which it is defined) produces a new metric 
descriptor. To delete old / unused metric descriptors see 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Mar 31, 2020 7:02:00 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-31T19:02:00.777Z: Autoscaling: Raised the number of workers 
to 5 based on the rate of progress in the currently running stage(s).
    Mar 31, 2020 7:02:18 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-31T19:02:17.648Z: Workers have started successfully.
    Mar 31, 2020 7:02:18 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-31T19:02:17.688Z: Workers have started successfully.
    Mar 31, 2020 7:04:13 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-31T19:04:13.111Z: Finished operation Generate 
sequence/Read(BoundedCountingSource)+Produce documents/Map+Collect write time 
metric+Write documents to MongoDB/ParDo(Write)
    Mar 31, 2020 7:04:13 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-31T19:04:13.380Z: Cleaning up.
    Mar 31, 2020 7:04:13 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-31T19:04:13.496Z: Stopping worker pool...
    Mar 31, 2020 7:06:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-31T19:05:58.233Z: Autoscaling: Resized worker pool from 5 to 
0.
    Mar 31, 2020 7:06:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-31T19:05:58.293Z: Worker pool stopped.
    Mar 31, 2020 7:06:04 PM 
org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-03-31_12_01_23-9019861710269652086 finished with status DONE.
    Mar 31, 2020 7:06:04 PM com.mongodb.diagnostics.logging.SLF4JLogger warn
    WARNING: Got socket exception on connection [connectionId{localValue:3, 
serverValue:2}] to 34.69.47.7:27017. All connections to 34.69.47.7:27017 will 
be closed.
    Mar 31, 2020 7:06:04 PM com.mongodb.diagnostics.logging.SLF4JLogger info
    INFO: Closed connection [connectionId{localValue:3, serverValue:2}] to 
34.69.47.7:27017 because there was a socket exception raised by this connection.

org.apache.beam.sdk.io.mongodb.MongoDBIOIT > testWriteAndRead FAILED
    com.mongodb.MongoSocketReadException: Exception receiving message
        at 
com.mongodb.internal.connection.InternalStreamConnection.translateReadException(InternalStreamConnection.java:559)
        at 
com.mongodb.internal.connection.InternalStreamConnection.receiveMessage(InternalStreamConnection.java:444)
        at 
com.mongodb.internal.connection.InternalStreamConnection.receiveCommandMessageResponse(InternalStreamConnection.java:295)
        at 
com.mongodb.internal.connection.InternalStreamConnection.sendAndReceive(InternalStreamConnection.java:255)
        at 
com.mongodb.internal.connection.UsageTrackingInternalConnection.sendAndReceive(UsageTrackingInternalConnection.java:99)
        at 
com.mongodb.internal.connection.DefaultConnectionPool$PooledConnection.sendAndReceive(DefaultConnectionPool.java:444)
        at 
com.mongodb.internal.connection.CommandProtocolImpl.execute(CommandProtocolImpl.java:72)
        at 
com.mongodb.internal.connection.DefaultServer$DefaultServerProtocolExecutor.execute(DefaultServer.java:200)
        at 
com.mongodb.internal.connection.DefaultServerConnection.executeProtocol(DefaultServerConnection.java:269)
        at 
com.mongodb.internal.connection.DefaultServerConnection.command(DefaultServerConnection.java:131)
        at 
com.mongodb.internal.connection.DefaultServerConnection.command(DefaultServerConnection.java:123)
        at 
com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:242)
        at 
com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:213)
        at 
com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:205)
        at 
com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:115)
        at 
com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:108)
        at 
com.mongodb.operation.CommandReadOperation.execute(CommandReadOperation.java:56)
        at 
com.mongodb.client.internal.MongoClientDelegate$DelegateOperationExecutor.execute(MongoClientDelegate.java:179)
        at 
com.mongodb.client.internal.MongoDatabaseImpl.executeCommand(MongoDatabaseImpl.java:182)
        at 
com.mongodb.client.internal.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:151)
        at 
com.mongodb.client.internal.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:146)
        at 
com.mongodb.client.internal.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:136)
        at 
org.apache.beam.sdk.io.mongodb.MongoDBIOIT.getCollectionSizeInBytes(MongoDBIOIT.java:192)
        at 
org.apache.beam.sdk.io.mongodb.MongoDBIOIT.testWriteAndRead(MongoDBIOIT.java:168)

        Caused by:
        java.io.IOException: Connection reset by peer
            at sun.nio.ch.FileDispatcherImpl.read0(Native Method)
            at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:39)
            at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
            at sun.nio.ch.IOUtil.read(IOUtil.java:197)
            at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:377)
            at 
com.mongodb.internal.connection.SocketChannelStream.read(SocketChannelStream.java:113)
            at 
com.mongodb.internal.connection.InternalStreamConnection.receiveResponseBuffers(InternalStreamConnection.java:570)
            at 
com.mongodb.internal.connection.InternalStreamConnection.receiveMessage(InternalStreamConnection.java:441)
            ... 22 more

org.apache.beam.sdk.io.mongodb.MongoDBIOIT STANDARD_ERROR
    Mar 31, 2020 7:06:04 PM com.mongodb.diagnostics.logging.SLF4JLogger info
    INFO: Opened connection [connectionId{localValue:4, serverValue:13}] to 
34.69.47.7:27017

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:mongodb:integrationTest FAILED

1 test completed, 1 failed
Finished generating test XML results (0.02 secs) into: 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/reports/tests/integrationTest>
:sdks:java:io:mongodb:integrationTest (Thread[Execution worker for ':',5,main]) 
completed. Took 5 mins 2.276 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:mongodb:integrationTest'.
> There were failing tests. See the report at: 
> file://<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 0s
81 actionable tasks: 52 executed, 29 from cache

Publishing build scan...
https://gradle.com/s/2qv35usb7mbri

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to