See 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/2900/display/redirect>

Changes:


------------------------------------------
[...truncated 239.03 KB...]
    INFO: Opened connection [connectionId{localValue:2, serverValue:2}] to 
35.188.133.78:27017
    Mar 23, 2020 12:55:52 AM 
org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory 
tryCreateDefaultBucket
    INFO: No tempLocation specified, attempting to use default bucket: 
dataflow-staging-us-central1-844138762903
    Mar 23, 2020 12:55:53 AM 
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
 handleResponse
    WARNING: Request failed with code 409, performed 0 retries due to 
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP 
framework says request can be retried, (caller responsible for retrying): 
https://www.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
 
    Mar 23, 2020 12:55:53 AM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Mar 23, 2020 12:55:53 AM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files 
from the classpath: will stage 189 files. Enable logging at DEBUG level to see 
which files will be staged.
    Mar 23, 2020 12:55:53 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
    Mar 23, 2020 12:55:53 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Uploading 190 files from PipelineOptions.filesToStage to staging 
location to prepare for execution.
    Mar 23, 2020 12:55:53 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    WARNING: Skipping non-existent file to stage ${dataflowWorkerJar}.
    Mar 23, 2020 12:55:53 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-core-construction-java-2.21.0-SNAPSHOT-BlHpceHaXV9Z3iTTx10TdA.jar
    Mar 23, 2020 12:55:53 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.21.0-SNAPSHOT-tests.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-google-cloud-dataflow-java-2.21.0-SNAPSHOT-tests-hFffI0Y4_2q4wLF9XEFOUg.jar
    Mar 23, 2020 12:55:53 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-google-cloud-platform-2.21.0-SNAPSHOT-O5WDP4J-QsaOlm6JGw4Ppg.jar
    Mar 23, 2020 12:55:54 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/classes/java/main>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/main-NwUdtjNCCrWEIHsUSVd-hQ.jar
    Mar 23, 2020 12:55:54 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-extensions-protobuf-2.21.0-SNAPSHOT-BFubUiEADizV0zDf4xBgGA.jar
    Mar 23, 2020 12:55:54 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-google-cloud-dataflow-java-2.21.0-SNAPSHOT-sX7XPeQ8nmRbNg5bHZ7pBQ.jar
    Mar 23, 2020 12:55:54 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.21.0-SNAPSHOT-xxFtlmKIpPJxX0uraORe2A.jar
    Mar 23, 2020 12:55:54 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/classes/java/test>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/test-n5BoLq2NI0N6WHv8ZZcziQ.jar
    Mar 23, 2020 12:55:54 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.21.0-SNAPSHOT-tests.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-common-2.21.0-SNAPSHOT-tests-N6K56jXlKxy4av7zzRKwGw.jar
    Mar 23, 2020 12:55:54 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-expansion-service-2.21.0-SNAPSHOT-4KIyRBXpU-BEYv8KGF6JlQ.jar
    Mar 23, 2020 12:55:54 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.21.0-SNAPSHOT-tests.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.21.0-SNAPSHOT-tests-hBW0YhZ-Wc5wAVRRcgQG2Q.jar
    Mar 23, 2020 12:55:54 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.21.0-SNAPSHOT-tests.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-test-utils-2.21.0-SNAPSHOT-tests-dqAgmAMZMiuY4ysOl5yn7Q.jar
    Mar 23, 2020 12:55:54 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-common-2.21.0-SNAPSHOT-xfH2zytBtyHzO6gFBc0oJg.jar
    Mar 23, 2020 12:55:54 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-test-utils-2.21.0-SNAPSHOT-KQoQ90obxnCCRMs5mKMfiA.jar
    Mar 23, 2020 12:55:54 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.21.0-SNAPSHOT-tests.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-google-cloud-platform-2.21.0-SNAPSHOT-tests-mwh9rEDaGpQdOHbS0qDoPQ.jar
    Mar 23, 2020 12:55:54 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-model-fn-execution-2.21.0-SNAPSHOT-AjMKWqHFa7PlWnlZGbCvmg.jar
    Mar 23, 2020 12:55:54 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/model/job-management/build/libs/beam-model-job-management-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-model-job-management-2.21.0-SNAPSHOT-6t3GJI2ik3FMx-YM0uHaeA.jar
    Mar 23, 2020 12:55:54 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-model-pipeline-2.21.0-SNAPSHOT-oar1HVPE3E2ozY1aOlaixw.jar
    Mar 23, 2020 12:55:54 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-core-2.21.0-SNAPSHOT-FkrXrU06lg04GXHoHMDpKA.jar
    Mar 23, 2020 12:55:54 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.21.0-SNAPSHOT-tests.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-core-2.21.0-SNAPSHOT-tests-D7hiBbJBD5AbBFoYmeejzw.jar
    Mar 23, 2020 12:55:54 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-direct-java-2.21.0-SNAPSHOT-Ar34nd7WFnkTufy86HGZNg.jar
    Mar 23, 2020 12:55:54 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.21.0-SNAPSHOT.jar>
 to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.21.0-SNAPSHOT-AaSpMBCxNeb7Yrd0MdoU0g.jar
    Mar 23, 2020 12:55:55 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Staging files complete: 167 files cached, 22 files newly uploaded in 
2 seconds
    Mar 23, 2020 12:55:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Generate sequence/Read(BoundedCountingSource) as step s1
    Mar 23, 2020 12:55:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Produce documents/Map as step s2
    Mar 23, 2020 12:55:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Collect write time metric as step s3
    Mar 23, 2020 12:55:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write documents to MongoDB/ParDo(Write) as step s4
    Mar 23, 2020 12:55:56 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Mar 23, 2020 12:55:56 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading <8595 bytes, hash 5q7VSiu4eJCnSn_8yqoifg> to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-5q7VSiu4eJCnSn_8yqoifg.pb
    Mar 23, 2020 12:55:56 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.21.0-SNAPSHOT
    Mar 23, 2020 12:55:56 AM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$DefaultGcpRegionFactory
 create
    WARNING: Region will default to us-central1. Future releases of Beam will 
require the user to set the region explicitly. 
https://cloud.google.com/compute/docs/regions-zones/regions-zones
    Mar 23, 2020 12:55:58 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-22_17_55_56-6146097462188316255?project=apache-beam-testing
    Mar 23, 2020 12:55:58 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-03-22_17_55_56-6146097462188316255
    Mar 23, 2020 12:55:58 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2020-03-22_17_55_56-6146097462188316255
    Mar 23, 2020 12:56:01 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-23T00:56:00.960Z: Checking permissions granted to controller 
Service Account.
    Mar 23, 2020 12:56:14 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-23T00:56:12.789Z: Worker configuration: n1-standard-1 in 
us-central1-f.
    Mar 23, 2020 12:56:14 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-03-23T00:56:13.319Z: Your project already contains 100 
Dataflow-created metric descriptors and Stackdriver will not create new 
Dataflow custom metrics for this job. Each unique user-defined metric name 
(independent of the DoFn in which it is defined) produces a new metric 
descriptor. To delete old / unused metric descriptors see 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Mar 23, 2020 12:56:14 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-23T00:56:13.580Z: Expanding CoGroupByKey operations into 
optimizable parts.
    Mar 23, 2020 12:56:14 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-23T00:56:13.757Z: Expanding GroupByKey operations into 
optimizable parts.
    Mar 23, 2020 12:56:14 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-23T00:56:13.840Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
    Mar 23, 2020 12:56:14 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-23T00:56:14.009Z: Fusing adjacent ParDo, Read, Write, and 
Flatten operations
    Mar 23, 2020 12:56:14 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-23T00:56:14.057Z: Fusing consumer Produce documents/Map into 
Generate sequence/Read(BoundedCountingSource)
    Mar 23, 2020 12:56:14 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-23T00:56:14.136Z: Fusing consumer Collect write time metric 
into Produce documents/Map
    Mar 23, 2020 12:56:14 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-23T00:56:14.182Z: Fusing consumer Write documents to 
MongoDB/ParDo(Write) into Collect write time metric
    Mar 23, 2020 12:56:16 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-23T00:56:14.471Z: Executing operation Generate 
sequence/Read(BoundedCountingSource)+Produce documents/Map+Collect write time 
metric+Write documents to MongoDB/ParDo(Write)
    Mar 23, 2020 12:56:16 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-23T00:56:14.541Z: Starting 5 workers in us-central1-f...
    Mar 23, 2020 12:56:39 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-23T00:56:38.857Z: Autoscaling: Raised the number of workers 
to 4 based on the rate of progress in the currently running step(s).
    Mar 23, 2020 12:56:39 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-23T00:56:38.890Z: Resized worker pool to 4, though goal was 
5.  This could be a quota issue.
    Mar 23, 2020 12:56:45 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-23T00:56:44.353Z: Autoscaling: Raised the number of workers 
to 5 based on the rate of progress in the currently running step(s).
    Mar 23, 2020 12:56:58 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-23T00:56:57.634Z: Workers have started successfully.
    Mar 23, 2020 12:56:58 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-23T00:56:57.662Z: Workers have started successfully.
    Mar 23, 2020 12:57:42 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2020-03-23T00:57:41.391Z: com.mongodb.MongoTimeoutException: Timed 
out after 30000 ms while waiting to connect. Client view of cluster state is 
{type=UNKNOWN, servers=[{address=35.188.133.78:27017, type=UNKNOWN, 
state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception 
opening socket}, caused by {java.net.SocketTimeoutException: connect timed 
out}}]
        at 
com.mongodb.internal.connection.BaseCluster.getDescription(BaseCluster.java:179)
        at 
com.mongodb.internal.connection.SingleServerCluster.getDescription(SingleServerCluster.java:41)
        at 
com.mongodb.client.internal.MongoClientDelegate.getConnectedClusterDescription(MongoClientDelegate.java:136)
        at 
com.mongodb.client.internal.MongoClientDelegate.createClientSession(MongoClientDelegate.java:94)
        at 
com.mongodb.client.internal.MongoClientDelegate$DelegateOperationExecutor.getClientSession(MongoClientDelegate.java:249)
        at 
com.mongodb.client.internal.MongoClientDelegate$DelegateOperationExecutor.execute(MongoClientDelegate.java:190)
        at 
com.mongodb.client.internal.MongoCollectionImpl.executeInsertMany(MongoCollectionImpl.java:520)
        at 
com.mongodb.client.internal.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:504)
        at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.flush(MongoDbIO.java:970)
        at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.processElement(MongoDbIO.java:954)

    Mar 23, 2020 12:57:44 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2020-03-23T00:57:43.984Z: com.mongodb.MongoTimeoutException: Timed 
out after 30000 ms while waiting to connect. Client view of cluster state is 
{type=UNKNOWN, servers=[{address=35.188.133.78:27017, type=UNKNOWN, 
state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception 
opening socket}, caused by {java.net.SocketTimeoutException: connect timed 
out}}]
        at 
com.mongodb.internal.connection.BaseCluster.getDescription(BaseCluster.java:179)
        at 
com.mongodb.internal.connection.SingleServerCluster.getDescription(SingleServerCluster.java:41)
        at 
com.mongodb.client.internal.MongoClientDelegate.getConnectedClusterDescription(MongoClientDelegate.java:136)
        at 
com.mongodb.client.internal.MongoClientDelegate.createClientSession(MongoClientDelegate.java:94)
        at 
com.mongodb.client.internal.MongoClientDelegate$DelegateOperationExecutor.getClientSession(MongoClientDelegate.java:249)
        at 
com.mongodb.client.internal.MongoClientDelegate$DelegateOperationExecutor.execute(MongoClientDelegate.java:190)
        at 
com.mongodb.client.internal.MongoCollectionImpl.executeInsertMany(MongoCollectionImpl.java:520)
        at 
com.mongodb.client.internal.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:504)
        at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.flush(MongoDbIO.java:970)
        at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.processElement(MongoDbIO.java:954)

    Mar 23, 2020 12:57:51 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2020-03-23T00:57:50.405Z: com.mongodb.MongoTimeoutException: Timed 
out after 30000 ms while waiting to connect. Client view of cluster state is 
{type=UNKNOWN, servers=[{address=35.188.133.78:27017, type=UNKNOWN, 
state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception 
opening socket}, caused by {java.net.SocketTimeoutException: connect timed 
out}}]
        at 
com.mongodb.internal.connection.BaseCluster.getDescription(BaseCluster.java:179)
        at 
com.mongodb.internal.connection.SingleServerCluster.getDescription(SingleServerCluster.java:41)
        at 
com.mongodb.client.internal.MongoClientDelegate.getConnectedClusterDescription(MongoClientDelegate.java:136)
        at 
com.mongodb.client.internal.MongoClientDelegate.createClientSession(MongoClientDelegate.java:94)
        at 
com.mongodb.client.internal.MongoClientDelegate$DelegateOperationExecutor.getClientSession(MongoClientDelegate.java:249)
        at 
com.mongodb.client.internal.MongoClientDelegate$DelegateOperationExecutor.execute(MongoClientDelegate.java:190)
        at 
com.mongodb.client.internal.MongoCollectionImpl.executeInsertMany(MongoCollectionImpl.java:520)
        at 
com.mongodb.client.internal.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:504)
        at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.flush(MongoDbIO.java:970)
        at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.processElement(MongoDbIO.java:954)

    Mar 23, 2020 12:59:23 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-23T00:59:21.173Z: Finished operation Generate 
sequence/Read(BoundedCountingSource)+Produce documents/Map+Collect write time 
metric+Write documents to MongoDB/ParDo(Write)
    Mar 23, 2020 12:59:23 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-23T00:59:22.060Z: Cleaning up.
    Mar 23, 2020 12:59:23 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-23T00:59:22.136Z: Stopping worker pool...
    Mar 23, 2020 1:01:01 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-23T01:01:00.148Z: Autoscaling: Resized worker pool from 5 to 
0.
    Mar 23, 2020 1:01:01 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-03-23T01:01:00.189Z: Worker pool stopped.
    Mar 23, 2020 1:01:11 AM 
org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-03-22_17_55_56-6146097462188316255 finished with status DONE.
    Mar 23, 2020 1:01:11 AM com.mongodb.diagnostics.logging.SLF4JLogger warn
    WARNING: Got socket exception on connection [connectionId{localValue:2, 
serverValue:2}] to 35.188.133.78:27017. All connections to 35.188.133.78:27017 
will be closed.
    Mar 23, 2020 1:01:11 AM com.mongodb.diagnostics.logging.SLF4JLogger info
    INFO: Closed connection [connectionId{localValue:2, serverValue:2}] to 
35.188.133.78:27017 because there was a socket exception raised by this 
connection.

org.apache.beam.sdk.io.mongodb.MongoDBIOIT > testWriteAndRead FAILED
    com.mongodb.MongoSocketWriteException: Exception sending message
        at 
com.mongodb.internal.connection.InternalStreamConnection.translateWriteException(InternalStreamConnection.java:541)
        at 
com.mongodb.internal.connection.InternalStreamConnection.sendMessage(InternalStreamConnection.java:429)
        at 
com.mongodb.internal.connection.InternalStreamConnection.sendCommandMessage(InternalStreamConnection.java:269)
        at 
com.mongodb.internal.connection.InternalStreamConnection.sendAndReceive(InternalStreamConnection.java:253)
        at 
com.mongodb.internal.connection.UsageTrackingInternalConnection.sendAndReceive(UsageTrackingInternalConnection.java:99)
        at 
com.mongodb.internal.connection.DefaultConnectionPool$PooledConnection.sendAndReceive(DefaultConnectionPool.java:444)
        at 
com.mongodb.internal.connection.CommandProtocolImpl.execute(CommandProtocolImpl.java:72)
        at 
com.mongodb.internal.connection.DefaultServer$DefaultServerProtocolExecutor.execute(DefaultServer.java:200)
        at 
com.mongodb.internal.connection.DefaultServerConnection.executeProtocol(DefaultServerConnection.java:269)
        at 
com.mongodb.internal.connection.DefaultServerConnection.command(DefaultServerConnection.java:131)
        at 
com.mongodb.internal.connection.DefaultServerConnection.command(DefaultServerConnection.java:123)
        at 
com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:242)
        at 
com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:213)
        at 
com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:205)
        at 
com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:115)
        at 
com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:108)
        at 
com.mongodb.operation.CommandReadOperation.execute(CommandReadOperation.java:56)
        at 
com.mongodb.client.internal.MongoClientDelegate$DelegateOperationExecutor.execute(MongoClientDelegate.java:179)
        at 
com.mongodb.client.internal.MongoDatabaseImpl.executeCommand(MongoDatabaseImpl.java:182)
        at 
com.mongodb.client.internal.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:151)
        at 
com.mongodb.client.internal.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:146)
        at 
com.mongodb.client.internal.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:136)
        at 
org.apache.beam.sdk.io.mongodb.MongoDBIOIT.getCollectionSizeInBytes(MongoDBIOIT.java:192)
        at 
org.apache.beam.sdk.io.mongodb.MongoDBIOIT.testWriteAndRead(MongoDBIOIT.java:168)

        Caused by:
        java.io.IOException: Connection reset by peer
            at sun.nio.ch.FileDispatcherImpl.writev0(Native Method)
            at sun.nio.ch.SocketDispatcher.writev(SocketDispatcher.java:51)
            at sun.nio.ch.IOUtil.write(IOUtil.java:148)
            at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:501)
            at java.nio.channels.SocketChannel.write(SocketChannel.java:502)
            at 
com.mongodb.internal.connection.SocketChannelStream.write(SocketChannelStream.java:102)
            at 
com.mongodb.internal.connection.InternalStreamConnection.sendMessage(InternalStreamConnection.java:426)
            ... 22 more

org.apache.beam.sdk.io.mongodb.MongoDBIOIT STANDARD_ERROR
    Mar 23, 2020 1:01:12 AM com.mongodb.diagnostics.logging.SLF4JLogger info
    INFO: Opened connection [connectionId{localValue:3, serverValue:16}] to 
35.188.133.78:27017

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:mongodb:integrationTest FAILED

1 test completed, 1 failed
Finished generating test XML results (0.022 secs) into: 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: 
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/reports/tests/integrationTest>
:sdks:java:io:mongodb:integrationTest (Thread[Execution worker for ':' Thread 
3,5,main]) completed. Took 5 mins 26.241 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:mongodb:integrationTest'.
> There were failing tests. See the report at: 
> file://<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 11s
81 actionable tasks: 52 executed, 29 from cache

Publishing build scan...
https://gradle.com/s/5rjmgaliubh5u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to