See <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/2952/display/redirect>
Changes: ------------------------------------------ [...truncated 233.09 KB...] No history is available. Custom actions are attached to task ':runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar'. ******************* GRADLE SHADOW STATS Total Jars: 16 (includes project) Total Time: 3.06s [3060ms] Average Time/Jar: 0.19125s [191.25ms] ******************* :runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 3.859 secs. :sdks:java:io:mongodb:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) started. Gradle Test Executor 1 started executing tests. > Task :sdks:java:io:mongodb:integrationTest Build cache key for task ':sdks:java:io:mongodb:integrationTest' is 033e51419bc21cdd09330d4f68fe7d42 Task ':sdks:java:io:mongodb:integrationTest' is not up-to-date because: Task.upToDateWhen is false. Custom actions are attached to task ':sdks:java:io:mongodb:integrationTest'. Starting process 'Gradle Test Executor 1'. Working directory: <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--numberOfRecords=10000000","--bigQueryDataset=beam_performance","--bigQueryTable=mongodbioit_results","--mongoDBDatabaseName=beam","--mongoDBHostName=35.193.56.236","--mongoDBPort=27017","--runner=DataflowRunner","--autoscalingAlgorithm=NONE","--numWorkers=5","--workerHarnessContainerImage=","--dataflowWorkerJar=${dataflowWorkerJar}"] -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/5.2.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1' Successfully started process 'Gradle Test Executor 1' org.apache.beam.sdk.io.mongodb.MongoDBIOIT STANDARD_ERROR SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.21.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]> SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.25/bccda40ebc8067491b32a88f49615a747d20082d/slf4j-jdk14-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory] Apr 05, 2020 12:55:37 AM com.mongodb.diagnostics.logging.SLF4JLogger info INFO: Cluster created with settings {hosts=[35.193.56.236:27017], mode=SINGLE, requiredClusterType=UNKNOWN, serverSelectionTimeout='30000 ms', maxWaitQueueSize=500} org.apache.beam.sdk.io.mongodb.MongoDBIOIT > testWriteAndRead STANDARD_ERROR Apr 05, 2020 12:55:37 AM com.mongodb.diagnostics.logging.SLF4JLogger info INFO: Cluster description not yet available. Waiting for 30000 ms before timing out Apr 05, 2020 12:55:37 AM com.mongodb.diagnostics.logging.SLF4JLogger info INFO: Opened connection [connectionId{localValue:1, serverValue:1}] to 35.193.56.236:27017 Apr 05, 2020 12:55:37 AM com.mongodb.diagnostics.logging.SLF4JLogger info INFO: Monitor thread successfully connected to server with description ServerDescription{address=35.193.56.236:27017, type=STANDALONE, state=CONNECTED, ok=true, version=ServerVersion{versionList=[4, 2, 5]}, minWireVersion=0, maxWireVersion=8, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=3136591} Apr 05, 2020 12:55:37 AM com.mongodb.diagnostics.logging.SLF4JLogger info INFO: Opened connection [connectionId{localValue:2, serverValue:2}] to 35.193.56.236:27017 Apr 05, 2020 12:55:38 AM org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory tryCreateDefaultBucket INFO: No tempLocation specified, attempting to use default bucket: dataflow-staging-us-central1-844138762903 Apr 05, 2020 12:55:38 AM org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler handleResponse WARNING: Request failed with code 409, performed 0 retries due to IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP framework says request can be retried, (caller responsible for retrying): https://www.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing. Apr 05, 2020 12:55:38 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create INFO: No stagingLocation provided, falling back to gcpTempLocation Apr 05, 2020 12:55:39 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 189 files. Enable logging at DEBUG level to see which files will be staged. Apr 05, 2020 12:55:39 AM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services. Apr 05, 2020 12:55:39 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Uploading 190 files from PipelineOptions.filesToStage to staging location to prepare for execution. Apr 05, 2020 12:55:39 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements WARNING: Skipping non-existent file to stage ${dataflowWorkerJar}. Apr 05, 2020 12:55:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.21.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-expansion-service-2.21.0-SNAPSHOT-I6FdML43-z2AkLqR2bkkOg.jar Apr 05, 2020 12:55:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/classes/java/main> to gs://dataflow-staging-us-central1-844138762903/temp/staging/main-43kGnQnvAgsu1Wps_Rw0oA.jar Apr 05, 2020 12:55:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.21.0-SNAPSHOT-tests.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-google-cloud-dataflow-java-2.21.0-SNAPSHOT-tests-g-CQkJUpe9oQdb36rogsgg.jar Apr 05, 2020 12:55:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.21.0-SNAPSHOT-tests.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.21.0-SNAPSHOT-tests-lRzjoP1DchNYtNEr_ymvFA.jar Apr 05, 2020 12:55:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.21.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-core-construction-java-2.21.0-SNAPSHOT-mAYp1MMnv51w6f5JrTHxQw.jar Apr 05, 2020 12:55:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.21.0-SNAPSHOT-tests.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-common-2.21.0-SNAPSHOT-tests-EbSSSSy_XaTAubkafP4yoA.jar Apr 05, 2020 12:55:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.21.0-SNAPSHOT-tests.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-google-cloud-platform-2.21.0-SNAPSHOT-tests-8aVBkZbjVh-R2j8XiNcQxQ.jar Apr 05, 2020 12:55:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/model/job-management/build/libs/beam-model-job-management-2.21.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-model-job-management-2.21.0-SNAPSHOT-WnBmRrg32S7rcBKDn6-CkQ.jar Apr 05, 2020 12:55:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/classes/java/test> to gs://dataflow-staging-us-central1-844138762903/temp/staging/test-cBYPOrssbkCaWXOZOb90Lg.jar Apr 05, 2020 12:55:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.21.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-common-2.21.0-SNAPSHOT-o0ijtD7PQaFnv9_V57RVtQ.jar Apr 05, 2020 12:55:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.21.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-test-utils-2.21.0-SNAPSHOT-H5b2dWM7n_zgCjupQeJldw.jar Apr 05, 2020 12:55:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.21.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-extensions-protobuf-2.21.0-SNAPSHOT-L4WDMrTBD9YRrrMdN42F5w.jar Apr 05, 2020 12:55:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.21.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-google-cloud-dataflow-java-2.21.0-SNAPSHOT-fPKO1OoFv9QRmXe-VSnLkg.jar Apr 05, 2020 12:55:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.21.0-SNAPSHOT-tests.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-test-utils-2.21.0-SNAPSHOT-tests--S-xpOU5tIjg0Xaz2I24yg.jar Apr 05, 2020 12:55:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.21.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.21.0-SNAPSHOT-X2tvV_a1o3hrar8QDPxuLg.jar Apr 05, 2020 12:55:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.21.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-google-cloud-platform-2.21.0-SNAPSHOT-og9DI4mhal5HnXQvaqqSPQ.jar Apr 05, 2020 12:55:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.21.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-model-pipeline-2.21.0-SNAPSHOT-IsY8S65BFVwWm6QFH1yjJg.jar Apr 05, 2020 12:55:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.21.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-model-fn-execution-2.21.0-SNAPSHOT-pThKsY_DewhpYc09z0X5vw.jar Apr 05, 2020 12:55:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.21.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-direct-java-2.21.0-SNAPSHOT-tCxaUCmqIUDswun-0g_rXw.jar Apr 05, 2020 12:55:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.21.0-SNAPSHOT-tests.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-core-2.21.0-SNAPSHOT-tests-TuiWKO7BhR60RLHuA84ppw.jar Apr 05, 2020 12:55:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.21.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-core-2.21.0-SNAPSHOT-2iez2DMtz7-04NvlJQYX0g.jar Apr 05, 2020 12:55:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.21.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.21.0-SNAPSHOT-EUN41O-ynBtLqbTRIsmg8g.jar Apr 05, 2020 12:55:40 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Staging files complete: 167 files cached, 22 files newly uploaded in 1 seconds Apr 05, 2020 12:55:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Generate sequence/Read(BoundedCountingSource) as step s1 Apr 05, 2020 12:55:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Produce documents/Map as step s2 Apr 05, 2020 12:55:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect write time metric as step s3 Apr 05, 2020 12:55:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Write documents to MongoDB/ParDo(Write) as step s4 Apr 05, 2020 12:55:41 AM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Staging pipeline description to gs://dataflow-staging-us-central1-844138762903/temp/staging/ Apr 05, 2020 12:55:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <8593 bytes, hash YrVeh-Py7X-kFSWFauXYQw> to gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-YrVeh-Py7X-kFSWFauXYQw.pb Apr 05, 2020 12:55:41 AM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Dataflow SDK version: 2.21.0-SNAPSHOT Apr 05, 2020 12:55:41 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$DefaultGcpRegionFactory create WARNING: Region will default to us-central1. Future releases of Beam will require the user to set the region explicitly. https://cloud.google.com/compute/docs/regions-zones/regions-zones Apr 05, 2020 12:55:43 AM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-04_17_55_41-13248156623775382275?project=apache-beam-testing Apr 05, 2020 12:55:43 AM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Submitted job: 2020-04-04_17_55_41-13248156623775382275 Apr 05, 2020 12:55:43 AM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To cancel the job using the 'gcloud' tool, run: > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-04-04_17_55_41-13248156623775382275 Apr 05, 2020 12:55:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-05T00:55:45.700Z: Checking permissions granted to controller Service Account. Apr 05, 2020 12:55:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-05T00:55:54.706Z: Worker configuration: n1-standard-1 in us-central1-a. Apr 05, 2020 12:55:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-05T00:55:55.376Z: Expanding CoGroupByKey operations into optimizable parts. Apr 05, 2020 12:55:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-05T00:55:55.421Z: Expanding GroupByKey operations into optimizable parts. Apr 05, 2020 12:55:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-05T00:55:55.452Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns Apr 05, 2020 12:55:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-05T00:55:55.541Z: Fusing adjacent ParDo, Read, Write, and Flatten operations Apr 05, 2020 12:55:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-05T00:55:55.579Z: Fusing consumer Produce documents/Map into Generate sequence/Read(BoundedCountingSource) Apr 05, 2020 12:55:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-05T00:55:55.618Z: Fusing consumer Collect write time metric into Produce documents/Map Apr 05, 2020 12:55:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-05T00:55:55.659Z: Fusing consumer Write documents to MongoDB/ParDo(Write) into Collect write time metric Apr 05, 2020 12:55:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-05T00:55:56.066Z: Executing operation Generate sequence/Read(BoundedCountingSource)+Produce documents/Map+Collect write time metric+Write documents to MongoDB/ParDo(Write) Apr 05, 2020 12:55:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-05T00:55:56.155Z: Starting 5 workers in us-central1-a... Apr 05, 2020 12:56:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process WARNING: 2020-04-05T00:56:20.643Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete Apr 05, 2020 12:56:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-05T00:56:21.416Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s). Apr 05, 2020 12:56:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-05T00:56:21.456Z: Resized worker pool to 4, though goal was 5. This could be a quota issue. Apr 05, 2020 12:56:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-05T00:56:26.779Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s). Apr 05, 2020 12:56:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-05T00:56:41.652Z: Workers have started successfully. Apr 05, 2020 12:56:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-05T00:56:41.686Z: Workers have started successfully. Apr 05, 2020 12:58:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-05T00:58:27.673Z: Finished operation Generate sequence/Read(BoundedCountingSource)+Produce documents/Map+Collect write time metric+Write documents to MongoDB/ParDo(Write) Apr 05, 2020 12:58:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-05T00:58:27.876Z: Cleaning up. Apr 05, 2020 12:58:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-05T00:58:27.969Z: Stopping worker pool... Apr 05, 2020 12:59:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-05T00:59:56.220Z: Autoscaling: Resized worker pool from 5 to 0. Apr 05, 2020 12:59:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-05T00:59:56.265Z: Worker pool stopped. Apr 05, 2020 1:00:02 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState INFO: Job 2020-04-04_17_55_41-13248156623775382275 finished with status DONE. Apr 05, 2020 1:00:02 AM com.mongodb.diagnostics.logging.SLF4JLogger warn WARNING: Got socket exception on connection [connectionId{localValue:2, serverValue:2}] to 35.193.56.236:27017. All connections to 35.193.56.236:27017 will be closed. Apr 05, 2020 1:00:02 AM com.mongodb.diagnostics.logging.SLF4JLogger info INFO: Closed connection [connectionId{localValue:2, serverValue:2}] to 35.193.56.236:27017 because there was a socket exception raised by this connection. org.apache.beam.sdk.io.mongodb.MongoDBIOIT > testWriteAndRead FAILED com.mongodb.MongoSocketReadException: Exception receiving message at com.mongodb.internal.connection.InternalStreamConnection.translateReadException(InternalStreamConnection.java:559) at com.mongodb.internal.connection.InternalStreamConnection.receiveMessage(InternalStreamConnection.java:444) at com.mongodb.internal.connection.InternalStreamConnection.receiveCommandMessageResponse(InternalStreamConnection.java:295) at com.mongodb.internal.connection.InternalStreamConnection.sendAndReceive(InternalStreamConnection.java:255) at com.mongodb.internal.connection.UsageTrackingInternalConnection.sendAndReceive(UsageTrackingInternalConnection.java:99) at com.mongodb.internal.connection.DefaultConnectionPool$PooledConnection.sendAndReceive(DefaultConnectionPool.java:444) at com.mongodb.internal.connection.CommandProtocolImpl.execute(CommandProtocolImpl.java:72) at com.mongodb.internal.connection.DefaultServer$DefaultServerProtocolExecutor.execute(DefaultServer.java:200) at com.mongodb.internal.connection.DefaultServerConnection.executeProtocol(DefaultServerConnection.java:269) at com.mongodb.internal.connection.DefaultServerConnection.command(DefaultServerConnection.java:131) at com.mongodb.internal.connection.DefaultServerConnection.command(DefaultServerConnection.java:123) at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:242) at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:213) at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:205) at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:115) at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:108) at com.mongodb.operation.CommandReadOperation.execute(CommandReadOperation.java:56) at com.mongodb.client.internal.MongoClientDelegate$DelegateOperationExecutor.execute(MongoClientDelegate.java:179) at com.mongodb.client.internal.MongoDatabaseImpl.executeCommand(MongoDatabaseImpl.java:182) at com.mongodb.client.internal.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:151) at com.mongodb.client.internal.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:146) at com.mongodb.client.internal.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:136) at org.apache.beam.sdk.io.mongodb.MongoDBIOIT.getCollectionSizeInBytes(MongoDBIOIT.java:192) at org.apache.beam.sdk.io.mongodb.MongoDBIOIT.testWriteAndRead(MongoDBIOIT.java:168) Caused by: java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.read0(Native Method) at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:39) at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223) at sun.nio.ch.IOUtil.read(IOUtil.java:197) at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:377) at com.mongodb.internal.connection.SocketChannelStream.read(SocketChannelStream.java:113) at com.mongodb.internal.connection.InternalStreamConnection.receiveResponseBuffers(InternalStreamConnection.java:570) at com.mongodb.internal.connection.InternalStreamConnection.receiveMessage(InternalStreamConnection.java:441) ... 22 more org.apache.beam.sdk.io.mongodb.MongoDBIOIT STANDARD_ERROR Apr 05, 2020 1:00:02 AM com.mongodb.diagnostics.logging.SLF4JLogger info INFO: Opened connection [connectionId{localValue:3, serverValue:13}] to 35.193.56.236:27017 Gradle Test Executor 1 finished executing tests. > Task :sdks:java:io:mongodb:integrationTest FAILED 1 test completed, 1 failed Finished generating test XML results (0.018 secs) into: <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/test-results/integrationTest> Generating HTML test report... Finished generating test html results (0.025 secs) into: <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/reports/tests/integrationTest> :sdks:java:io:mongodb:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 4 mins 28.767 secs. FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':sdks:java:io:mongodb:integrationTest'. > There were failing tests. See the report at: > file://<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/reports/tests/integrationTest/index.html> * Try: Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 5m 10s 81 actionable tasks: 52 executed, 29 from cache Publishing build scan... https://gradle.com/s/iakmpny6heqha Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
