See <https://ci-beam.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/3946/display/redirect?page=changes>
Changes: [noreply] [BEAM-11312] Minor improvement in retrieving build log (#13501) ------------------------------------------ [...truncated 265.05 KB...] > Task :runners:google-cloud-dataflow-java:classes Skipping task ':runners:google-cloud-dataflow-java:classes' as it has no actions. :runners:google-cloud-dataflow-java:classes (Thread[Daemon ****,5,main]) completed. Took 0.0 secs. :runners:google-cloud-dataflow-java:jar (Thread[Daemon ****,5,main]) started. > Task :runners:google-cloud-dataflow-java:jar Watching 1207 directories to track changes Watching 1208 directories to track changes Caching disabled for task ':runners:google-cloud-dataflow-java:jar' because: Caching has not been enabled for the task Task ':runners:google-cloud-dataflow-java:jar' is not up-to-date because: No history is available. Watching 1208 directories to track changes Watching 1209 directories to track changes :runners:google-cloud-dataflow-java:jar (Thread[Daemon ****,5,main]) completed. Took 0.086 secs. :runners:google-cloud-dataflow-java:****:legacy-****:compileJava (Thread[Daemon ****,5,main]) started. > Task :sdks:java:io:google-cloud-platform:compileTestJava FROM-CACHE Watching 1172 directories to track changes Watching 1173 directories to track changes Watching 1173 directories to track changes Watching 1230 directories to track changes Custom actions are attached to task ':sdks:java:io:google-cloud-platform:compileTestJava'. Build cache key for task ':sdks:java:io:google-cloud-platform:compileTestJava' is 9950d00044f5c02c09b45c2ca677d64d Task ':sdks:java:io:google-cloud-platform:compileTestJava' is not up-to-date because: No history is available. Watching 1230 directories to track changes Watching 1230 directories to track changes Watching 1230 directories to track changes Watching 1250 directories to track changes Watching 1259 directories to track changes Watching 1260 directories to track changes Loaded cache entry for task ':sdks:java:io:google-cloud-platform:compileTestJava' with cache key 9950d00044f5c02c09b45c2ca677d64d :sdks:java:io:google-cloud-platform:compileTestJava (Thread[Execution **** for ':',5,main]) completed. Took 0.343 secs. :sdks:java:io:google-cloud-platform:testClasses (Thread[Execution **** for ':',5,main]) started. > Task :sdks:java:io:google-cloud-platform:testClasses Skipping task ':sdks:java:io:google-cloud-platform:testClasses' as it has no actions. :sdks:java:io:google-cloud-platform:testClasses (Thread[Execution **** for ':',5,main]) completed. Took 0.0 secs. :sdks:java:io:google-cloud-platform:testJar (Thread[Execution **** for ':' Thread 6,5,main]) started. > Task :runners:google-cloud-dataflow-java:****:legacy-****:compileJava > FROM-CACHE file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/src/main/java',> not found Watching 1210 directories to track changes Watching 1210 directories to track changes Watching 1210 directories to track changes Watching 1263 directories to track changes Watching 1295 directories to track changes Custom actions are attached to task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava'. Build cache key for task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava' is 372840dfa9d99f430311b54ff6299823 Task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava' is not up-to-date because: No history is available. Watching 1295 directories to track changes Watching 1295 directories to track changes Watching 1295 directories to track changes Watching 1327 directories to track changes Watching 1337 directories to track changes Watching 1339 directories to track changes Loaded cache entry for task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava' with cache key 372840dfa9d99f430311b54ff6299823 :runners:google-cloud-dataflow-java:****:legacy-****:compileJava (Thread[Daemon ****,5,main]) completed. Took 0.298 secs. :runners:google-cloud-dataflow-java:****:legacy-****:classes (Thread[Daemon ****,5,main]) started. > Task :runners:google-cloud-dataflow-java:****:legacy-****:classes UP-TO-DATE Skipping task ':runners:google-cloud-dataflow-java:****:legacy-****:classes' as it has no actions. :runners:google-cloud-dataflow-java:****:legacy-****:classes (Thread[Daemon ****,5,main]) completed. Took 0.0 secs. :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar (Thread[Daemon ****,5,main]) started. > Task :sdks:java:io:google-cloud-platform:testJar Watching 1260 directories to track changes Watching 1261 directories to track changes Caching disabled for task ':sdks:java:io:google-cloud-platform:testJar' because: Caching has not been enabled for the task Task ':sdks:java:io:google-cloud-platform:testJar' is not up-to-date because: No history is available. Watching 1261 directories to track changes Watching 1339 directories to track changes :sdks:java:io:google-cloud-platform:testJar (Thread[Execution **** for ':' Thread 6,5,main]) completed. Took 0.242 secs. :runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** for ':' Thread 6,5,main]) started. > Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar FROM-CACHE Watching 1339 directories to track changes Could not read file path '<https://ci-beam.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/resources/main'.> Watching 1339 directories to track changes Could not read file path '<https://ci-beam.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/original_sources_to_package'.> Watching 1339 directories to track changes Watching 1340 directories to track changes Custom actions are attached to task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar'. Build cache key for task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' is c27ed34ae94c5708a9f37658148949cc Task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' is not up-to-date because: No history is available. Watching 1340 directories to track changes Watching 1341 directories to track changes Loaded cache entry for task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' with cache key c27ed34ae94c5708a9f37658148949cc :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar (Thread[Daemon ****,5,main]) completed. Took 0.219 secs. > Task :runners:google-cloud-dataflow-java:compileTestJava FROM-CACHE Watching 1340 directories to track changes Watching 1340 directories to track changes Watching 1340 directories to track changes Watching 1351 directories to track changes Custom actions are attached to task ':runners:google-cloud-dataflow-java:compileTestJava'. Build cache key for task ':runners:google-cloud-dataflow-java:compileTestJava' is dfef2276b611cff825530ccbcdc5060e Task ':runners:google-cloud-dataflow-java:compileTestJava' is not up-to-date because: No history is available. Watching 1351 directories to track changes Watching 1351 directories to track changes Watching 1351 directories to track changes Watching 1362 directories to track changes Watching 1363 directories to track changes Watching 1364 directories to track changes Loaded cache entry for task ':runners:google-cloud-dataflow-java:compileTestJava' with cache key dfef2276b611cff825530ccbcdc5060e :runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** for ':' Thread 6,5,main]) completed. Took 0.182 secs. :runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** for ':' Thread 6,5,main]) started. > Task :runners:google-cloud-dataflow-java:testClasses UP-TO-DATE Skipping task ':runners:google-cloud-dataflow-java:testClasses' as it has no actions. :runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** for ':' Thread 6,5,main]) completed. Took 0.0 secs. :runners:google-cloud-dataflow-java:testJar (Thread[Execution **** for ':' Thread 6,5,main]) started. > Task :runners:google-cloud-dataflow-java:testJar Watching 1364 directories to track changes Could not read file path '<https://ci-beam.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/google-cloud-dataflow-java/build/resources/test'.> Watching 1364 directories to track changes Watching 1365 directories to track changes Caching disabled for task ':runners:google-cloud-dataflow-java:testJar' because: Caching has not been enabled for the task Task ':runners:google-cloud-dataflow-java:testJar' is not up-to-date because: No history is available. Watching 1365 directories to track changes file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/google-cloud-dataflow-java/build/resources/test',> not found Watching 1365 directories to track changes :runners:google-cloud-dataflow-java:testJar (Thread[Execution **** for ':' Thread 6,5,main]) completed. Took 0.036 secs. :sdks:java:io:mongodb:integrationTest (Thread[Execution **** for ':' Thread 6,5,main]) started. Gradle Test Executor 1 started executing tests. > Task :sdks:java:io:mongodb:integrationTest Watching 1365 directories to track changes Watching 1365 directories to track changes Watching 1365 directories to track changes Watching 1365 directories to track changes Custom actions are attached to task ':sdks:java:io:mongodb:integrationTest'. Build cache key for task ':sdks:java:io:mongodb:integrationTest' is 352aaebb97cb4d543369d2b0aa74234d Task ':sdks:java:io:mongodb:integrationTest' is not up-to-date because: Task.upToDateWhen is false. Watching 1365 directories to track changes Watching 1365 directories to track changes Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--numberOfRecords=10000000","--bigQueryDataset=beam_performance","--bigQueryTable=mongodbioit_results","--influxMeasurement=mongodbioit_results","--influxDatabase=beam_test_metrics","--influxHost=http://10.128.0.96:8086","--mongoDBDatabaseName=beam","--mongoDBHostName=34.66.180.171","--mongoDBPort=27017","--runner=DataflowRunner","--autoscalingAlgorithm=NONE","--numWorkers=5","--****HarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.27.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=****.org.gradle.process.internal.****.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.7/****Main/gradle-****.jar ****.org.gradle.process.internal.****.GradleWorkerMain 'Gradle Test Executor 1' Successfully started process 'Gradle Test Executor 1' org.apache.beam.sdk.io.mongodb.MongoDBIOIT STANDARD_ERROR SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.27.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]> SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory] Dec 08, 2020 12:55:35 PM com.mongodb.diagnostics.logging.SLF4JLogger info INFO: Cluster created with settings {hosts=[34.66.180.171:27017], mode=SINGLE, requiredClusterType=UNKNOWN, serverSelectionTimeout='30000 ms', maxWaitQueueSize=500} org.apache.beam.sdk.io.mongodb.MongoDBIOIT > testWriteAndRead STANDARD_ERROR Dec 08, 2020 12:55:35 PM com.mongodb.diagnostics.logging.SLF4JLogger info INFO: Opened connection [connectionId{localValue:1, serverValue:1}] to 34.66.180.171:27017 Dec 08, 2020 12:55:35 PM com.mongodb.diagnostics.logging.SLF4JLogger info INFO: Cluster description not yet available. Waiting for 30000 ms before timing out Dec 08, 2020 12:55:35 PM com.mongodb.diagnostics.logging.SLF4JLogger info INFO: Monitor thread successfully connected to server with description ServerDescription{address=34.66.180.171:27017, type=STANDALONE, state=CONNECTED, ok=true, version=ServerVersion{versionList=[4, 4, 2]}, minWireVersion=0, maxWireVersion=9, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=4785275} Dec 08, 2020 12:55:45 PM com.mongodb.diagnostics.logging.SLF4JLogger info INFO: Closed connection [connectionId{localValue:2}] to 34.66.180.171:27017 because there was a socket exception raised by this connection. org.apache.beam.sdk.io.mongodb.MongoDBIOIT > testWriteAndRead FAILED com.mongodb.MongoSocketOpenException: Exception opening socket at com.mongodb.internal.connection.SocketStream.open(SocketStream.java:70) at com.mongodb.internal.connection.InternalStreamConnection.open(InternalStreamConnection.java:128) at com.mongodb.internal.connection.UsageTrackingInternalConnection.open(UsageTrackingInternalConnection.java:50) at com.mongodb.internal.connection.DefaultConnectionPool$PooledConnection.open(DefaultConnectionPool.java:398) at com.mongodb.internal.connection.DefaultConnectionPool.get(DefaultConnectionPool.java:115) at com.mongodb.internal.connection.DefaultConnectionPool.get(DefaultConnectionPool.java:101) at com.mongodb.internal.connection.DefaultServer.getConnection(DefaultServer.java:92) at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.getConnection(ClusterBinding.java:133) at com.mongodb.client.internal.ClientSessionBinding$SessionBindingConnectionSource.getConnection(ClientSessionBinding.java:135) at com.mongodb.operation.CommandOperationHelper$5.call(CommandOperationHelper.java:207) at com.mongodb.operation.OperationHelper.withReadConnectionSource(OperationHelper.java:463) at com.mongodb.operation.CommandOperationHelper.executeCommand(CommandOperationHelper.java:203) at com.mongodb.operation.CommandOperationHelper.executeCommand(CommandOperationHelper.java:198) at com.mongodb.operation.CommandReadOperation.execute(CommandReadOperation.java:59) at com.mongodb.client.internal.MongoClientDelegate$DelegateOperationExecutor.execute(MongoClientDelegate.java:194) at com.mongodb.client.internal.MongoDatabaseImpl.executeCommand(MongoDatabaseImpl.java:194) at com.mongodb.client.internal.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:163) at com.mongodb.client.internal.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:158) at com.mongodb.client.internal.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:148) at org.apache.beam.sdk.io.mongodb.MongoDBIOIT.getCollectionSizeInBytes(MongoDBIOIT.java:203) at org.apache.beam.sdk.io.mongodb.MongoDBIOIT.testWriteAndRead(MongoDBIOIT.java:164) Caused by: java.net.SocketTimeoutException: connect timed out at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350) at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206) at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) at java.net.Socket.connect(Socket.java:607) at com.mongodb.internal.connection.SocketStreamHelper.initialize(SocketStreamHelper.java:64) at com.mongodb.internal.connection.SocketStream.initializeSocket(SocketStream.java:79) at com.mongodb.internal.connection.SocketStream.open(SocketStream.java:65) ... 20 more org.apache.beam.sdk.io.mongodb.MongoDBIOIT STANDARD_ERROR Dec 08, 2020 12:55:45 PM com.mongodb.diagnostics.logging.SLF4JLogger info INFO: Cluster description not yet available. Waiting for 30000 ms before timing out Dec 08, 2020 12:55:46 PM com.mongodb.diagnostics.logging.SLF4JLogger info INFO: Opened connection [connectionId{localValue:3, serverValue:2}] to 34.66.180.171:27017 Gradle Test Executor 1 finished executing tests. > Task :sdks:java:io:mongodb:integrationTest FAILED 1 test completed, 1 failed Finished generating test XML results (0.016 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/test-results/integrationTest> Generating HTML test report... Finished generating test html results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/reports/tests/integrationTest> Watching 1367 directories to track changes Watching 1373 directories to track changes Watching 1374 directories to track changes :sdks:java:io:mongodb:integrationTest (Thread[Execution **** for ':' Thread 6,5,main]) completed. Took 14.183 secs. FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':sdks:java:io:mongodb:integrationTest'. > There were failing tests. See the report at: > file://<https://ci-beam.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/reports/tests/integrationTest/index.html> * Try: Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.7/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 45s 107 actionable tasks: 64 executed, 43 from cache Watching 1374 directories to track changes Publishing build scan... https://gradle.com/s/d2lrcfppnfai6 Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
