See <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/2960/display/redirect?page=changes>
Changes: [alex] Add Beam Schema Options to changelog [alex] [BEAM-9704] Deprecate FieldType metadata [eekkaaadrian] [BEAM-9705] Go sdk add value length validation checking on write to ------------------------------------------ [...truncated 219.40 KB...] No history is available. All input files are considered out-of-date for incremental task ':runners:java-fn-execution:compileJava'. Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments. Compiling with error-prone compiler Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. Created classpath snapshot for incremental compilation in 5.718 secs. 29280 duplicate classes found in classpath (see all with --debug). Packing task ':runners:java-fn-execution:compileJava' :runners:java-fn-execution:compileJava (Thread[Execution worker for ':',5,main]) completed. Took 14.783 secs. :runners:java-fn-execution:classes (Thread[Execution worker for ':',5,main]) started. > Task :runners:java-fn-execution:classes Skipping task ':runners:java-fn-execution:classes' as it has no actions. :runners:java-fn-execution:classes (Thread[Execution worker for ':',5,main]) completed. Took 0.0 secs. :runners:java-fn-execution:jar (Thread[Execution worker for ':',5,main]) started. > Task :runners:java-fn-execution:jar Build cache key for task ':runners:java-fn-execution:jar' is d7f6b2a66d641bad21a2851250cff6cb Caching disabled for task ':runners:java-fn-execution:jar': Caching has not been enabled for the task Task ':runners:java-fn-execution:jar' is not up-to-date because: No history is available. :runners:java-fn-execution:jar (Thread[Execution worker for ':',5,main]) completed. Took 0.05 secs. :runners:direct-java:compileJava (Thread[Execution worker for ':',5,main]) started. :runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava (Thread[Daemon worker,5,main]) started. > Task :runners:direct-java:compileJava Build cache key for task ':runners:direct-java:compileJava' is 0f639da449965f1ce916badd76d4d6d1 Task ':runners:direct-java:compileJava' is not up-to-date because: No history is available. All input files are considered out-of-date for incremental task ':runners:direct-java:compileJava'. Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments. Compiling with error-prone compiler Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. Created classpath snapshot for incremental compilation in 0.056 secs. 1 duplicate classes found in classpath (see all with --debug). Packing task ':runners:direct-java:compileJava' :runners:direct-java:compileJava (Thread[Execution worker for ':',5,main]) completed. Took 9.965 secs. :runners:direct-java:classes (Thread[Execution worker for ':',5,main]) started. > Task :runners:direct-java:classes Skipping task ':runners:direct-java:classes' as it has no actions. :runners:direct-java:classes (Thread[Execution worker for ':',5,main]) completed. Took 0.0 secs. :runners:direct-java:shadowJar (Thread[Execution worker for ':' Thread 4,5,main]) started. > Task :runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava file or directory '<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/src/main/java',> not found Build cache key for task ':runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava' is 34efab8236e182308b8125211bf0c2c7 Task ':runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava' is not up-to-date because: No history is available. All input files are considered out-of-date for incremental task ':runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava'. Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments. file or directory '<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/src/main/java',> not found Compiling with error-prone compiler > Task :runners:direct-java:shadowJar Build cache key for task ':runners:direct-java:shadowJar' is 95b77b200f173455e79a0ad0633dc219 Caching disabled for task ':runners:direct-java:shadowJar': Caching has not been enabled for the task Task ':runners:direct-java:shadowJar' is not up-to-date because: No history is available. Custom actions are attached to task ':runners:direct-java:shadowJar'. ******************* GRADLE SHADOW STATS Total Jars: 6 (includes project) Total Time: 0.617s [617ms] Average Time/Jar: 0.1028333333333s [102.8333333333ms] ******************* :runners:direct-java:shadowJar (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 0.887 secs. :sdks:java:io:google-cloud-platform:compileTestJava (Thread[Execution worker for ':' Thread 4,5,main]) started. > Task :sdks:java:io:google-cloud-platform:compileTestJava Build cache key for task ':sdks:java:io:google-cloud-platform:compileTestJava' is c39b006b816cdd6e4d4aa3c97c2cef61 Task ':sdks:java:io:google-cloud-platform:compileTestJava' is not up-to-date because: No history is available. All input files are considered out-of-date for incremental task ':sdks:java:io:google-cloud-platform:compileTestJava'. Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments. Compiling with error-prone compiler > Task :runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. Created classpath snapshot for incremental compilation in 0.129 secs. 302 duplicate classes found in classpath (see all with --debug). Packing task ':runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava' :runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava (Thread[Daemon worker,5,main]) completed. Took 18.631 secs. :runners:google-cloud-dataflow-java:worker:legacy-worker:classes (Thread[Daemon worker,5,main]) started. > Task :runners:google-cloud-dataflow-java:worker:legacy-worker:classes Skipping task ':runners:google-cloud-dataflow-java:worker:legacy-worker:classes' as it has no actions. :runners:google-cloud-dataflow-java:worker:legacy-worker:classes (Thread[Daemon worker,5,main]) completed. Took 0.0 secs. :runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar (Thread[Daemon worker,5,main]) started. > Task :runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar Build cache key for task ':runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar' is fed90b8ec03f6ca7672afb8b494355ea Caching disabled for task ':runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar': Caching has not been enabled for the task Task ':runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar' is not up-to-date because: No history is available. Custom actions are attached to task ':runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar'. ******************* GRADLE SHADOW STATS Total Jars: 16 (includes project) Total Time: 3.252s [3252ms] Average Time/Jar: 0.20325s [203.25ms] ******************* :runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar (Thread[Daemon worker,5,main]) completed. Took 4.056 secs. > Task :sdks:java:io:google-cloud-platform:compileTestJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. Created classpath snapshot for incremental compilation in 0.511 secs. 1576 duplicate classes found in classpath (see all with --debug). Packing task ':sdks:java:io:google-cloud-platform:compileTestJava' :sdks:java:io:google-cloud-platform:compileTestJava (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 14.366 secs. :sdks:java:io:google-cloud-platform:testClasses (Thread[Execution worker for ':' Thread 4,5,main]) started. > Task :sdks:java:io:google-cloud-platform:testClasses Skipping task ':sdks:java:io:google-cloud-platform:testClasses' as it has no actions. :sdks:java:io:google-cloud-platform:testClasses (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 0.0 secs. :sdks:java:io:google-cloud-platform:testJar (Thread[Execution worker for ':' Thread 4,5,main]) started. > Task :sdks:java:io:google-cloud-platform:testJar Build cache key for task ':sdks:java:io:google-cloud-platform:testJar' is 6d2409cf32cc2596d61a3b729a499658 Caching disabled for task ':sdks:java:io:google-cloud-platform:testJar': Caching has not been enabled for the task Task ':sdks:java:io:google-cloud-platform:testJar' is not up-to-date because: No history is available. :sdks:java:io:google-cloud-platform:testJar (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 0.057 secs. :runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution worker for ':' Thread 4,5,main]) started. > Task :runners:google-cloud-dataflow-java:compileTestJava Build cache key for task ':runners:google-cloud-dataflow-java:compileTestJava' is 7767523345e94cace7536b6838096b2a Task ':runners:google-cloud-dataflow-java:compileTestJava' is not up-to-date because: No history is available. All input files are considered out-of-date for incremental task ':runners:google-cloud-dataflow-java:compileTestJava'. Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments. Compiling with error-prone compiler Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. Created classpath snapshot for incremental compilation in 0.074 secs. 1576 duplicate classes found in classpath (see all with --debug). Packing task ':runners:google-cloud-dataflow-java:compileTestJava' :runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 8.526 secs. :runners:google-cloud-dataflow-java:testClasses (Thread[Execution worker for ':' Thread 4,5,main]) started. > Task :runners:google-cloud-dataflow-java:testClasses Skipping task ':runners:google-cloud-dataflow-java:testClasses' as it has no actions. :runners:google-cloud-dataflow-java:testClasses (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 0.0 secs. :runners:google-cloud-dataflow-java:testJar (Thread[Execution worker for ':' Thread 4,5,main]) started. > Task :runners:google-cloud-dataflow-java:testJar Build cache key for task ':runners:google-cloud-dataflow-java:testJar' is 4401c3c34c7ff4f4005c066bfc92de78 Caching disabled for task ':runners:google-cloud-dataflow-java:testJar': Caching has not been enabled for the task Task ':runners:google-cloud-dataflow-java:testJar' is not up-to-date because: No history is available. :runners:google-cloud-dataflow-java:testJar (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 0.029 secs. :sdks:java:io:mongodb:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) started. Gradle Test Executor 1 started executing tests. > Task :sdks:java:io:mongodb:integrationTest Build cache key for task ':sdks:java:io:mongodb:integrationTest' is bb4dcbf65ddbb5162badcbc409d7a892 Task ':sdks:java:io:mongodb:integrationTest' is not up-to-date because: Task.upToDateWhen is false. Custom actions are attached to task ':sdks:java:io:mongodb:integrationTest'. Starting process 'Gradle Test Executor 1'. Working directory: <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--numberOfRecords=10000000","--bigQueryDataset=beam_performance","--bigQueryTable=mongodbioit_results","--mongoDBDatabaseName=beam","--mongoDBHostName=35.223.169.207","--mongoDBPort=27017","--runner=DataflowRunner","--autoscalingAlgorithm=NONE","--numWorkers=5","--workerHarnessContainerImage=","--dataflowWorkerJar=${dataflowWorkerJar}"] -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/5.2.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1' Successfully started process 'Gradle Test Executor 1' org.apache.beam.sdk.io.mongodb.MongoDBIOIT STANDARD_ERROR SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.21.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]> SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.25/bccda40ebc8067491b32a88f49615a747d20082d/slf4j-jdk14-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory] Apr 07, 2020 6:57:05 AM com.mongodb.diagnostics.logging.SLF4JLogger info INFO: Cluster created with settings {hosts=[35.223.169.207:27017], mode=SINGLE, requiredClusterType=UNKNOWN, serverSelectionTimeout='30000 ms', maxWaitQueueSize=500} org.apache.beam.sdk.io.mongodb.MongoDBIOIT > testWriteAndRead STANDARD_ERROR Apr 07, 2020 6:57:05 AM com.mongodb.diagnostics.logging.SLF4JLogger info INFO: Cluster description not yet available. Waiting for 30000 ms before timing out Apr 07, 2020 6:57:05 AM com.mongodb.diagnostics.logging.SLF4JLogger info INFO: Opened connection [connectionId{localValue:1, serverValue:1}] to 35.223.169.207:27017 Apr 07, 2020 6:57:05 AM com.mongodb.diagnostics.logging.SLF4JLogger info INFO: Monitor thread successfully connected to server with description ServerDescription{address=35.223.169.207:27017, type=STANDALONE, state=CONNECTED, ok=true, version=ServerVersion{versionList=[4, 2, 5]}, minWireVersion=0, maxWireVersion=8, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=3820708} Apr 07, 2020 6:57:05 AM com.mongodb.diagnostics.logging.SLF4JLogger info INFO: Opened connection [connectionId{localValue:2, serverValue:2}] to 35.223.169.207:27017 Gradle Test Executor 1 finished executing tests. > Task :sdks:java:io:mongodb:integrationTest FAILED org.apache.beam.sdk.io.mongodb.MongoDBIOIT > testWriteAndRead FAILED java.lang.RuntimeException: Failed to construct instance from factory method DataflowRunner#fromOptions(interface org.apache.beam.sdk.options.PipelineOptions) at org.apache.beam.sdk.util.InstanceBuilder.buildFromMethod(InstanceBuilder.java:224) at org.apache.beam.sdk.util.InstanceBuilder.build(InstanceBuilder.java:155) at org.apache.beam.sdk.PipelineRunner.fromOptions(PipelineRunner.java:55) at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311) at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:350) at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:331) at org.apache.beam.sdk.io.mongodb.MongoDBIOIT.testWriteAndRead(MongoDBIOIT.java:165) Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.beam.sdk.util.InstanceBuilder.buildFromMethod(InstanceBuilder.java:214) ... 6 more Caused by: java.lang.IllegalArgumentException: Missing required values: region at org.apache.beam.runners.dataflow.DataflowRunner.fromOptions(DataflowRunner.java:246) ... 11 more 1 test completed, 1 failed Finished generating test XML results (0.021 secs) into: <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/test-results/integrationTest> Generating HTML test report... Finished generating test html results (0.029 secs) into: <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/reports/tests/integrationTest> :sdks:java:io:mongodb:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 5.459 secs. FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':sdks:java:io:mongodb:integrationTest'. > There were failing tests. See the report at: > file://<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/reports/tests/integrationTest/index.html> * Try: Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1m 54s 81 actionable tasks: 68 executed, 13 from cache Publishing build scan... https://gradle.com/s/zdqlrbucp56vo Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
