See <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/171/display/redirect?page=changes>
Changes: [kirillkozlov] Fix MongoDb SQL Integration Tests [kirillkozlov] Add MongoDbIT back to build file [kirillkozlov] Update JavaDoc comment and remove pipeline options [pabloem] [BEAM-876] Support schemaUpdateOption in BigQueryIO (#9524) ------------------------------------------ [...truncated 213.72 KB...] > Task :runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava file or directory '<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/src/main/java',> not found Build cache key for task ':runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava' is c1e399915581e42118f9183df5cf9aed Task ':runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava' is not up-to-date because: No history is available. All input files are considered out-of-date for incremental task ':runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava'. Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments. file or directory '<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/src/main/java',> not found Compiling with error-prone compiler > Task :sdks:java:io:google-cloud-platform:compileTestJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. Created classpath snapshot for incremental compilation in 1.233 secs. 1478 duplicate classes found in classpath (see all with --debug). Packing task ':sdks:java:io:google-cloud-platform:compileTestJava' :sdks:java:io:google-cloud-platform:compileTestJava (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 12.068 secs. :sdks:java:io:google-cloud-platform:testClasses (Thread[Execution worker for ':' Thread 7,5,main]) started. > Task :sdks:java:io:google-cloud-platform:testClasses Skipping task ':sdks:java:io:google-cloud-platform:testClasses' as it has no actions. :sdks:java:io:google-cloud-platform:testClasses (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 0.0 secs. :sdks:java:io:google-cloud-platform:testJar (Thread[Execution worker for ':' Thread 7,5,main]) started. > Task :sdks:java:io:google-cloud-platform:testJar Build cache key for task ':sdks:java:io:google-cloud-platform:testJar' is 73f93c6b2fa58ef01cc55b46b6bd5fa8 Caching disabled for task ':sdks:java:io:google-cloud-platform:testJar': Caching has not been enabled for the task Task ':sdks:java:io:google-cloud-platform:testJar' is not up-to-date because: No history is available. :sdks:java:io:google-cloud-platform:testJar (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 0.07 secs. :runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution worker for ':' Thread 7,5,main]) started. :sdks:java:io:bigquery-io-perf-tests:compileTestJava (Thread[Execution worker for ':' Thread 2,5,main]) started. > Task :runners:google-cloud-dataflow-java:compileTestJava Build cache key for task ':runners:google-cloud-dataflow-java:compileTestJava' is f44f475cdb1fb123e86e9e2bf87debfe Task ':runners:google-cloud-dataflow-java:compileTestJava' is not up-to-date because: No history is available. All input files are considered out-of-date for incremental task ':runners:google-cloud-dataflow-java:compileTestJava'. Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments. Compiling with error-prone compiler > Task :sdks:java:io:bigquery-io-perf-tests:compileTestJava Build cache key for task ':sdks:java:io:bigquery-io-perf-tests:compileTestJava' is ee0f95edada4ff1f0d538c1bbb7168d8 Task ':sdks:java:io:bigquery-io-perf-tests:compileTestJava' is not up-to-date because: No history is available. All input files are considered out-of-date for incremental task ':sdks:java:io:bigquery-io-perf-tests:compileTestJava'. Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments. Compiling with error-prone compiler Created classpath snapshot for incremental compilation in 0.082 secs. 1478 duplicate classes found in classpath (see all with --debug). Packing task ':sdks:java:io:bigquery-io-perf-tests:compileTestJava' :sdks:java:io:bigquery-io-perf-tests:compileTestJava (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 3.337 secs. :sdks:java:io:bigquery-io-perf-tests:testClasses (Thread[Execution worker for ':' Thread 2,5,main]) started. > Task :sdks:java:io:bigquery-io-perf-tests:testClasses Skipping task ':sdks:java:io:bigquery-io-perf-tests:testClasses' as it has no actions. :sdks:java:io:bigquery-io-perf-tests:testClasses (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 0.0 secs. > Task :runners:google-cloud-dataflow-java:compileTestJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. Created classpath snapshot for incremental compilation in 0.065 secs. 1478 duplicate classes found in classpath (see all with --debug). Packing task ':runners:google-cloud-dataflow-java:compileTestJava' :runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 8.775 secs. :runners:google-cloud-dataflow-java:testClasses (Thread[Execution worker for ':' Thread 7,5,main]) started. > Task :runners:google-cloud-dataflow-java:testClasses Skipping task ':runners:google-cloud-dataflow-java:testClasses' as it has no actions. :runners:google-cloud-dataflow-java:testClasses (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 0.0 secs. :runners:google-cloud-dataflow-java:testJar (Thread[Execution worker for ':' Thread 7,5,main]) started. > Task :runners:google-cloud-dataflow-java:testJar Build cache key for task ':runners:google-cloud-dataflow-java:testJar' is fb2f0158e9b08eb586f66c79c65e0560 Caching disabled for task ':runners:google-cloud-dataflow-java:testJar': Caching has not been enabled for the task Task ':runners:google-cloud-dataflow-java:testJar' is not up-to-date because: No history is available. :runners:google-cloud-dataflow-java:testJar (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 0.034 secs. > Task :runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. Created classpath snapshot for incremental compilation in 0.129 secs. 14 duplicate classes found in classpath (see all with --debug). Packing task ':runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava' :runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 15.647 secs. :runners:google-cloud-dataflow-java:worker:legacy-worker:classes (Thread[Execution worker for ':' Thread 4,5,main]) started. > Task :runners:google-cloud-dataflow-java:worker:legacy-worker:classes Skipping task ':runners:google-cloud-dataflow-java:worker:legacy-worker:classes' as it has no actions. :runners:google-cloud-dataflow-java:worker:legacy-worker:classes (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 0.0 secs. :runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar (Thread[Execution worker for ':' Thread 4,5,main]) started. > Task :runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar Build cache key for task ':runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar' is b797e89516b8633aff7c336a819457c9 Caching disabled for task ':runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar': Caching has not been enabled for the task Task ':runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar' is not up-to-date because: No history is available. Custom actions are attached to task ':runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar'. ******************* GRADLE SHADOW STATS Total Jars: 16 (includes project) Total Time: 3.194s [3194ms] Average Time/Jar: 0.199625s [199.625ms] ******************* :runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 3.993 secs. :sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) started. Gradle Test Executor 1 started executing tests. > Task :sdks:java:io:bigquery-io-perf-tests:integrationTest Build cache key for task ':sdks:java:io:bigquery-io-perf-tests:integrationTest' is a67be5a3fb6d2ef731f46621c3721158 Task ':sdks:java:io:bigquery-io-perf-tests:integrationTest' is not up-to-date because: Task.upToDateWhen is false. Custom actions are attached to task ':sdks:java:io:bigquery-io-perf-tests:integrationTest'. Starting process 'Gradle Test Executor 1'. Working directory: <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--writeMethod=STREAMING_INSERTS","--testBigQueryDataset=beam_performance","--testBigQueryTable=bqio_write_10GB_java","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=bqio_10GB_results_java_stream","--sourceOptions={\"numRecords\":\"10485760\",\"keySizeBytes\":\"1\",\"valueSizeBytes\":\"1024\"}","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=${dataflowWorkerJar}"] -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/5.2.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1' Successfully started process 'Gradle Test Executor 1' org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT STANDARD_ERROR SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.18.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]> SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.25/bccda40ebc8067491b32a88f49615a747d20082d/slf4j-jdk14-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory] org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead FAILED java.lang.RuntimeException: Failed to construct instance from factory method DataflowRunner#fromOptions(interface org.apache.beam.sdk.options.PipelineOptions) at org.apache.beam.sdk.util.InstanceBuilder.buildFromMethod(InstanceBuilder.java:224) at org.apache.beam.sdk.util.InstanceBuilder.build(InstanceBuilder.java:155) at org.apache.beam.sdk.PipelineRunner.fromOptions(PipelineRunner.java:55) at org.apache.beam.sdk.Pipeline.create(Pipeline.java:147) at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testWrite(BigQueryIOIT.java:152) at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testJsonWrite(BigQueryIOIT.java:134) at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testWriteThenRead(BigQueryIOIT.java:120) Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.beam.sdk.util.InstanceBuilder.buildFromMethod(InstanceBuilder.java:214) ... 6 more Caused by: java.lang.IllegalArgumentException: DataflowRunner requires gcpTempLocation, but failed to retrieve a value from PipelineOptions at org.apache.beam.runners.dataflow.DataflowRunner.fromOptions(DataflowRunner.java:257) ... 11 more Caused by: java.lang.IllegalArgumentException: Error constructing default value for gcpTempLocation: tempLocation is not a valid GCS path, gs://temp-storage-for-perf-tests/loadtests. at org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory.create(GcpOptions.java:311) at org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory.create(GcpOptions.java:288) at org.apache.beam.sdk.options.ProxyInvocationHandler.returnDefaultHelper(ProxyInvocationHandler.java:592) at org.apache.beam.sdk.options.ProxyInvocationHandler.getDefault(ProxyInvocationHandler.java:533) at org.apache.beam.sdk.options.ProxyInvocationHandler.invoke(ProxyInvocationHandler.java:158) at com.sun.proxy.$Proxy12.getGcpTempLocation(Unknown Source) at org.apache.beam.runners.dataflow.DataflowRunner.fromOptions(DataflowRunner.java:255) ... 11 more Caused by: java.lang.RuntimeException: Unable to verify that GCS bucket gs://temp-storage-for-perf-tests exists. at org.apache.beam.sdk.extensions.gcp.storage.GcsPathValidator.verifyPathIsAccessible(GcsPathValidator.java:86) at org.apache.beam.sdk.extensions.gcp.storage.GcsPathValidator.validateOutputFilePrefixSupported(GcsPathValidator.java:53) at org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory.create(GcpOptions.java:308) ... 17 more Caused by: java.io.IOException: Error code 404 trying to get security access token from Compute Engine metadata for the default service account. This may be because the virtual machine instance does not have permission scopes specified. It is possible to skip checking for Compute Engine metadata by specifying the environment variable NO_GCE_CHECK=true. at com.google.auth.oauth2.ComputeEngineCredentials.refreshAccessToken(ComputeEngineCredentials.java:156) at com.google.auth.oauth2.OAuth2Credentials.refresh(OAuth2Credentials.java:181) at com.google.auth.oauth2.OAuth2Credentials.getRequestMetadata(OAuth2Credentials.java:167) at com.google.auth.http.HttpCredentialsAdapter.initialize(HttpCredentialsAdapter.java:96) at com.google.cloud.hadoop.util.ChainingHttpRequestInitializer.initialize(ChainingHttpRequestInitializer.java:52) at com.google.api.client.http.HttpRequestFactory.buildRequest(HttpRequestFactory.java:93) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.buildHttpRequest(AbstractGoogleClientRequest.java:397) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:515) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:448) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565) at com.google.cloud.hadoop.util.ResilientOperation$AbstractGoogleClientRequestExecutor.call(ResilientOperation.java:171) at com.google.cloud.hadoop.util.ResilientOperation.retry(ResilientOperation.java:67) at org.apache.beam.sdk.extensions.gcp.util.GcsUtil.getBucket(GcsUtil.java:490) at org.apache.beam.sdk.extensions.gcp.util.GcsUtil.bucketAccessible(GcsUtil.java:478) at org.apache.beam.sdk.extensions.gcp.util.GcsUtil.bucketAccessible(GcsUtil.java:451) at org.apache.beam.sdk.extensions.gcp.storage.GcsPathValidator.verifyPathIsAccessible(GcsPathValidator.java:83) ... 19 more Gradle Test Executor 1 finished executing tests. > Task :sdks:java:io:bigquery-io-perf-tests:integrationTest FAILED org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > classMethod FAILED com.google.cloud.bigquery.BigQueryException: Error code 404 trying to get security access token from Compute Engine metadata for the default service account. This may be because the virtual machine instance does not have permission scopes specified. It is possible to skip checking for Compute Engine metadata by specifying the environment variable NO_GCE_CHECK=true. at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:99) at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.deleteTable(HttpBigQueryRpc.java:274) at com.google.cloud.bigquery.BigQueryImpl$9.call(BigQueryImpl.java:352) at com.google.cloud.bigquery.BigQueryImpl$9.call(BigQueryImpl.java:349) at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:105) at com.google.cloud.RetryHelper.run(RetryHelper.java:76) at com.google.cloud.RetryHelper.runWithRetries(RetryHelper.java:50) at com.google.cloud.bigquery.BigQueryImpl.delete(BigQueryImpl.java:349) at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.tearDown(BigQueryIOIT.java:115) Caused by: java.io.IOException: Error code 404 trying to get security access token from Compute Engine metadata for the default service account. This may be because the virtual machine instance does not have permission scopes specified. It is possible to skip checking for Compute Engine metadata by specifying the environment variable NO_GCE_CHECK=true. at com.google.auth.oauth2.ComputeEngineCredentials.refreshAccessToken(ComputeEngineCredentials.java:156) at com.google.auth.oauth2.OAuth2Credentials.refresh(OAuth2Credentials.java:181) at com.google.auth.oauth2.OAuth2Credentials.getRequestMetadata(OAuth2Credentials.java:167) at com.google.auth.http.HttpCredentialsAdapter.initialize(HttpCredentialsAdapter.java:96) at com.google.cloud.http.HttpTransportOptions$1.initialize(HttpTransportOptions.java:159) at com.google.api.client.http.HttpRequestFactory.buildRequest(HttpRequestFactory.java:93) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.buildHttpRequest(AbstractGoogleClientRequest.java:397) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:515) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:448) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565) at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.deleteTable(HttpBigQueryRpc.java:271) ... 7 more 2 tests completed, 2 failed Finished generating test XML results (0.018 secs) into: <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/test-results/integrationTest> Generating HTML test report... Finished generating test html results (0.026 secs) into: <https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest> :sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 3.375 secs. FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':sdks:java:io:bigquery-io-perf-tests:integrationTest'. > There were failing tests. See the report at: > file://<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest/index.html> * Try: Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1m 4s 80 actionable tasks: 58 executed, 22 from cache Publishing build scan... https://gradle.com/s/jsrp5n76umle4 Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
