See <https://builds.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_Streaming/201/display/redirect?page=changes>
Changes: [github] [BEAM-8803] BigQuery Streaming Inserts are always retried by default. ------------------------------------------ [...truncated 522 B...] Wiping out workspace first. Cloning the remote Git repository Cloning repository https://github.com/apache/beam.git > git init > <https://builds.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_Streaming/ws/src> > # timeout=10 Fetching upstream changes from https://github.com/apache/beam.git > git --version # timeout=10 > git fetch --tags --progress https://github.com/apache/beam.git > +refs/heads/*:refs/remotes/origin/* > git config remote.origin.url https://github.com/apache/beam.git # timeout=10 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # > timeout=10 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10 Fetching upstream changes from https://github.com/apache/beam.git > git fetch --tags --progress https://github.com/apache/beam.git > +refs/heads/*:refs/remotes/origin/* > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* > git rev-parse origin/master^{commit} # timeout=10 Checking out Revision 885ecbf3f49257e8b6b4ac376cb5a79ed6282580 (origin/master) > git config core.sparsecheckout # timeout=10 > git checkout -f 885ecbf3f49257e8b6b4ac376cb5a79ed6282580 Commit message: "[BEAM-8803] BigQuery Streaming Inserts are always retried by default. (#10195)" > git rev-list --no-walk 639fc24b2a1d6886197750fe6be75e8ad8029205 # timeout=10 No emails were triggered. [EnvInject] - Executing scripts and injecting environment variables after the SCM step. [EnvInject] - Injecting as environment variables the properties content SPARK_LOCAL_IP=127.0.0.1 [EnvInject] - Variables injected successfully. [beam_LoadTests_Java_GBK_Dataflow_Streaming] $ /bin/bash -xe /tmp/jenkins6813877594107440949.sh + echo src Load test: 2GB of 10B records src src Load test: 2GB of 10B records src [Gradle] - Launching build. [src] $ <https://builds.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_Streaming/ws/src/gradlew> --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -PloadTest.mainClass=org.apache.beam.sdk.loadtests.GroupByKeyLoadTest -Prunner=:runners:google-cloud-dataflow-java '-PloadTest.args=--project=apache-beam-testing --appName=load_tests_Java_Dataflow_streaming_GBK_1 --tempLocation=gs://temp-storage-for-perf-tests/loadtests --publishToBigQuery=true --bigQueryDataset=load_test --bigQueryTable=java_dataflow_streaming_GBK_1 --sourceOptions={"numRecords":200000000,"keySizeBytes":1,"valueSizeBytes":9} --fanout=1 --iterations=1 --numWorkers=5 --autoscalingAlgorithm=NONE --streaming=true --inputWindowDurationSec=1200 --runner=DataflowRunner' :sdks:java:testing:load-tests:run Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details > Task :buildSrc:compileJava NO-SOURCE > Task :buildSrc:compileGroovy FROM-CACHE > Task :buildSrc:pluginDescriptors > Task :buildSrc:processResources > Task :buildSrc:classes > Task :buildSrc:jar > Task :buildSrc:assemble > Task :buildSrc:spotlessGroovy > Task :buildSrc:spotlessGroovyCheck > Task :buildSrc:spotlessGroovyGradle > Task :buildSrc:spotlessGroovyGradleCheck > Task :buildSrc:spotlessCheck > Task :buildSrc:pluginUnderTestMetadata > Task :buildSrc:compileTestJava NO-SOURCE > Task :buildSrc:compileTestGroovy NO-SOURCE > Task :buildSrc:processTestResources NO-SOURCE > Task :buildSrc:testClasses UP-TO-DATE > Task :buildSrc:test NO-SOURCE > Task :buildSrc:validateTaskProperties FROM-CACHE > Task :buildSrc:check > Task :buildSrc:build Configuration on demand is an incubating feature. > Configure project :sdks:java:container Found go 1.12 in /usr/bin/go, use it. > Task :sdks:java:fn-execution:processResources NO-SOURCE > Task :sdks:java:core:generateAvroProtocol NO-SOURCE > Task :sdks:java:extensions:google-cloud-platform-core:processResources > NO-SOURCE > Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE > Task :runners:core-java:processResources NO-SOURCE > Task :runners:core-construction-java:processResources NO-SOURCE > Task :sdks:java:harness:processResources NO-SOURCE > Task :runners:direct-java:processResources NO-SOURCE > Task :runners:java-fn-execution:processResources NO-SOURCE > Task :runners:local-java:processResources NO-SOURCE > Task :sdks:java:core:generateAvroJava NO-SOURCE > Task :model:job-management:extractProto > Task :model:fn-execution:extractProto > Task :sdks:java:extensions:protobuf:extractProto > Task :sdks:java:io:kafka:processResources NO-SOURCE > Task :sdks:java:extensions:protobuf:processResources NO-SOURCE > Task :sdks:java:io:kinesis:processResources NO-SOURCE > Task :sdks:java:io:synthetic:processResources NO-SOURCE > Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE > Task > :runners:google-cloud-dataflow-java:worker:legacy-worker:processResources > NO-SOURCE > Task :sdks:java:testing:test-utils:processResources NO-SOURCE > Task :sdks:java:testing:load-tests:processResources NO-SOURCE > Task :model:job-management:processResources > Task :model:fn-execution:processResources > Task :sdks:java:core:generateGrammarSource FROM-CACHE > Task :sdks:java:core:processResources > Task :runners:google-cloud-dataflow-java:processResources > Task :runners:google-cloud-dataflow-java:worker:windmill:extractIncludeProto > Task :runners:google-cloud-dataflow-java:worker:windmill:extractProto > Task :model:pipeline:extractIncludeProto > Task :model:pipeline:extractProto > Task :runners:google-cloud-dataflow-java:worker:windmill:generateProto > Task :runners:google-cloud-dataflow-java:worker:windmill:compileJava > FROM-CACHE > Task :runners:google-cloud-dataflow-java:worker:windmill:processResources > Task :runners:google-cloud-dataflow-java:worker:windmill:classes > Task :model:pipeline:generateProto > Task :model:pipeline:compileJava FROM-CACHE > Task :model:pipeline:processResources > Task :model:pipeline:classes > Task :model:pipeline:jar > Task :model:job-management:extractIncludeProto > Task :model:fn-execution:extractIncludeProto > Task :model:job-management:generateProto > Task :model:fn-execution:generateProto > Task :model:job-management:compileJava FROM-CACHE > Task :model:job-management:classes > Task :model:fn-execution:compileJava FROM-CACHE > Task :model:fn-execution:classes > Task :runners:google-cloud-dataflow-java:worker:windmill:shadowJar > Task :model:pipeline:shadowJar > Task :model:job-management:shadowJar > Task :model:fn-execution:shadowJar > Task :sdks:java:core:compileJava FROM-CACHE > Task :sdks:java:core:classes > Task :sdks:java:core:shadowJar > Task :sdks:java:extensions:protobuf:extractIncludeProto > Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE > Task :runners:local-java:compileJava FROM-CACHE > Task :runners:local-java:classes UP-TO-DATE > Task :runners:local-java:jar > Task :runners:core-construction-java:compileJava FROM-CACHE > Task :runners:core-construction-java:classes UP-TO-DATE > Task :sdks:java:io:kinesis:compileJava FROM-CACHE > Task :sdks:java:io:kinesis:classes UP-TO-DATE > Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE > Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE > Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE > Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE > Task :runners:core-construction-java:jar > Task :sdks:java:io:kinesis:jar > Task :sdks:java:extensions:google-cloud-platform-core:jar > Task :vendor:sdks-java-extensions-protobuf:shadowJar > Task :sdks:java:io:synthetic:compileJava FROM-CACHE > Task :sdks:java:io:synthetic:classes UP-TO-DATE > Task :sdks:java:io:kafka:compileJava FROM-CACHE > Task :sdks:java:io:kafka:classes UP-TO-DATE > Task :sdks:java:io:synthetic:jar > Task :sdks:java:fn-execution:compileJava FROM-CACHE > Task :sdks:java:fn-execution:classes UP-TO-DATE > Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE > Task :sdks:java:extensions:protobuf:classes UP-TO-DATE > Task :sdks:java:fn-execution:jar > Task :sdks:java:extensions:protobuf:jar > Task :sdks:java:io:kafka:jar > Task :runners:core-java:compileJava FROM-CACHE > Task :runners:core-java:classes UP-TO-DATE > Task :runners:core-java:jar > Task :sdks:java:harness:compileJava FROM-CACHE > Task :sdks:java:harness:classes UP-TO-DATE > Task :sdks:java:testing:test-utils:compileJava FROM-CACHE > Task :sdks:java:testing:test-utils:classes UP-TO-DATE > Task :sdks:java:testing:test-utils:jar > Task :sdks:java:harness:jar > Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE > Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE > Task :sdks:java:io:google-cloud-platform:jar > Task :runners:google-cloud-dataflow-java:compileJava FROM-CACHE > Task :runners:google-cloud-dataflow-java:classes > Task :runners:google-cloud-dataflow-java:jar > Task :sdks:java:harness:shadowJar > Task :runners:java-fn-execution:compileJava FROM-CACHE > Task :runners:java-fn-execution:classes UP-TO-DATE > Task :runners:java-fn-execution:jar > Task :runners:direct-java:compileJava FROM-CACHE > Task :runners:direct-java:classes UP-TO-DATE > Task :runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava > FROM-CACHE > Task :runners:google-cloud-dataflow-java:worker:legacy-worker:classes > UP-TO-DATE > Task :runners:direct-java:shadowJar > Task :sdks:java:testing:load-tests:compileJava FROM-CACHE > Task :sdks:java:testing:load-tests:classes UP-TO-DATE > Task :sdks:java:testing:load-tests:jar > Task :runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar > Task :sdks:java:testing:load-tests:run SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". SLF4J: Defaulting to no-operation (NOP) logger implementation SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details. Exception in thread "main" java.lang.RuntimeException: Failed to construct instance from factory method DataflowRunner#fromOptions(interface org.apache.beam.sdk.options.PipelineOptions) at org.apache.beam.sdk.util.InstanceBuilder.buildFromMethod(InstanceBuilder.java:224) at org.apache.beam.sdk.util.InstanceBuilder.build(InstanceBuilder.java:155) at org.apache.beam.sdk.PipelineRunner.fromOptions(PipelineRunner.java:55) at org.apache.beam.sdk.Pipeline.create(Pipeline.java:147) at org.apache.beam.sdk.loadtests.LoadTest.<init>(LoadTest.java:75) at org.apache.beam.sdk.loadtests.GroupByKeyLoadTest.<init>(GroupByKeyLoadTest.java:78) at org.apache.beam.sdk.loadtests.GroupByKeyLoadTest.main(GroupByKeyLoadTest.java:131) Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.beam.sdk.util.InstanceBuilder.buildFromMethod(InstanceBuilder.java:214) ... 6 more Caused by: java.lang.IllegalArgumentException: DataflowRunner requires gcpTempLocation, but failed to retrieve a value from PipelineOptions at org.apache.beam.runners.dataflow.DataflowRunner.fromOptions(DataflowRunner.java:257) ... 11 more Caused by: java.lang.IllegalArgumentException: Error constructing default value for gcpTempLocation: tempLocation is not a valid GCS path, gs://temp-storage-for-perf-tests/loadtests. at org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory.create(GcpOptions.java:311) at org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory.create(GcpOptions.java:288) at org.apache.beam.sdk.options.ProxyInvocationHandler.returnDefaultHelper(ProxyInvocationHandler.java:592) at org.apache.beam.sdk.options.ProxyInvocationHandler.getDefault(ProxyInvocationHandler.java:533) at org.apache.beam.sdk.options.ProxyInvocationHandler.invoke(ProxyInvocationHandler.java:158) at com.sun.proxy.$Proxy0.getGcpTempLocation(Unknown Source) at org.apache.beam.runners.dataflow.DataflowRunner.fromOptions(DataflowRunner.java:255) ... 11 more Caused by: java.lang.RuntimeException: Unable to verify that GCS bucket gs://temp-storage-for-perf-tests exists. at org.apache.beam.sdk.extensions.gcp.storage.GcsPathValidator.verifyPathIsAccessible(GcsPathValidator.java:86) at org.apache.beam.sdk.extensions.gcp.storage.GcsPathValidator.validateOutputFilePrefixSupported(GcsPathValidator.java:53) at org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory.create(GcpOptions.java:308) ... 17 more Caused by: java.io.IOException: Error code 404 trying to get security access token from Compute Engine metadata for the default service account. This may be because the virtual machine instance does not have permission scopes specified. It is possible to skip checking for Compute Engine metadata by specifying the environment variable NO_GCE_CHECK=true. at com.google.auth.oauth2.ComputeEngineCredentials.refreshAccessToken(ComputeEngineCredentials.java:156) at com.google.auth.oauth2.OAuth2Credentials.refresh(OAuth2Credentials.java:181) at com.google.auth.oauth2.OAuth2Credentials.getRequestMetadata(OAuth2Credentials.java:167) at com.google.auth.http.HttpCredentialsAdapter.initialize(HttpCredentialsAdapter.java:96) at com.google.cloud.hadoop.util.ChainingHttpRequestInitializer.initialize(ChainingHttpRequestInitializer.java:52) at com.google.api.client.http.HttpRequestFactory.buildRequest(HttpRequestFactory.java:93) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.buildHttpRequest(AbstractGoogleClientRequest.java:397) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:515) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:448) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565) at com.google.cloud.hadoop.util.ResilientOperation$AbstractGoogleClientRequestExecutor.call(ResilientOperation.java:171) at com.google.cloud.hadoop.util.ResilientOperation.retry(ResilientOperation.java:67) at org.apache.beam.sdk.extensions.gcp.util.GcsUtil.getBucket(GcsUtil.java:490) at org.apache.beam.sdk.extensions.gcp.util.GcsUtil.bucketAccessible(GcsUtil.java:478) at org.apache.beam.sdk.extensions.gcp.util.GcsUtil.bucketAccessible(GcsUtil.java:451) at org.apache.beam.sdk.extensions.gcp.storage.GcsPathValidator.verifyPathIsAccessible(GcsPathValidator.java:83) ... 19 more > Task :sdks:java:testing:load-tests:run FAILED FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':sdks:java:testing:load-tests:run'. > Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with > non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 43s 70 actionable tasks: 46 executed, 24 from cache Publishing build scan... https://gradle.com/s/43oglyeq6q3jg Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
