See <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Batch/5/display/redirect?page=changes>
Changes: [iambruceactor] added more meetups [suztomo] Google-cloud-clients to use 2019 versions [lcwik] [BEAM-8298] Fully specify the necessary details to support side input [chadrik] [BEAM-7746] Introduce a protocol to handle various types of partitioning [iemejia] [BEAM-6957] Enable Counter/Distribution metrics tests for Portable Spark [kcweaver] [BEAM-9200] fix portable jar test version property [iemejia] [BEAM-9204] Refactor HBaseUtils methods to depend on Ranges [iemejia] [BEAM-9204] Fix HBase SplitRestriction to be based on provided Range [echauchot] [BEAM-9205] Add ValidatesRunner annotation to the MetricsPusherTest [echauchot] [BEAM-9205] Fix validatesRunner tests configuration in spark module [jbonofre] [BEAM-7427] Refactore JmsCheckpointMark to be usage via Coder [iemejia] [BEAM-7427] Adjust JmsIO access levels and other minor fixes [pabloem] Merge pull request #10346 from [BEAM-7926] Data-centric Interactive [chamikara] Fix Spanner auth endpoints [chadrik] [BEAM-7746] Stop automatically creating staticmethods in register_urn ------------------------------------------ [...truncated 72.99 KB...] + metadata+=flink-start-yarn-session=true, + metadata+=flink-taskmanager-slots=1, + metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar + [[ -n gcr.io/apache-beam-testing/beam_portability/java_sdk:latest ]] + metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest + [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]] + metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest + local image_version=1.2 + echo 'Starting dataproc cluster. Dataproc version: 1.2' Starting dataproc cluster. Dataproc version: 1.2 + local num_dataproc_workers=6 + gcloud dataproc clusters create beam-loadtests-java-portable-flink-batch-5 --region=global --num-workers=6 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest, --image-version=1.2 --zone=us-central1-a --quiet Waiting on operation [projects/apache-beam-testing/regions/global/operations/791401ed-8682-3f59-8223-d95b633d755b]. Waiting for cluster creation operation... WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance. ......................................................................................................................................................................done. Created [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-java-portable-flink-batch-5] Cluster placed in zone [us-central1-a]. + get_leader + local i=0 + local application_ids + local application_masters + echo 'Yarn Applications' Yarn Applications ++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-batch-5-m '--command=yarn application -list' ++ grep beam-loadtests-java-portable-flink-batch-5 Warning: Permanently added 'compute.4413311484161737698' (ECDSA) to the list of known hosts. 20/01/29 12:42:30 INFO client.RMProxy: Connecting to ResourceManager at beam-loadtests-java-portable-flink-batch-5-m/10.128.0.90:8032 + read line + echo application_1580301682932_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:36779 application_1580301682932_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:36779 ++ echo application_1580301682932_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:36779 ++ sed 's/ .*//' + application_ids[$i]=application_1580301682932_0001 ++ echo application_1580301682932_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:36779 ++ sed 's/.*beam-loadtests-java-portable-flink-batch-5/beam-loadtests-java-portable-flink-batch-5/' ++ sed 's/ .*//' + application_masters[$i]=beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:36779 + i=1 + read line + '[' 1 '!=' 1 ']' + YARN_APPLICATION_MASTER=beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:36779 + echo 'Using Yarn Application master: beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:36779' Using Yarn Application master: beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:36779 + [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]] + start_job_server + gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-batch-5-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest --flink-master=beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:36779 --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-java-portable-flink-batch-5' bf6d9a16e805ef405c55d568954c2ab8e381bebac694b82c2f23d54fa0bfa9a6 + start_tunnel ++ gcloud compute ssh --quiet --zone=us-central1-a yarn@beam-loadtests-java-portable-flink-batch-5-m '--command=curl -s "http://beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:36779/jobmanager/config"' + local 'job_server_config=[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1580301682932_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-284e6819-944f-4584-a730-42dfae65c3b8"},{"key":"jobmanager.rpc.port","value":"35959"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1580301682932_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]' + local key=jobmanager.rpc.port ++ echo beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:36779 ++ cut -d : -f1 + local yarn_application_master_host=beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal ++ python -c 'import sys, json; print([e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0])' ++ echo '[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1580301682932_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-284e6819-944f-4584-a730-42dfae65c3b8"},{"key":"jobmanager.rpc.port","value":"35959"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1580301682932_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]' + local jobmanager_rpc_port=35959 ++ [[ true == \t\r\u\e ]] ++ echo ' -Nf >& /dev/null' + local 'detached_mode_params= -Nf >& /dev/null' ++ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]] ++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097' + local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097' + local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-batch-5-m -- -L 8081:beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:36779 -L 35959:beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:35959 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf >& /dev/null' + eval gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-batch-5-m -- -L 8081:beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:36779 -L 35959:beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:35959 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null ++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-batch-5-m -- -L 8081:beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:36779 -L 35959:beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:35959 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf [beam_LoadTests_Java_Combine_Portable_Flink_Batch] $ /bin/bash -xe /tmp/jenkins1026063840166715975.sh + echo src Load test: 2GB of 10B records on Flink in Portable mode src src Load test: 2GB of 10B records on Flink in Portable mode src [Gradle] - Launching build. [src] $ <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Batch/ws/src/gradlew> -PloadTest.mainClass=org.apache.beam.sdk.loadtests.CombineLoadTest -Prunner=:runners:portability:java '-PloadTest.args=--project=apache-beam-testing --appName=load_tests_Java_Portable_Flink_batch_Combine_1 --tempLocation=gs://temp-storage-for-perf-tests/loadtests --publishToBigQuery=true --bigQueryDataset=load_test --bigQueryTable=java_portable_flink_batch_Combine_1 --sourceOptions={"numRecords":200000000,"keySizeBytes":1,"valueSizeBytes":9} --fanout=1 --iterations=1 --topCount=20 --sdkWorkerParallelism=5 --perKeyCombiner=TOP_LARGEST --streaming=false --jobEndpoint=localhost:8099 --defaultEnvironmentConfig=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest --defaultEnvironmentType=DOCKER --runner=PortableRunner' --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :sdks:java:testing:load-tests:run > Task :buildSrc:compileJava NO-SOURCE > Task :buildSrc:compileGroovy UP-TO-DATE > Task :buildSrc:pluginDescriptors UP-TO-DATE > Task :buildSrc:processResources UP-TO-DATE > Task :buildSrc:classes UP-TO-DATE > Task :buildSrc:jar UP-TO-DATE > Task :buildSrc:assemble UP-TO-DATE > Task :buildSrc:spotlessGroovy UP-TO-DATE > Task :buildSrc:spotlessGroovyCheck UP-TO-DATE > Task :buildSrc:spotlessGroovyGradle UP-TO-DATE > Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE > Task :buildSrc:spotlessCheck UP-TO-DATE > Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE > Task :buildSrc:compileTestJava NO-SOURCE > Task :buildSrc:compileTestGroovy NO-SOURCE > Task :buildSrc:processTestResources NO-SOURCE > Task :buildSrc:testClasses UP-TO-DATE > Task :buildSrc:test NO-SOURCE > Task :buildSrc:validateTaskProperties UP-TO-DATE > Task :buildSrc:check UP-TO-DATE > Task :buildSrc:build UP-TO-DATE Configuration on demand is an incubating feature. > Task :sdks:java:core:generateAvroProtocol NO-SOURCE > Task :runners:core-java:processResources NO-SOURCE > Task :runners:core-construction-java:processResources NO-SOURCE > Task :sdks:java:fn-execution:processResources NO-SOURCE > Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE > Task :sdks:java:extensions:google-cloud-platform-core:processResources > NO-SOURCE > Task :sdks:java:harness:processResources NO-SOURCE > Task :model:job-management:extractProto UP-TO-DATE > Task :model:fn-execution:extractProto UP-TO-DATE > Task :runners:java-fn-execution:processResources NO-SOURCE > Task :runners:direct-java:processResources NO-SOURCE > Task :runners:local-java:processResources NO-SOURCE > Task :runners:portability:java:processResources NO-SOURCE > Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE > Task :model:job-management:processResources UP-TO-DATE > Task :sdks:java:extensions:protobuf:extractProto UP-TO-DATE > Task :sdks:java:io:kafka:processResources NO-SOURCE > Task :model:fn-execution:processResources UP-TO-DATE > Task :sdks:java:io:kinesis:processResources NO-SOURCE > Task :sdks:java:io:synthetic:processResources NO-SOURCE > Task :sdks:java:core:generateAvroJava NO-SOURCE > Task :sdks:java:extensions:protobuf:processResources NO-SOURCE > Task :sdks:java:testing:test-utils:processResources NO-SOURCE > Task :sdks:java:testing:load-tests:processResources NO-SOURCE > Task :sdks:java:core:generateGrammarSource UP-TO-DATE > Task :sdks:java:core:processResources UP-TO-DATE > Task :model:pipeline:extractIncludeProto UP-TO-DATE > Task :model:pipeline:extractProto UP-TO-DATE > Task :model:pipeline:generateProto UP-TO-DATE > Task :model:pipeline:compileJava UP-TO-DATE > Task :model:pipeline:processResources UP-TO-DATE > Task :model:pipeline:classes UP-TO-DATE > Task :model:pipeline:jar UP-TO-DATE > Task :model:pipeline:shadowJar UP-TO-DATE > Task :model:fn-execution:extractIncludeProto UP-TO-DATE > Task :model:job-management:extractIncludeProto UP-TO-DATE > Task :model:fn-execution:generateProto UP-TO-DATE > Task :model:job-management:generateProto UP-TO-DATE > Task :model:job-management:compileJava UP-TO-DATE > Task :model:fn-execution:compileJava UP-TO-DATE > Task :model:job-management:classes UP-TO-DATE > Task :model:fn-execution:classes UP-TO-DATE > Task :model:job-management:shadowJar UP-TO-DATE > Task :model:fn-execution:shadowJar UP-TO-DATE > Task :sdks:java:core:compileJava UP-TO-DATE > Task :sdks:java:core:classes UP-TO-DATE > Task :sdks:java:core:shadowJar UP-TO-DATE > Task :sdks:java:extensions:protobuf:extractIncludeProto UP-TO-DATE > Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE > Task :sdks:java:fn-execution:compileJava UP-TO-DATE > Task :runners:core-construction-java:compileJava UP-TO-DATE > Task :runners:core-construction-java:classes UP-TO-DATE > Task :sdks:java:fn-execution:classes UP-TO-DATE > Task :runners:local-java:compileJava FROM-CACHE > Task :runners:local-java:classes UP-TO-DATE > Task :sdks:java:fn-execution:jar UP-TO-DATE > Task :vendor:sdks-java-extensions-protobuf:compileJava UP-TO-DATE > Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE > Task :runners:core-construction-java:jar UP-TO-DATE > Task :sdks:java:io:kafka:compileJava UP-TO-DATE > Task :sdks:java:io:kafka:classes UP-TO-DATE > Task :vendor:sdks-java-extensions-protobuf:shadowJar UP-TO-DATE > Task :sdks:java:io:kafka:jar UP-TO-DATE > Task :runners:local-java:jar > Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE > Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE > Task :sdks:java:io:synthetic:compileJava FROM-CACHE > Task :sdks:java:io:synthetic:classes UP-TO-DATE > Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE > Task :sdks:java:extensions:protobuf:compileJava UP-TO-DATE > Task :sdks:java:extensions:protobuf:classes UP-TO-DATE > Task :runners:core-java:compileJava UP-TO-DATE > Task :sdks:java:extensions:protobuf:jar UP-TO-DATE > Task :runners:core-java:classes UP-TO-DATE > Task :runners:core-java:jar UP-TO-DATE > Task :sdks:java:io:synthetic:jar > Task :sdks:java:testing:test-utils:compileJava FROM-CACHE > Task :sdks:java:testing:test-utils:classes UP-TO-DATE > Task :sdks:java:testing:test-utils:jar > Task :sdks:java:harness:compileJava UP-TO-DATE > Task :sdks:java:harness:classes UP-TO-DATE > Task :sdks:java:harness:jar UP-TO-DATE > Task :sdks:java:harness:shadowJar UP-TO-DATE > Task :sdks:java:io:google-cloud-platform:compileJava UP-TO-DATE > Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE > Task :sdks:java:io:google-cloud-platform:jar UP-TO-DATE > Task :runners:java-fn-execution:compileJava UP-TO-DATE > Task :runners:java-fn-execution:classes UP-TO-DATE > Task :runners:java-fn-execution:jar UP-TO-DATE > Task :runners:portability:java:compileJava FROM-CACHE > Task :runners:portability:java:classes UP-TO-DATE > Task :runners:portability:java:jar > Task :runners:direct-java:compileJava FROM-CACHE > Task :runners:direct-java:classes UP-TO-DATE > Task :runners:direct-java:shadowJar > Task :sdks:java:io:kinesis:compileJava > Task :sdks:java:io:kinesis:classes > Task :sdks:java:io:kinesis:jar > Task :sdks:java:testing:load-tests:compileJava Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :sdks:java:testing:load-tests:classes > Task :sdks:java:testing:load-tests:jar > Task :sdks:java:testing:load-tests:run SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". SLF4J: Defaulting to no-operation (NOP) logger implementation SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details. ERROR StatusLogger Log4j2 could not find a logging implementation. Please add log4j-core to the classpath. Using SimpleLogger to log to the console... Load test results for test (ID): a1f68f7a-7f40-4c8c-959a-86b32f49eab6 and timestamp: 2020-01-29T12:42:53.067000000Z: Metric: Value: runtime_sec 1200.452 total_bytes_count 2.0E9 Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD SUCCESSFUL in 21m 9s 61 actionable tasks: 10 executed, 5 from cache, 46 up-to-date Publishing build scan... https://gradle.com/s/3aynkctmo24xw [beam_LoadTests_Java_Combine_Portable_Flink_Batch] $ /bin/bash -xe /tmp/jenkins3795386177324874177.sh + echo Changing number of workers to 16 Changing number of workers to 16 [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content FLINK_NUM_WORKERS=16 [EnvInject] - Variables injected successfully. [beam_LoadTests_Java_Combine_Portable_Flink_Batch] $ /bin/bash -xe /tmp/jenkins4596650120239358043.sh + cd <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Batch/ws/src/.test-infra/dataproc> + ./flink_cluster.sh restart + GCLOUD_ZONE=us-central1-a + DATAPROC_VERSION=1.2 + MASTER_NAME=beam-loadtests-java-portable-flink-batch-5-m + INIT_ACTIONS_FOLDER_NAME=init-actions + FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh + BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh + DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh + FLINK_LOCAL_PORT=8081 + FLINK_TASKMANAGER_SLOTS=1 + TASK_MANAGER_MEM=10240 + YARN_APPLICATION_MASTER= + restart + delete + gcloud dataproc clusters delete beam-loadtests-java-portable-flink-batch-5 --region=global --quiet Waiting on operation [projects/apache-beam-testing/regions/global/operations/d8d222ba-f485-344b-8a6c-c5cce2d41d7b]. Waiting for cluster deletion operation... ...........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................done. ERROR: (gcloud.dataproc.clusters.delete) Operation [projects/apache-beam-testing/regions/global/operations/d8d222ba-f485-344b-8a6c-c5cce2d41d7b] timed out. Build step 'Execute shell' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
