See <https://builds.apache.org/job/beam_LoadTests_Python_Combine_Flink_Batch/221/display/redirect?page=changes>
Changes: [ankurgoenka] [BEAM-9402] Remove options overwrite [chadrik] Add pre-commit hook for pylint [suztomo] grpc-google-cloud-pubsub-v1 1.85.1 [sunjincheng121] [BEAM-9295] Add Flink 1.10 build target and Make FlinkRunner compatible [12602502+Ardagan] [BEAM-9431] Remove ReadFromPubSub/Read-out0-ElementCount from the [github] Update Python roadmap for 2.7 eol [mxm] [BEAM-9474] Improve robustness of BundleFactory and ProcessEnvironment [github] [BEAM-7815] update MemoryReporter comments about using guppy3 (#11073) [rohde.samuel] [BEAM-8335] Modify the StreamingCache to subclass the CacheManager [sunjincheng121] [BEAM-9298] Drop support for Flink 1.7 ------------------------------------------ [...truncated 46.51 KB...] c80bc4af4ded: Layer already exists 91daf9fc6311: Layer already exists 162804eaaa1e: Layer already exists d040e6423b7a: Layer already exists 00adafc8e77b: Layer already exists 2c995a2087c1: Layer already exists 91b635539185: Pushed 9adbecfa9860: Pushed 3ce5cf5f8395: Pushed cb6f2fe4ebcd: Pushed latest: digest: sha256:b255f9988f2e83d4fd5fbc494a495b28ee9dc4f0c34fd60e7b7102b22afc5fa0 size: 4526 [Gradle] - Launching build. [src] $ <https://builds.apache.org/job/beam_LoadTests_Python_Combine_Flink_Batch/ws/src/gradlew> --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Pdocker-repository-root=gcr.io/apache-beam-testing/beam_portability -Pdocker-tag=latest :runners:flink:1.10:job-server-container:docker > Task :buildSrc:compileJava NO-SOURCE > Task :buildSrc:compileGroovy UP-TO-DATE > Task :buildSrc:pluginDescriptors UP-TO-DATE > Task :buildSrc:processResources UP-TO-DATE > Task :buildSrc:classes UP-TO-DATE > Task :buildSrc:jar UP-TO-DATE > Task :buildSrc:assemble UP-TO-DATE > Task :buildSrc:spotlessGroovy UP-TO-DATE > Task :buildSrc:spotlessGroovyCheck UP-TO-DATE > Task :buildSrc:spotlessGroovyGradle UP-TO-DATE > Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE > Task :buildSrc:spotlessCheck UP-TO-DATE > Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE > Task :buildSrc:compileTestJava NO-SOURCE > Task :buildSrc:compileTestGroovy NO-SOURCE > Task :buildSrc:processTestResources NO-SOURCE > Task :buildSrc:testClasses UP-TO-DATE > Task :buildSrc:test NO-SOURCE > Task :buildSrc:validateTaskProperties UP-TO-DATE > Task :buildSrc:check UP-TO-DATE > Task :buildSrc:build UP-TO-DATE Configuration on demand is an incubating feature. > Task :runners:core-construction-java:processResources NO-SOURCE > Task :sdks:java:fn-execution:processResources NO-SOURCE > Task :sdks:java:core:generateAvroProtocol NO-SOURCE > Task :runners:core-java:processResources NO-SOURCE > Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE > Task :sdks:java:expansion-service:processResources NO-SOURCE > Task :sdks:java:core:generateAvroJava NO-SOURCE > Task :runners:java-fn-execution:processResources NO-SOURCE > Task :runners:flink:1.10:copyResourcesOverrides NO-SOURCE > Task :sdks:java:extensions:google-cloud-platform-core:processResources > NO-SOURCE > Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE > Task :sdks:java:io:kafka:processResources NO-SOURCE > Task :sdks:java:harness:processResources NO-SOURCE > Task :runners:flink:1.10:job-server:processResources NO-SOURCE > Task :model:job-management:extractProto > Task :model:fn-execution:extractProto > Task :sdks:java:extensions:protobuf:extractProto > Task :sdks:java:extensions:protobuf:processResources NO-SOURCE > Task :runners:flink:1.10:job-server-container:copyLicenses > Task :model:job-management:processResources > Task :runners:flink:1.10:job-server-container:dockerClean UP-TO-DATE > Task :model:fn-execution:processResources > Task :sdks:java:core:generateGrammarSource FROM-CACHE > Task :runners:flink:1.10:copySourceOverrides > Task :runners:flink:1.10:copyTestResourcesOverrides NO-SOURCE > Task :runners:flink:1.10:processResources > Task :sdks:java:build-tools:compileJava FROM-CACHE > Task :sdks:java:build-tools:processResources > Task :sdks:java:build-tools:classes > Task :sdks:java:core:processResources > Task :sdks:java:build-tools:jar > Task :model:pipeline:extractIncludeProto > Task :model:pipeline:extractProto > Task :model:pipeline:generateProto > Task :model:pipeline:compileJava FROM-CACHE > Task :model:pipeline:processResources > Task :model:pipeline:classes > Task :model:pipeline:jar > Task :model:job-management:extractIncludeProto > Task :model:fn-execution:extractIncludeProto > Task :model:job-management:generateProto > Task :model:fn-execution:generateProto > Task :model:job-management:compileJava FROM-CACHE > Task :model:job-management:classes > Task :model:fn-execution:compileJava FROM-CACHE > Task :model:fn-execution:classes > Task :model:pipeline:shadowJar > Task :model:job-management:shadowJar > Task :model:fn-execution:shadowJar > Task :sdks:java:core:compileJava FROM-CACHE > Task :sdks:java:core:classes > Task :sdks:java:core:shadowJar > Task :sdks:java:extensions:protobuf:extractIncludeProto > Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE > Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE > Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE > Task :sdks:java:extensions:google-cloud-platform-core:jar > Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE > Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE > Task :vendor:sdks-java-extensions-protobuf:shadowJar > Task :runners:core-construction-java:compileJava FROM-CACHE > Task :runners:core-construction-java:classes UP-TO-DATE > Task :sdks:java:fn-execution:compileJava FROM-CACHE > Task :sdks:java:fn-execution:classes UP-TO-DATE > Task :sdks:java:fn-execution:jar > Task :runners:core-construction-java:jar > Task :sdks:java:expansion-service:compileJava FROM-CACHE > Task :sdks:java:expansion-service:classes UP-TO-DATE > Task :sdks:java:expansion-service:jar > Task :sdks:java:io:kafka:compileJava FROM-CACHE > Task :sdks:java:io:kafka:classes UP-TO-DATE > Task :runners:core-java:compileJava FROM-CACHE > Task :runners:core-java:classes UP-TO-DATE > Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE > Task :sdks:java:extensions:protobuf:classes UP-TO-DATE > Task :sdks:java:io:kafka:jar > Task :sdks:java:extensions:protobuf:jar > Task :runners:core-java:jar > Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE > Task :sdks:java:harness:compileJava FROM-CACHE > Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE > Task :sdks:java:harness:classes UP-TO-DATE > Task :sdks:java:harness:jar > Task :sdks:java:io:google-cloud-platform:jar > Task :sdks:java:harness:shadowJar > Task :runners:java-fn-execution:compileJava FROM-CACHE > Task :runners:java-fn-execution:classes UP-TO-DATE > Task :runners:java-fn-execution:jar > Task :runners:flink:1.10:compileJava FROM-CACHE > Task :runners:flink:1.10:classes > Task :runners:flink:1.10:jar > Task :runners:flink:1.10:job-server:compileJava NO-SOURCE > Task :runners:flink:1.10:job-server:classes UP-TO-DATE > Task :runners:flink:1.10:job-server:shadowJar > Task :runners:flink:1.10:job-server-container:copyDockerfileDependencies > Task :runners:flink:1.10:job-server-container:dockerPrepare > Task :runners:flink:1.10:job-server-container:docker Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD SUCCESSFUL in 1m 21s 61 actionable tasks: 42 executed, 18 from cache, 1 up-to-date Publishing build scan... https://gradle.com/s/rdulyfrj33bmm [beam_LoadTests_Python_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins62583144800345978.sh + echo 'Tagging image...' Tagging image... [beam_LoadTests_Python_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins4282766393031629755.sh + docker tag gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest [beam_LoadTests_Python_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins5841048431576393093.sh + echo 'Pushing image...' Pushing image... [beam_LoadTests_Python_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins8977489282343395776.sh + docker push gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest The push refers to repository [gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server] 49333dc0d6d3: Preparing 065bbd8c9021: Preparing 222351333c13: Preparing f9ac396c8ed1: Preparing 7f2fcf9e7e07: Preparing ac3e2c206c49: Preparing 3663b7fed4c9: Preparing 832f129ebea4: Preparing 6670e930ed33: Preparing c7f27a4eb870: Preparing e70dfb4c3a48: Preparing 1c76bd0dc325: Preparing ac3e2c206c49: Waiting 3663b7fed4c9: Waiting 6670e930ed33: Waiting 832f129ebea4: Waiting c7f27a4eb870: Waiting e70dfb4c3a48: Waiting 1c76bd0dc325: Waiting 065bbd8c9021: Pushed 49333dc0d6d3: Pushed 222351333c13: Pushed ac3e2c206c49: Layer already exists 3663b7fed4c9: Layer already exists 832f129ebea4: Layer already exists 6670e930ed33: Layer already exists c7f27a4eb870: Layer already exists e70dfb4c3a48: Layer already exists 1c76bd0dc325: Layer already exists 7f2fcf9e7e07: Pushed f9ac396c8ed1: Pushed latest: digest: sha256:414a07127d61d0fc9f6541c4967c90eb37f082f52f6db7180d6c568796e901b1 size: 2841 [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content JOB_SERVER_IMAGE=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest CLUSTER_NAME=beam-loadtests-python-combine-flink-batch-221 DETACHED_MODE=true HARNESS_IMAGES_TO_PULL=gcr.io/apache-beam-testing/beam_portability/beam_python2.7_sdk:latest FLINK_NUM_WORKERS=16 FLINK_DOWNLOAD_URL=https://archive.apache.org/dist/flink/flink-1.10.0/flink-1.10.0-bin-scala_2.11.tgz GCS_BUCKET=gs://beam-flink-cluster HADOOP_DOWNLOAD_URL=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar FLINK_TASKMANAGER_SLOTS=1 ARTIFACTS_DIR=gs://beam-flink-cluster/beam-loadtests-python-combine-flink-batch-221 GCLOUD_ZONE=us-central1-a [EnvInject] - Variables injected successfully. [beam_LoadTests_Python_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins6937580288019241311.sh + echo Setting up flink cluster Setting up flink cluster [beam_LoadTests_Python_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins6302400549300706121.sh + cd <https://builds.apache.org/job/beam_LoadTests_Python_Combine_Flink_Batch/ws/src/.test-infra/dataproc> + ./flink_cluster.sh create + GCLOUD_ZONE=us-central1-a + DATAPROC_VERSION=1.2 + MASTER_NAME=beam-loadtests-python-combine-flink-batch-221-m + INIT_ACTIONS_FOLDER_NAME=init-actions + FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh + BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh + DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh + FLINK_LOCAL_PORT=8081 + FLINK_TASKMANAGER_SLOTS=1 + TASK_MANAGER_MEM=10240 + YARN_APPLICATION_MASTER= + create + upload_init_actions + echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster' Uploading initialization actions to GCS bucket: gs://beam-flink-cluster + gsutil cp -r init-actions/beam.sh init-actions/docker.sh init-actions/flink.sh gs://beam-flink-cluster/init-actions Copying file://init-actions/beam.sh [Content-Type=text/x-sh]... / [0 files][ 0.0 B/ 2.3 KiB] / [1 files][ 2.3 KiB/ 2.3 KiB] Copying file://init-actions/docker.sh [Content-Type=text/x-sh]... / [1 files][ 2.3 KiB/ 6.0 KiB] / [2 files][ 6.0 KiB/ 6.0 KiB] Copying file://init-actions/flink.sh [Content-Type=text/x-sh]... / [2 files][ 6.0 KiB/ 13.4 KiB] / [3 files][ 13.4 KiB/ 13.4 KiB] - Operation completed over 3 objects/13.4 KiB. + create_cluster + local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.10.0/flink-1.10.0-bin-scala_2.11.tgz, + metadata+=flink-start-yarn-session=true, + metadata+=flink-taskmanager-slots=1, + metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar + [[ -n gcr.io/apache-beam-testing/beam_portability/beam_python2.7_sdk:latest ]] + metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_python2.7_sdk:latest + [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest ]] + metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest + local image_version=1.2 + echo 'Starting dataproc cluster. Dataproc version: 1.2' Starting dataproc cluster. Dataproc version: 1.2 + local num_dataproc_workers=17 + gcloud dataproc clusters create beam-loadtests-python-combine-flink-batch-221 --region=global --num-workers=17 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.10.0/flink-1.10.0-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_python2.7_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest, --image-version=1.2 --zone=us-central1-a --quiet Waiting on operation [projects/apache-beam-testing/regions/global/operations/29e3fd20-8a0c-315a-8f1d-8bb43668d769]. Waiting for cluster creation operation... WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance. ..............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................WARNING: Cluster beam-loadtests-python-combine-flink-batch-221 failed to create. Beginning automated resource cleanup process. done. ERROR: (gcloud.dataproc.clusters.create) Operation [projects/apache-beam-testing/regions/global/operations/29e3fd20-8a0c-315a-8f1d-8bb43668d769] failed: Initialization action timed out. Failed action 'gs://beam-flink-cluster/init-actions/flink.sh', see output in: gs://dataproc-6c5fbcbb-a2de-406e-9cf7-8c1ce0b6a604-us/google-cloud-dataproc-metainfo/58d938a3-2904-4cfe-8fd5-30c701fd39cd/beam-loadtests-python-combine-flink-batch-221-m/dataproc-initialization-script-2_output. Build step 'Execute shell' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
