See <https://builds.apache.org/job/beam_LoadTests_Python_ParDo_Flink_Batch/91/display/redirect?page=changes>
Changes: [sunjincheng121] [BEAM-7949] Add time-based cache threshold support in the data service [sunjincheng121] [BEAM-7949] Introduce PeriodicThread for time-based cache threshold [relax] Merge pull request #10444: [BEAM-9010] Proper TableRow size calculation ------------------------------------------ [...truncated 53.97 KB...] > Task :model:job-management:extractProto > Task :runners:flink:1.9:job-server:processResources NO-SOURCE > Task :sdks:java:io:kafka:processResources NO-SOURCE > Task :runners:flink:1.9:job-server-container:dockerClean UP-TO-DATE > Task :sdks:java:extensions:protobuf:extractProto > Task :sdks:java:extensions:protobuf:processResources NO-SOURCE > Task :model:job-management:processResources > Task :sdks:java:core:generateGrammarSource FROM-CACHE > Task :model:fn-execution:processResources > Task :runners:flink:1.9:copySourceOverrides > Task :runners:flink:1.9:copyTestResourcesOverrides NO-SOURCE > Task :runners:flink:1.9:processResources > Task :sdks:java:build-tools:compileJava FROM-CACHE > Task :sdks:java:build-tools:processResources > Task :sdks:java:build-tools:classes > Task :sdks:java:core:processResources > Task :sdks:java:build-tools:jar > Task :model:pipeline:extractIncludeProto > Task :model:pipeline:extractProto > Task :model:pipeline:generateProto > Task :model:pipeline:compileJava FROM-CACHE > Task :model:pipeline:processResources > Task :model:pipeline:classes > Task :model:pipeline:jar > Task :model:fn-execution:extractIncludeProto > Task :model:job-management:extractIncludeProto > Task :model:job-management:generateProto > Task :model:fn-execution:generateProto > Task :model:job-management:compileJava FROM-CACHE > Task :model:job-management:classes > Task :model:fn-execution:compileJava FROM-CACHE > Task :model:fn-execution:classes > Task :model:pipeline:shadowJar > Task :model:job-management:shadowJar > Task :model:fn-execution:shadowJar > Task :sdks:java:core:compileJava FROM-CACHE > Task :sdks:java:core:classes > Task :sdks:java:core:shadowJar > Task :sdks:java:extensions:protobuf:extractIncludeProto > Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE > Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE > Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE > Task :sdks:java:io:kafka:compileJava FROM-CACHE > Task :sdks:java:io:kafka:classes UP-TO-DATE > Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE > Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE > Task :vendor:sdks-java-extensions-protobuf:shadowJar > Task :sdks:java:io:kafka:jar > Task :runners:core-construction-java:compileJava FROM-CACHE > Task :runners:core-construction-java:classes UP-TO-DATE > Task :sdks:java:extensions:google-cloud-platform-core:jar > Task :runners:core-construction-java:jar > Task :sdks:java:fn-execution:compileJava FROM-CACHE > Task :sdks:java:fn-execution:classes UP-TO-DATE > Task :sdks:java:fn-execution:jar > Task :runners:core-java:compileJava FROM-CACHE > Task :runners:core-java:classes UP-TO-DATE > Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE > Task :sdks:java:extensions:protobuf:classes UP-TO-DATE > Task :sdks:java:extensions:protobuf:jar > Task :runners:core-java:jar > Task :sdks:java:harness:compileJava FROM-CACHE > Task :sdks:java:harness:classes UP-TO-DATE > Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE > Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE > Task :sdks:java:harness:jar > Task :sdks:java:io:google-cloud-platform:jar > Task :sdks:java:harness:shadowJar > Task :runners:java-fn-execution:compileJava FROM-CACHE > Task :runners:java-fn-execution:classes UP-TO-DATE > Task :runners:java-fn-execution:jar > Task :runners:flink:1.9:compileJava FROM-CACHE > Task :runners:flink:1.9:classes > Task :runners:flink:1.9:jar > Task :runners:flink:1.9:job-server:compileJava NO-SOURCE > Task :runners:flink:1.9:job-server:classes UP-TO-DATE > Task :runners:flink:1.9:job-server:shadowJar > Task :runners:flink:1.9:job-server-container:copyDockerfileDependencies > Task :runners:flink:1.9:job-server-container:dockerPrepare > Task :runners:flink:1.9:job-server-container:docker Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD SUCCESSFUL in 1m 2s 58 actionable tasks: 40 executed, 17 from cache, 1 up-to-date Publishing build scan... https://gradle.com/s/pnyvxkbdgch2q [beam_LoadTests_Python_ParDo_Flink_Batch] $ /bin/bash -xe /tmp/jenkins2868442187833761128.sh + echo 'Tagging image...' Tagging image... [beam_LoadTests_Python_ParDo_Flink_Batch] $ /bin/bash -xe /tmp/jenkins936771583852189961.sh + docker tag gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest [beam_LoadTests_Python_ParDo_Flink_Batch] $ /bin/bash -xe /tmp/jenkins8157969673273076527.sh + echo 'Pushing image...' Pushing image... [beam_LoadTests_Python_ParDo_Flink_Batch] $ /bin/bash -xe /tmp/jenkins6310901029255877954.sh + docker push gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest The push refers to repository [gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server] 649023ae6f2c: Preparing 985a2b417dcf: Preparing 92637ee02aca: Preparing 2ee490fbc316: Preparing b18043518924: Preparing 9a11244a7e74: Preparing 5f3a5adb8e97: Preparing 73bfa217d66f: Preparing 91ecdd7165d3: Preparing e4b20fcc48f4: Preparing 9a11244a7e74: Waiting 5f3a5adb8e97: Waiting 91ecdd7165d3: Waiting e4b20fcc48f4: Waiting 2ee490fbc316: Layer already exists b18043518924: Layer already exists 9a11244a7e74: Layer already exists 5f3a5adb8e97: Layer already exists 73bfa217d66f: Layer already exists 91ecdd7165d3: Layer already exists e4b20fcc48f4: Layer already exists 649023ae6f2c: Pushed 92637ee02aca: Pushed 985a2b417dcf: Pushed latest: digest: sha256:109fbe2769058c85e5a781f84ba0c7fee7cc93b6cda29874da7e8c44ec9cabc1 size: 2427 [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content JOB_SERVER_IMAGE=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest CLUSTER_NAME=beam-loadtests-python-pardo-flink-batch-91 DETACHED_MODE=true HARNESS_IMAGES_TO_PULL=gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest FLINK_NUM_WORKERS=5 FLINK_DOWNLOAD_URL=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz GCS_BUCKET=gs://beam-flink-cluster HADOOP_DOWNLOAD_URL=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar FLINK_TASKMANAGER_SLOTS=1 ARTIFACTS_DIR=gs://beam-flink-cluster/beam-loadtests-python-pardo-flink-batch-91 GCLOUD_ZONE=us-central1-a [EnvInject] - Variables injected successfully. [beam_LoadTests_Python_ParDo_Flink_Batch] $ /bin/bash -xe /tmp/jenkins5993155086654947551.sh + echo Setting up flink cluster Setting up flink cluster [beam_LoadTests_Python_ParDo_Flink_Batch] $ /bin/bash -xe /tmp/jenkins5158988985421881825.sh + cd <https://builds.apache.org/job/beam_LoadTests_Python_ParDo_Flink_Batch/ws/src/.test-infra/dataproc> + ./flink_cluster.sh create + GCLOUD_ZONE=us-central1-a + DATAPROC_VERSION=1.2 + MASTER_NAME=beam-loadtests-python-pardo-flink-batch-91-m + INIT_ACTIONS_FOLDER_NAME=init-actions + FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh + BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh + DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh + FLINK_LOCAL_PORT=8081 + FLINK_TASKMANAGER_SLOTS=1 + TASK_MANAGER_MEM=10240 + YARN_APPLICATION_MASTER= + create + upload_init_actions + echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster' Uploading initialization actions to GCS bucket: gs://beam-flink-cluster + gsutil cp -r init-actions/beam.sh init-actions/docker.sh init-actions/flink.sh gs://beam-flink-cluster/init-actions Copying file://init-actions/beam.sh [Content-Type=text/x-sh]... / [0 files][ 0.0 B/ 2.3 KiB] / [1 files][ 2.3 KiB/ 2.3 KiB] Copying file://init-actions/docker.sh [Content-Type=text/x-sh]... / [1 files][ 2.3 KiB/ 6.0 KiB] / [2 files][ 6.0 KiB/ 6.0 KiB] Copying file://init-actions/flink.sh [Content-Type=text/x-sh]... / [2 files][ 6.0 KiB/ 13.4 KiB] / [3 files][ 13.4 KiB/ 13.4 KiB] - Operation completed over 3 objects/13.4 KiB. + create_cluster + local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz, + metadata+=flink-start-yarn-session=true, + metadata+=flink-taskmanager-slots=1, + metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar + [[ -n gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest ]] + metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest + [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]] + metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest + local image_version=1.2 + echo 'Starting dataproc cluster. Dataproc version: 1.2' Starting dataproc cluster. Dataproc version: 1.2 + local num_dataproc_workers=6 + gcloud dataproc clusters create beam-loadtests-python-pardo-flink-batch-91 --region=global --num-workers=6 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest, --image-version=1.2 --zone=us-central1-a --quiet Waiting on operation [projects/apache-beam-testing/regions/global/operations/c272c142-b33d-361a-8089-491f28785cf5]. Waiting for cluster creation operation... WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance. .................................................................................................................................................................................................................................................done. Created [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-python-pardo-flink-batch-91] Cluster placed in zone [us-central1-a]. + get_leader + local i=0 + local application_ids + local application_masters + echo 'Yarn Applications' Yarn Applications ++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-pardo-flink-batch-91-m '--command=yarn application -list' ++ grep beam-loadtests-python-pardo-flink-batch-91 Warning: Permanently added 'compute.3682459856002798672' (ECDSA) to the list of known hosts. 19/12/25 14:01:22 INFO client.RMProxy: Connecting to ResourceManager at beam-loadtests-python-pardo-flink-batch-91-m/10.128.0.87:8032 + read line + echo ++ echo ++ sed 's/ .*//' + application_ids[$i]= ++ echo ++ sed 's/.*beam-loadtests-python-pardo-flink-batch-91/beam-loadtests-python-pardo-flink-batch-91/' ++ sed 's/ .*//' + application_masters[$i]= + i=1 + read line + '[' 1 '!=' 1 ']' + YARN_APPLICATION_MASTER= + echo 'Using Yarn Application master: ' Using Yarn Application master: + [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]] + start_job_server + gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-pardo-flink-batch-91-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest --flink-master= --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-python-pardo-flink-batch-91' a8b7cde8851ff1781170a53f0d358491973b68128ca5e7ff1fc958e8e6839f4e + start_tunnel ++ gcloud compute ssh --quiet --zone=us-central1-a yarn@beam-loadtests-python-pardo-flink-batch-91-m '--command=curl -s "http:///jobmanager/config"' + local job_server_config= + local key=jobmanager.rpc.port ++ echo ++ cut -d : -f1 + local yarn_application_master_host= ++ echo ++ python -c 'import sys, json; print [ e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0]' Traceback (most recent call last): File "<string>", line 1, in <module> File "/usr/lib/python2.7/json/__init__.py", line 291, in load **kw) File "/usr/lib/python2.7/json/__init__.py", line 339, in loads return _default_decoder.decode(s) File "/usr/lib/python2.7/json/decoder.py", line 364, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "/usr/lib/python2.7/json/decoder.py", line 382, in raw_decode raise ValueError("No JSON object could be decoded") ValueError: No JSON object could be decoded + local jobmanager_rpc_port= ++ [[ true == \t\r\u\e ]] ++ echo ' -Nf >& /dev/null' + local 'detached_mode_params= -Nf >& /dev/null' ++ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]] ++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097' + local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097' + local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-pardo-flink-batch-91-m -- -L 8081: -L :: -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf >& /dev/null' + eval gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-pardo-flink-batch-91-m -- -L 8081: -L :: -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null ++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-pardo-flink-batch-91-m -- -L 8081: -L :: -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf Build step 'Execute shell' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
