See <https://builds.apache.org/job/beam_LoadTests_Python_Combine_Flink_Batch/123/display/redirect?page=changes>
Changes: [ehudm] [BEAM-8489] Filter: don't use callable's output type [kcweaver] [BEAM-8251] plumb worker_(region|zone) to Environment proto [rohde.samuel] change definition of has_unbounded_sources in PIN to a pre-determined [rohde.samuel] typo [rohde.samuel] lint [rohde.samuel] remove BigQueryReader from list [rohde.samuel] lint [kcweaver] Add null checks for worker region/zone options [lostluck] [GoSDK] Handle data write errors & stream recreate [sambvfx] [BEAM-8836] Make ExternalTransform unique_name unique [sambvfx] add simple unique_name test; remove all uses of [sambvfx] fixup: pylint fix [rohde.samuel] remove external [rohde.samuel] remove external [michal.walenia] [BEAM-8869] Exclude system metrics test from legacy runner test suite [github] Merge pull request #10248: [BEAM-7274] Add type conversions factory [chamikara] Merge pull request #10262: [BEAM-8575] Revert validates runner test tag [github] [BEAM-8835] Disable Flink Uber Jar by default. (#10270) [lostluck] [GoSDK] Cancel stream context on dataWriter error [github] [BEAM-8651] [BEAM-8874] Change pickle_lock to be a reentrant lock, and [lostluck] [GoSDK] Don't panic if debug symbols are striped [lcwik] [BEAM-8523] Regenerate Go protos with respect to changes in #9959 [kcweaver] [BEAM-8883] downgrade 'Failed to remove job staging directory' log level ------------------------------------------ [...truncated 54.17 KB...] > Task :runners:flink:1.9:job-server-container:dockerClean UP-TO-DATE > Task :model:job-management:extractProto > Task :sdks:java:extensions:protobuf:extractProto > Task :model:fn-execution:extractProto > Task :sdks:java:extensions:protobuf:processResources NO-SOURCE > Task :model:job-management:processResources > Task :runners:flink:1.9:copySourceOverrides > Task :sdks:java:core:generateGrammarSource FROM-CACHE > Task :runners:flink:1.9:copyTestResourcesOverrides NO-SOURCE > Task :model:fn-execution:processResources > Task :runners:flink:1.9:processResources > Task :sdks:java:build-tools:compileJava FROM-CACHE > Task :sdks:java:build-tools:processResources > Task :sdks:java:build-tools:classes > Task :sdks:java:core:processResources > Task :sdks:java:build-tools:jar > Task :model:pipeline:extractIncludeProto > Task :model:pipeline:extractProto > Task :model:pipeline:generateProto > Task :model:pipeline:compileJava FROM-CACHE > Task :model:pipeline:processResources > Task :model:pipeline:classes > Task :model:pipeline:jar > Task :model:job-management:extractIncludeProto > Task :model:fn-execution:extractIncludeProto > Task :model:job-management:generateProto > Task :model:fn-execution:generateProto > Task :model:job-management:compileJava FROM-CACHE > Task :model:job-management:classes > Task :model:fn-execution:compileJava FROM-CACHE > Task :model:fn-execution:classes > Task :model:pipeline:shadowJar > Task :model:job-management:shadowJar > Task :model:fn-execution:shadowJar > Task :sdks:java:core:compileJava FROM-CACHE > Task :sdks:java:core:classes > Task :sdks:java:core:shadowJar > Task :sdks:java:extensions:protobuf:extractIncludeProto > Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE > Task :sdks:java:io:kafka:compileJava FROM-CACHE > Task :sdks:java:io:kafka:classes UP-TO-DATE > Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE > Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE > Task :sdks:java:io:kafka:jar > Task :vendor:sdks-java-extensions-protobuf:shadowJar > Task :runners:core-construction-java:compileJava FROM-CACHE > Task :runners:core-construction-java:classes UP-TO-DATE > Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE > Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE > Task :sdks:java:extensions:google-cloud-platform-core:jar > Task :runners:core-construction-java:jar > Task :sdks:java:fn-execution:compileJava FROM-CACHE > Task :sdks:java:fn-execution:classes UP-TO-DATE > Task :sdks:java:fn-execution:jar > Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE > Task :sdks:java:extensions:protobuf:classes UP-TO-DATE > Task :sdks:java:extensions:protobuf:jar > Task :runners:core-java:compileJava FROM-CACHE > Task :runners:core-java:classes UP-TO-DATE > Task :runners:core-java:jar > Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE > Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE > Task :sdks:java:harness:compileJava FROM-CACHE > Task :sdks:java:harness:classes UP-TO-DATE > Task :sdks:java:harness:jar > Task :sdks:java:io:google-cloud-platform:jar > Task :sdks:java:harness:shadowJar > Task :runners:java-fn-execution:compileJava FROM-CACHE > Task :runners:java-fn-execution:classes UP-TO-DATE > Task :runners:java-fn-execution:jar > Task :runners:flink:1.9:compileJava FROM-CACHE > Task :runners:flink:1.9:classes > Task :runners:flink:1.9:jar > Task :runners:flink:1.9:job-server:compileJava NO-SOURCE > Task :runners:flink:1.9:job-server:classes UP-TO-DATE > Task :runners:flink:1.9:job-server:shadowJar > Task :runners:flink:1.9:job-server-container:copyDockerfileDependencies > Task :runners:flink:1.9:job-server-container:dockerPrepare > Task :runners:flink:1.9:job-server-container:docker Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD SUCCESSFUL in 1m 1s 58 actionable tasks: 40 executed, 17 from cache, 1 up-to-date Publishing build scan... https://scans.gradle.com/s/ben2uleml7r2y [beam_LoadTests_Python_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins3176559693189957924.sh + echo 'Tagging image...' Tagging image... [beam_LoadTests_Python_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins1902636002885022708.sh + docker tag gcr.io/apache-beam-testing/beam_portability/flink-job-server gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest [beam_LoadTests_Python_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins1509255970345493719.sh + echo 'Pushing image...' Pushing image... [beam_LoadTests_Python_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins1668381774497949131.sh + docker push gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest The push refers to repository [gcr.io/apache-beam-testing/beam_portability/flink-job-server] 5cd882d98da1: Preparing 1d513d0888ae: Preparing 019b3097366b: Preparing 2ee490fbc316: Preparing b18043518924: Preparing 9a11244a7e74: Preparing 5f3a5adb8e97: Preparing 73bfa217d66f: Preparing 91ecdd7165d3: Preparing e4b20fcc48f4: Preparing 91ecdd7165d3: Waiting e4b20fcc48f4: Waiting 9a11244a7e74: Waiting 5f3a5adb8e97: Waiting 73bfa217d66f: Waiting 2ee490fbc316: Layer already exists 9a11244a7e74: Layer already exists 5f3a5adb8e97: Layer already exists 73bfa217d66f: Layer already exists b18043518924: Layer already exists e4b20fcc48f4: Layer already exists 91ecdd7165d3: Layer already exists 5cd882d98da1: Pushed 019b3097366b: Pushed 1d513d0888ae: Pushed latest: digest: sha256:393f0155897ac9181ee0d377ba530d00368a76053355cd30e551af5897518f49 size: 2427 [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content JOB_SERVER_IMAGE=gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest CLUSTER_NAME=beam-loadtests-python-combine-flink-batch-123 DETACHED_MODE=true HARNESS_IMAGES_TO_PULL=gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest FLINK_NUM_WORKERS=16 FLINK_DOWNLOAD_URL=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz GCS_BUCKET=gs://beam-flink-cluster HADOOP_DOWNLOAD_URL=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar FLINK_TASKMANAGER_SLOTS=1 ARTIFACTS_DIR=gs://beam-flink-cluster/beam-loadtests-python-combine-flink-batch-123 GCLOUD_ZONE=us-central1-a [EnvInject] - Variables injected successfully. [beam_LoadTests_Python_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins8024062912549193301.sh + echo Setting up flink cluster Setting up flink cluster [beam_LoadTests_Python_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins7751388800946610633.sh + cd <https://builds.apache.org/job/beam_LoadTests_Python_Combine_Flink_Batch/ws/src/.test-infra/dataproc> + ./flink_cluster.sh create + GCLOUD_ZONE=us-central1-a + DATAPROC_VERSION=1.2 + MASTER_NAME=beam-loadtests-python-combine-flink-batch-123-m + INIT_ACTIONS_FOLDER_NAME=init-actions + FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh + BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh + DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh + FLINK_LOCAL_PORT=8081 + FLINK_TASKMANAGER_SLOTS=1 + TASK_MANAGER_MEM=10240 + YARN_APPLICATION_MASTER= + create + upload_init_actions + echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster' Uploading initialization actions to GCS bucket: gs://beam-flink-cluster + gsutil cp -r init-actions/beam.sh init-actions/docker.sh init-actions/flink.sh gs://beam-flink-cluster/init-actions Copying file://init-actions/beam.sh [Content-Type=text/x-sh]... / [0 files][ 0.0 B/ 2.3 KiB] / [1 files][ 2.3 KiB/ 2.3 KiB] Copying file://init-actions/docker.sh [Content-Type=text/x-sh]... / [1 files][ 2.3 KiB/ 6.0 KiB] / [2 files][ 6.0 KiB/ 6.0 KiB] Copying file://init-actions/flink.sh [Content-Type=text/x-sh]... / [2 files][ 6.0 KiB/ 13.4 KiB] / [3 files][ 13.4 KiB/ 13.4 KiB] - Operation completed over 3 objects/13.4 KiB. + create_cluster + local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz, + metadata+=flink-start-yarn-session=true, + metadata+=flink-taskmanager-slots=1, + metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar + [[ -n gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest ]] + metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest + [[ -n gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest ]] + metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest + local image_version=1.2 + echo 'Starting dataproc cluster. Dataproc version: 1.2' Starting dataproc cluster. Dataproc version: 1.2 + local num_dataproc_workers=17 + gcloud dataproc clusters create beam-loadtests-python-combine-flink-batch-123 --region=global --num-workers=17 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest, --image-version=1.2 --zone=us-central1-a --quiet Waiting on operation [projects/apache-beam-testing/regions/global/operations/9c39c9c4-661b-3401-a03f-aee9bcb66347]. Waiting for cluster creation operation... WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance. ..............................................................................................................................................................................................................done. Created [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-python-combine-flink-batch-123] Cluster placed in zone [us-central1-a]. + get_leader + local i=0 + local application_ids + local application_masters + echo 'Yarn Applications' Yarn Applications ++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-combine-flink-batch-123-m '--command=yarn application -list' ++ grep beam-loadtests-python-combine-flink-batch-123 Warning: Permanently added 'compute.4584424557532310572' (ECDSA) to the list of known hosts. 19/12/04 15:41:07 INFO client.RMProxy: Connecting to ResourceManager at beam-loadtests-python-combine-flink-batch-123-m/10.128.0.60:8032 + read line + echo ++ echo ++ sed 's/ .*//' + application_ids[$i]= ++ echo ++ sed 's/.*beam-loadtests-python-combine-flink-batch-123/beam-loadtests-python-combine-flink-batch-123/' ++ sed 's/ .*//' + application_masters[$i]= + i=1 + read line + '[' 1 '!=' 1 ']' + YARN_APPLICATION_MASTER= + echo 'Using Yarn Application master: ' Using Yarn Application master: + [[ -n gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest ]] + start_job_server + gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-combine-flink-batch-123-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest --flink-master= --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-python-combine-flink-batch-123' d7e33c58c503e7d65c23f06c1cab4a252f1f6c217bdf896719c56bbfeaa4a300 + start_tunnel ++ gcloud compute ssh --quiet --zone=us-central1-a yarn@beam-loadtests-python-combine-flink-batch-123-m '--command=curl -s "http:///jobmanager/config"' + local job_server_config= + local key=jobmanager.rpc.port ++ echo ++ cut -d : -f1 + local yarn_application_master_host= ++ echo ++ python -c 'import sys, json; print [ e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0]' Traceback (most recent call last): File "<string>", line 1, in <module> File "/usr/lib/python2.7/json/__init__.py", line 291, in load **kw) File "/usr/lib/python2.7/json/__init__.py", line 339, in loads return _default_decoder.decode(s) File "/usr/lib/python2.7/json/decoder.py", line 364, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "/usr/lib/python2.7/json/decoder.py", line 382, in raw_decode raise ValueError("No JSON object could be decoded") ValueError: No JSON object could be decoded + local jobmanager_rpc_port= ++ [[ true == \t\r\u\e ]] ++ echo ' -Nf >& /dev/null' + local 'detached_mode_params= -Nf >& /dev/null' ++ [[ -n gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest ]] ++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097' + local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097' + local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-combine-flink-batch-123-m -- -L 8081: -L :: -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf >& /dev/null' + eval gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-combine-flink-batch-123-m -- -L 8081: -L :: -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null ++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-combine-flink-batch-123-m -- -L 8081: -L :: -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf Build step 'Execute shell' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
