See <https://builds.apache.org/job/beam_LoadTests_Python_Combine_Flink_Batch/158/display/redirect?page=changes>
Changes: [rehman.muradali] onTimer/setTimer signature updates [lcwik] [BEAM-9059] Migrate PTransformTranslation to use string constants [iemejia] [BEAM-8701] Remove unused commons-io_1x dependency [iemejia] [BEAM-8701] Update commons-io to version 2.6 [github] Restrict the upper bound for pyhamcrest, since new version does not work [apilloud] [BEAM-9027] [SQL] Fix ZetaSQL Byte Literals [github] [BEAM-9058] Fix line-too-long exclusion regex and re-enable [altay] Readability/Lint fixes [hannahjiang] BEAM-8780 reuse RC images instead of recreate images [iemejia] [BEAM-9041] Add missing equals methods for GenericRecord <-> Row [iemejia] [BEAM-9042] Fix RowToGenericRecordFn Avro schema serialization [iemejia] [BEAM-9042] Update SchemaCoder doc with info about functions requiring [iemejia] [BEAM-9042] Test serializability and equality of Row<->GenericRecord [tvalentyn] [BEAM-9062] Improve assertion error for equal_to (#10504) [chamikara] [BEAM-8960]: Add an option for user to opt out of using insert id for [36090911+boyuanzz] [BEAM-8932] [BEAM-9036] Revert reverted commit to use PubsubMessage as ------------------------------------------ [...truncated 54.30 KB...] > Task :model:fn-execution:extractProto > Task :runners:flink:1.9:job-server-container:dockerClean UP-TO-DATE > Task :model:job-management:extractProto > Task :sdks:java:extensions:protobuf:extractProto > Task :sdks:java:extensions:protobuf:processResources NO-SOURCE > Task :model:job-management:processResources > Task :model:fn-execution:processResources > Task :sdks:java:core:generateGrammarSource FROM-CACHE > Task :runners:flink:1.9:copySourceOverrides > Task :runners:flink:1.9:copyTestResourcesOverrides NO-SOURCE > Task :runners:flink:1.9:processResources > Task :sdks:java:build-tools:compileJava FROM-CACHE > Task :sdks:java:build-tools:processResources > Task :sdks:java:build-tools:classes > Task :sdks:java:core:processResources > Task :sdks:java:build-tools:jar > Task :model:pipeline:extractIncludeProto > Task :model:pipeline:extractProto > Task :model:pipeline:generateProto > Task :model:pipeline:compileJava FROM-CACHE > Task :model:pipeline:processResources > Task :model:pipeline:classes > Task :model:pipeline:jar > Task :model:job-management:extractIncludeProto > Task :model:fn-execution:extractIncludeProto > Task :model:job-management:generateProto > Task :model:fn-execution:generateProto > Task :model:fn-execution:compileJava FROM-CACHE > Task :model:job-management:compileJava FROM-CACHE > Task :model:fn-execution:classes > Task :model:job-management:classes > Task :model:pipeline:shadowJar > Task :model:job-management:shadowJar > Task :model:fn-execution:shadowJar > Task :sdks:java:core:compileJava FROM-CACHE > Task :sdks:java:core:classes > Task :sdks:java:core:shadowJar > Task :sdks:java:extensions:protobuf:extractIncludeProto > Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE > Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE > Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE > Task :sdks:java:io:kafka:compileJava FROM-CACHE > Task :sdks:java:io:kafka:classes UP-TO-DATE > Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE > Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE > Task :vendor:sdks-java-extensions-protobuf:shadowJar > Task :runners:core-construction-java:compileJava FROM-CACHE > Task :runners:core-construction-java:classes UP-TO-DATE > Task :sdks:java:fn-execution:compileJava FROM-CACHE > Task :sdks:java:fn-execution:classes UP-TO-DATE > Task :sdks:java:io:kafka:jar > Task :sdks:java:extensions:google-cloud-platform-core:jar > Task :sdks:java:fn-execution:jar > Task :runners:core-construction-java:jar > Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE > Task :sdks:java:extensions:protobuf:classes UP-TO-DATE > Task :sdks:java:extensions:protobuf:jar > Task :runners:core-java:compileJava FROM-CACHE > Task :runners:core-java:classes UP-TO-DATE > Task :runners:core-java:jar > Task :sdks:java:harness:compileJava FROM-CACHE > Task :sdks:java:harness:classes UP-TO-DATE > Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE > Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE > Task :sdks:java:harness:jar > Task :sdks:java:io:google-cloud-platform:jar > Task :sdks:java:harness:shadowJar > Task :runners:java-fn-execution:compileJava FROM-CACHE > Task :runners:java-fn-execution:classes UP-TO-DATE > Task :runners:java-fn-execution:jar > Task :runners:flink:1.9:compileJava FROM-CACHE > Task :runners:flink:1.9:classes > Task :runners:flink:1.9:jar > Task :runners:flink:1.9:job-server:compileJava NO-SOURCE > Task :runners:flink:1.9:job-server:classes UP-TO-DATE > Task :runners:flink:1.9:job-server:shadowJar > Task :runners:flink:1.9:job-server-container:copyDockerfileDependencies > Task :runners:flink:1.9:job-server-container:dockerPrepare > Task :runners:flink:1.9:job-server-container:docker Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD SUCCESSFUL in 1m 7s 58 actionable tasks: 40 executed, 17 from cache, 1 up-to-date Publishing build scan... https://gradle.com/s/4bq6azg33dwzs [beam_LoadTests_Python_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins8956396290528686392.sh + echo 'Tagging image...' Tagging image... [beam_LoadTests_Python_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins7706218785804551469.sh + docker tag gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest [beam_LoadTests_Python_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins505222997443456337.sh + echo 'Pushing image...' Pushing image... [beam_LoadTests_Python_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins8078944319853365158.sh + docker push gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest The push refers to repository [gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server] 70ca6999b6ac: Preparing 667c426e458b: Preparing d283196a1894: Preparing d32e23a0d8e9: Preparing e7fe5541de5f: Preparing 03ff63c55220: Preparing bee1e39d7c3a: Preparing 1f59a4b2e206: Preparing 0ca7f54856c0: Preparing ebb9ae013834: Preparing 1f59a4b2e206: Waiting 0ca7f54856c0: Waiting ebb9ae013834: Waiting 03ff63c55220: Waiting bee1e39d7c3a: Waiting d32e23a0d8e9: Layer already exists e7fe5541de5f: Layer already exists 03ff63c55220: Layer already exists bee1e39d7c3a: Layer already exists 1f59a4b2e206: Layer already exists 0ca7f54856c0: Layer already exists ebb9ae013834: Layer already exists 70ca6999b6ac: Pushed d283196a1894: Pushed 667c426e458b: Pushed latest: digest: sha256:77ef39298640daf3b01c14b2cafe804c12db9836b6d885064a700d810d34e9fd size: 2427 [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content JOB_SERVER_IMAGE=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest CLUSTER_NAME=beam-loadtests-python-combine-flink-batch-158 DETACHED_MODE=true HARNESS_IMAGES_TO_PULL=gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest FLINK_NUM_WORKERS=16 FLINK_DOWNLOAD_URL=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz GCS_BUCKET=gs://beam-flink-cluster HADOOP_DOWNLOAD_URL=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar FLINK_TASKMANAGER_SLOTS=1 ARTIFACTS_DIR=gs://beam-flink-cluster/beam-loadtests-python-combine-flink-batch-158 GCLOUD_ZONE=us-central1-a [EnvInject] - Variables injected successfully. [beam_LoadTests_Python_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins4477801743745725074.sh + echo Setting up flink cluster Setting up flink cluster [beam_LoadTests_Python_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins4793395540453138697.sh + cd <https://builds.apache.org/job/beam_LoadTests_Python_Combine_Flink_Batch/ws/src/.test-infra/dataproc> + ./flink_cluster.sh create + GCLOUD_ZONE=us-central1-a + DATAPROC_VERSION=1.2 + MASTER_NAME=beam-loadtests-python-combine-flink-batch-158-m + INIT_ACTIONS_FOLDER_NAME=init-actions + FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh + BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh + DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh + FLINK_LOCAL_PORT=8081 + FLINK_TASKMANAGER_SLOTS=1 + TASK_MANAGER_MEM=10240 + YARN_APPLICATION_MASTER= + create + upload_init_actions + echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster' Uploading initialization actions to GCS bucket: gs://beam-flink-cluster + gsutil cp -r init-actions/beam.sh init-actions/docker.sh init-actions/flink.sh gs://beam-flink-cluster/init-actions Copying file://init-actions/beam.sh [Content-Type=text/x-sh]... / [0 files][ 0.0 B/ 2.3 KiB] / [1 files][ 2.3 KiB/ 2.3 KiB] Copying file://init-actions/docker.sh [Content-Type=text/x-sh]... / [1 files][ 2.3 KiB/ 6.0 KiB] / [2 files][ 6.0 KiB/ 6.0 KiB] Copying file://init-actions/flink.sh [Content-Type=text/x-sh]... / [2 files][ 6.0 KiB/ 13.4 KiB] / [3 files][ 13.4 KiB/ 13.4 KiB] Operation completed over 3 objects/13.4 KiB. + create_cluster + local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz, + metadata+=flink-start-yarn-session=true, + metadata+=flink-taskmanager-slots=1, + metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar + [[ -n gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest ]] + metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest + [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]] + metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest + local image_version=1.2 + echo 'Starting dataproc cluster. Dataproc version: 1.2' Starting dataproc cluster. Dataproc version: 1.2 + local num_dataproc_workers=17 + gcloud dataproc clusters create beam-loadtests-python-combine-flink-batch-158 --region=global --num-workers=17 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest, --image-version=1.2 --zone=us-central1-a --quiet Waiting on operation [projects/apache-beam-testing/regions/global/operations/4a2faf43-ebb3-3794-91d9-8f5c8e8109d8]. Waiting for cluster creation operation... WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance. ......................................................................................................................................................................................................................................................................................................................................................done. Created [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-python-combine-flink-batch-158] Cluster placed in zone [us-central1-a]. + get_leader + local i=0 + local application_ids + local application_masters + echo 'Yarn Applications' Yarn Applications ++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-combine-flink-batch-158-m '--command=yarn application -list' ++ grep beam-loadtests-python-combine-flink-batch-158 Warning: Permanently added 'compute.245709708159633883' (ECDSA) to the list of known hosts. 20/01/08 15:43:09 INFO client.RMProxy: Connecting to ResourceManager at beam-loadtests-python-combine-flink-batch-158-m/10.128.0.157:8032 + read line + echo ++ echo ++ sed 's/ .*//' + application_ids[$i]= ++ echo ++ sed 's/.*beam-loadtests-python-combine-flink-batch-158/beam-loadtests-python-combine-flink-batch-158/' ++ sed 's/ .*//' + application_masters[$i]= + i=1 + read line + '[' 1 '!=' 1 ']' + YARN_APPLICATION_MASTER= + echo 'Using Yarn Application master: ' Using Yarn Application master: + [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]] + start_job_server + gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-combine-flink-batch-158-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest --flink-master= --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-python-combine-flink-batch-158' 693c9b3c7687be412857e39eb7f1d91121a5ae492023f60d3087a21d7923af0c + start_tunnel ++ gcloud compute ssh --quiet --zone=us-central1-a yarn@beam-loadtests-python-combine-flink-batch-158-m '--command=curl -s "http:///jobmanager/config"' + local job_server_config= + local key=jobmanager.rpc.port ++ echo ++ cut -d : -f1 + local yarn_application_master_host= ++ echo ++ python -c 'import sys, json; print [ e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0]' Traceback (most recent call last): File "<string>", line 1, in <module> File "/usr/lib/python2.7/json/__init__.py", line 291, in load **kw) File "/usr/lib/python2.7/json/__init__.py", line 339, in loads return _default_decoder.decode(s) File "/usr/lib/python2.7/json/decoder.py", line 364, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "/usr/lib/python2.7/json/decoder.py", line 382, in raw_decode raise ValueError("No JSON object could be decoded") ValueError: No JSON object could be decoded + local jobmanager_rpc_port= ++ [[ true == \t\r\u\e ]] ++ echo ' -Nf >& /dev/null' + local 'detached_mode_params= -Nf >& /dev/null' ++ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]] ++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097' + local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097' + local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-combine-flink-batch-158-m -- -L 8081: -L :: -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf >& /dev/null' + eval gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-combine-flink-batch-158-m -- -L 8081: -L :: -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null ++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-combine-flink-batch-158-m -- -L 8081: -L :: -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf Build step 'Execute shell' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
