See <https://ci-beam.apache.org/job/beam_PostCommit_Python_Chicago_Taxi_Flink/257/display/redirect?page=changes>
Changes: [Chad Dombrova] [BEAM-7746] Add type checking to coders [noreply] [BEAM-9154] Ensure Chicago Taxi Example is disabled on Jenkins (#12929) [noreply] Add to_pcollection example to wordcount_dataframe (#12923) ------------------------------------------ Started by timer Running as SYSTEM [EnvInject] - Loading node environment variables. Building remotely on apache-beam-jenkins-12 (beam) in workspace <https://ci-beam.apache.org/job/beam_PostCommit_Python_Chicago_Taxi_Flink/ws/> No credentials specified Wiping out workspace first. Cloning the remote Git repository Cloning repository https://github.com/apache/beam.git > git init > <https://ci-beam.apache.org/job/beam_PostCommit_Python_Chicago_Taxi_Flink/ws/src> > # timeout=10 Fetching upstream changes from https://github.com/apache/beam.git > git --version # timeout=10 > git fetch --tags --progress https://github.com/apache/beam.git > +refs/heads/*:refs/remotes/origin/* # timeout=10 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # > timeout=10 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10 Fetching upstream changes from https://github.com/apache/beam.git > git fetch --tags --progress https://github.com/apache/beam.git > +refs/heads/*:refs/remotes/origin/* > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # > timeout=10 > git rev-parse origin/master^{commit} # timeout=10 Checking out Revision ed60540875a79fe2ba3442d5f87f8c88223be6b7 (origin/master) > git config core.sparsecheckout # timeout=10 > git checkout -f ed60540875a79fe2ba3442d5f87f8c88223be6b7 # timeout=10 Commit message: "Merge pull request #12884 [BEAM-7746] Add type checking to coders" > git rev-list --no-walk dc99f0448402e6b4765140b788687f92e965ce84 # timeout=10 No emails were triggered. [EnvInject] - Executing scripts and injecting environment variables after the SCM step. [EnvInject] - Injecting as environment variables the properties content SPARK_LOCAL_IP=127.0.0.1 SETUPTOOLS_USE_DISTUTILS=stdlib [EnvInject] - Variables injected successfully. [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content JOB_SERVER_IMAGE=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest CLUSTER_NAME=beam-postcommit-python-chicago-taxi-flink-257 DETACHED_MODE=true HARNESS_IMAGES_TO_PULL=gcr.io/apache-beam-testing/beam_portability/beam_python3.7_sdk:latest FLINK_NUM_WORKERS=5 FLINK_DOWNLOAD_URL=https://archive.apache.org/dist/flink/flink-1.10.1/flink-1.10.1-bin-scala_2.11.tgz GCS_BUCKET=gs://beam-flink-cluster HADOOP_DOWNLOAD_URL=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar FLINK_TASKMANAGER_SLOTS=1 ARTIFACTS_DIR=gs://beam-flink-cluster/beam-postcommit-python-chicago-taxi-flink-257 GCLOUD_ZONE=us-central1-a [EnvInject] - Variables injected successfully. [beam_PostCommit_Python_Chicago_Taxi_Flink] $ /bin/bash -xe /tmp/jenkins6972022017080707720.sh + echo Setting up flink cluster Setting up flink cluster [beam_PostCommit_Python_Chicago_Taxi_Flink] $ /bin/bash -xe /tmp/jenkins7893379450779928624.sh + cd <https://ci-beam.apache.org/job/beam_PostCommit_Python_Chicago_Taxi_Flink/ws/src/.test-infra/dataproc> + ./flink_cluster.sh create + GCLOUD_ZONE=us-central1-a + DATAPROC_VERSION=1.2 + MASTER_NAME=beam-postcommit-python-chicago-taxi-flink-257-m + INIT_ACTIONS_FOLDER_NAME=init-actions + FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh + BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh + DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh + FLINK_LOCAL_PORT=8081 + FLINK_TASKMANAGER_SLOTS=1 + YARN_APPLICATION_MASTER= + create + upload_init_actions + echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster' Uploading initialization actions to GCS bucket: gs://beam-flink-cluster + gsutil cp -r init-actions/beam.sh init-actions/docker.sh init-actions/flink.sh gs://beam-flink-cluster/init-actions Copying file://init-actions/beam.sh [Content-Type=text/x-sh]... / [0 files][ 0.0 B/ 2.3 KiB] / [1 files][ 2.3 KiB/ 2.3 KiB] Copying file://init-actions/docker.sh [Content-Type=text/x-sh]... / [1 files][ 2.3 KiB/ 6.0 KiB] / [2 files][ 6.0 KiB/ 6.0 KiB] Copying file://init-actions/flink.sh [Content-Type=text/x-sh]... / [2 files][ 6.0 KiB/ 13.5 KiB] / [3 files][ 13.5 KiB/ 13.5 KiB] Operation completed over 3 objects/13.5 KiB. + create_cluster + local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.10.1/flink-1.10.1-bin-scala_2.11.tgz, + metadata+=flink-start-yarn-session=true, + metadata+=flink-taskmanager-slots=1, + metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar + [[ -n gcr.io/apache-beam-testing/beam_portability/beam_python3.7_sdk:latest ]] + metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_python3.7_sdk:latest + [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest ]] + metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest + local image_version=1.2 + echo 'Starting dataproc cluster. Dataproc version: 1.2' Starting dataproc cluster. Dataproc version: 1.2 + local num_dataproc_workers=6 + gcloud dataproc clusters create beam-postcommit-python-chicago-taxi-flink-257 --region=global --num-workers=6 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.10.1/flink-1.10.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_python3.7_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest, --image-version=1.2 --zone=us-central1-a --quiet Waiting on operation [projects/apache-beam-testing/regions/global/operations/2d2c2796-258a-3030-bb24-47678dca2967]. Waiting for cluster creation operation... WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance. .........................................................................................................................................................................................................done. Created [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-postcommit-python-chicago-taxi-flink-257] Cluster placed in zone [us-central1-a]. + get_leader + local i=0 + local application_ids + local application_masters + echo 'Yarn Applications' Yarn Applications ++ grep beam-postcommit-python-chicago-taxi-flink-257 ++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-postcommit-python-chicago-taxi-flink-257-m '--command=yarn application -list' Warning: Permanently added 'compute.7773916023689928744' (ECDSA) to the list of known hosts. 20/09/24 18:04:03 INFO client.RMProxy: Connecting to ResourceManager at beam-postcommit-python-chicago-taxi-flink-257-m/10.128.0.43:8032 + read line + echo application_1600970536119_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-postcommit-python-chicago-taxi-flink-257-w-4.c.apache-beam-testing.internal:43783 application_1600970536119_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-postcommit-python-chicago-taxi-flink-257-w-4.c.apache-beam-testing.internal:43783 ++ echo application_1600970536119_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-postcommit-python-chicago-taxi-flink-257-w-4.c.apache-beam-testing.internal:43783 ++ sed 's/ .*//' + application_ids[$i]=application_1600970536119_0001 ++ echo application_1600970536119_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-postcommit-python-chicago-taxi-flink-257-w-4.c.apache-beam-testing.internal:43783 ++ sed 's/ .*//' ++ sed 's/.*beam-postcommit-python-chicago-taxi-flink-257/beam-postcommit-python-chicago-taxi-flink-257/' + application_masters[$i]=beam-postcommit-python-chicago-taxi-flink-257-w-4.c.apache-beam-testing.internal:43783 + i=1 + read line + '[' 1 '!=' 1 ']' + YARN_APPLICATION_MASTER=beam-postcommit-python-chicago-taxi-flink-257-w-4.c.apache-beam-testing.internal:43783 + echo 'Using Yarn Application master: beam-postcommit-python-chicago-taxi-flink-257-w-4.c.apache-beam-testing.internal:43783' Using Yarn Application master: beam-postcommit-python-chicago-taxi-flink-257-w-4.c.apache-beam-testing.internal:43783 + [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest ]] + start_job_server + gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-postcommit-python-chicago-taxi-flink-257-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest --flink-master=beam-postcommit-python-chicago-taxi-flink-257-w-4.c.apache-beam-testing.internal:43783 --artifacts-dir=gs://beam-flink-cluster/beam-postcommit-python-chicago-taxi-flink-257' 1a6f55ee831b55914b4120b48b094bc028c7c31911c41232ae8cc52a39b62291 + start_tunnel ++ gcloud compute ssh --quiet --zone=us-central1-a yarn@beam-postcommit-python-chicago-taxi-flink-257-m '--command=curl -s "http://beam-postcommit-python-chicago-taxi-flink-257-w-4.c.apache-beam-testing.internal:43783/jobmanager/config"' + local 'job_server_config=[{"key":"yarn.flink-dist-jar","value":"file:/usr/lib/flink/lib/flink-dist_2.11-1.10.1.jar"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1600970536119_0001"},{"key":"jobmanager.rpc.address","value":"beam-postcommit-python-chicago-taxi-flink-257-w-4.c.apache-beam-testing.internal"},{"key":"taskmanager.memory.jvm-metaspace.size","value":"512 mb"},{"key":"taskmanager.memory.task.off-heap.size","value":"256 mb"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1600970536119_0001"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"yarn.application.name","value":"flink-dataproc"},{"key":"jobmanager.heap.size","value":"12288m"},{"key":"taskmanager.memory.process.size","value":"12 gb"},{"key":"web.port","value":"0"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"execution.target","value":"yarn-per-job"},{"key":"web.tmpdir","value":"/tmp/flink-web-0e663fab-62fe-46bc-8552-4946a8486cbb"},{"key":"jobmanager.rpc.port","value":"46723"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"execution.attached","value":"false"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"taskmanager.memory.managed.fraction","value":"0.5"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-postcommit-python-chicago-taxi-flink-257-w-4.c.apache-beam-testing.internal"},{"key":"state.backend","value":"filesystem"},{"key":"$internal.yarn.log-config-file","value":"/usr/lib/flink/conf/log4j.properties"},{"key":"state.checkpoints.dir","value":"gs://dataproc-6c5fbcbb-a2de-406e-9cf7-8c1ce0b6a604-us/checkpoints"}]' + local key=jobmanager.rpc.port ++ echo beam-postcommit-python-chicago-taxi-flink-257-w-4.c.apache-beam-testing.internal:43783 ++ cut -d : -f1 + local yarn_application_master_host=beam-postcommit-python-chicago-taxi-flink-257-w-4.c.apache-beam-testing.internal ++ python -c 'import sys, json; print([e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0])' ++ echo '[{"key":"yarn.flink-dist-jar","value":"file:/usr/lib/flink/lib/flink-dist_2.11-1.10.1.jar"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1600970536119_0001"},{"key":"jobmanager.rpc.address","value":"beam-postcommit-python-chicago-taxi-flink-257-w-4.c.apache-beam-testing.internal"},{"key":"taskmanager.memory.jvm-metaspace.size","value":"512' 'mb"},{"key":"taskmanager.memory.task.off-heap.size","value":"256' 'mb"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1600970536119_0001"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"yarn.application.name","value":"flink-dataproc"},{"key":"jobmanager.heap.size","value":"12288m"},{"key":"taskmanager.memory.process.size","value":"12' 'gb"},{"key":"web.port","value":"0"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"execution.target","value":"yarn-per-job"},{"key":"web.tmpdir","value":"/tmp/flink-web-0e663fab-62fe-46bc-8552-4946a8486cbb"},{"key":"jobmanager.rpc.port","value":"46723"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"execution.attached","value":"false"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"taskmanager.memory.managed.fraction","value":"0.5"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-postcommit-python-chicago-taxi-flink-257-w-4.c.apache-beam-testing.internal"},{"key":"state.backend","value":"filesystem"},{"key":"$internal.yarn.log-config-file","value":"/usr/lib/flink/conf/log4j.properties"},{"key":"state.checkpoints.dir","value":"gs://dataproc-6c5fbcbb-a2de-406e-9cf7-8c1ce0b6a604-us/checkpoints"}]' + local jobmanager_rpc_port=46723 ++ [[ true == \t\r\u\e ]] ++ echo ' -Nf >& /dev/null' + local 'detached_mode_params= -Nf >& /dev/null' ++ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest ]] ++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097' + local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097' + local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-postcommit-python-chicago-taxi-flink-257-m -- -L 8081:beam-postcommit-python-chicago-taxi-flink-257-w-4.c.apache-beam-testing.internal:43783 -L 46723:beam-postcommit-python-chicago-taxi-flink-257-w-4.c.apache-beam-testing.internal:46723 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf >& /dev/null' + eval gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-postcommit-python-chicago-taxi-flink-257-m -- -L 8081:beam-postcommit-python-chicago-taxi-flink-257-w-4.c.apache-beam-testing.internal:43783 -L 46723:beam-postcommit-python-chicago-taxi-flink-257-w-4.c.apache-beam-testing.internal:46723 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null ++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-postcommit-python-chicago-taxi-flink-257-m -- -L 8081:beam-postcommit-python-chicago-taxi-flink-257-w-4.c.apache-beam-testing.internal:43783 -L 46723:beam-postcommit-python-chicago-taxi-flink-257-w-4.c.apache-beam-testing.internal:46723 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf [Gradle] - Launching build. [src] $ <https://ci-beam.apache.org/job/beam_PostCommit_Python_Chicago_Taxi_Flink/ws/src/gradlew> -PgcsRoot=gs://temp-storage-for-perf-tests/chicago-taxi "-PpipelineOptions=--parallelism=5 --job_endpoint=localhost:8099 --environment_config=gcr.io/apache-beam-testing/beam_portability/beam_python3.7_sdk:latest --environment_type=DOCKER --execution_mode_for_batch=BATCH_FORCED" :sdks:python:test-suites:portable:py37:chicagoTaxiExample Starting a Gradle Daemon, 3 busy Daemons could not be reused, use --status for details > Task :buildSrc:compileJava NO-SOURCE > Task :buildSrc:compileGroovy FROM-CACHE > Task :buildSrc:pluginDescriptors > Task :buildSrc:processResources > Task :buildSrc:classes > Task :buildSrc:jar > Task :buildSrc:assemble > Task :buildSrc:spotlessGroovy FROM-CACHE > Task :buildSrc:spotlessGroovyCheck UP-TO-DATE > Task :buildSrc:spotlessGroovyGradle FROM-CACHE > Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE > Task :buildSrc:spotlessCheck UP-TO-DATE > Task :buildSrc:pluginUnderTestMetadata > Task :buildSrc:compileTestJava NO-SOURCE > Task :buildSrc:compileTestGroovy NO-SOURCE > Task :buildSrc:processTestResources NO-SOURCE > Task :buildSrc:testClasses UP-TO-DATE > Task :buildSrc:test NO-SOURCE > Task :buildSrc:validatePlugins FROM-CACHE > Task :buildSrc:check UP-TO-DATE > Task :buildSrc:build Configuration on demand is an incubating feature. FAILURE: Build failed with an exception. * What went wrong: Task 'chicagoTaxiExample' not found in project ':sdks:python:test-suites:portable:py37'. * Try: Run gradlew tasks to get a list of available tasks. Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 36s Publishing build scan... https://gradle.com/s/ufjrl4og3gjlu Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org For additional commands, e-mail: builds-h...@beam.apache.org