See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Flink_Streaming/1014/display/redirect>
Changes: ------------------------------------------ Started by timer Running as SYSTEM [EnvInject] - Loading node environment variables. Building remotely on apache-beam-jenkins-2 (beam) in workspace /home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_Combine_Flink_Streaming The recommended git tool is: NONE No credentials specified Wiping out workspace first. Cloning the remote Git repository Using shallow clone with depth 1 Avoid fetching tags Cloning repository https://github.com/apache/beam.git > git init > /home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_Combine_Flink_Streaming/src > # timeout=10 Fetching upstream changes from https://github.com/apache/beam.git > git --version # timeout=10 > git --version # 'git version 2.25.1' > git fetch --no-tags --force --progress --depth=1 -- > https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # > timeout=10 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # > timeout=10 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10 Using shallow fetch with depth 1 Fetching upstream changes from https://github.com/apache/beam.git > git fetch --no-tags --force --progress --depth=1 -- > https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # > timeout=10 > git rev-parse origin/master^{commit} # timeout=10 Checking out Revision f433fbe65abb4b3e890066c4c57f41de73526c6a (origin/master) > git config core.sparsecheckout # timeout=10 > git checkout -f f433fbe65abb4b3e890066c4c57f41de73526c6a # timeout=10 Commit message: "Bump transformers (#27108)" > git rev-list --no-walk e07c461cd45e3942fa48ab4499ec4ee3317dbcff # timeout=10 First time build. Skipping changelog. No emails were triggered. [EnvInject] - Executing scripts and injecting environment variables after the SCM step. [EnvInject] - Injecting as environment variables the properties content SETUPTOOLS_USE_DISTUTILS=stdlib SPARK_LOCAL_IP=127.0.0.1 [EnvInject] - Variables injected successfully. [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content CLUSTER_NAME=beam-loadtests-python-combine-flink-streaming-1014 FLINK_NUM_WORKERS=16 DETACHED_MODE=true JOB_SERVER_IMAGE=gcr.io/apache-beam-testing/beam_portability/beam_flink1.15_job_server:latest GCS_BUCKET=gs://beam-flink-cluster HARNESS_IMAGES_TO_PULL=gcr.io/apache-beam-testing/beam_portability/beam_python3.8_sdk:latest HADOOP_DOWNLOAD_URL=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-10.0/flink-shaded-hadoop-2-uber-2.8.3-10.0.jar GCLOUD_ZONE=us-central1-a FLINK_TASKMANAGER_SLOTS=1 FLINK_DOWNLOAD_URL=https://archive.apache.org/dist/flink/flink-1.15.0/flink-1.15.0-bin-scala_2.12.tgz ARTIFACTS_DIR=gs://beam-flink-cluster/beam-loadtests-python-combine-flink-streaming-1014 [EnvInject] - Variables injected successfully. [beam_LoadTests_Python_Combine_Flink_Streaming] $ /bin/bash -xe /tmp/jenkins4115044291970554057.sh + echo Setting up flink cluster Setting up flink cluster [beam_LoadTests_Python_Combine_Flink_Streaming] $ /bin/bash -xe /tmp/jenkins471650307227025601.sh + cd /home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_Combine_Flink_Streaming/src/.test-infra/dataproc + ./flink_cluster.sh create + GCLOUD_ZONE=us-central1-a + DATAPROC_VERSION=2.1-debian ++ echo us-central1-a ++ sed -E 's/(-[a-z])?$//' + GCLOUD_REGION=us-central1 + MASTER_NAME=beam-loadtests-python-combine-flink-streaming-1014-m + INIT_ACTIONS_FOLDER_NAME=init-actions + FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh + BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh + DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh + FLINK_LOCAL_PORT=8081 + FLINK_TASKMANAGER_SLOTS=1 + YARN_APPLICATION_MASTER= + create + upload_init_actions + echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster' Uploading initialization actions to GCS bucket: gs://beam-flink-cluster + gsutil cp -r init-actions/beam.sh init-actions/docker.sh init-actions/flink.sh gs://beam-flink-cluster/init-actions Copying file://init-actions/beam.sh [Content-Type=text/x-sh]... / [0 files][ 0.0 B/ 2.3 KiB] / [1 files][ 2.3 KiB/ 2.3 KiB] Copying file://init-actions/docker.sh [Content-Type=text/x-sh]... / [1 files][ 2.3 KiB/ 6.0 KiB] / [2 files][ 6.0 KiB/ 6.0 KiB] Copying file://init-actions/flink.sh [Content-Type=text/x-sh]... / [2 files][ 6.0 KiB/ 13.5 KiB] / [3 files][ 13.5 KiB/ 13.5 KiB] - Operation completed over 3 objects/13.5 KiB. + create_cluster + local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.15.0/flink-1.15.0-bin-scala_2.12.tgz, + metadata+=flink-start-yarn-session=true, + metadata+=flink-taskmanager-slots=1, + metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-10.0/flink-shaded-hadoop-2-uber-2.8.3-10.0.jar + [[ -n gcr.io/apache-beam-testing/beam_portability/beam_python3.8_sdk:latest ]] + metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_python3.8_sdk:latest + [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.15_job_server:latest ]] + metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.15_job_server:latest + local image_version=2.1-debian + echo 'Starting dataproc cluster. Dataproc version: 2.1-debian' Starting dataproc cluster. Dataproc version: 2.1-debian + gcloud dataproc clusters create beam-loadtests-python-combine-flink-streaming-1014 --region=us-central1 --num-****s=16 --master-machine-type=n1-standard-2 --****-machine-type=n1-standard-2 --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.15.0/flink-1.15.0-bin-scala_2.12.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-10.0/flink-shaded-hadoop-2-uber-2.8.3-10.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_python3.8_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.15_job_server:latest, --image-version=2.1-debian --zone=us-central1-a --optional-components=FLINK,DOCKER --quiet Waiting on operation [projects/apache-beam-testing/regions/us-central1/operations/ecfcab27-27aa-3af3-8e33-1b76c112d2a9]. Waiting for cluster creation operation... WARNING: Consider using Auto Zone rather than selecting a zone manually. See https://cloud.google.com/dataproc/docs/concepts/configuring-clusters/auto-zone WARNING: Permissions are missing for the default service account '[email protected]', missing permissions: [storage.objects.get, storage.objects.update] on the staging_bucket 'projects/_/buckets/dataproc-staging-us-central1-844138762903-fb7i4jhf'. This usually happens when a custom resource (ex: custom staging bucket) or a user-managed VM Service account has been provided and the default/user-managed service account hasn't been granted enough permissions on the resource. See https://cloud.google.com/dataproc/docs/concepts/configuring-clusters/service-accounts#VM_service_account. WARNING: Permissions are missing for the default service account '[email protected]', missing permissions: [storage.objects.get, storage.objects.update] on the temp_bucket 'projects/_/buckets/dataproc-temp-us-central1-844138762903-khnec1vl'. This usually happens when a custom resource (ex: custom staging bucket) or a user-managed VM Service account has been provided and the default/user-managed service account hasn't been granted enough permissions on the resource. See https://cloud.google.com/dataproc/docs/concepts/configuring-clusters/service-accounts#VM_service_account. WARNING: The firewall rules for specified network or subnetwork would allow ingress traffic from 0.0.0.0/0, which could be a security risk. FATAL: command execution failed java.io.IOException: Pipe closed after 0 cycles at org.apache.sshd.common.channel.ChannelPipedInputStream.read(ChannelPipedInputStream.java:126) at org.apache.sshd.common.channel.ChannelPipedInputStream.read(ChannelPipedInputStream.java:105) at hudson.remoting.FlightRecorderInputStream.read(FlightRecorderInputStream.java:94) at hudson.remoting.ChunkedInputStream.readHeader(ChunkedInputStream.java:75) at hudson.remoting.ChunkedInputStream.readUntilBreak(ChunkedInputStream.java:105) at hudson.remoting.ChunkedCommandTransport.readBlock(ChunkedCommandTransport.java:39) at hudson.remoting.AbstractSynchronousByteArrayCommandTransport.read(AbstractSynchronousByteArrayCommandTransport.java:34) at hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:61) Caused: java.io.IOException: Backing channel 'apache-beam-jenkins-2' is disconnected. at hudson.remoting.RemoteInvocationHandler.channelOrFail(RemoteInvocationHandler.java:215) at hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:285) at com.sun.proxy.$Proxy146.isAlive(Unknown Source) at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1215) at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1207) at hudson.tasks.CommandInterpreter.join(CommandInterpreter.java:195) at hudson.tasks.CommandInterpreter.perform(CommandInterpreter.java:145) at hudson.tasks.CommandInterpreter.perform(CommandInterpreter.java:92) at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20) at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:814) at hudson.model.Build$BuildExecution.build(Build.java:199) at hudson.model.Build$BuildExecution.doRun(Build.java:164) at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:522) at hudson.model.Run.execute(Run.java:1896) at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:44) at hudson.model.ResourceController.execute(ResourceController.java:101) at hudson.model.Executor.run(Executor.java:442) FATAL: Unable to delete script file /tmp/jenkins471650307227025601.sh java.io.IOException: Pipe closed after 0 cycles at org.apache.sshd.common.channel.ChannelPipedInputStream.read(ChannelPipedInputStream.java:126) at org.apache.sshd.common.channel.ChannelPipedInputStream.read(ChannelPipedInputStream.java:105) at hudson.remoting.FlightRecorderInputStream.read(FlightRecorderInputStream.java:94) at hudson.remoting.ChunkedInputStream.readHeader(ChunkedInputStream.java:75) at hudson.remoting.ChunkedInputStream.readUntilBreak(ChunkedInputStream.java:105) at hudson.remoting.ChunkedCommandTransport.readBlock(ChunkedCommandTransport.java:39) at hudson.remoting.AbstractSynchronousByteArrayCommandTransport.read(AbstractSynchronousByteArrayCommandTransport.java:34) at hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:61) Caused: hudson.remoting.ChannelClosedException: Channel "hudson.remoting.Channel@7da3577c:apache-beam-jenkins-2": Remote call on apache-beam-jenkins-2 failed. The channel is closing down or has closed down at hudson.remoting.Channel.call(Channel.java:993) at hudson.FilePath.act(FilePath.java:1194) at hudson.FilePath.act(FilePath.java:1183) at hudson.FilePath.delete(FilePath.java:1730) at hudson.tasks.CommandInterpreter.perform(CommandInterpreter.java:163) at hudson.tasks.CommandInterpreter.perform(CommandInterpreter.java:92) at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20) at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:814) at hudson.model.Build$BuildExecution.build(Build.java:199) at hudson.model.Build$BuildExecution.doRun(Build.java:164) at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:522) at hudson.model.Run.execute(Run.java:1896) at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:44) at hudson.model.ResourceController.execute(ResourceController.java:101) at hudson.model.Executor.run(Executor.java:442) Build step 'Execute shell' marked build as failure ERROR: apache-beam-jenkins-2 is offline; cannot locate jdk_1.8_latest --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
