See 
<https://ci-beam.apache.org/job/beam_LoadTests_Go_CoGBK_Flink_batch/130/display/redirect?page=changes>

Changes:

[Kenneth Knowles] Fix script location in build_release_candidate.sh

[Kenneth Knowles] Build release candidate from tag

[Kenneth Knowles] Run release scripts from PWD not cloned repo

[Kenneth Knowles] Take build_release_candidate.sh parameters on command line 
for easy

[Kenneth Knowles] Use SSH auth URL for pushing beam-site branch

[Robert Bradshaw] Add runner capabilities to Beam model proto.

[Robert Bradshaw] Runner Capabilities, go proto.

[Robert Bradshaw] Plumb runner capabilities to Beam SDKs.

[Kenneth Knowles] Refactor PR template to separate test types and label test 
variants

[noreply] Update pardo.md

[Boyuan Zhang] [BEAM-12160] Add TODO for fixing warning

[Robert Bradshaw] Avoid sending zero msec counters.

[Ismaël Mejía] [BEAM-12091] Make file staging uniform among runners

[randomstep] [BEAM-11903] Bump achilles to 6.1.0

[randomstep] [BEAM-12172] Bump gradle to 6.8.3

[aromanenko.dev] [BEAM-2888] Added packages.confluent.io maven repo

[noreply] [BEAM-9547] DataFrame.corr cleanup (#14327)

[Andrew Pilloud] More tests for time types

[noreply] Merge pull request #14467 from [BEAM-11607] Add word count tasks

[Andrew Pilloud] [BEAM-9379] Output outside of codegen, support rows

[Kenneth Knowles] Run release scripts from same directory, not temp clone

[Kenneth Knowles] Fix invocation of download_github_actions_artifacts.py from

[Kenneth Knowles] Limit GitHub Actions artifact downloads to RC tag to avoid 
paging

[Kenneth Knowles] More verbose output downloading GHA artifacts

[Robert Bradshaw] [BEAM-12170] Handle duplicate metrics due to flatten 
unzipping.

[noreply] [BEAM-366] Populate display data in portable job representation 
(#14470)

[noreply] [BEAM-12118] Modify QueuingBeamFnDataClient to avoid completion 
latency

[noreply] [BEAM-7372] cleanup py2 codepath from apache_beam/testing (#14496)

[Robert Bradshaw] Fix one more usage.

[noreply] [BEAM-7372] cleanup py2 codepath from apache_beam/tool,

[noreply] [BEAM-12074] Add @with_docs_from decorator for generating API docs

[noreply] [BEAM-12029] Make WontImplementErrors more helpful (#14517)

[dmytrokozhevin] Don't use fake coders in interactive Beam.

[dmytrokozhevin] Formatting fixes

[dmytrokozhevin] Formatting fixes

[dmytrokozhevin] Ran yapf on changes.


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-4 (beam) in workspace 
<https://ci-beam.apache.org/job/beam_LoadTests_Go_CoGBK_Flink_batch/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init 
 > <https://ci-beam.apache.org/job/beam_LoadTests_Go_CoGBK_Flink_batch/ws/src> 
 > # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.7.4'
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # 
 > timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 985e2f095d150261e998f58cf048e48a909d5b2b (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 985e2f095d150261e998f58cf048e48a909d5b2b # timeout=10
Commit message: "Merge pull request #14543: [BEAM-12172] Bump gradle to 6.8.3"
 > git rev-list --no-walk a86dc0609f0b1bcc0c450979363b27b2657418af # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
JOB_SERVER_IMAGE=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest
CLUSTER_NAME=beam-loadtests-go-cogbk-flink-batch-130
DETACHED_MODE=true
HARNESS_IMAGES_TO_PULL=gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest
FLINK_NUM_WORKERS=5
FLINK_DOWNLOAD_URL=https://archive.apache.org/dist/flink/flink-1.10.1/flink-1.10.1-bin-scala_2.11.tgz
GCS_BUCKET=gs://beam-flink-cluster
HADOOP_DOWNLOAD_URL=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar
FLINK_TASKMANAGER_SLOTS=1
ARTIFACTS_DIR=gs://beam-flink-cluster/beam-loadtests-go-cogbk-flink-batch-130
GCLOUD_ZONE=us-central1-a

[EnvInject] - Variables injected successfully.
[beam_LoadTests_Go_CoGBK_Flink_batch] $ /bin/bash -xe 
/tmp/jenkins8476474074654377893.sh
+ echo Setting up flink cluster
Setting up flink cluster
[beam_LoadTests_Go_CoGBK_Flink_batch] $ /bin/bash -xe 
/tmp/jenkins7823176908090809556.sh
+ cd 
<https://ci-beam.apache.org/job/beam_LoadTests_Go_CoGBK_Flink_batch/ws/src/.test-infra/dataproc>
+ ./flink_cluster.sh create
+ GCLOUD_ZONE=us-central1-a
+ DATAPROC_VERSION=1.2
+ MASTER_NAME=beam-loadtests-go-cogbk-flink-batch-130-m
+ INIT_ACTIONS_FOLDER_NAME=init-actions
+ FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh
+ BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh
+ DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh
+ FLINK_LOCAL_PORT=8081
+ FLINK_TASKMANAGER_SLOTS=1
+ YARN_APPLICATION_MASTER=
+ create
+ upload_init_actions
+ echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster'
Uploading initialization actions to GCS bucket: gs://beam-flink-cluster
+ gsutil cp -r init-actions/beam.sh init-actions/docker.sh 
init-actions/flink.sh gs://beam-flink-cluster/init-actions
Copying file://init-actions/beam.sh [Content-Type=text/x-sh]...
/ [0 files][    0.0 B/  2.3 KiB]                                                
/ [1 files][  2.3 KiB/  2.3 KiB]                                                
Copying file://init-actions/docker.sh [Content-Type=text/x-sh]...
/ [1 files][  2.3 KiB/  6.0 KiB]                                                
/ [2 files][  6.0 KiB/  6.0 KiB]                                                
Copying file://init-actions/flink.sh [Content-Type=text/x-sh]...
/ [2 files][  6.0 KiB/ 13.5 KiB]                                                
/ [3 files][ 13.5 KiB/ 13.5 KiB]                                                
Operation completed over 3 objects/13.5 KiB.                                    
 
+ create_cluster
+ local 
metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.10.1/flink-1.10.1-bin-scala_2.11.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1,
+ 
metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar
+ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest ]]
+ 
metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest
+ [[ -n 
gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest ]]
+ 
metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_****s=6
+ gcloud dataproc clusters create beam-loadtests-go-cogbk-flink-batch-130 
--region=global --num-****s=6 --initialization-actions 
gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh
 --metadata 
flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.10.1/flink-1.10.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest,
 --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation 
[projects/apache-beam-testing/regions/global/operations/155e5ba5-74f0-3353-a54b-abeb66aed6d8].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 
1TB or larger to ensure consistently high I/O performance. See 
https://cloud.google.com/compute/docs/disks/performance for information on disk 
I/O performance.
..............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................done.
ERROR: (gcloud.dataproc.clusters.create) Operation 
[projects/apache-beam-testing/regions/global/operations/155e5ba5-74f0-3353-a54b-abeb66aed6d8]
 failed: Multiple Errors:
 - Initialization action timed out. Failed action 
'gs://beam-flink-cluster/init-actions/flink.sh', see output in: 
gs://dataproc-6c5fbcbb-a2de-406e-9cf7-8c1ce0b6a604-us/google-cloud-dataproc-metainfo/3ee0bea8-4766-4ed6-97ab-60a2aed42b0e/beam-loadtests-go-cogbk-flink-batch-130-m/dataproc-initialization-script-2_output
 - Initialization action timed out. Failed action 
'gs://beam-flink-cluster/init-actions/flink.sh', see output in: 
gs://dataproc-6c5fbcbb-a2de-406e-9cf7-8c1ce0b6a604-us/google-cloud-dataproc-metainfo/3ee0bea8-4766-4ed6-97ab-60a2aed42b0e/beam-loadtests-go-cogbk-flink-batch-130-w-1/dataproc-initialization-script-2_output
 - Initialization action timed out. Failed action 
'gs://beam-flink-cluster/init-actions/flink.sh', see output in: 
gs://dataproc-6c5fbcbb-a2de-406e-9cf7-8c1ce0b6a604-us/google-cloud-dataproc-metainfo/3ee0bea8-4766-4ed6-97ab-60a2aed42b0e/beam-loadtests-go-cogbk-flink-batch-130-w-5/dataproc-initialization-script-2_output.
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to