See 
<https://ci-beam.apache.org/job/beam_LoadTests_Go_CoGBK_Flink_batch/711/display/redirect?page=changes>

Changes:

[bulat.safiullin] [Website] update table text content overflow #23460

[Moritz Mack] [Spark dataset runner] Fix translation to run in the evaluation 
thread

[Moritz Mack] [Metrics] Add 'performance tests' tag to JMH dashboard (related to

[noreply] Bump github.com/aws/aws-sdk-go-v2/credentials in /sdks (#24318)

[noreply] Update apache beam installation in notebook (#24336)

[noreply] Adds GCP core dependency to the test expansion service (#24308)

[noreply] Update dataflow containers to coincide with objsize 0.6.1 update

[noreply] Add test configurations for deterministic outputs on Dataflow (#24325)

[noreply] Updates ExpansionService to support dynamically discovering and

[noreply] Enable streaming runner v2 tests that were forgotten to be enabled.

[noreply] A schema transform implementation for SpannerIO.Write (#24278)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace 
<https://ci-beam.apache.org/job/beam_LoadTests_Go_CoGBK_Flink_batch/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init 
 > <https://ci-beam.apache.org/job/beam_LoadTests_Go_CoGBK_Flink_batch/ws/src> 
 > # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # 
 > timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 09606de8779ca284666581c7c2da314df4f9ddb1 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 09606de8779ca284666581c7c2da314df4f9ddb1 # timeout=10
Commit message: "A schema transform implementation for SpannerIO.Write (#24278)"
 > git rev-list --no-walk cf904dc72aa8cd9f4973342409d8271fcd09f90b # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
CLUSTER_NAME=beam-loadtests-go-cogbk-flink-batch-711
FLINK_NUM_WORKERS=5
DETACHED_MODE=true
JOB_SERVER_IMAGE=gcr.io/apache-beam-testing/beam_portability/beam_flink1.15_job_server:latest
GCS_BUCKET=gs://beam-flink-cluster
HARNESS_IMAGES_TO_PULL=gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest
HADOOP_DOWNLOAD_URL=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-10.0/flink-shaded-hadoop-2-uber-2.8.3-10.0.jar
GCLOUD_ZONE=us-central1-a
FLINK_TASKMANAGER_SLOTS=1
FLINK_DOWNLOAD_URL=https://archive.apache.org/dist/flink/flink-1.15.0/flink-1.15.0-bin-scala_2.12.tgz
ARTIFACTS_DIR=gs://beam-flink-cluster/beam-loadtests-go-cogbk-flink-batch-711

[EnvInject] - Variables injected successfully.
[beam_LoadTests_Go_CoGBK_Flink_batch] $ /bin/bash -xe 
/tmp/jenkins1035101717532984410.sh
+ echo Setting up flink cluster
Setting up flink cluster
[beam_LoadTests_Go_CoGBK_Flink_batch] $ /bin/bash -xe 
/tmp/jenkins72124369696924194.sh
+ cd 
<https://ci-beam.apache.org/job/beam_LoadTests_Go_CoGBK_Flink_batch/ws/src/.test-infra/dataproc>
+ ./flink_cluster.sh create
+ GCLOUD_ZONE=us-central1-a
+ DATAPROC_VERSION=preview-debian11
++ echo us-central1-a
++ sed -E 's/(-[a-z])?$//'
+ GCLOUD_REGION=us-central1
+ MASTER_NAME=beam-loadtests-go-cogbk-flink-batch-711-m
+ INIT_ACTIONS_FOLDER_NAME=init-actions
+ FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh
+ BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh
+ DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh
+ FLINK_LOCAL_PORT=8081
+ FLINK_TASKMANAGER_SLOTS=1
+ YARN_APPLICATION_MASTER=
+ create
+ upload_init_actions
+ echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster'
Uploading initialization actions to GCS bucket: gs://beam-flink-cluster
+ gsutil cp -r init-actions/beam.sh init-actions/docker.sh 
init-actions/flink.sh gs://beam-flink-cluster/init-actions
Copying file://init-actions/beam.sh [Content-Type=text/x-sh]...
/ [0 files][    0.0 B/  2.3 KiB]                                                
/ [1 files][  2.3 KiB/  2.3 KiB]                                                
Copying file://init-actions/docker.sh [Content-Type=text/x-sh]...
/ [1 files][  2.3 KiB/  6.0 KiB]                                                
/ [2 files][  6.0 KiB/  6.0 KiB]                                                
Copying file://init-actions/flink.sh [Content-Type=text/x-sh]...
/ [2 files][  6.0 KiB/ 13.5 KiB]                                                
/ [3 files][ 13.5 KiB/ 13.5 KiB]                                                
-
Operation completed over 3 objects/13.5 KiB.                                    
 
+ create_cluster
+ local 
metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.15.0/flink-1.15.0-bin-scala_2.12.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1,
+ 
metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-10.0/flink-shaded-hadoop-2-uber-2.8.3-10.0.jar
+ [[ -n gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest ]]
+ 
metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest
+ [[ -n 
gcr.io/apache-beam-testing/beam_portability/beam_flink1.15_job_server:latest ]]
+ 
metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.15_job_server:latest
+ local image_version=preview-debian11
+ echo 'Starting dataproc cluster. Dataproc version: preview-debian11'
Starting dataproc cluster. Dataproc version: preview-debian11
+ gcloud dataproc clusters create beam-loadtests-go-cogbk-flink-batch-711 
--region=us-central1 --num-****s=5 --master-machine-type=n1-standard-2 
--****-machine-type=n1-standard-2 --metadata 
flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.15.0/flink-1.15.0-bin-scala_2.12.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-10.0/flink-shaded-hadoop-2-uber-2.8.3-10.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.15_job_server:latest,
 --image-version=preview-debian11 --zone=us-central1-a 
--optional-components=FLINK,DOCKER --quiet
Waiting on operation 
[projects/apache-beam-testing/regions/us-central1/operations/6eb4de64-2b3d-3ebb-8d66-cead7eb15b15].
Waiting for cluster creation operation...
WARNING: Consider using Auto Zone rather than selecting a zone manually. See 
https://cloud.google.com/dataproc/docs/concepts/configuring-clusters/auto-zone
...............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................done.
Created 
[https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/us-central1/clusters/beam-loadtests-go-cogbk-flink-batch-711]
 Cluster placed in zone [us-central1-a].
+ get_leader
+ local i=0
+ local application_ids
+ local application_masters
+ echo 'Yarn Applications'
Yarn Applications
++ gcloud compute ssh --zone=us-central1-a --quiet 
yarn@beam-loadtests-go-cogbk-flink-batch-711-m '--command=yarn application 
-list'
++ grep 'Apache Flink'
Writing 3 keys to /home/jenkins/.ssh/google_compute_known_hosts
kex_exchange_identification: Connection closed by remote host

Recommendation: To check for possible causes of SSH connectivity issues and get
recommendations, rerun the ssh command with the --troubleshoot option.

gcloud compute ssh beam-loadtests-go-cogbk-flink-batch-711-m 
--project=apache-beam-testing --zone=us-central1-a --troubleshoot

Or, to investigate an IAP tunneling issue:

gcloud compute ssh beam-loadtests-go-cogbk-flink-batch-711-m 
--project=apache-beam-testing --zone=us-central1-a --troubleshoot 
--tunnel-through-iap

ERROR: (gcloud.compute.ssh) [/usr/bin/ssh] exited with return code [255].
+ read line
+ echo

++ echo
++ sed 's/ .*//'
+ application_ids[$i]=
++ echo
++ sed -E 's#.*(https?://)##'
++ sed 's/ .*//'
+ application_masters[$i]=
+ i=1
+ read line
+ '[' 1 '!=' 1 ']'
+ YARN_APPLICATION_MASTER=
+ echo 'Using Yarn Application master: '
Using Yarn Application master: 
+ [[ -n 
gcr.io/apache-beam-testing/beam_portability/beam_flink1.15_job_server:latest ]]
+ start_job_server
+ gcloud compute ssh --zone=us-central1-a --quiet 
yarn@beam-loadtests-go-cogbk-flink-batch-711-m '--command=sudo --user yarn 
docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 
--volume ~/.config/gcloud:/root/.config/gcloud 
gcr.io/apache-beam-testing/beam_portability/beam_flink1.15_job_server:latest 
--flink-master= 
--artifacts-dir=gs://beam-flink-cluster/beam-loadtests-go-cogbk-flink-batch-711'
Existing host keys found in /home/jenkins/.ssh/google_compute_known_hosts
Unable to find image 
'gcr.io/apache-beam-testing/beam_portability/beam_flink1.15_job_server:latest' 
locally
latest: Pulling from 
apache-beam-testing/beam_portability/beam_flink1.15_job_server
001c52e26ad5: Pulling fs layer
d9d4b9b6e964: Pulling fs layer
2068746827ec: Pulling fs layer
9daef329d350: Pulling fs layer
d85151f15b66: Pulling fs layer
52a8c426d30b: Pulling fs layer
8754a66e0050: Pulling fs layer
5c7a196f36a3: Pulling fs layer
5c1c474493d0: Pulling fs layer
cab4f299094d: Pulling fs layer
6d3dd0e455fc: Pulling fs layer
f3b6da2e28a3: Pulling fs layer
52a8c426d30b: Waiting
8754a66e0050: Waiting
5c7a196f36a3: Waiting
5c1c474493d0: Waiting
cab4f299094d: Waiting
6d3dd0e455fc: Waiting
f3b6da2e28a3: Waiting
9daef329d350: Waiting
d85151f15b66: Waiting
d9d4b9b6e964: Verifying Checksum
d9d4b9b6e964: Download complete
2068746827ec: Verifying Checksum
2068746827ec: Download complete
d85151f15b66: Verifying Checksum
d85151f15b66: Download complete
52a8c426d30b: Verifying Checksum
52a8c426d30b: Download complete
001c52e26ad5: Verifying Checksum
001c52e26ad5: Download complete
9daef329d350: Verifying Checksum
9daef329d350: Download complete
5c7a196f36a3: Verifying Checksum
5c7a196f36a3: Download complete
cab4f299094d: Verifying Checksum
cab4f299094d: Download complete
6d3dd0e455fc: Verifying Checksum
6d3dd0e455fc: Download complete
f3b6da2e28a3: Verifying Checksum
f3b6da2e28a3: Download complete
8754a66e0050: Verifying Checksum
8754a66e0050: Download complete
001c52e26ad5: Pull complete
5c1c474493d0: Verifying Checksum
5c1c474493d0: Download complete
d9d4b9b6e964: Pull complete
2068746827ec: Pull complete
9daef329d350: Pull complete
d85151f15b66: Pull complete
52a8c426d30b: Pull complete
8754a66e0050: Pull complete
5c7a196f36a3: Pull complete
5c1c474493d0: Pull complete
cab4f299094d: Pull complete
6d3dd0e455fc: Pull complete
f3b6da2e28a3: Pull complete
Digest: sha256:f51f9a870aeba9c5d8d4865c670d4493b4ebb35a9717681d79c984343da28f95
Status: Downloaded newer image for 
gcr.io/apache-beam-testing/beam_portability/beam_flink1.15_job_server:latest
b67243897f2cfbccf7fe53c378a6a4cadcc9c3b75e3bc5cdc95bac724b4c0fda
+ start_tunnel
++ gcloud compute ssh --quiet --zone=us-central1-a 
yarn@beam-loadtests-go-cogbk-flink-batch-711-m '--command=curl -s 
"http:///jobmanager/config";'
Existing host keys found in /home/jenkins/.ssh/google_compute_known_hosts
+ local job_server_config=
+ local key=jobmanager.rpc.port
++ echo
++ cut -d : -f1
+ local yarn_application_master_host=
++ echo
++ python -c 'import sys, json; print([e['\''value'\''] for e in 
json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0])'
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/usr/lib/python3.8/json/__init__.py", line 293, in load
    return loads(fp.read(),
  File "/usr/lib/python3.8/json/__init__.py", line 357, in loads
    return _default_decoder.decode(s)
  File "/usr/lib/python3.8/json/decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/usr/lib/python3.8/json/decoder.py", line 355, in raw_decode
    raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 2 column 1 (char 1)
+ local jobmanager_rpc_port=
++ [[ true == \t\r\u\e ]]
++ echo ' -Nf >& /dev/null'
+ local 'detached_mode_params= -Nf >& /dev/null'
++ [[ -n 
gcr.io/apache-beam-testing/beam_portability/beam_flink1.15_job_server:latest ]]
++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 
8098:localhost:8098 -L 8097:localhost:8097'
+ local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet 
yarn@beam-loadtests-go-cogbk-flink-batch-711-m -- -L 8081: -L :: -L 
8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080  -Nf 
>& /dev/null'
+ eval gcloud compute ssh --zone=us-central1-a --quiet 
yarn@beam-loadtests-go-cogbk-flink-batch-711-m -- -L 8081: -L :: -L 
8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf 
'>&' /dev/null
++ gcloud compute ssh --zone=us-central1-a --quiet 
yarn@beam-loadtests-go-cogbk-flink-batch-711-m -- -L 8081: -L :: -L 
8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to