See 
<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/6290/display/redirect?page=changes>

Changes:

[clementg] allow non-lts jvm version, fallback on java 11 for runner

[clementg] Add a stricter java version method

[clementg] fall back to the nearest lts version

[noreply] Keep stale action from closing issues (#23067)

[noreply] Merge pull request #22996: [BEAM-11205] Update GCP Libraries BOM


------------------------------------------
[...truncated 1.53 KB...]
Checking out Revision 4ec319d27999aa7d5b7d9dfa1b9aed4e130e0bf2 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 4ec319d27999aa7d5b7d9dfa1b9aed4e130e0bf2 # timeout=10
Commit message: "Merge pull request #22996: [BEAM-11205] Update GCP Libraries 
BOM dependencies to version 26.1.1"
 > git rev-list --no-walk 0d937d4cd725965572d4720811fa2d6efaa8edf8 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins7492343744328375958.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a
Fetching cluster endpoint and auth data.
kubeconfig entry generated for io-datastores.
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins5482294978269255924.sh
+ cp /home/jenkins/.kube/config 
<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-beam-performancetests-textioit-hdfs-6290>
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
KUBECONFIG=<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-beam-performancetests-textioit-hdfs-6290>

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins689532141660900405.sh
+ 
<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/kubernetes.sh>
 createNamespace beam-performancetests-textioit-hdfs-6290
+ 
KUBECONFIG=<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-beam-performancetests-textioit-hdfs-6290>
+ KUBERNETES_NAMESPACE=default
+ KUBECTL='kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-beam-performancetests-textioit-hdfs-6290>
 --namespace=default'
+ createNamespace beam-performancetests-textioit-hdfs-6290
+ eval 'kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-beam-performancetests-textioit-hdfs-6290>
 create namespace beam-performancetests-textioit-hdfs-6290'
++ kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-beam-performancetests-textioit-hdfs-6290>
 create namespace beam-performancetests-textioit-hdfs-6290
namespace/beam-performancetests-textioit-hdfs-6290 created
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
KUBERNETES_NAMESPACE=beam-performancetests-textioit-hdfs-6290

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins4298596809047396968.sh
+ 
<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/kubernetes.sh>
 apply 
<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/LargeITCluster/hdfs-multi-datanode-cluster.yml>
+ 
KUBECONFIG=<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-beam-performancetests-textioit-hdfs-6290>
+ KUBERNETES_NAMESPACE=beam-performancetests-textioit-hdfs-6290
+ KUBECTL='kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-beam-performancetests-textioit-hdfs-6290>
 --namespace=beam-performancetests-textioit-hdfs-6290'
+ apply 
<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/LargeITCluster/hdfs-multi-datanode-cluster.yml>
+ eval 'kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-beam-performancetests-textioit-hdfs-6290>
 --namespace=beam-performancetests-textioit-hdfs-6290 apply -R -f 
<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/LargeITCluster/hdfs-multi-datanode-cluster.yml'>
++ kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-beam-performancetests-textioit-hdfs-6290>
 --namespace=beam-performancetests-textioit-hdfs-6290 apply -R -f 
<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/LargeITCluster/hdfs-multi-datanode-cluster.yml>
service/hadoop created
service/hadoop-datanodes created
statefulset.apps/datanode created
pod/namenode-0 created
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins6888258824211383697.sh
+ set -eo pipefail
+ eval 
<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/kubernetes.sh>
 loadBalancerIP hadoop
++ 
<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/kubernetes.sh>
 loadBalancerIP hadoop
+ sed 's/^/LOAD_BALANCER_IP=/'
+ 
KUBECONFIG=<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-beam-performancetests-textioit-hdfs-6290>
+ KUBERNETES_NAMESPACE=beam-performancetests-textioit-hdfs-6290
+ KUBECTL='kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-beam-performancetests-textioit-hdfs-6290>
 --namespace=beam-performancetests-textioit-hdfs-6290'
+ loadBalancerIP hadoop
+ local name=hadoop
+ local 'command=kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-beam-performancetests-textioit-hdfs-6290>
 --namespace=beam-performancetests-textioit-hdfs-6290 get svc hadoop 
-ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+ retry 'kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-beam-performancetests-textioit-hdfs-6290>
 --namespace=beam-performancetests-textioit-hdfs-6290 get svc hadoop 
-ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\''' 36 10
+ local 'command=kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-beam-performancetests-textioit-hdfs-6290>
 --namespace=beam-performancetests-textioit-hdfs-6290 get svc hadoop 
-ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+ local max_retries=36
+ local sleep_time=10
+ (( i = 1 ))
+ (( i <= max_retries ))
+ local output
++ eval 'kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-beam-performancetests-textioit-hdfs-6290>
 --namespace=beam-performancetests-textioit-hdfs-6290 get svc hadoop 
-ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+++ kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-beam-performancetests-textioit-hdfs-6290>
 --namespace=beam-performancetests-textioit-hdfs-6290 get svc hadoop 
'-ojsonpath={.status.loadBalancer.ingress[0].ip}'
+ output=
+ local status=0
+ [[ 0 == 0 ]]
+ [[ -n '' ]]
+ [[ 1 == \3\6 ]]
+ sleep 10
+ (( i++ ))
+ (( i <= max_retries ))
+ local output
++ eval 'kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-beam-performancetests-textioit-hdfs-6290>
 --namespace=beam-performancetests-textioit-hdfs-6290 get svc hadoop 
-ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+++ kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-beam-performancetests-textioit-hdfs-6290>
 --namespace=beam-performancetests-textioit-hdfs-6290 get svc hadoop 
'-ojsonpath={.status.loadBalancer.ingress[0].ip}'
+ output=
+ local status=0
+ [[ 0 == 0 ]]
+ [[ -n '' ]]
+ [[ 2 == \3\6 ]]
+ sleep 10
+ (( i++ ))
+ (( i <= max_retries ))
+ local output
++ eval 'kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-beam-performancetests-textioit-hdfs-6290>
 --namespace=beam-performancetests-textioit-hdfs-6290 get svc hadoop 
-ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+++ kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-beam-performancetests-textioit-hdfs-6290>
 --namespace=beam-performancetests-textioit-hdfs-6290 get svc hadoop 
'-ojsonpath={.status.loadBalancer.ingress[0].ip}'
+ output=
+ local status=0
+ [[ 0 == 0 ]]
+ [[ -n '' ]]
+ [[ 3 == \3\6 ]]
+ sleep 10
+ (( i++ ))
+ (( i <= max_retries ))
+ local output
++ eval 'kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-beam-performancetests-textioit-hdfs-6290>
 --namespace=beam-performancetests-textioit-hdfs-6290 get svc hadoop 
-ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+++ kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-beam-performancetests-textioit-hdfs-6290>
 --namespace=beam-performancetests-textioit-hdfs-6290 get svc hadoop 
'-ojsonpath={.status.loadBalancer.ingress[0].ip}'
+ output=34.72.119.95
+ local status=0
+ [[ 0 == 0 ]]
+ [[ -n 34.72.119.95 ]]
+ echo 34.72.119.95
+ return 0
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties file path 
'job.properties'
[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 
<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/gradlew>
 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g 
-Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses 
--info 
-DintegrationTestPipelineOptions=["--bigQueryDataset=beam_performance","--bigQueryTable=textioit_hdfs_results","--influxMeasurement=textioit_hdfs_results","--numberOfRecords=25000000","--expectedHash=f8453256ccf861e8a312c125dfe0e436","--datasetSize=1062290000","--numWorkers=5","--autoscalingAlgorithm=NONE","--influxDatabase=beam_test_metrics","--influxHost=http://10.128.0.96:8086","--runner=DataflowRunner","--project=apache-beam-testing","--tempRoot=gs://temp-storage-for-perf-tests","--hdfsConfiguration=[{\"fs.defaultFS\":\"hdfs:34.72.119.95:9000\",\"dfs.replication\":1}]","--filenamePrefix=hdfs://34.72.119.95:9000/TEXTIO_IT_";]
 -Dfilesystem=hdfs -DintegrationTestRunner=dataflow 
:sdks:java:io:file-based-io-tests:integrationTest --tests 
org.apache.beam.sdk.io.text.TextIOIT
Initialized native services in: /home/jenkins/.gradle/native
Initialized jansi services in: /home/jenkins/.gradle/native
The client will now receive all logging from the daemon (pid: 1624357). The 
daemon log file: /home/jenkins/.gradle/daemon/7.5.1/daemon-1624357.out.log
Starting 2nd build in daemon [uptime: 42 mins 48.627 secs, performance: 100%, 
GC rate: 0.00/s, heap usage: 1% of 4 GiB]
Using 8 **** leases.
Configuration on demand is an incubating feature.
Closing daemon's stdin at end of input.
The daemon will no longer process any standard input.
Watching the file system is configured to be disabled
File system watching is inactive
Starting Build
Invalidating in-memory cache of 
/home/jenkins/.gradle/caches/7.5.1/fileHashes/fileHashes.bin
Invalidating in-memory cache of 
/home/jenkins/.gradle/caches/7.5.1/fileHashes/resourceHashesCache.bin
Settings evaluated using settings file 
'<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/settings.gradle.kts'.>
Using local directory build cache for the root build (location = 
/home/jenkins/.gradle/caches/build-cache-1, removeUnusedEntriesAfter = 7 days).
Projects loaded. Root project using build file 
'<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/build.gradle.kts'.>
Included projects: [root project 'beam', project ':beam-test-infra-metrics', 
project ':beam-test-jenkins', project ':beam-test-tools', project 
':beam-validate-runner', project ':examples', project ':model', project 
':playground', project ':release', project ':runners', project ':sdks', project 
':vendor', project ':website', project ':examples:java', project 
':examples:kotlin', project ':examples:multi-language', project 
':model:fn-execution', project ':model:job-management', project 
':model:pipeline', project ':playground:backend', project 
':playground:frontend', project ':playground:terraform', project 
':release:go-licenses', project ':runners:core-construction-java', project 
':runners:core-java', project ':runners:direct-java', project 
':runners:extensions-java', project ':runners:flink', project 
':runners:google-cloud-dataflow-java', project ':runners:java-fn-execution', 
project ':runners:java-job-service', project ':runners:jet', project 
':runners:local-java', project ':runners:portability', project 
':runners:samza', project ':runners:spark', project ':runners:twister2', 
project ':sdks:go', project ':sdks:java', project ':sdks:python', project 
':vendor:calcite-1_28_0', project ':vendor:grpc-1_48_1', project 
':vendor:guava-26_0-jre', project ':examples:java:twitter', project 
':playground:backend:containers', project ':release:go-licenses:go', project 
':release:go-licenses:java', project ':release:go-licenses:py', project 
':runners:extensions-java:metrics', project ':runners:flink:1.12', project 
':runners:flink:1.13', project ':runners:flink:1.14', project 
':runners:flink:1.15', project ':runners:google-cloud-dataflow-java:examples', 
project ':runners:google-cloud-dataflow-java:examples-streaming', project 
':runners:google-cloud-dataflow-java:****', project 
':runners:portability:java', project ':runners:samza:job-server', project 
':runners:spark:2', project ':runners:spark:3', project ':sdks:go:container', 
project ':sdks:go:examples', project ':sdks:go:test', project ':sdks:java:bom', 
project ':sdks:java:build-tools', project ':sdks:java:container', project 
':sdks:java:core', project ':sdks:java:expansion-service', project 
':sdks:java:extensions', project ':sdks:java:fn-execution', project 
':sdks:java:harness', project ':sdks:java:io', project ':sdks:java:javadoc', 
project ':sdks:java:maven-archetypes', project ':sdks:java:testing', project 
':sdks:python:apache_beam', project ':sdks:python:container', project 
':sdks:python:test-suites', project ':playground:backend:containers:go', 
project ':playground:backend:containers:java', project 
':playground:backend:containers:python', project 
':playground:backend:containers:router', project 
':playground:backend:containers:scio', project 
':runners:flink:1.12:job-server', project 
':runners:flink:1.12:job-server-container', project 
':runners:flink:1.13:job-server', project 
':runners:flink:1.13:job-server-container', project 
':runners:flink:1.14:job-server', project 
':runners:flink:1.14:job-server-container', project 
':runners:flink:1.15:job-server', project 
':runners:flink:1.15:job-server-container', project 
':runners:google-cloud-dataflow-java:****:legacy-****', project 
':runners:google-cloud-dataflow-java:****:windmill', project 
':runners:spark:2:job-server', project ':runners:spark:3:job-server', project 
':sdks:go:test:load', project ':sdks:java:bom:gcp', project 
':sdks:java:container:java11', project ':sdks:java:container:java17', project 
':sdks:java:container:java8', project ':sdks:java:core:jmh', project 
':sdks:java:expansion-service:app', project ':sdks:java:extensions:arrow', 
project ':sdks:java:extensions:euphoria', project 
':sdks:java:extensions:google-cloud-platform-core', project 
':sdks:java:extensions:jackson', project ':sdks:java:extensions:join-library', 
project ':sdks:java:extensions:kryo', project ':sdks:java:extensions:ml', 
project ':sdks:java:extensions:protobuf', project 
':sdks:java:extensions:python', project ':sdks:java:extensions:sbe', project 
':sdks:java:extensions:schemaio-expansion-service', project 
':sdks:java:extensions:sketching', project ':sdks:java:extensions:sorter', 
project ':sdks:java:extensions:sql', project 
':sdks:java:extensions:timeseries', project ':sdks:java:extensions:zetasketch', 
project ':sdks:java:harness:jmh', project ':sdks:java:io:amazon-web-services', 
project ':sdks:java:io:amazon-web-services2', project ':sdks:java:io:amqp', 
project ':sdks:java:io:azure', project ':sdks:java:io:bigquery-io-perf-tests', 
project ':sdks:java:io:cassandra', project ':sdks:java:io:cdap', project 
':sdks:java:io:clickhouse', project ':sdks:java:io:common', project 
':sdks:java:io:contextualtextio', project ':sdks:java:io:debezium', project 
':sdks:java:io:elasticsearch', project ':sdks:java:io:elasticsearch-tests', 
project ':sdks:java:io:expansion-service', project 
':sdks:java:io:file-based-io-tests', project 
':sdks:java:io:google-cloud-platform', project ':sdks:java:io:hadoop-common', 
project ':sdks:java:io:hadoop-file-system', project 
':sdks:java:io:hadoop-format', project ':sdks:java:io:hbase', project 
':sdks:java:io:hcatalog', project ':sdks:java:io:influxdb', project 
':sdks:java:io:jdbc', project ':sdks:java:io:jms', project 
':sdks:java:io:kafka', project ':sdks:java:io:kinesis', project 
':sdks:java:io:kudu', project ':sdks:java:io:mongodb', project 
':sdks:java:io:mqtt', project ':sdks:java:io:neo4j', project 
':sdks:java:io:parquet', project ':sdks:java:io:pulsar', project 
':sdks:java:io:rabbitmq', project ':sdks:java:io:redis', project 
':sdks:java:io:snowflake', project ':sdks:java:io:solr', project 
':sdks:java:io:sparkreceiver', project ':sdks:java:io:splunk', project 
':sdks:java:io:synthetic', project ':sdks:java:io:thrift', project 
':sdks:java:io:tika', project ':sdks:java:io:xml', project 
':sdks:java:maven-archetypes:examples', project 
':sdks:java:maven-archetypes:gcp-bom-examples', project 
':sdks:java:maven-archetypes:starter', project 
':sdks:java:testing:expansion-service', project 
':sdks:java:testing:jpms-tests', project ':sdks:java:testing:kafka-service', 
project ':sdks:java:testing:load-tests', project ':sdks:java:testing:nexmark', 
project ':sdks:java:testing:test-utils', project ':sdks:java:testing:tpcds', 
project ':sdks:java:testing:watermarks', project 
':sdks:python:apache_beam:testing', project ':sdks:python:container:py37', 
project ':sdks:python:container:py38', project ':sdks:python:container:py39', 
project ':sdks:python:test-suites:dataflow', project 
':sdks:python:test-suites:direct', project ':sdks:python:test-suites:portable', 
project ':sdks:python:test-suites:tox', project 
':runners:spark:2:job-server:container', project 
':runners:spark:3:job-server:container', project 
':sdks:java:extensions:sql:datacatalog', project 
':sdks:java:extensions:sql:expansion-service', project 
':sdks:java:extensions:sql:hcatalog', project ':sdks:java:extensions:sql:jdbc', 
project ':sdks:java:extensions:sql:payloads', project 
':sdks:java:extensions:sql:perf-tests', project 
':sdks:java:extensions:sql:shell', project ':sdks:java:extensions:sql:udf', 
project ':sdks:java:extensions:sql:udf-test-provider', project 
':sdks:java:extensions:sql:zetasql', project 
':sdks:java:io:debezium:expansion-service', project 
':sdks:java:io:elasticsearch-tests:elasticsearch-tests-2', project 
':sdks:java:io:elasticsearch-tests:elasticsearch-tests-5', project 
':sdks:java:io:elasticsearch-tests:elasticsearch-tests-6', project 
':sdks:java:io:elasticsearch-tests:elasticsearch-tests-7', project 
':sdks:java:io:elasticsearch-tests:elasticsearch-tests-8', project 
':sdks:java:io:elasticsearch-tests:elasticsearch-tests-common', project 
':sdks:java:io:google-cloud-platform:expansion-service', project 
':sdks:java:io:kinesis:expansion-service', project 
':sdks:java:io:snowflake:expansion-service', project 
':sdks:python:apache_beam:testing:benchmarks', project 
':sdks:python:apache_beam:testing:load_tests', project 
':sdks:python:test-suites:dataflow:py37', project 
':sdks:python:test-suites:dataflow:py38', project 
':sdks:python:test-suites:dataflow:py39', project 
':sdks:python:test-suites:direct:py37', project 
':sdks:python:test-suites:direct:py38', project 
':sdks:python:test-suites:direct:py39', project 
':sdks:python:test-suites:direct:xlang', project 
':sdks:python:test-suites:portable:py37', project 
':sdks:python:test-suites:portable:py38', project 
':sdks:python:test-suites:portable:py39', project 
':sdks:python:test-suites:tox:py37', project 
':sdks:python:test-suites:tox:py38', project 
':sdks:python:test-suites:tox:py39', project 
':sdks:python:test-suites:tox:pycommon', project 
':sdks:python:apache_beam:testing:benchmarks:nexmark']

> Configure project :buildSrc
Evaluating project ':buildSrc' using build file 
'<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/buildSrc/build.gradle.kts'.>
Build cache key for Kotlin DSL accessors for project ':buildSrc' is 
51f3398668973406f074354f17710946
Skipping Kotlin DSL accessors for project ':buildSrc' as it is up-to-date.
Task name matched 'build'
Selected primary task 'build' from project :
Resolve mutations for :buildSrc:compileJava (Thread[Execution ****,5,main]) 
started.
Resolve mutations for :buildSrc:compileJava (Thread[Execution ****,5,main]) 
completed. Took 0.001 secs.
:buildSrc:compileJava (Thread[Execution ****,5,main]) started.

> Task :buildSrc:compileJava NO-SOURCE
Skipping task ':buildSrc:compileJava' as it has no source files and no previous 
output files.
:buildSrc:compileJava (Thread[Execution ****,5,main]) completed. Took 0.005 
secs.
Resolve mutations for :buildSrc:compileGroovy (Thread[included builds,5,main]) 
started.
Resolve mutations for :buildSrc:compileGroovy (Thread[included builds,5,main]) 
completed. Took 0.0 secs.
:buildSrc:compileGroovy (Thread[Execution **** Thread 6,5,main]) started.

> Task :buildSrc:compileGroovy FROM-CACHE
Build cache key for task ':buildSrc:compileGroovy' is 
f99b94e41c2f0a012a1f2f86a2a2a234
Task ':buildSrc:compileGroovy' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':buildSrc:compileGroovy' with cache key 
f99b94e41c2f0a012a1f2f86a2a2a234
:buildSrc:compileGroovy (Thread[Execution **** Thread 6,5,main]) completed. 
Took 0.135 secs.
Resolve mutations for :buildSrc:pluginDescriptors (Thread[included 
builds,5,main]) started.
Resolve mutations for :buildSrc:pluginDescriptors (Thread[included 
builds,5,main]) completed. Took 0.0 secs.
:buildSrc:pluginDescriptors (Thread[Execution **** Thread 7,5,main]) started.

> Task :buildSrc:pluginDescriptors
Caching disabled for task ':buildSrc:pluginDescriptors' because:
  Not worth caching
Task ':buildSrc:pluginDescriptors' is not up-to-date because:
  No history is available.
:buildSrc:pluginDescriptors (Thread[Execution **** Thread 7,5,main]) completed. 
Took 0.005 secs.
Resolve mutations for :buildSrc:processResources (Thread[Execution **** Thread 
6,5,main]) started.
Resolve mutations for :buildSrc:processResources (Thread[Execution **** Thread 
6,5,main]) completed. Took 0.0 secs.
:buildSrc:processResources (Thread[Execution **** Thread 5,5,main]) started.

> Task :buildSrc:processResources
Caching disabled for task ':buildSrc:processResources' because:
  Not worth caching
Task ':buildSrc:processResources' is not up-to-date because:
  No history is available.
file or directory 
'<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/buildSrc/src/main/resources',>
 not found
Resolve mutations for :buildSrc:classes (Thread[Execution **** Thread 
7,5,main]) started.
:buildSrc:processResources (Thread[Execution **** Thread 5,5,main]) completed. 
Took 0.009 secs.
Resolve mutations for :buildSrc:classes (Thread[Execution **** Thread 
7,5,main]) completed. Took 0.0 secs.
:buildSrc:classes (Thread[Execution **** Thread 6,5,main]) started.

> Task :buildSrc:classes
Skipping task ':buildSrc:classes' as it has no actions.
:buildSrc:classes (Thread[Execution **** Thread 6,5,main]) completed. Took 0.0 
secs.
Resolve mutations for :buildSrc:jar (Thread[included builds,5,main]) started.
Resolve mutations for :buildSrc:jar (Thread[included builds,5,main]) completed. 
Took 0.0 secs.
:buildSrc:jar (Thread[included builds,5,main]) started.

> Task :buildSrc:jar
Caching disabled for task ':buildSrc:jar' because:
  Not worth caching
Task ':buildSrc:jar' is not up-to-date because:
  No history is available.
file or directory 
'<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/buildSrc/build/classes/java/main',>
 not found
:buildSrc:jar (Thread[included builds,5,main]) completed. Took 0.113 secs.
Resolve mutations for :buildSrc:assemble (Thread[Execution **** Thread 
6,5,main]) started.
Resolve mutations for :buildSrc:assemble (Thread[Execution **** Thread 
6,5,main]) completed. Took 0.0 secs.
:buildSrc:assemble (Thread[Execution **** Thread 6,5,main]) started.

> Task :buildSrc:assemble
Skipping task ':buildSrc:assemble' as it has no actions.
:buildSrc:assemble (Thread[Execution **** Thread 6,5,main]) completed. Took 0.0 
secs.
Resolve mutations for :buildSrc:spotlessGroovy (Thread[included builds,5,main]) 
started.
Resolve mutations for :buildSrc:spotlessGroovy (Thread[included builds,5,main]) 
completed. Took 0.0 secs.
:buildSrc:spotlessGroovy (Thread[included builds,5,main]) started.
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=ab1f5557-69b1-42f0-b703-21e1f820bdb0, 
currentDir=<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 1624357
  log file: /home/jenkins/.gradle/daemon/7.5.1/daemon-1624357.out.log
----- Last  20 lines from daemon log file - daemon-1624357.out.log -----
Resolve mutations for :buildSrc:jar (Thread[included builds,5,main]) started.
Resolve mutations for :buildSrc:jar (Thread[included builds,5,main]) completed. 
Took 0.0 secs.
:buildSrc:jar (Thread[included builds,5,main]) started.
Caching disabled for task ':buildSrc:jar' because:
  Not worth caching
Task ':buildSrc:jar' is not up-to-date because:
  No history is available.
file or directory 
'<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/buildSrc/build/classes/java/main',>
 not found
:buildSrc:jar (Thread[included builds,5,main]) completed. Took 0.113 secs.
Resolve mutations for :buildSrc:assemble (Thread[Execution **** Thread 
6,5,main]) started.
Resolve mutations for :buildSrc:assemble (Thread[Execution **** Thread 
6,5,main]) completed. Took 0.0 secs.
:buildSrc:assemble (Thread[Execution **** Thread 6,5,main]) started.
Skipping task ':buildSrc:assemble' as it has no actions.
:buildSrc:assemble (Thread[Execution **** Thread 6,5,main]) completed. Took 0.0 
secs.
Resolve mutations for :buildSrc:spotlessGroovy (Thread[included builds,5,main]) 
started.
Resolve mutations for :buildSrc:spotlessGroovy (Thread[included builds,5,main]) 
completed. Took 0.0 secs.
:buildSrc:spotlessGroovy (Thread[included builds,5,main]) started.
file or directory 
'<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/buildSrc/src/test/groovy',>
 not found
file or directory 
'<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/buildSrc/src/test/groovy',>
 not found
Daemon vm is shutting down... The daemon has exited normally or was terminated 
in response to a user interrupt.
----- End of the daemon log -----


FAILURE: Build failed with an exception.

* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may 
have crashed)

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

> Task :buildSrc:spotlessGroovy
file or directory 
'<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/buildSrc/src/test/groovy',>
 not found
file or directory 
'<https://ci-beam.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/buildSrc/src/test/groovy',>
 not found
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to