See 
<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/6494/display/redirect>

Changes:


------------------------------------------
[...truncated 639 B...]
 > git init 
 > <https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/src>
 >  # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # 
 > timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 1ba7821a09cb1b9a1793f7044ed64aabf55f7713 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 1ba7821a09cb1b9a1793f7044ed64aabf55f7713 # timeout=10
Commit message: "Merge pull request #24554: Split some IO precommits out of the 
Java precommit"
 > git rev-list --no-walk 1ba7821a09cb1b9a1793f7044ed64aabf55f7713 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_ParquetIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins1986561092711936286.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a
Fetching cluster endpoint and auth data.
CRITICAL: ACTION REQUIRED: gke-gcloud-auth-plugin, which is needed for 
continued use of kubectl, was not found or is not executable. Install 
gke-gcloud-auth-plugin for use with kubectl by following 
https://cloud.google.com/blog/products/containers-kubernetes/kubectl-auth-changes-in-gke
kubeconfig entry generated for io-datastores.
[beam_PerformanceTests_ParquetIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins5744271793509537367.sh
+ cp /home/jenkins/.kube/config 
<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/config-beam-performancetests-parquetioit-hdfs-6494>
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
KUBECONFIG=<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/config-beam-performancetests-parquetioit-hdfs-6494>

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_ParquetIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins149762835823162181.sh
+ 
<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/src/.test-infra/kubernetes/kubernetes.sh>
 createNamespace beam-performancetests-parquetioit-hdfs-6494
+ 
KUBECONFIG=<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/config-beam-performancetests-parquetioit-hdfs-6494>
+ KUBERNETES_NAMESPACE=default
+ KUBECTL='kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/config-beam-performancetests-parquetioit-hdfs-6494>
 --namespace=default'
+ createNamespace beam-performancetests-parquetioit-hdfs-6494
+ eval 'kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/config-beam-performancetests-parquetioit-hdfs-6494>
 create namespace beam-performancetests-parquetioit-hdfs-6494'
++ kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/config-beam-performancetests-parquetioit-hdfs-6494>
 create namespace beam-performancetests-parquetioit-hdfs-6494
namespace/beam-performancetests-parquetioit-hdfs-6494 created
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
KUBERNETES_NAMESPACE=beam-performancetests-parquetioit-hdfs-6494

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_ParquetIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins6733701015284538482.sh
+ 
<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/src/.test-infra/kubernetes/kubernetes.sh>
 apply 
<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/LargeITCluster/hdfs-multi-datanode-cluster.yml>
+ 
KUBECONFIG=<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/config-beam-performancetests-parquetioit-hdfs-6494>
+ KUBERNETES_NAMESPACE=beam-performancetests-parquetioit-hdfs-6494
+ KUBECTL='kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/config-beam-performancetests-parquetioit-hdfs-6494>
 --namespace=beam-performancetests-parquetioit-hdfs-6494'
+ apply 
<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/LargeITCluster/hdfs-multi-datanode-cluster.yml>
+ eval 'kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/config-beam-performancetests-parquetioit-hdfs-6494>
 --namespace=beam-performancetests-parquetioit-hdfs-6494 apply -R -f 
<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/LargeITCluster/hdfs-multi-datanode-cluster.yml'>
++ kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/config-beam-performancetests-parquetioit-hdfs-6494>
 --namespace=beam-performancetests-parquetioit-hdfs-6494 apply -R -f 
<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/LargeITCluster/hdfs-multi-datanode-cluster.yml>
service/hadoop created
service/hadoop-datanodes created
statefulset.apps/datanode created
pod/namenode-0 created
[beam_PerformanceTests_ParquetIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins7379765128302340710.sh
+ set -eo pipefail
+ eval 
<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/src/.test-infra/kubernetes/kubernetes.sh>
 loadBalancerIP hadoop
++ 
<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/src/.test-infra/kubernetes/kubernetes.sh>
 loadBalancerIP hadoop
+ sed 's/^/LOAD_BALANCER_IP=/'
+ 
KUBECONFIG=<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/config-beam-performancetests-parquetioit-hdfs-6494>
+ KUBERNETES_NAMESPACE=beam-performancetests-parquetioit-hdfs-6494
+ KUBECTL='kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/config-beam-performancetests-parquetioit-hdfs-6494>
 --namespace=beam-performancetests-parquetioit-hdfs-6494'
+ loadBalancerIP hadoop
+ local name=hadoop
+ local 'command=kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/config-beam-performancetests-parquetioit-hdfs-6494>
 --namespace=beam-performancetests-parquetioit-hdfs-6494 get svc hadoop 
-ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+ retry 'kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/config-beam-performancetests-parquetioit-hdfs-6494>
 --namespace=beam-performancetests-parquetioit-hdfs-6494 get svc hadoop 
-ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\''' 36 10
+ local 'command=kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/config-beam-performancetests-parquetioit-hdfs-6494>
 --namespace=beam-performancetests-parquetioit-hdfs-6494 get svc hadoop 
-ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+ local max_retries=36
+ local sleep_time=10
+ (( i = 1 ))
+ (( i <= max_retries ))
+ local output
++ eval 'kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/config-beam-performancetests-parquetioit-hdfs-6494>
 --namespace=beam-performancetests-parquetioit-hdfs-6494 get svc hadoop 
-ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+++ kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/config-beam-performancetests-parquetioit-hdfs-6494>
 --namespace=beam-performancetests-parquetioit-hdfs-6494 get svc hadoop 
'-ojsonpath={.status.loadBalancer.ingress[0].ip}'
+ output=
+ local status=0
+ [[ 0 == 0 ]]
+ [[ -n '' ]]
+ [[ 1 == \3\6 ]]
+ sleep 10
+ (( i++ ))
+ (( i <= max_retries ))
+ local output
++ eval 'kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/config-beam-performancetests-parquetioit-hdfs-6494>
 --namespace=beam-performancetests-parquetioit-hdfs-6494 get svc hadoop 
-ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+++ kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/config-beam-performancetests-parquetioit-hdfs-6494>
 --namespace=beam-performancetests-parquetioit-hdfs-6494 get svc hadoop 
'-ojsonpath={.status.loadBalancer.ingress[0].ip}'
+ output=
+ local status=0
+ [[ 0 == 0 ]]
+ [[ -n '' ]]
+ [[ 2 == \3\6 ]]
+ sleep 10
+ (( i++ ))
+ (( i <= max_retries ))
+ local output
++ eval 'kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/config-beam-performancetests-parquetioit-hdfs-6494>
 --namespace=beam-performancetests-parquetioit-hdfs-6494 get svc hadoop 
-ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+++ kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/config-beam-performancetests-parquetioit-hdfs-6494>
 --namespace=beam-performancetests-parquetioit-hdfs-6494 get svc hadoop 
'-ojsonpath={.status.loadBalancer.ingress[0].ip}'
+ output=
+ local status=0
+ [[ 0 == 0 ]]
+ [[ -n '' ]]
+ [[ 3 == \3\6 ]]
+ sleep 10
+ (( i++ ))
+ (( i <= max_retries ))
+ local output
++ eval 'kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/config-beam-performancetests-parquetioit-hdfs-6494>
 --namespace=beam-performancetests-parquetioit-hdfs-6494 get svc hadoop 
-ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+++ kubectl 
--kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/config-beam-performancetests-parquetioit-hdfs-6494>
 --namespace=beam-performancetests-parquetioit-hdfs-6494 get svc hadoop 
'-ojsonpath={.status.loadBalancer.ingress[0].ip}'
+ output=34.118.207.163
+ local status=0
+ [[ 0 == 0 ]]
+ [[ -n 34.118.207.163 ]]
+ echo 34.118.207.163
+ return 0
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties file path 
'job.properties'
[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 
<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/src/gradlew>
 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g 
-Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses 
--info 
-DintegrationTestPipelineOptions=["--bigQueryDataset=beam_performance","--bigQueryTable=parquetioit_hdfs_results","--influxMeasurement=parquetioit_hdfs_results","--numberOfRecords=225000000","--expectedHash=2f9f5ca33ea464b25109c0297eb6aecb","--datasetSize=1087370000","--numWorkers=5","--autoscalingAlgorithm=NONE","--influxDatabase=beam_test_metrics","--influxHost=http://10.128.0.96:8086","--runner=DataflowRunner","--project=apache-beam-testing","--tempRoot=gs://temp-storage-for-perf-tests","--hdfsConfiguration=[{\"fs.defaultFS\":\"hdfs:34.118.207.163:9000\",\"dfs.replication\":1}]","--filenamePrefix=hdfs://34.118.207.163:9000/TEXTIO_IT_";]
 -Dfilesystem=hdfs -DintegrationTestRunner=dataflow 
:sdks:java:io:file-based-io-tests:integrationTest --tests 
org.apache.beam.sdk.io.parquet.ParquetIOIT
Initialized native services in: /home/jenkins/.gradle/native
Initialized jansi services in: /home/jenkins/.gradle/native
The client will now receive all logging from the daemon (pid: 1865571). The 
daemon log file: /home/jenkins/.gradle/daemon/7.5.1/daemon-1865571.out.log
Starting 2nd build in daemon [uptime: 3 mins 42.306 secs, performance: 100%]
Using 8 **** leases.
Configuration on demand is an incubating feature.
Closing daemon's stdin at end of input.
The daemon will no longer process any standard input.
Watching the file system is configured to be disabled
File system watching is inactive
Starting Build
Invalidating in-memory cache of 
/home/jenkins/.gradle/caches/7.5.1/fileHashes/fileHashes.bin
Invalidating in-memory cache of 
/home/jenkins/.gradle/caches/7.5.1/fileHashes/resourceHashesCache.bin
Invalidating in-memory cache of 
/home/jenkins/.gradle/caches/7.5.1/kotlin-dsl/executionHistory.bin
Skipping Kotlin DSL script compilation (Settings/TopLevel/stage1) as it is 
up-to-date.
Skipping Kotlin DSL script compilation (Settings/TopLevel/stage2) as it is 
up-to-date.
Settings evaluated using settings file 
'<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/src/settings.gradle.kts'.>
Using local directory build cache for the root build (location = 
/home/jenkins/.gradle/caches/build-cache-1, removeUnusedEntriesAfter = 7 days).
Projects loaded. Root project using build file 
'<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/src/build.gradle.kts'.>
Included projects: [root project 'beam', project ':beam-test-infra-metrics', 
project ':beam-test-jenkins', project ':beam-test-tools', project 
':beam-validate-runner', project ':examples', project ':model', project 
':playground', project ':release', project ':runners', project ':sdks', project 
':vendor', project ':website', project ':examples:java', project 
':examples:kotlin', project ':examples:multi-language', project 
':model:fn-execution', project ':model:job-management', project 
':model:pipeline', project ':playground:backend', project 
':playground:frontend', project ':playground:terraform', project 
':release:go-licenses', project ':runners:core-construction-java', project 
':runners:core-java', project ':runners:direct-java', project 
':runners:extensions-java', project ':runners:flink', project 
':runners:google-cloud-dataflow-java', project ':runners:java-fn-execution', 
project ':runners:java-job-service', project ':runners:jet', project 
':runners:local-java', project ':runners:portability', project 
':runners:samza', project ':runners:spark', project ':runners:twister2', 
project ':sdks:go', project ':sdks:java', project ':sdks:python', project 
':vendor:bytebuddy-1_12_8', project ':vendor:calcite-1_28_0', project 
':vendor:grpc-1_48_1', project ':vendor:guava-26_0-jre', project 
':examples:java:twitter', project ':playground:backend:containers', project 
':playground:frontend:playground_components', project 
':release:go-licenses:go', project ':release:go-licenses:java', project 
':release:go-licenses:py', project ':runners:extensions-java:metrics', project 
':runners:flink:1.12', project ':runners:flink:1.13', project 
':runners:flink:1.14', project ':runners:flink:1.15', project 
':runners:google-cloud-dataflow-java:examples', project 
':runners:google-cloud-dataflow-java:examples-streaming', project 
':runners:google-cloud-dataflow-java:****', project 
':runners:portability:java', project ':runners:samza:job-server', project 
':runners:spark:2', project ':runners:spark:3', project ':sdks:go:container', 
project ':sdks:go:examples', project ':sdks:go:test', project ':sdks:java:bom', 
project ':sdks:java:build-tools', project ':sdks:java:container', project 
':sdks:java:core', project ':sdks:java:expansion-service', project 
':sdks:java:extensions', project ':sdks:java:fn-execution', project 
':sdks:java:harness', project ':sdks:java:io', project ':sdks:java:javadoc', 
project ':sdks:java:maven-archetypes', project ':sdks:java:testing', project 
':sdks:python:apache_beam', project ':sdks:python:container', project 
':sdks:python:test-suites', project ':playground:backend:containers:go', 
project ':playground:backend:containers:java', project 
':playground:backend:containers:python', project 
':playground:backend:containers:router', project 
':playground:backend:containers:scio', project 
':runners:flink:1.12:job-server', project 
':runners:flink:1.12:job-server-container', project 
':runners:flink:1.13:job-server', project 
':runners:flink:1.13:job-server-container', project 
':runners:flink:1.14:job-server', project 
':runners:flink:1.14:job-server-container', project 
':runners:flink:1.15:job-server', project 
':runners:flink:1.15:job-server-container', project 
':runners:google-cloud-dataflow-java:****:legacy-****', project 
':runners:google-cloud-dataflow-java:****:windmill', project 
':runners:spark:2:job-server', project ':runners:spark:3:job-server', project 
':sdks:go:test:load', project ':sdks:java:bom:gcp', project 
':sdks:java:container:agent', project ':sdks:java:container:java11', project 
':sdks:java:container:java17', project ':sdks:java:container:java8', project 
':sdks:java:core:jmh', project ':sdks:java:expansion-service:app', project 
':sdks:java:extensions:arrow', project ':sdks:java:extensions:euphoria', 
project ':sdks:java:extensions:google-cloud-platform-core', project 
':sdks:java:extensions:jackson', project ':sdks:java:extensions:join-library', 
project ':sdks:java:extensions:kryo', project ':sdks:java:extensions:ml', 
project ':sdks:java:extensions:protobuf', project 
':sdks:java:extensions:python', project ':sdks:java:extensions:sbe', project 
':sdks:java:extensions:schemaio-expansion-service', project 
':sdks:java:extensions:sketching', project ':sdks:java:extensions:sorter', 
project ':sdks:java:extensions:sql', project 
':sdks:java:extensions:timeseries', project ':sdks:java:extensions:zetasketch', 
project ':sdks:java:harness:jmh', project ':sdks:java:io:amazon-web-services', 
project ':sdks:java:io:amazon-web-services2', project ':sdks:java:io:amqp', 
project ':sdks:java:io:azure', project ':sdks:java:io:bigquery-io-perf-tests', 
project ':sdks:java:io:cassandra', project ':sdks:java:io:cdap', project 
':sdks:java:io:clickhouse', project ':sdks:java:io:common', project 
':sdks:java:io:contextualtextio', project ':sdks:java:io:debezium', project 
':sdks:java:io:elasticsearch', project ':sdks:java:io:elasticsearch-tests', 
project ':sdks:java:io:expansion-service', project 
':sdks:java:io:file-based-io-tests', project 
':sdks:java:io:fileschematransform', project 
':sdks:java:io:google-cloud-platform', project ':sdks:java:io:hadoop-common', 
project ':sdks:java:io:hadoop-file-system', project 
':sdks:java:io:hadoop-format', project ':sdks:java:io:hbase', project 
':sdks:java:io:hcatalog', project ':sdks:java:io:influxdb', project 
':sdks:java:io:jdbc', project ':sdks:java:io:jms', project 
':sdks:java:io:kafka', project ':sdks:java:io:kinesis', project 
':sdks:java:io:kudu', project ':sdks:java:io:mongodb', project 
':sdks:java:io:mqtt', project ':sdks:java:io:neo4j', project 
':sdks:java:io:parquet', project ':sdks:java:io:pulsar', project 
':sdks:java:io:rabbitmq', project ':sdks:java:io:redis', project 
':sdks:java:io:singlestore', project ':sdks:java:io:snowflake', project 
':sdks:java:io:solr', project ':sdks:java:io:sparkreceiver', project 
':sdks:java:io:splunk', project ':sdks:java:io:synthetic', project 
':sdks:java:io:thrift', project ':sdks:java:io:tika', project 
':sdks:java:io:xml', project ':sdks:java:maven-archetypes:examples', project 
':sdks:java:maven-archetypes:gcp-bom-examples', project 
':sdks:java:maven-archetypes:starter', project 
':sdks:java:testing:expansion-service', project 
':sdks:java:testing:jpms-tests', project ':sdks:java:testing:kafka-service', 
project ':sdks:java:testing:load-tests', project ':sdks:java:testing:nexmark', 
project ':sdks:java:testing:test-utils', project ':sdks:java:testing:tpcds', 
project ':sdks:java:testing:watermarks', project 
':sdks:python:apache_beam:testing', project ':sdks:python:container:py310', 
project ':sdks:python:container:py37', project ':sdks:python:container:py38', 
project ':sdks:python:container:py39', project 
':sdks:python:test-suites:dataflow', project ':sdks:python:test-suites:direct', 
project ':sdks:python:test-suites:portable', project 
':sdks:python:test-suites:tox', project 
':runners:spark:2:job-server:container', project 
':runners:spark:3:job-server:container', project 
':sdks:java:extensions:sql:datacatalog', project 
':sdks:java:extensions:sql:expansion-service', project 
':sdks:java:extensions:sql:hcatalog', project ':sdks:java:extensions:sql:jdbc', 
project ':sdks:java:extensions:sql:payloads', project 
':sdks:java:extensions:sql:perf-tests', project 
':sdks:java:extensions:sql:shell', project ':sdks:java:extensions:sql:udf', 
project ':sdks:java:extensions:sql:udf-test-provider', project 
':sdks:java:extensions:sql:zetasql', project 
':sdks:java:io:debezium:expansion-service', project 
':sdks:java:io:elasticsearch-tests:elasticsearch-tests-2', project 
':sdks:java:io:elasticsearch-tests:elasticsearch-tests-5', project 
':sdks:java:io:elasticsearch-tests:elasticsearch-tests-6', project 
':sdks:java:io:elasticsearch-tests:elasticsearch-tests-7', project 
':sdks:java:io:elasticsearch-tests:elasticsearch-tests-8', project 
':sdks:java:io:elasticsearch-tests:elasticsearch-tests-common', project 
':sdks:java:io:google-cloud-platform:expansion-service', project 
':sdks:java:io:kinesis:expansion-service', project 
':sdks:java:io:snowflake:expansion-service', project 
':sdks:java:io:sparkreceiver:2', project 
':sdks:python:apache_beam:testing:benchmarks', project 
':sdks:python:apache_beam:testing:load_tests', project 
':sdks:python:test-suites:dataflow:py310', project 
':sdks:python:test-suites:dataflow:py37', project 
':sdks:python:test-suites:dataflow:py38', project 
':sdks:python:test-suites:dataflow:py39', project 
':sdks:python:test-suites:direct:py310', project 
':sdks:python:test-suites:direct:py37', project 
':sdks:python:test-suites:direct:py38', project 
':sdks:python:test-suites:direct:py39', project 
':sdks:python:test-suites:direct:xlang', project 
':sdks:python:test-suites:portable:py310', project 
':sdks:python:test-suites:portable:py37', project 
':sdks:python:test-suites:portable:py38', project 
':sdks:python:test-suites:portable:py39', project 
':sdks:python:test-suites:tox:py310', project 
':sdks:python:test-suites:tox:py37', project 
':sdks:python:test-suites:tox:py38', project 
':sdks:python:test-suites:tox:py39', project 
':sdks:python:test-suites:tox:pycommon', project 
':sdks:python:apache_beam:testing:benchmarks:nexmark']

> Configure project :buildSrc
Evaluating project ':buildSrc' using build file 
'<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/src/buildSrc/build.gradle.kts'.>
Build cache key for Kotlin DSL accessors for project ':buildSrc' is 
51f3398668973406f074354f17710946
Skipping Kotlin DSL accessors for project ':buildSrc' as it is up-to-date.
Task name matched 'build'
Selected primary task 'build' from project :
Resolve mutations for :buildSrc:compileJava (Thread[Execution ****,5,main]) 
started.
Resolve mutations for :buildSrc:compileJava (Thread[Execution ****,5,main]) 
completed. Took 0.0 secs.
:buildSrc:compileJava (Thread[Execution **** Thread 5,5,main]) started.

> Task :buildSrc:compileJava NO-SOURCE
Skipping task ':buildSrc:compileJava' as it has no source files and no previous 
output files.
:buildSrc:compileJava (Thread[Execution **** Thread 5,5,main]) completed. Took 
0.006 secs.
Resolve mutations for :buildSrc:compileGroovy (Thread[Execution **** Thread 
7,5,main]) started.
Resolve mutations for :buildSrc:compileGroovy (Thread[Execution **** Thread 
7,5,main]) completed. Took 0.0 secs.
:buildSrc:compileGroovy (Thread[included builds,5,main]) started.

> Task :buildSrc:compileGroovy FROM-CACHE
Build cache key for task ':buildSrc:compileGroovy' is 
35adbf2f9549c705a1b536285724098c
Task ':buildSrc:compileGroovy' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':buildSrc:compileGroovy' with cache key 
35adbf2f9549c705a1b536285724098c
:buildSrc:compileGroovy (Thread[included builds,5,main]) completed. Took 0.06 
secs.
Resolve mutations for :buildSrc:pluginDescriptors (Thread[Execution **** Thread 
7,5,main]) started.
Resolve mutations for :buildSrc:pluginDescriptors (Thread[Execution **** Thread 
7,5,main]) completed. Took 0.0 secs.
:buildSrc:pluginDescriptors (Thread[Execution **** Thread 6,5,main]) started.

> Task :buildSrc:pluginDescriptors
Caching disabled for task ':buildSrc:pluginDescriptors' because:
  Not worth caching
Task ':buildSrc:pluginDescriptors' is not up-to-date because:
  No history is available.
:buildSrc:pluginDescriptors (Thread[Execution **** Thread 6,5,main]) completed. 
Took 0.004 secs.
Resolve mutations for :buildSrc:processResources (Thread[Execution **** Thread 
5,5,main]) started.
Resolve mutations for :buildSrc:processResources (Thread[Execution **** Thread 
5,5,main]) completed. Took 0.0 secs.
:buildSrc:processResources (Thread[included builds,5,main]) started.

> Task :buildSrc:processResources
Caching disabled for task ':buildSrc:processResources' because:
  Not worth caching
Task ':buildSrc:processResources' is not up-to-date because:
  No history is available.
file or directory 
'<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/src/buildSrc/src/main/resources',>
 not found
:buildSrc:processResources (Thread[included builds,5,main]) completed. Took 
0.004 secs.
Resolve mutations for :buildSrc:classes (Thread[Execution **** Thread 
7,5,main]) started.
Resolve mutations for :buildSrc:classes (Thread[Execution **** Thread 
7,5,main]) completed. Took 0.0 secs.
:buildSrc:classes (Thread[Execution **** Thread 5,5,main]) started.

> Task :buildSrc:classes
Skipping task ':buildSrc:classes' as it has no actions.
:buildSrc:classes (Thread[Execution **** Thread 5,5,main]) completed. Took 0.0 
secs.
Resolve mutations for :buildSrc:jar (Thread[Execution **** Thread 5,5,main]) 
started.
Resolve mutations for :buildSrc:jar (Thread[Execution **** Thread 5,5,main]) 
completed. Took 0.0 secs.
:buildSrc:jar (Thread[included builds,5,main]) started.
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=59205f58-962e-40b7-b491-4a86bec2b5ec, 
currentDir=<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 1865571
  log file: /home/jenkins/.gradle/daemon/7.5.1/daemon-1865571.out.log
----- Last  20 lines from daemon log file - daemon-1865571.out.log -----
Caching disabled for task ':buildSrc:processResources' because:
  Not worth caching
Task ':buildSrc:processResources' is not up-to-date because:
  No history is available.
file or directory 
'<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/src/buildSrc/src/main/resources',>
 not found
:buildSrc:processResources (Thread[included builds,5,main]) completed. Took 
0.004 secs.
Resolve mutations for :buildSrc:classes (Thread[Execution **** Thread 
7,5,main]) started.
Resolve mutations for :buildSrc:classes (Thread[Execution **** Thread 
7,5,main]) completed. Took 0.0 secs.
:buildSrc:classes (Thread[Execution **** Thread 5,5,main]) started.
Skipping task ':buildSrc:classes' as it has no actions.
:buildSrc:classes (Thread[Execution **** Thread 5,5,main]) completed. Took 0.0 
secs.
Resolve mutations for :buildSrc:jar (Thread[Execution **** Thread 5,5,main]) 
started.
Resolve mutations for :buildSrc:jar (Thread[Execution **** Thread 5,5,main]) 
completed. Took 0.0 secs.
:buildSrc:jar (Thread[included builds,5,main]) started.
Caching disabled for task ':buildSrc:jar' because:
  Not worth caching
Task ':buildSrc:jar' is not up-to-date because:
  No history is available.
file or directory 
'<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/src/buildSrc/build/classes/java/main',>
 not found
Daemon vm is shutting down... The daemon has exited normally or was terminated 
in response to a user interrupt.
----- End of the daemon log -----


FAILURE: Build failed with an exception.

* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may 
have crashed)

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

> Task :buildSrc:jar
Caching disabled for task ':buildSrc:jar' because:
  Not worth caching
Task ':buildSrc:jar' is not up-to-date because:
  No history is available.
file or directory 
'<https://ci-beam.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/src/buildSrc/build/classes/java/main',>
 not found
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to