See <https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/4359/display/redirect?page=changes>
Changes: [rohde.samuel] Update V1Beta3 API and add DebugOptions [83289+avalanche123] [BEAM-12081] Fix AwsOptions Jackson (de)serialization of integer values [noreply] [BEAM-12069] Remove mock from base_image_requirements.txt (#14389) [noreply] [BEAM-5537] Allow google-cloud-bigquery 2.x (#14391) [noreply] Merge pull request #14274 from [BEAM-9547] Initial implementation for ------------------------------------------ [...truncated 83 B...] The recommended git tool is: NONE No credentials specified Wiping out workspace first. Cloning the remote Git repository Cloning repository https://github.com/apache/beam.git > git init > <https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/src> > # timeout=10 Fetching upstream changes from https://github.com/apache/beam.git > git --version # timeout=10 > git --version # 'git version 2.7.4' > git fetch --tags --progress https://github.com/apache/beam.git > +refs/heads/*:refs/remotes/origin/* # timeout=10 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # > timeout=10 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10 Fetching upstream changes from https://github.com/apache/beam.git > git fetch --tags --progress https://github.com/apache/beam.git > +refs/heads/*:refs/remotes/origin/* > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # > timeout=10 > git rev-parse origin/master^{commit} # timeout=10 Checking out Revision b43952ca369418e6f0d6cc2d4c54f991d7aafb25 (origin/master) > git config core.sparsecheckout # timeout=10 > git checkout -f b43952ca369418e6f0d6cc2d4c54f991d7aafb25 # timeout=10 Commit message: "Merge pull request #14380 from [BEAM-10994] Java: Update V1Beta3 API and add DebugOptions" > git rev-list --no-walk 7963cd3329f5349cb3ad93e0bbdebebdeeb3b86f # timeout=10 No emails were triggered. [EnvInject] - Executing scripts and injecting environment variables after the SCM step. [EnvInject] - Injecting as environment variables the properties content SPARK_LOCAL_IP=127.0.0.1 SETUPTOOLS_USE_DISTUTILS=stdlib [EnvInject] - Variables injected successfully. [beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins7111326660445107674.sh + cp /home/jenkins/.kube/config <https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content KUBECONFIG=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> [EnvInject] - Variables injected successfully. [beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins8835857561954714669.sh + gcloud container clusters get-credentials io-datastores --zone=us-central1-a Fetching cluster endpoint and auth data. kubeconfig entry generated for io-datastores. [beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins6773664859718780166.sh + <https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/kubernetes.sh> createNamespace beam-performancetests-compressed-textioit-hdfs-4359 + KUBECONFIG=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> + KUBERNETES_NAMESPACE=default + KUBECTL='kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> --namespace=default' + createNamespace beam-performancetests-compressed-textioit-hdfs-4359 + eval 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> create namespace beam-performancetests-compressed-textioit-hdfs-4359' ++ kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> create namespace beam-performancetests-compressed-textioit-hdfs-4359 namespace/beam-performancetests-compressed-textioit-hdfs-4359 created [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content KUBERNETES_NAMESPACE=beam-performancetests-compressed-textioit-hdfs-4359 [EnvInject] - Variables injected successfully. [beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins2082060242816235684.sh + <https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/kubernetes.sh> apply <https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/LargeITCluster/hdfs-multi-datanode-cluster.yml> + KUBECONFIG=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> + KUBERNETES_NAMESPACE=beam-performancetests-compressed-textioit-hdfs-4359 + KUBECTL='kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> --namespace=beam-performancetests-compressed-textioit-hdfs-4359' + apply <https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/LargeITCluster/hdfs-multi-datanode-cluster.yml> + eval 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> --namespace=beam-performancetests-compressed-textioit-hdfs-4359 apply -R -f <https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/LargeITCluster/hdfs-multi-datanode-cluster.yml'> ++ kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> --namespace=beam-performancetests-compressed-textioit-hdfs-4359 apply -R -f <https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/LargeITCluster/hdfs-multi-datanode-cluster.yml> service/hadoop created service/hadoop-datanodes created statefulset.apps/datanode created pod/namenode-0 created [beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins8879743162168844896.sh + set -eo pipefail + eval <https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/kubernetes.sh> loadBalancerIP hadoop ++ <https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/kubernetes.sh> loadBalancerIP hadoop + sed 's/^/LOAD_BALANCER_IP=/' + KUBECONFIG=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> + KUBERNETES_NAMESPACE=beam-performancetests-compressed-textioit-hdfs-4359 + KUBECTL='kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> --namespace=beam-performancetests-compressed-textioit-hdfs-4359' + loadBalancerIP hadoop + local name=hadoop + local 'command=kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> --namespace=beam-performancetests-compressed-textioit-hdfs-4359 get svc hadoop -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\''' + retry 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> --namespace=beam-performancetests-compressed-textioit-hdfs-4359 get svc hadoop -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\''' 36 10 + local 'command=kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> --namespace=beam-performancetests-compressed-textioit-hdfs-4359 get svc hadoop -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\''' + local max_retries=36 + local sleep_time=10 + (( i = 1 )) + (( i <= max_retries )) + local output ++ eval 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> --namespace=beam-performancetests-compressed-textioit-hdfs-4359 get svc hadoop -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\''' +++ kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> --namespace=beam-performancetests-compressed-textioit-hdfs-4359 get svc hadoop '-ojsonpath={.status.loadBalancer.ingress[0].ip}' + output= + local status=0 + [[ 0 == 0 ]] + [[ -n '' ]] + [[ 1 == \3\6 ]] + sleep 10 + (( i++ )) + (( i <= max_retries )) + local output ++ eval 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> --namespace=beam-performancetests-compressed-textioit-hdfs-4359 get svc hadoop -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\''' +++ kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> --namespace=beam-performancetests-compressed-textioit-hdfs-4359 get svc hadoop '-ojsonpath={.status.loadBalancer.ingress[0].ip}' + output= + local status=0 + [[ 0 == 0 ]] + [[ -n '' ]] + [[ 2 == \3\6 ]] + sleep 10 + (( i++ )) + (( i <= max_retries )) + local output ++ eval 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> --namespace=beam-performancetests-compressed-textioit-hdfs-4359 get svc hadoop -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\''' +++ kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> --namespace=beam-performancetests-compressed-textioit-hdfs-4359 get svc hadoop '-ojsonpath={.status.loadBalancer.ingress[0].ip}' + output= + local status=0 + [[ 0 == 0 ]] + [[ -n '' ]] + [[ 3 == \3\6 ]] + sleep 10 + (( i++ )) + (( i <= max_retries )) + local output ++ eval 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> --namespace=beam-performancetests-compressed-textioit-hdfs-4359 get svc hadoop -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\''' +++ kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> --namespace=beam-performancetests-compressed-textioit-hdfs-4359 get svc hadoop '-ojsonpath={.status.loadBalancer.ingress[0].ip}' + output= + local status=0 + [[ 0 == 0 ]] + [[ -n '' ]] + [[ 4 == \3\6 ]] + sleep 10 + (( i++ )) + (( i <= max_retries )) + local output ++ eval 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> --namespace=beam-performancetests-compressed-textioit-hdfs-4359 get svc hadoop -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\''' +++ kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> --namespace=beam-performancetests-compressed-textioit-hdfs-4359 get svc hadoop '-ojsonpath={.status.loadBalancer.ingress[0].ip}' + output= + local status=0 + [[ 0 == 0 ]] + [[ -n '' ]] + [[ 5 == \3\6 ]] + sleep 10 + (( i++ )) + (( i <= max_retries )) + local output ++ eval 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> --namespace=beam-performancetests-compressed-textioit-hdfs-4359 get svc hadoop -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\''' +++ kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> --namespace=beam-performancetests-compressed-textioit-hdfs-4359 get svc hadoop '-ojsonpath={.status.loadBalancer.ingress[0].ip}' + output= + local status=0 + [[ 0 == 0 ]] + [[ -n '' ]] + [[ 6 == \3\6 ]] + sleep 10 + (( i++ )) + (( i <= max_retries )) + local output ++ eval 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> --namespace=beam-performancetests-compressed-textioit-hdfs-4359 get svc hadoop -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\''' +++ kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> --namespace=beam-performancetests-compressed-textioit-hdfs-4359 get svc hadoop '-ojsonpath={.status.loadBalancer.ingress[0].ip}' + output= + local status=0 + [[ 0 == 0 ]] + [[ -n '' ]] + [[ 7 == \3\6 ]] + sleep 10 + (( i++ )) + (( i <= max_retries )) + local output ++ eval 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> --namespace=beam-performancetests-compressed-textioit-hdfs-4359 get svc hadoop -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\''' +++ kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> --namespace=beam-performancetests-compressed-textioit-hdfs-4359 get svc hadoop '-ojsonpath={.status.loadBalancer.ingress[0].ip}' + output= + local status=0 + [[ 0 == 0 ]] + [[ -n '' ]] + [[ 8 == \3\6 ]] + sleep 10 + (( i++ )) + (( i <= max_retries )) + local output ++ eval 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> --namespace=beam-performancetests-compressed-textioit-hdfs-4359 get svc hadoop -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\''' +++ kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> --namespace=beam-performancetests-compressed-textioit-hdfs-4359 get svc hadoop '-ojsonpath={.status.loadBalancer.ingress[0].ip}' + output= + local status=0 + [[ 0 == 0 ]] + [[ -n '' ]] + [[ 9 == \3\6 ]] + sleep 10 + (( i++ )) + (( i <= max_retries )) + local output ++ eval 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> --namespace=beam-performancetests-compressed-textioit-hdfs-4359 get svc hadoop -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\''' +++ kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/config-beam-performancetests-compressed-textioit-hdfs-4359> --namespace=beam-performancetests-compressed-textioit-hdfs-4359 get svc hadoop '-ojsonpath={.status.loadBalancer.ingress[0].ip}' + output=35.222.162.184 + local status=0 + [[ 0 == 0 ]] + [[ -n 35.222.162.184 ]] + echo 35.222.162.184 + return 0 [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties file path 'job.properties' [EnvInject] - Variables injected successfully. [Gradle] - Launching build. [src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/src/gradlew> --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Pdocker-pull-licenses --info -DintegrationTestPipelineOptions=["--bigQueryDataset=beam_performance","--bigQueryTable=compressed_textioit_hdfs_results","--influxMeasurement=compressed_textioit_hdfs_results","--numberOfRecords=450000000","--expectedHash=8a3de973354abc6fba621c6797cc0f06","--datasetSize=1097840000","--compressionType=GZIP","--numWorkers=5","--autoscalingAlgorithm=NONE","--influxDatabase=beam_test_metrics","--influxHost=http://10.128.0.96:8086","--runner=DataflowRunner","--project=apache-beam-testing","--tempRoot=gs://temp-storage-for-perf-tests","--hdfsConfiguration=[{\"fs.defaultFS\":\"hdfs:35.222.162.184:9000\",\"dfs.replication\":1}]","--filenamePrefix=hdfs://35.222.162.184:9000/TEXTIO_IT_"] -Dfilesystem=hdfs -DintegrationTestRunner=dataflow :sdks:java:io:file-based-io-tests:integrationTest --tests org.apache.beam.sdk.io.text.TextIOIT Initialized native services in: /home/jenkins/.gradle/native Removing 0 daemon stop events from registry Previous Daemon (26031) stopped at Thu Apr 01 18:23:34 UTC 2021 other compatible daemons were started and after being idle for 0 minutes and not recently used Starting a Gradle Daemon, 6 busy and 1 stopped Daemons could not be reused, use --status for details Starting process 'Gradle build daemon'. Working directory: /home/jenkins/.gradle/daemon/6.8 Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -Xmx4g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -cp /home/jenkins/.gradle/wrapper/dists/gradle-6.8-bin/1jblhjyydfkclfzx1agp92nyl/gradle-6.8/lib/gradle-launcher-6.8.jar org.gradle.launcher.daemon.bootstrap.GradleDaemon 6.8 Successfully started process 'Gradle build daemon' An attempt to start the daemon took 2.958 secs. The client will now receive all logging from the daemon (pid: 31600). The daemon log file: /home/jenkins/.gradle/daemon/6.8/daemon-31600.out.log Starting build in new daemon [memory: 3.5 GiB] Closing daemon's stdin at end of input. The daemon will no longer process any standard input. Using 12 **** leases. Watching the file system is enabled Now considering [<https://ci-beam.apache.org/job/beam_PerformanceTests_Compressed_TextIOIT_HDFS/ws/src]> as hierarchies to watch Starting Build FAILURE: Build completed with 2 failures. 1: Task failed with an exception. ----------- * What went wrong: Gradle could not start your build. > Could not create service of type ExecutionHistoryStore using > DependencyManagementGradleUserHomeScopeServices.createExecutionHistoryStore(). > Timeout waiting to lock execution history cache (/home/jenkins/.gradle/caches/6.8/executionHistory). It is currently in use by another Gradle instance. Owner PID: 2796 Our PID: 31600 Owner Operation: Our operation: Lock file: /home/jenkins/.gradle/caches/6.8/executionHistory/executionHistory.lock * Try: Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 2: Task failed with an exception. ----------- * What went wrong: Timeout waiting to lock artifact cache (/home/jenkins/.gradle/caches/modules-2). It is currently in use by another Gradle instance. Owner PID: 2796 Our PID: 31600 Owner Operation: Our operation: Lock file: /home/jenkins/.gradle/caches/modules-2/modules-2.lock * Try: Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights. ============================================================================== * Get more help at https://help.gradle.org BUILD FAILED in 2m 10s Watching 0 directories to track changes Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
