See 
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/1774/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-9541] Add flink_versions to gradle.properties.

[Kyle Weaver] [BEAM-9541] All Gradle tasks use latest Flink version.

[Kyle Weaver] Disable Flink classloader leak check when using local execution 
mode.

[noreply] [BEAM-8829] only drop event_timestamp when it exists (#13638)

[Kyle Weaver] [BEAM-11570] Comment with link to context.

[noreply] [BEAM-9980] do not hardcode Python version for dataflow validate 
runner


------------------------------------------
[...truncated 1.06 MB...]
See: 
https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker

The push refers to repository 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java]
8dcb7e8037ef: Preparing
1e12f9ca3ed5: Preparing
04c6843faeb5: Preparing
85ba0874a968: Preparing
16b092fc63e1: Preparing
687b24b3b94b: Preparing
e909ad4d2dd0: Preparing
e58dad108ca6: Preparing
88c88a78e28a: Preparing
4e3769015b80: Preparing
65e24e5ad481: Preparing
f85e383859a1: Preparing
e909ad4d2dd0: Waiting
e58dad108ca6: Waiting
4e3769015b80: Waiting
ffb4778f8a52: Preparing
687b24b3b94b: Waiting
f85e383859a1: Waiting
e528f2c31deb: Preparing
c5f4367d4a59: Preparing
ceecb62b2fcc: Preparing
e528f2c31deb: Waiting
ffb4778f8a52: Waiting
193bc1d68b80: Preparing
f0e10b20de19: Preparing
ceecb62b2fcc: Waiting
f0e10b20de19: Waiting
88c88a78e28a: Waiting
65e24e5ad481: Waiting
c5f4367d4a59: Waiting
8dcb7e8037ef: Pushed
85ba0874a968: Pushed
16b092fc63e1: Pushed
1e12f9ca3ed5: Pushed
e58dad108ca6: Pushed
04c6843faeb5: Pushed
e909ad4d2dd0: Pushed
f85e383859a1: Layer already exists
ffb4778f8a52: Layer already exists
e528f2c31deb: Layer already exists
c5f4367d4a59: Layer already exists
ceecb62b2fcc: Layer already exists
193bc1d68b80: Layer already exists
f0e10b20de19: Layer already exists
4e3769015b80: Pushed
65e24e5ad481: Pushed
687b24b3b94b: Pushed
88c88a78e28a: Pushed
20210106005220: digest: 
sha256:79d75cf7471bc1c82817218055c943dd3bddca726519629a54e9976755a32653 size: 
4098
:runners:google-cloud-dataflow-java:buildAndPushDockerContainer 
(Thread[Execution **** for ':' Thread 8,5,main]) completed. Took 6.858 secs.
:runners:google-cloud-dataflow-java:cleanUpDockerImages (Thread[Execution **** 
for ':' Thread 8,5,main]) started.
:sdks:java:io:kafka:integrationTest (Thread[Execution **** for ':',5,main]) 
started.
Gradle Test Executor 2 started executing tests.

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Custom actions are attached to task 
':runners:google-cloud-dataflow-java:cleanUpDockerImages'.
Caching disabled for task 
':runners:google-cloud-dataflow-java:cleanUpDockerImages' because:
  Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:cleanUpDockerImages' is not 
up-to-date because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'docker''. Working directory: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java>
 Command: docker rmi --force 
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210106005220
Successfully started process 'command 'docker''
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210106005220
Untagged: 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:79d75cf7471bc1c82817218055c943dd3bddca726519629a54e9976755a32653
Starting process 'command 'gcloud''. Working directory: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java>
 Command: gcloud --quiet container images delete --force-delete-tags 
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210106005220
Successfully started process 'command 'gcloud''
Digests:
- 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:79d75cf7471bc1c82817218055c943dd3bddca726519629a54e9976755a32653
  Associated tags:
 - 20210106005220
Tags:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210106005220
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210106005220].
Deleted 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:79d75cf7471bc1c82817218055c943dd3bddca726519629a54e9976755a32653].
:runners:google-cloud-dataflow-java:cleanUpDockerImages (Thread[Execution **** 
for ':' Thread 8,5,main]) completed. Took 2.599 secs.

> Task :sdks:java:io:kafka:integrationTest
Custom actions are attached to task ':sdks:java:io:kafka:integrationTest'.
Build cache key for task ':sdks:java:io:kafka:integrationTest' is 
bbf1f8679d559f2247758aef37e5daa2
Task ':sdks:java:io:kafka:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 2'. Working directory: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka>
 Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java 
-DbeamTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--runner=DataflowRunner","--sourceOptions={\"numRecords\":\"100000\",\"keySizeBytes\":\"1\",\"valueSizeBytes\":\"90\"}","--bigQueryDataset=beam_performance","--bigQueryTable=kafkaioit_results_sdf_wrapper","--influxMeasurement=kafkaioit_results_sdf_wrapper","--influxDatabase=beam_test_metrics","--influxHost=http://10.128.0.96:8086","--kafkaBootstrapServerAddresses=35.238.52.105:32400,34.123.102.236:32401,35.238.224.170:32402","--kafkaTopic=beam-runnerv2","--readTimeout=900","--numWorkers=5","--autoscalingAlgorithm=NONE","--experiments=beam_fn_api,use_runner_v2,use_unified_****","--****HarnessContainerImage=us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210106005220","--region=us-central1";]
 
-Djava.security.manager=****.org.gradle.process.internal.****.child.BootstrapSecurityManager
 -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US 
-Duser.language=en -Duser.variant -ea -cp 
/home/jenkins/.gradle/caches/6.7.1/****Main/gradle-****.jar 
****.org.gradle.process.internal.****.GradleWorkerMain 'Gradle Test Executor 2'
Successfully started process 'Gradle Test Executor 2'

org.apache.beam.sdk.io.kafka.KafkaIOIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in 
[jar:<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.28.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in 
[jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.io.kafka.KafkaIOIT > 
testKafkaIOReadsAndWritesCorrectlyInStreaming STANDARD_ERROR
    Jan 06, 2021 12:56:39 AM 
org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory 
tryCreateDefaultBucket
    INFO: No tempLocation specified, attempting to use default bucket: 
dataflow-staging-us-central1-844138762903
    Jan 06, 2021 12:56:39 AM 
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
 handleResponse
    WARNING: Request failed with code 409, performed 0 retries due to 
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP 
framework says request can be retried, (caller responsible for retrying): 
https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
 
    Jan 06, 2021 12:56:39 AM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Jan 06, 2021 12:56:40 AM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files 
from the classpath: will stage 225 files. Enable logging at DEBUG level to see 
which files will be staged.
    Jan 06, 2021 12:56:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
    Jan 06, 2021 12:56:42 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Uploading 225 files from PipelineOptions.filesToStage to staging 
location to prepare for execution.
    Jan 06, 2021 12:56:43 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Staging files complete: 225 files cached, 0 files newly uploaded in 0 
seconds
    Jan 06, 2021 12:56:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Generate records/Impulse as step s1
    Jan 06, 2021 12:56:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Generate records/ParDo(OutputSingleSource) as step s2
    Jan 06, 2021 12:56:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Generate records/ParDo(BoundedSourceAsSDFWrapper) as step s3
    Jan 06, 2021 12:56:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Measure write time as step s4
    Jan 06, 2021 12:56:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to Kafka/Kafka ProducerRecord/Map as step s5
    Jan 06, 2021 12:56:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter) as step 
s6
    Jan 06, 2021 12:56:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Jan 06, 2021 12:56:43 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading <100272 bytes, hash 
dcb671162da68ad1a3f1c9f99bbdc452d03f2aaeb9c36a2390fe360751d70efc> to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-3LZxFi2mitGj8cn5m73EUtA_Kq65w2ojkP42B1HXDvw.pb
    Jan 06, 2021 12:56:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.28.0-SNAPSHOT
    Jan 06, 2021 12:56:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-05_16_56_43-250494063182232742?project=apache-beam-testing
    Jan 06, 2021 12:56:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-01-05_16_56_43-250494063182232742
    Jan 06, 2021 12:56:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2021-01-05_16_56_43-250494063182232742
    Jan 06, 2021 12:56:51 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-01-06T00:56:49.831Z: The workflow name is not a valid Cloud 
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring 
will be labeled with this modified job name: 
kafkaioit0testkafkaioreadsandwritescorrectlyinstreaming-je-skj3. For the best 
monitoring experience, please name your job with a valid Cloud Label. For 
details, see: 
https://cloud.google.com/compute/docs/labeling-resources#restrictions
    Jan 06, 2021 12:57:02 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-06T00:57:00.381Z: Worker configuration: n1-standard-1 in 
us-central1-f.
    Jan 06, 2021 12:57:02 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-06T00:57:01.070Z: Expanding SplittableParDo operations into 
optimizable parts.
    Jan 06, 2021 12:57:02 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-06T00:57:01.168Z: Expanding CollectionToSingleton operations 
into optimizable parts.
    Jan 06, 2021 12:57:02 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-06T00:57:01.264Z: Expanding CoGroupByKey operations into 
optimizable parts.
    Jan 06, 2021 12:57:02 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-06T00:57:01.312Z: Expanding GroupByKey operations into 
optimizable parts.
    Jan 06, 2021 12:57:02 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-06T00:57:01.388Z: Fusing adjacent ParDo, Read, Write, and 
Flatten operations
    Jan 06, 2021 12:57:02 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-06T00:57:01.418Z: Fusing consumer Generate 
records/ParDo(OutputSingleSource) into Generate records/Impulse
    Jan 06, 2021 12:57:02 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-06T00:57:01.452Z: Fusing consumer s3/PairWithRestriction into 
Generate records/ParDo(OutputSingleSource)
    Jan 06, 2021 12:57:02 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-06T00:57:01.484Z: Fusing consumer s3/SplitWithSizing into 
s3/PairWithRestriction
    Jan 06, 2021 12:57:02 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-06T00:57:01.540Z: Fusing consumer Measure write time into 
s3/ProcessElementAndRestrictionWithSizing
    Jan 06, 2021 12:57:02 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-06T00:57:01.573Z: Fusing consumer Write to Kafka/Kafka 
ProducerRecord/Map into Measure write time
    Jan 06, 2021 12:57:02 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-06T00:57:01.602Z: Fusing consumer Write to 
Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter) into Write to Kafka/Kafka 
ProducerRecord/Map
    Jan 06, 2021 12:57:02 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-06T00:57:02.201Z: Executing operation Generate 
records/Impulse+Generate 
records/ParDo(OutputSingleSource)+s3/PairWithRestriction+s3/SplitWithSizing
    Jan 06, 2021 12:57:02 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-06T00:57:02.269Z: Starting 5 ****s in us-central1-f...
    Jan 06, 2021 12:57:07 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-06T00:57:07.878Z: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Jan 06, 2021 12:57:30 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-06T00:57:30.680Z: Autoscaling: Raised the number of ****s to 
2 based on the rate of progress in the currently running stage(s).
    Jan 06, 2021 12:57:30 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-06T00:57:30.712Z: Resized **** pool to 2, though goal was 5.  
This could be a quota issue.
    Jan 06, 2021 12:57:41 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-06T00:57:40.969Z: Autoscaling: Raised the number of ****s to 
5 based on the rate of progress in the currently running stage(s).
    Jan 06, 2021 12:57:57 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-06T00:57:56.552Z: Workers have started successfully.
    Jan 06, 2021 12:57:57 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-06T00:57:56.579Z: Workers have started successfully.

org.apache.beam.sdk.io.kafka.KafkaIOIT > 
testKafkaIOReadsAndWritesCorrectlyInStreaming SKIPPED

> Task :sdks:java:io:kafka:integrationTest FAILED
:sdks:java:io:kafka:integrationTest (Thread[Execution **** for ':',5,main]) 
completed. Took 19 mins 4.991 secs.
:sdks:java:io:kafka:cleanUp (Thread[Daemon **** Thread 2,5,main]) started.

> Task :sdks:java:io:kafka:cleanUp
Skipping task ':sdks:java:io:kafka:cleanUp' as it has no actions.
:sdks:java:io:kafka:cleanUp (Thread[Daemon **** Thread 2,5,main]) completed. 
Took 0.0 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:kafka:integrationTest'.
> Process 'Gradle Test Executor 2' finished with non-zero exit value 143
  This problem might be caused by incorrect test process configuration.
  Please refer to the test execution section in the User Manual at 
https://docs.gradle.org/6.7.1/userguide/java_testing.html#sec:test_execution

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.7.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 23m 22s
138 actionable tasks: 27 executed, 111 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches 
limit is too low.

Publishing build scan...
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=870eb8d0-ef8d-4f2c-9ec1-7bdfcdc81041, 
currentDir=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 582
  log file: /home/jenkins/.gradle/daemon/6.7.1/daemon-582.out.log
----- Last  20 lines from daemon log file - daemon-582.out.log -----
Execution failed for task ':sdks:java:io:kafka:integrationTest'.
> Process 'Gradle Test Executor 2' finished with non-zero exit value 143
  This problem might be caused by incorrect test process configuration.
  Please refer to the test execution section in the User Manual at 
https://docs.gradle.org/6.7.1/userguide/java_testing.html#sec:test_execution

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.7.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 23m 22s
138 actionable tasks: 27 executed, 111 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches 
limit is too low.

Publishing build scan...
Daemon vm is shutting down... The daemon has exited normally or was terminated 
in response to a user interrupt.
----- End of the daemon log -----


FAILURE: Build failed with an exception.

* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may 
have crashed)

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to