See 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/1901/display/redirect>

Changes:


------------------------------------------
[...truncated 272.62 KB...]
Watching 1333 directories to track changes
Watching 1333 directories to track changes
Watching 1333 directories to track changes
Watching 1340 directories to track changes
Watching 1342 directories to track changes
Watching 1344 directories to track changes
Loaded cache entry for task 
':sdks:java:io:bigquery-io-perf-tests:compileTestJava' with cache key 
25383f939d70037c942e939116ed3501
:sdks:java:io:bigquery-io-perf-tests:compileTestJava (Thread[Execution **** for 
':' Thread 8,5,main]) completed. Took 0.169 secs.
:sdks:java:io:bigquery-io-perf-tests:testClasses (Thread[Execution **** for ':' 
Thread 8,5,main]) started.

> Task :sdks:java:io:bigquery-io-perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:io:bigquery-io-perf-tests:testClasses' as it has no 
actions.
:sdks:java:io:bigquery-io-perf-tests:testClasses (Thread[Execution **** for ':' 
Thread 8,5,main]) completed. Took 0.0 secs.

> Task :runners:google-cloud-dataflow-java:compileTestJava FROM-CACHE
Watching 1322 directories to track changes
Watching 1323 directories to track changes
Watching 1323 directories to track changes
Watching 1354 directories to track changes
Custom actions are attached to task 
':runners:google-cloud-dataflow-java:compileTestJava'.
Build cache key for task ':runners:google-cloud-dataflow-java:compileTestJava' 
is b33197310334c76ee166bf3c6fd8fa88
Task ':runners:google-cloud-dataflow-java:compileTestJava' is not up-to-date 
because:
  No history is available.
Watching 1354 directories to track changes
Watching 1354 directories to track changes
Watching 1354 directories to track changes
Watching 1365 directories to track changes
Watching 1366 directories to track changes
Watching 1367 directories to track changes
Loaded cache entry for task 
':runners:google-cloud-dataflow-java:compileTestJava' with cache key 
b33197310334c76ee166bf3c6fd8fa88
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** for 
':' Thread 5,5,main]) completed. Took 0.2 secs.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** for ':' 
Thread 5,5,main]) started.

> Task :runners:google-cloud-dataflow-java:testClasses UP-TO-DATE
Skipping task ':runners:google-cloud-dataflow-java:testClasses' as it has no 
actions.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** for ':' 
Thread 5,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:testJar (Thread[Execution **** for ':' 
Thread 5,5,main]) started.

> Task :runners:google-cloud-dataflow-java:testJar
Watching 1367 directories to track changes
Watching 1367 directories to track changes
Watching 1368 directories to track changes
Caching disabled for task ':runners:google-cloud-dataflow-java:testJar' because:
  Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:testJar' is not up-to-date because:
  No history is available.
Watching 1368 directories to track changes
file or directory 
'<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/resources/test',>
 not found
Watching 1368 directories to track changes
:runners:google-cloud-dataflow-java:testJar (Thread[Execution **** for ':' 
Thread 5,5,main]) completed. Took 0.048 secs.
:sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Execution **** for 
':' Thread 5,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest
Watching 1368 directories to track changes
Watching 1368 directories to track changes
Watching 1368 directories to track changes
Watching 1368 directories to track changes
Watching 1368 directories to track changes
Custom actions are attached to task 
':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
Build cache key for task ':sdks:java:io:bigquery-io-perf-tests:integrationTest' 
is 045e5f01e58355d8fbf6f72159c3b2ac
Task ':sdks:java:io:bigquery-io-perf-tests:integrationTest' is not up-to-date 
because:
  Task.upToDateWhen is false.
Watching 1368 directories to track changes
Watching 1368 directories to track changes
Starting process 'Gradle Test Executor 1'. Working directory: 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests>
 Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java 
-DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--writeMethod=STREAMING_INSERTS","--writeFormat=JSON","--testBigQueryDataset=beam_performance","--testBigQueryTable=bqio_write_10GB_java_stream_0123150408","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=bqio_10GB_results_java_stream","--influxMeasurement=bqio_10GB_results_java_stream","--sourceOptions={\"numRecords\":\"10485760\",\"keySizeBytes\":\"1\",\"valueSizeBytes\":\"1024\"}","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--influxDatabase=beam_test_metrics","--influxHost=http://10.128.0.96:8086","--****HarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.28.0-SNAPSHOT.jar","--region=us-central1";]>
 
-Djava.security.manager=****.org.gradle.process.internal.****.child.BootstrapSecurityManager
 -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US 
-Duser.language=en -Duser.variant -ea -cp 
/home/jenkins/.gradle/caches/6.8/****Main/gradle-****.jar 
****.org.gradle.process.internal.****.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in 
[jar:<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.28.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in 
[jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead 
STANDARD_ERROR
    Jan 23, 2021 6:33:55 PM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Jan 23, 2021 6:33:55 PM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files 
from the classpath: will stage 210 files. Enable logging at DEBUG level to see 
which files will be staged.
    Jan 23, 2021 6:33:57 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
    Jan 23, 2021 6:33:58 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Uploading 211 files from PipelineOptions.filesToStage to staging 
location to prepare for execution.
    Jan 23, 2021 6:33:59 PM 
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes 
forFileToStage
    INFO: Staging custom dataflow-****.jar as 
beam-runners-google-cloud-dataflow-java-legacy-****-2.28.0-SNAPSHOT-2SoFhojL-3Ddn-VPHvQQQMaM5G0EvqGXPUL8TzA3o_c.jar
    Jan 23, 2021 6:33:59 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading /tmp/test1327257988221722231.zip to 
gs://temp-storage-for-perf-tests/loadtests/staging/test-miKap2f6BgI5xSAa6mVNaRgj7z_yL7NyH89206nS0Qs.jar
    Jan 23, 2021 6:33:59 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Staging files complete: 210 files cached, 1 files newly uploaded in 0 
seconds
    Jan 23, 2021 6:33:59 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from source as step s1
    Jan 23, 2021 6:33:59 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Gather time as step s2
    Jan 23, 2021 6:33:59 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Map records as step s3
    Jan 23, 2021 6:33:59 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/PrepareWrite/ParDo(Anonymous) as step s4
    Jan 23, 2021 6:33:59 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/StreamingInserts/CreateTables/ParDo(CreateTables) 
as step s5
    Jan 23, 2021 6:33:59 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites as step s6
    Jan 23, 2021 6:33:59 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds as step s7
    Jan 23, 2021 6:33:59 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign 
as step s8
    Jan 23, 2021 6:33:59 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey as step s9
    Jan 23, 2021 6:33:59 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable as step s10
    Jan 23, 2021 6:33:59 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign as step s11
    Jan 23, 2021 6:33:59 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/StreamingWrite as step s12
    Jan 23, 2021 6:33:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to 
gs://temp-storage-for-perf-tests/loadtests/staging/
    Jan 23, 2021 6:33:59 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading <110018 bytes, hash 
6e7ad9f3131c88d0a52f2d6c4e6839ad7dccbb871c2bac9c9c77b082678edacf> to 
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-bnrZ8xMciNClLy1sTmg5rX3Mu4ccK6ycnHewgmeO2s8.pb
    Jan 23, 2021 6:34:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.28.0-SNAPSHOT
    Jan 23, 2021 6:34:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-23_10_34_00-3935164951784654554?project=apache-beam-testing
    Jan 23, 2021 6:34:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-01-23_10_34_00-3935164951784654554
    Jan 23, 2021 6:34:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2021-01-23_10_34_00-3935164951784654554
    Jan 23, 2021 6:34:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-01-23T18:34:00.102Z: The requested max number of ****s (5) is 
ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Jan 23, 2021 6:34:13 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:34:11.550Z: Worker configuration: n1-standard-1 in 
us-central1-f.
    Jan 23, 2021 6:34:13 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:34:12.215Z: Expanding CoGroupByKey operations into 
optimizable parts.
    Jan 23, 2021 6:34:13 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:34:12.344Z: Expanding GroupByKey operations into 
optimizable parts.
    Jan 23, 2021 6:34:13 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:34:12.366Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
    Jan 23, 2021 6:34:13 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:34:12.444Z: Fusing adjacent ParDo, Read, Write, and 
Flatten operations
    Jan 23, 2021 6:34:13 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:34:12.469Z: Fusing consumer Gather time into Read from 
source
    Jan 23, 2021 6:34:13 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:34:12.495Z: Fusing consumer Map records into Gather time
    Jan 23, 2021 6:34:13 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:34:12.518Z: Fusing consumer Write to 
BQ/PrepareWrite/ParDo(Anonymous) into Map records
    Jan 23, 2021 6:34:13 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:34:12.543Z: Fusing consumer Write to 
BQ/StreamingInserts/CreateTables/ParDo(CreateTables) into Write to 
BQ/PrepareWrite/ParDo(Anonymous)
    Jan 23, 2021 6:34:13 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:34:12.563Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites into Write to 
BQ/StreamingInserts/CreateTables/ParDo(CreateTables)
    Jan 23, 2021 6:34:13 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:34:12.590Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds into Write to 
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites
    Jan 23, 2021 6:34:13 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:34:12.619Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign 
into Write to BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds
    Jan 23, 2021 6:34:13 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:34:12.706Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify into Write 
to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign
    Jan 23, 2021 6:34:13 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:34:12.728Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write into Write 
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify
    Jan 23, 2021 6:34:13 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:34:12.755Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow 
into Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Read
    Jan 23, 2021 6:34:13 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:34:12.777Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable into Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow
    Jan 23, 2021 6:34:13 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:34:12.798Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign into Write 
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable
    Jan 23, 2021 6:34:13 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:34:12.822Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/StreamingWrite into Write to 
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign
    Jan 23, 2021 6:34:15 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:34:13.383Z: Executing operation Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Create
    Jan 23, 2021 6:34:15 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:34:13.451Z: Starting 5 ****s in us-central1-f...
    Jan 23, 2021 6:34:15 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:34:13.495Z: Finished operation Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Create
    Jan 23, 2021 6:34:15 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:34:13.623Z: Executing operation Read from source+Gather 
time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to 
BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to 
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to 
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write
 to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write 
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write
    Jan 23, 2021 6:34:26 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:34:25.582Z: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Jan 23, 2021 6:34:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:34:39.301Z: Autoscaling: Raised the number of ****s to 
1 based on the rate of progress in the currently running stage(s).
    Jan 23, 2021 6:34:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:34:39.317Z: Resized **** pool to 1, though goal was 5.  
This could be a quota issue.
    Jan 23, 2021 6:34:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:34:49.604Z: Autoscaling: Raised the number of ****s to 
5 based on the rate of progress in the currently running stage(s).
    Jan 23, 2021 6:35:05 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:35:03.068Z: Workers have started successfully.
    Jan 23, 2021 6:35:05 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:35:03.090Z: Workers have started successfully.
    Jan 23, 2021 6:37:04 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:37:04.370Z: Finished operation Read from source+Gather 
time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to 
BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to 
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to 
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write
 to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write 
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write
    Jan 23, 2021 6:37:04 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:37:04.441Z: Executing operation Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Close
    Jan 23, 2021 6:37:04 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:37:04.486Z: Finished operation Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Close
    Jan 23, 2021 6:37:06 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-01-23T18:37:04.558Z: Executing operation Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Read+Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow+Write
 to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable+Write to 
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign+Write to 
BQ/StreamingInserts/StreamingWriteTables/StreamingWrite
Build timed out (after 100 minutes). Marking the build as aborted.
FATAL: command execution failed
hudson.remoting.ChannelClosedException: Channel 
"hudson.remoting.Channel@12edf30e:apache-beam-jenkins-15": Remote call on 
apache-beam-jenkins-15 failed. The channel is closing down or has closed down
        at hudson.remoting.Channel.call(Channel.java:991)
        at 
hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:285)
        at com.sun.proxy.$Proxy145.isAlive(Unknown Source)
        at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1147)
        at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1139)
        at hudson.Launcher$ProcStarter.join(Launcher.java:469)
        at hudson.plugins.gradle.Gradle.perform(Gradle.java:317)
        at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
        at 
hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:741)
        at hudson.model.Build$BuildExecution.build(Build.java:206)
        at hudson.model.Build$BuildExecution.doRun(Build.java:163)
        at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:504)
        at hudson.model.Run.execute(Run.java:1880)
        at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
        at hudson.model.ResourceController.execute(ResourceController.java:97)
        at hudson.model.Executor.run(Executor.java:428)
Caused by: java.io.IOException
        at hudson.remoting.Channel.close(Channel.java:1490)
        at hudson.remoting.Channel.close(Channel.java:1446)
        at hudson.slaves.SlaveComputer.closeChannel(SlaveComputer.java:872)
        at hudson.slaves.SlaveComputer.access$100(SlaveComputer.java:113)
        at hudson.slaves.SlaveComputer$2.run(SlaveComputer.java:763)
        at 
jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28)
        at 
jenkins.security.ImpersonatingExecutorService$1.run(ImpersonatingExecutorService.java:59)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
FATAL: Channel "hudson.remoting.Channel@12edf30e:apache-beam-jenkins-15": 
Remote call on apache-beam-jenkins-15 failed. The channel is closing down or 
has closed down
java.io.IOException
        at hudson.remoting.Channel.close(Channel.java:1490)
        at hudson.remoting.Channel.close(Channel.java:1446)
        at hudson.slaves.SlaveComputer.closeChannel(SlaveComputer.java:872)
        at hudson.slaves.SlaveComputer.access$100(SlaveComputer.java:113)
        at hudson.slaves.SlaveComputer$2.run(SlaveComputer.java:763)
        at 
jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28)
        at 
jenkins.security.ImpersonatingExecutorService$1.run(ImpersonatingExecutorService.java:59)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused: hudson.remoting.ChannelClosedException: Channel 
"hudson.remoting.Channel@12edf30e:apache-beam-jenkins-15": Remote call on 
apache-beam-jenkins-15 failed. The channel is closing down or has closed down
        at hudson.remoting.Channel.call(Channel.java:991)
        at hudson.Launcher$RemoteLauncher.kill(Launcher.java:1083)
        at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:510)
        at hudson.model.Run.execute(Run.java:1880)
        at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
        at hudson.model.ResourceController.execute(ResourceController.java:97)
        at hudson.model.Executor.run(Executor.java:428)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to