See
<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/132/display/redirect?page=changes>
Changes:
[kirillkozlov] Filter push-down for BigQuery (kind of) working.
[kirillkozlov] Added IT test for BigQuery. spotlesApply.
[altay] Separate pydocs generation from py2 precommit tests.
[altay] Add settings file
[ningk] [BEAM-8379] Cache Eviction
[kirillkozlov] review comments
[wenjialiu] [BEAM-8575] Test a customized window fn work as expected
[wenjialiu] fixup
[wenjialiu] fixup
[thw] [BEAM-8670] Manage environment parallelism in DefaultJobBundleFactory
[migryz] Reduce Java Examples Dataflow Precommit timeout
[robertwb] Merge pull request #10117 [BEAM-8335] Add service and tagged output
------------------------------------------
[...truncated 479.35 KB...]
at
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV.process(BigQueryIOIT.java:243)
at
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV$DoFnInvoker.invokeProcessElement(Unknown
Source)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
at
org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
at
org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
at
org.apache.beam.sdk.testutils.metrics.TimeMonitor.processElement(TimeMonitor.java:42)
at
org.apache.beam.sdk.testutils.metrics.TimeMonitor$DoFnInvoker.invokeProcessElement(Unknown
Source)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
at
org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:77)
at
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
at
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)
at
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:305)
at
org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:140)
at
org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:120)
at
org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Nov 16, 2019 1:00:18 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2019-11-16T01:00:17.140Z: java.lang.RuntimeException:
com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad
Request
{
"code" : 400,
"errors" : [ {
"domain" : "global",
"message" : "Invalid table ID
\"bqio_write_10GB_java_b3b10281-0252-47a2-9aa4-c6f0fa5cba7a\". Table IDs must
be alphanumeric (plus underscores) and must be at most 1024 characters long.
Also, Table decorators cannot be used.",
"reason" : "invalid"
} ],
"message" : "Invalid table ID
\"bqio_write_10GB_java_b3b10281-0252-47a2-9aa4-c6f0fa5cba7a\". Table IDs must
be alphanumeric (plus underscores) and must be at most 1024 characters long.
Also, Table decorators cannot be used.",
"status" : "INVALID_ARGUMENT"
}
at
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.tryCreateTable(CreateTables.java:208)
at
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.getTableDestination(CreateTables.java:160)
at
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.lambda$processElement$0(CreateTables.java:113)
at java.util.HashMap.computeIfAbsent(HashMap.java:1126)
at
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.processElement(CreateTables.java:112)
Caused by:
com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad
Request
{
"code" : 400,
"errors" : [ {
"domain" : "global",
"message" : "Invalid table ID
\"bqio_write_10GB_java_b3b10281-0252-47a2-9aa4-c6f0fa5cba7a\". Table IDs must
be alphanumeric (plus underscores) and must be at most 1024 characters long.
Also, Table decorators cannot be used.",
"reason" : "invalid"
} ],
"message" : "Invalid table ID
\"bqio_write_10GB_java_b3b10281-0252-47a2-9aa4-c6f0fa5cba7a\". Table IDs must
be alphanumeric (plus underscores) and must be at most 1024 characters long.
Also, Table decorators cannot be used.",
"status" : "INVALID_ARGUMENT"
}
at
com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:150)
at
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
at
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:417)
at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1132)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:515)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:448)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.tryCreateTable(BigQueryServicesImpl.java:520)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.createTable(BigQueryServicesImpl.java:505)
at
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.tryCreateTable(CreateTables.java:205)
at
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.getTableDestination(CreateTables.java:160)
at
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.lambda$processElement$0(CreateTables.java:113)
at java.util.HashMap.computeIfAbsent(HashMap.java:1126)
at
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.processElement(CreateTables.java:112)
at
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn$DoFnInvoker.invokeProcessElement(Unknown
Source)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
at
org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
at
org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
at
org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1.processElement(PrepareWrite.java:82)
at
org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1$DoFnInvoker.invokeProcessElement(Unknown
Source)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:180)
at
org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
at
org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
at
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV.process(BigQueryIOIT.java:243)
at
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV$DoFnInvoker.invokeProcessElement(Unknown
Source)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
at
org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
at
org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
at
org.apache.beam.sdk.testutils.metrics.TimeMonitor.processElement(TimeMonitor.java:42)
at
org.apache.beam.sdk.testutils.metrics.TimeMonitor$DoFnInvoker.invokeProcessElement(Unknown
Source)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
at
org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:77)
at
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
at
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)
at
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:305)
at
org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:140)
at
org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:120)
at
org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Nov 16, 2019 1:00:18 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-16T01:00:17.267Z: Finished operation Read from source+Gather
time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to
BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write
Nov 16, 2019 1:00:18 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2019-11-16T01:00:17.428Z: Workflow failed. Causes: S02:Read from
source+Gather time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write
to BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write failed.,
The job failed because a work item has failed 4 times. Look in previous log
entries for the cause of each one of the 4 failures. For more information, see
https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was
attempted on these workers:
testpipeline-jenkins-1116-11151658-os5k-harness-zzjj
Root cause: Work item failed.,
testpipeline-jenkins-1116-11151658-os5k-harness-8wpw
Root cause: Work item failed.,
testpipeline-jenkins-1116-11151658-os5k-harness-zzjj
Root cause: Work item failed.,
testpipeline-jenkins-1116-11151658-os5k-harness-8wpw
Root cause: Work item failed.
Nov 16, 2019 1:00:18 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-16T01:00:17.596Z: Cleaning up.
Nov 16, 2019 1:00:18 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-16T01:00:17.776Z: Stopping worker pool...
Nov 16, 2019 1:02:21 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-16T01:02:18.811Z: Autoscaling: Resized worker pool from 5 to
0.
Nov 16, 2019 1:02:21 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-16T01:02:18.891Z: Worker pool stopped.
Nov 16, 2019 1:02:24 AM
org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2019-11-15_16_58_13-630726305557046916 failed with status FAILED.
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead
STANDARD_OUT
Load test results for test (ID): b3b10281-0252-47a2-9aa4-c6f0fa5cba7a and
timestamp: 2019-11-16T00:58:06.373000000Z:
Metric: Value:
write_time 0.0
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead FAILED
java.lang.IllegalArgumentException: Writing avro formatted data is only
supported for FILE_LOADS, however the method was STREAMING_INSERTS
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument(Preconditions.java:216)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$Write.expand(BigQueryIO.java:2362)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$Write.expand(BigQueryIO.java:1662)
at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:539)
at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:490)
at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:368)
at
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testWrite(BigQueryIOIT.java:159)
at
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testAvroWrite(BigQueryIOIT.java:148)
at
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testWriteThenRead(BigQueryIOIT.java:121)
Gradle Test Executor 1 finished executing tests.
> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest FAILED
1 test completed, 1 failed
Finished generating test XML results (0.04 secs) into:
<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.049 secs) into:
<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest>
:sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Execution worker
for ':' Thread 4,5,main]) completed. Took 4 mins 20.845 secs.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task
':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> There were failing tests. See the report at:
> file://<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest/index.html>
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 5m 43s
80 actionable tasks: 59 executed, 21 from cache
Publishing build scan...
https://gradle.com/s/26zpea73e2xds
Build cache (/home/jenkins/.gradle/caches/build-cache-1) removing files not
accessed on or after Sat Nov 09 00:56:50 UTC 2019.
Cache entries evicted. In-memory cache of
/home/jenkins/.gradle/caches/journal-1/file-access.bin: Size{4432}
MaxSize{4500}, CacheStats{hitCount=10, missCount=4020, loadSuccessCount=4021,
loadExceptionCount=0, totalLoadTime=758390390, evictionCount=1}
Performance may suffer from in-memory cache misses. Increase max heap size of
Gradle build process to reduce cache misses.
Cache entries evicted. In-memory cache of
/home/jenkins/.gradle/caches/journal-1/file-access.bin: Size{4500}
MaxSize{4500}, CacheStats{hitCount=10, missCount=4538, loadSuccessCount=4539,
loadExceptionCount=0, totalLoadTime=829324342, evictionCount=451}
Performance may suffer from in-memory cache misses. Increase max heap size of
Gradle build process to reduce cache misses.
Cache entries evicted. In-memory cache of
/home/jenkins/.gradle/caches/journal-1/file-access.bin: Size{4500}
MaxSize{4500}, CacheStats{hitCount=10, missCount=4988, loadSuccessCount=4989,
loadExceptionCount=0, totalLoadTime=889456983, evictionCount=901}
Performance may suffer from in-memory cache misses. Increase max heap size of
Gradle build process to reduce cache misses.
Cache entries evicted. In-memory cache of
/home/jenkins/.gradle/caches/journal-1/file-access.bin: Size{4500}
MaxSize{4500}, CacheStats{hitCount=10, missCount=5438, loadSuccessCount=5439,
loadExceptionCount=0, totalLoadTime=957065194, evictionCount=1351}
Performance may suffer from in-memory cache misses. Increase max heap size of
Gradle build process to reduce cache misses.
Cache entries evicted. In-memory cache of
/home/jenkins/.gradle/caches/journal-1/file-access.bin: Size{4500}
MaxSize{4500}, CacheStats{hitCount=10, missCount=5888, loadSuccessCount=5889,
loadExceptionCount=0, totalLoadTime=1003404554, evictionCount=1801}
Performance may suffer from in-memory cache misses. Increase max heap size of
Gradle build process to reduce cache misses.
Cache entries evicted. In-memory cache of
/home/jenkins/.gradle/caches/journal-1/file-access.bin: Size{4500}
MaxSize{4500}, CacheStats{hitCount=10, missCount=6338, loadSuccessCount=6339,
loadExceptionCount=0, totalLoadTime=1046591791, evictionCount=2251}
Performance may suffer from in-memory cache misses. Increase max heap size of
Gradle build process to reduce cache misses.
Cache entries evicted. In-memory cache of
/home/jenkins/.gradle/caches/journal-1/file-access.bin: Size{4500}
MaxSize{4500}, CacheStats{hitCount=10, missCount=6788, loadSuccessCount=6789,
loadExceptionCount=0, totalLoadTime=1089768529, evictionCount=2701}
Performance may suffer from in-memory cache misses. Increase max heap size of
Gradle build process to reduce cache misses.
Cache entries evicted. In-memory cache of
/home/jenkins/.gradle/caches/journal-1/file-access.bin: Size{4500}
MaxSize{4500}, CacheStats{hitCount=10, missCount=7238, loadSuccessCount=7239,
loadExceptionCount=0, totalLoadTime=1131501274, evictionCount=3151}
Performance may suffer from in-memory cache misses. Increase max heap size of
Gradle build process to reduce cache misses.
Cache entries evicted. In-memory cache of
/home/jenkins/.gradle/caches/journal-1/file-access.bin: Size{4500}
MaxSize{4500}, CacheStats{hitCount=10, missCount=7688, loadSuccessCount=7689,
loadExceptionCount=0, totalLoadTime=1187968866, evictionCount=3601}
Performance may suffer from in-memory cache misses. Increase max heap size of
Gradle build process to reduce cache misses.
Cache entries evicted. In-memory cache of
/home/jenkins/.gradle/caches/journal-1/file-access.bin: Size{4500}
MaxSize{4500}, CacheStats{hitCount=10, missCount=8138, loadSuccessCount=8139,
loadExceptionCount=0, totalLoadTime=1236156326, evictionCount=4051}
Performance may suffer from in-memory cache misses. Increase max heap size of
Gradle build process to reduce cache misses.
Cache entries evicted. In-memory cache of
/home/jenkins/.gradle/caches/journal-1/file-access.bin: Size{4500}
MaxSize{4500}, CacheStats{hitCount=10, missCount=8588, loadSuccessCount=8589,
loadExceptionCount=0, totalLoadTime=1277173989, evictionCount=4501}
Performance may suffer from in-memory cache misses. Increase max heap size of
Gradle build process to reduce cache misses.
Cache entries evicted. In-memory cache of
/home/jenkins/.gradle/caches/journal-1/file-access.bin: Size{4500}
MaxSize{4500}, CacheStats{hitCount=10, missCount=9038, loadSuccessCount=9039,
loadExceptionCount=0, totalLoadTime=1323576599, evictionCount=4951}
Performance may suffer from in-memory cache misses. Increase max heap size of
Gradle build process to reduce cache misses.
Cache entries evicted. In-memory cache of
/home/jenkins/.gradle/caches/journal-1/file-access.bin: Size{4500}
MaxSize{4500}, CacheStats{hitCount=10, missCount=9488, loadSuccessCount=9489,
loadExceptionCount=0, totalLoadTime=1368556796, evictionCount=5401}
Performance may suffer from in-memory cache misses. Increase max heap size of
Gradle build process to reduce cache misses.
Cache entries evicted. In-memory cache of
/home/jenkins/.gradle/caches/journal-1/file-access.bin: Size{4500}
MaxSize{4500}, CacheStats{hitCount=10, missCount=9938, loadSuccessCount=9939,
loadExceptionCount=0, totalLoadTime=1408927107, evictionCount=5851}
Performance may suffer from in-memory cache misses. Increase max heap size of
Gradle build process to reduce cache misses.
Cache entries evicted. In-memory cache of
/home/jenkins/.gradle/caches/journal-1/file-access.bin: Size{4500}
MaxSize{4500}, CacheStats{hitCount=10, missCount=10388, loadSuccessCount=10389,
loadExceptionCount=0, totalLoadTime=1463051170, evictionCount=6301}
Performance may suffer from in-memory cache misses. Increase max heap size of
Gradle build process to reduce cache misses.
Cache entries evicted. In-memory cache of
/home/jenkins/.gradle/caches/journal-1/file-access.bin: Size{4500}
MaxSize{4500}, CacheStats{hitCount=10, missCount=10838, loadSuccessCount=10839,
loadExceptionCount=0, totalLoadTime=1507527108, evictionCount=6751}
Performance may suffer from in-memory cache misses. Increase max heap size of
Gradle build process to reduce cache misses.
Cache entries evicted. In-memory cache of
/home/jenkins/.gradle/caches/journal-1/file-access.bin: Size{4500}
MaxSize{4500}, CacheStats{hitCount=10, missCount=11288, loadSuccessCount=11289,
loadExceptionCount=0, totalLoadTime=1559680887, evictionCount=7201}
Performance may suffer from in-memory cache misses. Increase max heap size of
Gradle build process to reduce cache misses.
Cache entries evicted. In-memory cache of
/home/jenkins/.gradle/caches/journal-1/file-access.bin: Size{4500}
MaxSize{4500}, CacheStats{hitCount=10, missCount=11738, loadSuccessCount=11739,
loadExceptionCount=0, totalLoadTime=1610839808, evictionCount=7651}
Performance may suffer from in-memory cache misses. Increase max heap size of
Gradle build process to reduce cache misses.
Cache entries evicted. In-memory cache of
/home/jenkins/.gradle/caches/journal-1/file-access.bin: Size{4500}
MaxSize{4500}, CacheStats{hitCount=10, missCount=12188, loadSuccessCount=12189,
loadExceptionCount=0, totalLoadTime=1657763824, evictionCount=8101}
Performance may suffer from in-memory cache misses. Increase max heap size of
Gradle build process to reduce cache misses.
Cache entries evicted. In-memory cache of
/home/jenkins/.gradle/caches/journal-1/file-access.bin: Size{4500}
MaxSize{4500}, CacheStats{hitCount=10, missCount=12638, loadSuccessCount=12639,
loadExceptionCount=0, totalLoadTime=1705577109, evictionCount=8551}
Performance may suffer from in-memory cache misses. Increase max heap size of
Gradle build process to reduce cache misses.
Cache entries evicted. In-memory cache of
/home/jenkins/.gradle/caches/journal-1/file-access.bin: Size{4500}
MaxSize{4500}, CacheStats{hitCount=10, missCount=13088, loadSuccessCount=13089,
loadExceptionCount=0, totalLoadTime=1752382678, evictionCount=9001}
Performance may suffer from in-memory cache misses. Increase max heap size of
Gradle build process to reduce cache misses.
Build cache (/home/jenkins/.gradle/caches/build-cache-1) cleaned up in 4.108
secs.
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]