See 
<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/202/display/redirect?page=changes>

Changes:

[chadrik] [BEAM-8523] JobAPI: Give access to timestamped state change history

[chadrik] Rename GetJobStateResponse to JobStateEvent

[chadrik] Move state history utilities to AbstractBeamJob

[chadrik] Small bugfix to FlinkBeamJob job state mapping

[chadrik] Fix existing bugs in AbstractJobServiceServicer

[chadrik] Use timestamp.Timestamp instead of float

[echauchot] [BEAM-8470] Update capability matrix: add Spark Structured Streaming

[echauchot] [BEAM-8470] Update Spark runner page: add Spark Structured Streaming


------------------------------------------
[...truncated 430.20 KB...]
        "message" : "Invalid table ID 
\"bqio_write_10GB_java_f01b6481-ec2d-406c-8db7-573756f21160\". Table IDs must 
be alphanumeric (plus underscores) and must be at most 1024 characters long. 
Also, Table decorators cannot be used.",
        "reason" : "invalid"
      } ],
      "message" : "Invalid table ID 
\"bqio_write_10GB_java_f01b6481-ec2d-406c-8db7-573756f21160\". Table IDs must 
be alphanumeric (plus underscores) and must be at most 1024 characters long. 
Also, Table decorators cannot be used.",
      "status" : "INVALID_ARGUMENT"
    }
        at 
com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:150)
        at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
        at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
        at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:417)
        at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1132)
        at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:515)
        at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:448)
        at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565)
        at 
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.tryCreateTable(BigQueryServicesImpl.java:520)
        at 
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.createTable(BigQueryServicesImpl.java:505)
        at 
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.tryCreateTable(CreateTables.java:205)
        at 
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.getTableDestination(CreateTables.java:160)
        at 
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.lambda$processElement$0(CreateTables.java:113)
        at java.util.HashMap.computeIfAbsent(HashMap.java:1126)
        at 
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.processElement(CreateTables.java:112)
        at 
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn$DoFnInvoker.invokeProcessElement(Unknown
 Source)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
        at 
org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
        at 
org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
        at 
org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1.processElement(PrepareWrite.java:82)
        at 
org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1$DoFnInvoker.invokeProcessElement(Unknown
 Source)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:180)
        at 
org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
        at 
org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
        at 
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV.process(BigQueryIOIT.java:243)
        at 
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV$DoFnInvoker.invokeProcessElement(Unknown
 Source)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
        at 
org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
        at 
org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
        at 
org.apache.beam.sdk.testutils.metrics.TimeMonitor.processElement(TimeMonitor.java:42)
        at 
org.apache.beam.sdk.testutils.metrics.TimeMonitor$DoFnInvoker.invokeProcessElement(Unknown
 Source)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
        at 
org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:77)
        at 
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
        at 
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)
        at 
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:305)
        at 
org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:140)
        at 
org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:120)
        at 
org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

    Dec 03, 2019 12:49:58 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2019-12-03T12:49:57.658Z: java.lang.RuntimeException: 
com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad 
Request
    {
      "code" : 400,
      "errors" : [ {
        "domain" : "global",
        "message" : "Invalid table ID 
\"bqio_write_10GB_java_f01b6481-ec2d-406c-8db7-573756f21160\". Table IDs must 
be alphanumeric (plus underscores) and must be at most 1024 characters long. 
Also, Table decorators cannot be used.",
        "reason" : "invalid"
      } ],
      "message" : "Invalid table ID 
\"bqio_write_10GB_java_f01b6481-ec2d-406c-8db7-573756f21160\". Table IDs must 
be alphanumeric (plus underscores) and must be at most 1024 characters long. 
Also, Table decorators cannot be used.",
      "status" : "INVALID_ARGUMENT"
    }
        at 
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.tryCreateTable(CreateTables.java:208)
        at 
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.getTableDestination(CreateTables.java:160)
        at 
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.lambda$processElement$0(CreateTables.java:113)
        at java.util.HashMap.computeIfAbsent(HashMap.java:1126)
        at 
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.processElement(CreateTables.java:112)
    Caused by: 
com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad 
Request
    {
      "code" : 400,
      "errors" : [ {
        "domain" : "global",
        "message" : "Invalid table ID 
\"bqio_write_10GB_java_f01b6481-ec2d-406c-8db7-573756f21160\". Table IDs must 
be alphanumeric (plus underscores) and must be at most 1024 characters long. 
Also, Table decorators cannot be used.",
        "reason" : "invalid"
      } ],
      "message" : "Invalid table ID 
\"bqio_write_10GB_java_f01b6481-ec2d-406c-8db7-573756f21160\". Table IDs must 
be alphanumeric (plus underscores) and must be at most 1024 characters long. 
Also, Table decorators cannot be used.",
      "status" : "INVALID_ARGUMENT"
    }
        at 
com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:150)
        at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
        at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
        at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:417)
        at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1132)
        at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:515)
        at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:448)
        at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565)
        at 
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.tryCreateTable(BigQueryServicesImpl.java:520)
        at 
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.createTable(BigQueryServicesImpl.java:505)
        at 
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.tryCreateTable(CreateTables.java:205)
        at 
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.getTableDestination(CreateTables.java:160)
        at 
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.lambda$processElement$0(CreateTables.java:113)
        at java.util.HashMap.computeIfAbsent(HashMap.java:1126)
        at 
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.processElement(CreateTables.java:112)
        at 
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn$DoFnInvoker.invokeProcessElement(Unknown
 Source)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
        at 
org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
        at 
org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
        at 
org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1.processElement(PrepareWrite.java:82)
        at 
org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1$DoFnInvoker.invokeProcessElement(Unknown
 Source)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:180)
        at 
org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
        at 
org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
        at 
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV.process(BigQueryIOIT.java:243)
        at 
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV$DoFnInvoker.invokeProcessElement(Unknown
 Source)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
        at 
org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
        at 
org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
        at 
org.apache.beam.sdk.testutils.metrics.TimeMonitor.processElement(TimeMonitor.java:42)
        at 
org.apache.beam.sdk.testutils.metrics.TimeMonitor$DoFnInvoker.invokeProcessElement(Unknown
 Source)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
        at 
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
        at 
org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:77)
        at 
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
        at 
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)
        at 
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:305)
        at 
org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:140)
        at 
org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:120)
        at 
org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

    Dec 03, 2019 12:49:58 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-03T12:49:57.713Z: Finished operation Read from source+Gather 
time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to 
BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to 
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to 
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write
 to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write 
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write
    Dec 03, 2019 12:49:58 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2019-12-03T12:49:57.849Z: Workflow failed. Causes: S02:Read from 
source+Gather time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write 
to BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to 
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to 
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write
 to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write 
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write failed., 
The job failed because a work item has failed 4 times. Look in previous log 
entries for the cause of each one of the 4 failures. For more information, see 
https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was 
attempted on these workers: 
      testpipeline-jenkins-1203-12030448-kosd-harness-579q
          Root cause: Work item failed.,
      testpipeline-jenkins-1203-12030448-kosd-harness-580g
          Root cause: Work item failed.,
      testpipeline-jenkins-1203-12030448-kosd-harness-pp3w
          Root cause: Work item failed.,
      testpipeline-jenkins-1203-12030448-kosd-harness-579q
          Root cause: Work item failed.
    Dec 03, 2019 12:49:58 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-03T12:49:58.004Z: Cleaning up.
    Dec 03, 2019 12:49:58 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-03T12:49:58.100Z: Stopping worker pool...
    Dec 03, 2019 12:53:10 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-03T12:53:08.862Z: Autoscaling: Resized worker pool from 5 to 
0.
    Dec 03, 2019 12:53:10 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-03T12:53:08.902Z: Worker pool stopped.
    Dec 03, 2019 12:53:15 PM 
org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2019-12-03_04_48_12-11591846819273240468 failed with status 
FAILED.

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead 
STANDARD_OUT
    Load test results for test (ID): f01b6481-ec2d-406c-8db7-573756f21160 and 
timestamp: 2019-12-03T12:48:06.644000000Z:
                     Metric:                    Value:
                  write_time                       0.0

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead FAILED
    java.lang.IllegalArgumentException: Writing avro formatted data is only 
supported for FILE_LOADS, however the method was STREAMING_INSERTS
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument(Preconditions.java:216)
        at 
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$Write.expand(BigQueryIO.java:2394)
        at 
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$Write.expand(BigQueryIO.java:1665)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:539)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:490)
        at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:368)
        at 
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testWrite(BigQueryIOIT.java:159)
        at 
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testAvroWrite(BigQueryIOIT.java:148)
        at 
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testWriteThenRead(BigQueryIOIT.java:121)

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest FAILED

1 test completed, 1 failed
Finished generating test XML results (0.025 secs) into: 
<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: 
<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest>
:sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Daemon 
worker,5,main]) completed. Took 5 mins 10.844 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task 
':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> There were failing tests. See the report at: 
> file://<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 49s
80 actionable tasks: 52 executed, 28 from cache

Publishing build scan...
https://scans.gradle.com/s/myblcn4zicwqo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org

Reply via email to