See
<https://builds.apache.org/job/beam_PostCommit_Java_PortabilityApi_GradleBuild/389/display/redirect?page=changes>
Changes:
[iemejia] [BEAM-6079] Add ability for CassandraIO to delete data
[iemejia] [BEAM-6079] Fix access level and clean up generics issues
------------------------------------------
[...truncated 1.35 MB...]
Dec 10, 2018 4:51:01 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/TempFilePrefixView/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map
as step s63
Dec 10, 2018 4:51:01 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/TempFilePrefixView/Combine.GloballyAsSingletonView/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)
as step s64
Dec 10, 2018 4:51:01 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/TempFilePrefixView/Combine.GloballyAsSingletonView/View.CreatePCollectionView
as step s65
Dec 10, 2018 4:51:01 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding BigQueryIO.Write/BatchLoads/rewindowIntoGlobal/Window.Assign
as step s66
Dec 10, 2018 4:51:01 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding BigQueryIO.Write/BatchLoads/WriteBundlesToFiles as step s67
Dec 10, 2018 4:51:01 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding BigQueryIO.Write/BatchLoads/GroupByDestination as step s68
Dec 10, 2018 4:51:01 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding BigQueryIO.Write/BatchLoads/WriteGroupedRecords as step s69
Dec 10, 2018 4:51:01 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding BigQueryIO.Write/BatchLoads/FlattenFiles as step s70
Dec 10, 2018 4:51:01 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/ReifyResults/View.AsIterable/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)
as step s71
Dec 10, 2018 4:51:01 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/ReifyResults/View.AsIterable/View.CreatePCollectionView
as step s72
Dec 10, 2018 4:51:01 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/ReifyResults/Create.Values/Read(CreateSource)/Impulse
as step s73
Dec 10, 2018 4:51:01 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/ReifyResults/Create.Values/Read(CreateSource)/ParDo(SplitBoundedSource)
as step s74
Dec 10, 2018 4:51:01 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/ReifyResults/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Pair
with random key as step s75
Dec 10, 2018 4:51:01 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/ReifyResults/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign
as step s76
Dec 10, 2018 4:51:01 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/ReifyResults/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey
as step s77
Dec 10, 2018 4:51:01 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/ReifyResults/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
as step s78
Dec 10, 2018 4:51:01 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/ReifyResults/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Values/Values/Map
as step s79
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/ReifyResults/Create.Values/Read(CreateSource)/ParDo(ReadFromBoundedSource)
as step s80
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding BigQueryIO.Write/BatchLoads/ReifyResults/ParDo(Anonymous) as
step s81
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding BigQueryIO.Write/BatchLoads/WritePartitionUntriggered as step
s82
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/MultiPartitionsReshuffle/Window.Into()/Window.Assign
as step s83
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/MultiPartitionsReshuffle/GroupByKey as step s84
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/MultiPartitionsReshuffle/ExpandIterable as step s85
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/MultiPartitionsWriteTables/ParMultiDo(WriteTables)
as step s86
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/MultiPartitionsWriteTables/WithKeys/AddKeys/Map as
step s87
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/MultiPartitionsWriteTables/Window.Into()/Window.Assign
as step s88
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/MultiPartitionsWriteTables/GroupByKey as step s89
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/MultiPartitionsWriteTables/Values/Values/Map as
step s90
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/MultiPartitionsWriteTables/ParDo(GarbageCollectTemporaryFiles)
as step s91
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/ReifyRenameInput/View.AsIterable/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)
as step s92
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/ReifyRenameInput/View.AsIterable/View.CreatePCollectionView
as step s93
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/ReifyRenameInput/Create.Values/Read(CreateSource)/Impulse
as step s94
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/ReifyRenameInput/Create.Values/Read(CreateSource)/ParDo(SplitBoundedSource)
as step s95
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/ReifyRenameInput/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Pair
with random key as step s96
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/ReifyRenameInput/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign
as step s97
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/ReifyRenameInput/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey
as step s98
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/ReifyRenameInput/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
as step s99
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/ReifyRenameInput/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Values/Values/Map
as step s100
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/ReifyRenameInput/Create.Values/Read(CreateSource)/ParDo(ReadFromBoundedSource)
as step s101
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding BigQueryIO.Write/BatchLoads/ReifyRenameInput/ParDo(Anonymous)
as step s102
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding BigQueryIO.Write/BatchLoads/WriteRenameUntriggered as step s103
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/SinglePartitionsReshuffle/Window.Into()/Window.Assign
as step s104
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/SinglePartitionsReshuffle/GroupByKey as step s105
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/SinglePartitionsReshuffle/ExpandIterable as step
s106
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/ParMultiDo(WriteTables)
as step s107
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/WithKeys/AddKeys/Map as
step s108
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/Window.Into()/Window.Assign
as step s109
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/GroupByKey as step s110
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/Values/Values/Map as
step s111
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/SinglePartitionWriteTables/ParDo(GarbageCollectTemporaryFiles)
as step s112
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/CreateEmptyFailedInserts/Read(CreateSource)/Impulse
as step s113
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/CreateEmptyFailedInserts/Read(CreateSource)/ParDo(SplitBoundedSource)
as step s114
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/CreateEmptyFailedInserts/Read(CreateSource)/Reshuffle.ViaRandomKey/Pair
with random key as step s115
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/CreateEmptyFailedInserts/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign
as step s116
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/CreateEmptyFailedInserts/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey
as step s117
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/CreateEmptyFailedInserts/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
as step s118
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/CreateEmptyFailedInserts/Read(CreateSource)/Reshuffle.ViaRandomKey/Values/Values/Map
as step s119
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
BigQueryIO.Write/BatchLoads/CreateEmptyFailedInserts/Read(CreateSource)/ParDo(ReadFromBoundedSource)
as step s120
Dec 10, 2018 4:51:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding DropInputs as step s121
Dec 10, 2018 4:51:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging pipeline description to
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-1210165051-2a0b53ef/output/results/staging/
Dec 10, 2018 4:51:02 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading <348872 bytes, hash iqaSmJl_9-Hv_2DbC4CosA> to
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-1210165051-2a0b53ef/output/results/staging/pipeline-iqaSmJl_9-Hv_2DbC4CosA.pb
org.apache.beam.sdk.io.gcp.bigquery.BigQueryToTableIT >
testStandardQueryWithoutCustom STANDARD_OUT
Dataflow SDK version: 2.10.0-SNAPSHOT
org.apache.beam.sdk.io.gcp.bigquery.BigQueryToTableIT >
testStandardQueryWithoutCustom STANDARD_ERROR
Dec 10, 2018 4:51:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_51_02-1065202652686330085?project=apache-beam-testing
org.apache.beam.sdk.io.gcp.bigquery.BigQueryToTableIT >
testStandardQueryWithoutCustom STANDARD_OUT
Submitted job: 2018-12-10_08_51_02-1065202652686330085
org.apache.beam.sdk.io.gcp.bigquery.BigQueryToTableIT >
testStandardQueryWithoutCustom STANDARD_ERROR
Dec 10, 2018 4:51:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
--region=us-central1 2018-12-10_08_51_02-1065202652686330085
Dec 10, 2018 4:51:03 PM org.apache.beam.runners.dataflow.TestDataflowRunner
run
INFO: Running Dataflow job 2018-12-10_08_51_02-1065202652686330085 with 0
expected assertions.
Dec 10, 2018 4:51:15 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-10T16:51:02.546Z: Autoscaling is enabled for job
2018-12-10_08_51_02-1065202652686330085. The number of workers will be between
1 and 1000.
Dec 10, 2018 4:51:15 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-10T16:51:02.595Z: Autoscaling was automatically enabled for
job 2018-12-10_08_51_02-1065202652686330085.
Dec 10, 2018 4:51:15 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-10T16:51:05.222Z: Checking permissions granted to controller
Service Account.
Dec 10, 2018 4:51:15 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-10T16:51:09.379Z: Worker configuration: n1-standard-1 in
us-central1-b.
Dec 10, 2018 4:51:15 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2018-12-10T16:51:09.941Z: Workflow failed. Causes: Project
apache-beam-testing has insufficient quota(s) to execute this workflow with 1
instances in region us-central1. Quota summary (required/available): 1/6987
instances, 1/0 CPUs, 250/69951 disk GB, 0/4046 SSD disk GB, 1/288 instance
groups, 1/288 managed instance groups, 1/263 instance templates, 1/241 in-use
IP addresses.
Please see https://cloud.google.com/compute/docs/resource-quotas about
requesting more quota.
Dec 10, 2018 4:51:15 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-10T16:51:10.026Z: Cleaning up.
Dec 10, 2018 4:51:15 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-10T16:51:10.084Z: Worker pool stopped.
Dec 10, 2018 4:51:15 PM
org.apache.beam.runners.dataflow.TestDataflowRunner$ErrorMonitorMessagesHandler
process
INFO: Dataflow job 2018-12-10_08_51_02-1065202652686330085 threw exception.
Failure message was: Workflow failed. Causes: Project apache-beam-testing has
insufficient quota(s) to execute this workflow with 1 instances in region
us-central1. Quota summary (required/available): 1/6987 instances, 1/0 CPUs,
250/69951 disk GB, 0/4046 SSD disk GB, 1/288 instance groups, 1/288 managed
instance groups, 1/263 instance templates, 1/241 in-use IP addresses.
Please see https://cloud.google.com/compute/docs/resource-quotas about
requesting more quota.
Dec 10, 2018 4:51:15 PM
org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
INFO: Job 2018-12-10_08_51_02-1065202652686330085 failed with status FAILED.
Dec 10, 2018 4:51:15 PM org.apache.beam.runners.dataflow.TestDataflowRunner
checkForPAssertSuccess
WARNING: Metrics not present for Dataflow job
2018-12-10_08_51_02-1065202652686330085.
Dec 10, 2018 4:51:15 PM org.apache.beam.runners.dataflow.TestDataflowRunner
run
WARNING: Dataflow job 2018-12-10_08_51_02-1065202652686330085 did not
output a success or failure metric.
Dec 10, 2018 4:51:15 PM
org.apache.beam.sdk.io.gcp.bigquery.BigQueryToTableIT cleanBqEnvironment
INFO: Start to clean up tables and datasets.
Dec 10, 2018 4:51:15 PM org.apache.beam.sdk.io.gcp.testing.BigqueryClient
deleteDataset
INFO: Successfully deleted dataset: bq_query_to_table_1544460651649_155
Gradle Test Executor 4 finished executing tests.
> Task
> :beam-runners-google-cloud-dataflow-java:googleCloudPlatformFnApiWorkerIntegrationTest
> FAILED
org.apache.beam.sdk.io.gcp.bigquery.BigQueryToTableIT >
testStandardQueryWithoutCustom FAILED
java.lang.RuntimeException: Workflow failed. Causes: Project
apache-beam-testing has insufficient quota(s) to execute this workflow with 1
instances in region us-central1. Quota summary (required/available): 1/6987
instances, 1/0 CPUs, 250/69951 disk GB, 0/4046 SSD disk GB, 1/288 instance
groups, 1/288 managed instance groups, 1/263 instance templates, 1/241 in-use
IP addresses.
Please see https://cloud.google.com/compute/docs/resource-quotas about
requesting more quota.
at
org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:134)
at
org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:90)
at
org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:55)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:313)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:299)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryToTableIT.runBigQueryToTablePipeline(BigQueryToTableIT.java:111)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryToTableIT.testStandardQueryWithoutCustom(BigQueryToTableIT.java:295)
7 tests completed, 5 failed
Finished generating test XML results (0.028 secs) into:
<https://builds.apache.org/job/beam_PostCommit_Java_PortabilityApi_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build/test-results/googleCloudPlatformFnApiWorkerIntegrationTest>
Generating HTML test report...
Finished generating test html results (0.02 secs) into:
<https://builds.apache.org/job/beam_PostCommit_Java_PortabilityApi_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/googleCloudPlatformFnApiWorkerIntegrationTest>
:beam-runners-google-cloud-dataflow-java:googleCloudPlatformFnApiWorkerIntegrationTest
(Thread[Task worker for ':' Thread 3,5,main]) completed. Took 10 mins 50.92
secs.
:beam-runners-google-cloud-dataflow-java:cleanUpDockerImages (Thread[Task
worker for ':' Thread 3,5,main]) started.
> Task :beam-runners-google-cloud-dataflow-java:cleanUpDockerImages
Caching disabled for task
':beam-runners-google-cloud-dataflow-java:cleanUpDockerImages': Caching has not
been enabled for the task
Task ':beam-runners-google-cloud-dataflow-java:cleanUpDockerImages' is not
up-to-date because:
Task has not declared any outputs despite executing actions.
Starting process 'command 'docker''. Working directory:
<https://builds.apache.org/job/beam_PostCommit_Java_PortabilityApi_GradleBuild/ws/src/runners/google-cloud-dataflow-java>
Command: docker rmi
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20181210163145
Successfully started process 'command 'docker''
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20181210163145
Untagged:
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d1dbd4e7b17be1b2e58c1ee8b77c0ad5d9166dcafcb43ddb98b7b6a9c3e6c622
Starting process 'command 'gcloud''. Working directory:
<https://builds.apache.org/job/beam_PostCommit_Java_PortabilityApi_GradleBuild/ws/src/runners/google-cloud-dataflow-java>
Command: gcloud --quiet container images delete --force-delete-tags
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20181210163145
Successfully started process 'command 'gcloud''
Digests:
-
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d1dbd4e7b17be1b2e58c1ee8b77c0ad5d9166dcafcb43ddb98b7b6a9c3e6c622
Associated tags:
- 20181210163145
Tags:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java:20181210163145
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20181210163145].
Deleted
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d1dbd4e7b17be1b2e58c1ee8b77c0ad5d9166dcafcb43ddb98b7b6a9c3e6c622].
:beam-runners-google-cloud-dataflow-java:cleanUpDockerImages (Thread[Task
worker for ':' Thread 3,5,main]) completed. Took 2.587 secs.
FAILURE: Build completed with 2 failures.
1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task
':beam-runners-google-cloud-dataflow-java:examplesJavaFnApiWorkerIntegrationTest'.
> There were failing tests. See the report at:
> file://<https://builds.apache.org/job/beam_PostCommit_Java_PortabilityApi_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/examplesJavaFnApiWorkerIntegrationTest/index.html>
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task
':beam-runners-google-cloud-dataflow-java:googleCloudPlatformFnApiWorkerIntegrationTest'.
> There were failing tests. See the report at:
> file://<https://builds.apache.org/job/beam_PostCommit_Java_PortabilityApi_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/googleCloudPlatformFnApiWorkerIntegrationTest/index.html>
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 5.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/4.10.2/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 19m 46s
90 actionable tasks: 84 executed, 5 from cache, 1 up-to-date
Publishing build scan...
https://gradle.com/s/ymq4pujxwecso
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]