See 
<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/234/display/redirect>

Changes:


------------------------------------------
[...truncated 278.77 KB...]
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.GroupingShuffleEntryIterator$ValuesIterator.advance(GroupingShuffleEntryIterator.java:271)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.GroupingShuffleEntryIterator$ValuesIterator.hasNext(GroupingShuffleEntryIterator.java:263)
        at 
org.apache.beam.runners.dataflow.worker.GroupingShuffleReader$GroupingShuffleReaderIterator$ValuesIterator.hasNext(GroupingShuffleReader.java:397)
        at 
org.apache.beam.runners.dataflow.worker.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:55)
        at 
org.apache.beam.runners.dataflow.worker.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:39)
        at 
org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowFnRunner.invokeProcessElement(GroupAlsoByWindowFnRunner.java:115)
        at 
org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowFnRunner.processElement(GroupAlsoByWindowFnRunner.java:73)
        at 
org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowsParDoFn.processElement(GroupAlsoByWindowsParDoFn.java:114)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:77)
        at 
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
        at 
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)

    Dec 09, 2019 1:41:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T13:41:54.300Z: Checking permissions granted to controller 
Service Account.
    Dec 09, 2019 1:47:55 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T13:47:54.263Z: Checking permissions granted to controller 
Service Account.
    Dec 09, 2019 1:50:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2019-12-09T13:50:00.884Z: An OutOfMemoryException occurred. 
Consider specifying higher memory instances in PipelineOptions.
    java.lang.RuntimeException: 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.ExecutionError:
 java.lang.OutOfMemoryError: Java heap space
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntries(BatchingShuffleEntryReader.java:132)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntriesIfNeeded(BatchingShuffleEntryReader.java:119)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.hasNext(BatchingShuffleEntryReader.java:84)
        at 
org.apache.beam.runners.dataflow.worker.util.common.ForwardingReiterator.hasNext(ForwardingReiterator.java:63)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.GroupingShuffleEntryIterator$ValuesIterator.advance(GroupingShuffleEntryIterator.java:271)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.GroupingShuffleEntryIterator$ValuesIterator.hasNext(GroupingShuffleEntryIterator.java:263)
        at 
org.apache.beam.runners.dataflow.worker.GroupingShuffleReader$GroupingShuffleReaderIterator$ValuesIterator.hasNext(GroupingShuffleReader.java:397)
        at 
org.apache.beam.runners.dataflow.worker.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:55)
        at 
org.apache.beam.runners.dataflow.worker.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:39)
        at 
org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowFnRunner.invokeProcessElement(GroupAlsoByWindowFnRunner.java:115)
        at 
org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowFnRunner.processElement(GroupAlsoByWindowFnRunner.java:73)
        at 
org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowsParDoFn.processElement(GroupAlsoByWindowsParDoFn.java:114)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:77)
        at 
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
        at 
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)
        at 
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:305)
        at 
org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:140)
        at 
org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:120)
        at 
org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
    Caused by: 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.ExecutionError:
 java.lang.OutOfMemoryError: Java heap space
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2048)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.CachingShuffleBatchReader.read(CachingShuffleBatchReader.java:74)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntries(BatchingShuffleEntryReader.java:125)
        ... 26 more
    Caused by: java.lang.OutOfMemoryError: Java heap space
        at 
org.apache.beam.runners.dataflow.worker.ChunkingShuffleBatchReader.getFixedLengthPrefixedByteArray(ChunkingShuffleBatchReader.java:98)
        at 
org.apache.beam.runners.dataflow.worker.ChunkingShuffleBatchReader.getShuffleEntry(ChunkingShuffleBatchReader.java:82)
        at 
org.apache.beam.runners.dataflow.worker.ChunkingShuffleBatchReader.read(ChunkingShuffleBatchReader.java:63)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.CachingShuffleBatchReader$1.load(CachingShuffleBatchReader.java:55)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.CachingShuffleBatchReader$1.load(CachingShuffleBatchReader.java:51)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.CachingShuffleBatchReader.read(CachingShuffleBatchReader.java:74)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntries(BatchingShuffleEntryReader.java:125)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntriesIfNeeded(BatchingShuffleEntryReader.java:119)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.hasNext(BatchingShuffleEntryReader.java:84)
        at 
org.apache.beam.runners.dataflow.worker.util.common.ForwardingReiterator.hasNext(ForwardingReiterator.java:63)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.GroupingShuffleEntryIterator$ValuesIterator.advance(GroupingShuffleEntryIterator.java:271)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.GroupingShuffleEntryIterator$ValuesIterator.hasNext(GroupingShuffleEntryIterator.java:263)
        at 
org.apache.beam.runners.dataflow.worker.GroupingShuffleReader$GroupingShuffleReaderIterator$ValuesIterator.hasNext(GroupingShuffleReader.java:397)
        at 
org.apache.beam.runners.dataflow.worker.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:55)
        at 
org.apache.beam.runners.dataflow.worker.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:39)
        at 
org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowFnRunner.invokeProcessElement(GroupAlsoByWindowFnRunner.java:115)
        at 
org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowFnRunner.processElement(GroupAlsoByWindowFnRunner.java:73)
        at 
org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowsParDoFn.processElement(GroupAlsoByWindowsParDoFn.java:114)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:77)
        at 
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
        at 
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)

    Dec 09, 2019 1:53:55 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T13:53:54.263Z: Checking permissions granted to controller 
Service Account.
    Dec 09, 2019 1:59:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T13:59:54.390Z: Checking permissions granted to controller 
Service Account.
    Dec 09, 2019 2:03:30 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T14:03:30.169Z: Finished operation Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Read+Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow+Write
 to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable+Write to 
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign+Write to 
BQ/StreamingInserts/StreamingWriteTables/StreamingWrite
    Dec 09, 2019 2:03:30 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2019-12-09T14:03:30.370Z: Workflow failed. Causes: S04:Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Read+Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow+Write
 to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable+Write to 
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign+Write to 
BQ/StreamingInserts/StreamingWriteTables/StreamingWrite failed., The job failed 
because a work item has failed 4 times. Look in previous log entries for the 
cause of each one of the 4 failures. For more information, see 
https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was 
attempted on these workers: 
      testpipeline-jenkins-1209-12090523-t2aj-harness-pwd0
          Root cause: The worker lost contact with the service.,
      testpipeline-jenkins-1209-12090523-t2aj-harness-lq9p
          Root cause: The worker lost contact with the service.,
      testpipeline-jenkins-1209-12090523-t2aj-harness-lq9p
          Root cause: The worker lost contact with the service.,
      testpipeline-jenkins-1209-12090523-t2aj-harness-2v28
          Root cause: The worker lost contact with the service.
    Dec 09, 2019 2:03:30 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T14:03:30.513Z: Cleaning up.
    Dec 09, 2019 2:03:32 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T14:03:31.085Z: Stopping worker pool...
    Dec 09, 2019 2:07:49 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T14:07:47.541Z: Autoscaling: Resized worker pool from 5 to 
0.
    Dec 09, 2019 2:07:49 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-12-09T14:07:47.606Z: Worker pool stopped.
    Dec 09, 2019 2:07:55 PM 
org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2019-12-09_05_23_35-7411825093350387049 failed with status FAILED.

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead 
STANDARD_OUT
    Load test results for test (ID): 4928f165-dbe1-4286-84b1-961de06a1725 and 
timestamp: 2019-12-09T13:23:29.293000000Z:
                     Metric:                    Value:
                  write_time                   140.456

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead 
STANDARD_ERROR
    Dec 09, 2019 2:07:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
    Dec 09, 2019 2:07:56 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Uploading 183 files from PipelineOptions.filesToStage to staging 
location to prepare for execution.
    Dec 09, 2019 2:07:56 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    WARNING: Skipping non-existent file to stage ${dataflowWorkerJar}.
    Dec 09, 2019 2:07:56 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    WARNING: Skipping non-existent file to stage ${dataflowWorkerJar}.
    Dec 09, 2019 2:07:56 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    WARNING: Skipping non-existent file to stage 
<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/resources/test.>
    Dec 09, 2019 2:07:56 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    WARNING: Skipping non-existent file to stage 
<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/classes/java/main.>
    Dec 09, 2019 2:07:56 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    WARNING: Skipping non-existent file to stage 
<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/resources/main.>
    Dec 09, 2019 2:07:57 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Staging files complete: 178 files cached, 0 files newly uploaded in 0 
seconds
    Dec 09, 2019 2:07:57 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from BQ/TriggerIdCreation/Read(CreateSource) as step s1
    Dec 09, 2019 2:07:57 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from 
BQ/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map
 as step s2
    Dec 09, 2019 2:07:57 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from 
BQ/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey
 as step s3
    Dec 09, 2019 2:07:57 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from 
BQ/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues
 as step s4
    Dec 09, 2019 2:07:57 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from 
BQ/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map
 as step s5
    Dec 09, 2019 2:07:57 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from 
BQ/ViewId/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
 as step s6
    Dec 09, 2019 2:07:57 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from 
BQ/ViewId/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly
 as step s7
    Dec 09, 2019 2:07:57 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from 
BQ/ViewId/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
 as step s8
    Dec 09, 2019 2:07:57 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from 
BQ/ViewId/Combine.GloballyAsSingletonView/CreateDataflowView as step s9
    Dec 09, 2019 2:07:57 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from BQ/Read(BigQueryTableSource) as step s10
    Dec 09, 2019 2:07:57 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from BQ/PassThroughThenCleanup/ParMultiDo(Identity) as 
step s11
    Dec 09, 2019 2:07:57 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from 
BQ/PassThroughThenCleanup/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow) as 
step s12
    Dec 09, 2019 2:07:57 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from 
BQ/PassThroughThenCleanup/View.AsIterable/CreateDataflowView as step s13
    Dec 09, 2019 2:07:57 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from 
BQ/PassThroughThenCleanup/Create(CleanupOperation)/Read(CreateSource) as step 
s14
    Dec 09, 2019 2:07:57 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from BQ/PassThroughThenCleanup/Cleanup as step s15
    Dec 09, 2019 2:07:57 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Gather time as step s16
    Dec 09, 2019 2:07:57 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to 
gs://temp-storage-for-perf-tests/loadtests/staging/
    Dec 09, 2019 2:07:57 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading <48505 bytes, hash YzbfkhBux1jcRt1EPMbWkg> to 
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-YzbfkhBux1jcRt1EPMbWkg.pb
    Dec 09, 2019 2:07:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.19.0-SNAPSHOT
    Dec 09, 2019 2:07:58 PM 
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
 handleResponse
    WARNING: Request failed with code 400, performed 0 retries due to 
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP 
framework says request can be retried, (caller responsible for retrying): 
https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs.
 

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead FAILED
    java.lang.RuntimeException: Failed to create a workflow job: 
(9182231a18263cc9): The workflow could not be created. Causes: 
(46e2ca637dda3b74): Dataflow quota error for jobs-per-project quota. Project 
apache-beam-testing is running 300 jobs. Please check the quota usage via GCP 
Console. If it exceeds the limit, please wait for a workflow to finish or 
contact Google Cloud Support to request an increase in quota. If it does not, 
contact Google Cloud Support.
        at 
org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:974)
        at 
org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:188)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:315)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:301)
        at 
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testRead(BigQueryIOIT.java:190)
        at 
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testWriteThenRead(BigQueryIOIT.java:131)

        Caused by:
        com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 
Bad Request
        {
          "code" : 400,
          "errors" : [ {
            "domain" : "global",
            "message" : "(9182231a18263cc9): The workflow could not be created. 
Causes: (46e2ca637dda3b74): Dataflow quota error for jobs-per-project quota. 
Project apache-beam-testing is running 300 jobs. Please check the quota usage 
via GCP Console. If it exceeds the limit, please wait for a workflow to finish 
or contact Google Cloud Support to request an increase in quota. If it does 
not, contact Google Cloud Support.",
            "reason" : "failedPrecondition"
          } ],
          "message" : "(9182231a18263cc9): The workflow could not be created. 
Causes: (46e2ca637dda3b74): Dataflow quota error for jobs-per-project quota. 
Project apache-beam-testing is running 300 jobs. Please check the quota usage 
via GCP Console. If it exceeds the limit, please wait for a workflow to finish 
or contact Google Cloud Support to request an increase in quota. If it does 
not, contact Google Cloud Support.",
          "status" : "FAILED_PRECONDITION"
        }
            at 
com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:150)
            at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
            at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
            at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:417)
            at 
com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1132)
            at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:515)
            at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:448)
            at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565)
            at 
org.apache.beam.runners.dataflow.DataflowClient.createJob(DataflowClient.java:61)
            at 
org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:960)
            ... 5 more

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest FAILED

1 test completed, 1 failed
Finished generating test XML results (0.036 secs) into: 
<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.054 secs) into: 
<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest>
:sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Daemon 
worker,5,main]) completed. Took 44 mins 31.608 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task 
':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> There were failing tests. See the report at: 
> file://<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 47m 16s
80 actionable tasks: 79 executed, 1 from cache

Publishing build scan...
https://scans.gradle.com/s/h3fmtx25u5w7m

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to