See
<https://builds.apache.org/job/beam_PostCommit_Java_Nexmark_Dataflow/438/display/redirect?page=changes>
Changes:
[mxm] [BEAM-3089] Use Flink cluster parallelism if no parallelism provided
[mxm] [flink] Revert default checkpointing mode to EXACTLY_ONCE
[mxm] [flink] Use default value for checkpoint timeout
[mxm] [flink] Set default master url to [auto]
[mxm] [BEAM-3089] Test default values of FlinkPipelineOptions
------------------------------------------
[...truncated 1.27 MB...]
2018-09-19T18:17:47.898Z DONE Query12
==========================================================================================
Run started 2018-09-19T17:23:21.974Z and ran for PT3265.961S
Default configuration:
{"debug":true,"query":0,"sourceType":"DIRECT","sinkType":"DEVNULL","exportSummaryToBigQuery":false,"pubSubMode":"COMBINED","numEvents":100000,"numEventGenerators":100,"rateShape":"SINE","firstEventRate":10000,"nextEventRate":10000,"rateUnit":"PER_SECOND","ratePeriodSec":600,"preloadSeconds":0,"streamTimeout":240,"isRateLimited":false,"useWallclockEventTime":false,"avgPersonByteSize":200,"avgAuctionByteSize":500,"avgBidByteSize":100,"hotAuctionRatio":2,"hotSellersRatio":4,"hotBiddersRatio":4,"windowSizeSec":10,"windowPeriodSec":5,"watermarkHoldbackSec":0,"numInFlightAuctions":100,"numActivePeople":1000,"coderStrategy":"HAND","cpuDelayMs":0,"diskBusyBytes":0,"auctionSkip":123,"fanout":5,"maxAuctionsWaitingTime":600,"occasionalDelaySec":3,"probDelayedEvent":0.1,"maxLogEvents":100000,"usePubsubPublishTime":false,"outOfOrderGroupSize":1}
Configurations:
Conf Description
0000 query:0; exportSummaryToBigQuery:true; numEvents:10000000
0001 query:1; exportSummaryToBigQuery:true; numEvents:10000000
0002 query:2; exportSummaryToBigQuery:true; numEvents:10000000
0003 query:3; exportSummaryToBigQuery:true; numEvents:10000000
0004 query:4; exportSummaryToBigQuery:true; numEvents:1000000
0005 query:5; exportSummaryToBigQuery:true; numEvents:10000000
0006 query:6; exportSummaryToBigQuery:true; numEvents:1000000
0007 query:7; exportSummaryToBigQuery:true; numEvents:10000000
0008 query:8; exportSummaryToBigQuery:true; numEvents:10000000
0009 query:9; exportSummaryToBigQuery:true; numEvents:1000000
0010 query:10; exportSummaryToBigQuery:true; numEvents:10000000
0011 query:11; exportSummaryToBigQuery:true; numEvents:10000000
0012 query:12; exportSummaryToBigQuery:true; numEvents:10000000
Performance:
Conf Runtime(sec) (Baseline) Events(/sec) (Baseline) Results
(Baseline)
0000 59.6 167897.9 10000000
0001 -1.0 -1.0 -1
*** Job was unexpectedly updated ***
0002 27.6 362594.7 83527
0003 36.3 275178.9 59507
0004 39.2 25516.7 140
0005 70.8 141250.9 4567
0006 39.2 25525.2 8902
0007 78.6 127179.5 100
0008 41.7 240067.2 92833
0009 41.6 24059.3 32548
0010 -1.0 -1.0 -1
*** Job was unexpectedly updated ***
0011 54.2 184481.4 1839657
0012 143.8 69564.7 199929
==========================================================================================
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 19, 2018 6:17:48 PM org.apache.beam.runners.dataflow.DataflowRunner
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from
the classpath: will stage 127 files. Enable logging at DEBUG level to see which
files will be staged.
Sep 19, 2018 6:17:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing
implications related to Google Compute Engine usage and other Google Cloud
Services.
Sep 19, 2018 6:17:48 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Uploading 127 files from PipelineOptions.filesToStage to staging location
to prepare for execution.
Sep 19, 2018 6:17:48 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Staging files complete: 127 files cached, 0 files newly uploaded
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Create.Values/Read(CreateSource) as step s1
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding savePerfsToBigQuery/PrepareWrite/ParDo(Anonymous) as step s2
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/JobIdCreationRoot/Read(CreateSource) as step s3
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding savePerfsToBigQuery/BatchLoads/CreateJobId as step s4
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map
as step s5
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey
as step s6
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues
as step s7
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map
as step s8
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
as step s9
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly
as step s10
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/View.AsSingleton/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
as step s11
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/View.AsSingleton/Combine.GloballyAsSingletonView/CreateDataflowView
as step s12
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding savePerfsToBigQuery/BatchLoads/Create.Values/Read(CreateSource) as
step s13
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding savePerfsToBigQuery/BatchLoads/GetTempFilePrefix as step s14
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/TempFilePrefixView/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map
as step s15
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/TempFilePrefixView/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey
as step s16
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/TempFilePrefixView/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues
as step s17
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/TempFilePrefixView/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map
as step s18
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/TempFilePrefixView/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
as step s19
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/TempFilePrefixView/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly
as step s20
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/TempFilePrefixView/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
as step s21
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/TempFilePrefixView/Combine.GloballyAsSingletonView/CreateDataflowView
as step s22
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding savePerfsToBigQuery/BatchLoads/rewindowIntoGlobal/Window.Assign as
step s23
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding savePerfsToBigQuery/BatchLoads/WriteBundlesToFiles as step s24
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding savePerfsToBigQuery/BatchLoads/GroupByDestination as step s25
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding savePerfsToBigQuery/BatchLoads/WriteGroupedRecords as step s26
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding savePerfsToBigQuery/BatchLoads/FlattenFiles as step s27
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/ReifyResults/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)
as step s28
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/ReifyResults/View.AsIterable/CreateDataflowView
as step s29
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/ReifyResults/Create.Values/Read(CreateSource) as
step s30
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding savePerfsToBigQuery/BatchLoads/ReifyResults/ParDo(Anonymous) as
step s31
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding savePerfsToBigQuery/BatchLoads/WritePartitionUntriggered as step
s32
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/MultiPartitionsReshuffle/Window.Into()/Window.Assign
as step s33
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding savePerfsToBigQuery/BatchLoads/MultiPartitionsReshuffle/GroupByKey
as step s34
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/MultiPartitionsReshuffle/ExpandIterable as step
s35
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/MultiPartitionsWriteTables/ParMultiDo(WriteTables)
as step s36
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/MultiPartitionsWriteTables/WithKeys/AddKeys/Map
as step s37
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/MultiPartitionsWriteTables/Window.Into()/Window.Assign
as step s38
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/MultiPartitionsWriteTables/GroupByKey as step s39
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/MultiPartitionsWriteTables/Values/Values/Map as
step s40
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/MultiPartitionsWriteTables/ParDo(GarbageCollectTemporaryFiles)
as step s41
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/ReifyRenameInput/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)
as step s42
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/ReifyRenameInput/View.AsIterable/CreateDataflowView
as step s43
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/ReifyRenameInput/Create.Values/Read(CreateSource)
as step s44
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding savePerfsToBigQuery/BatchLoads/ReifyRenameInput/ParDo(Anonymous)
as step s45
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding savePerfsToBigQuery/BatchLoads/WriteRenameUntriggered as step s46
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/SinglePartitionsReshuffle/Window.Into()/Window.Assign
as step s47
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/SinglePartitionsReshuffle/GroupByKey as step s48
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/SinglePartitionsReshuffle/ExpandIterable as step
s49
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/SinglePartitionWriteTables/ParMultiDo(WriteTables)
as step s50
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/SinglePartitionWriteTables/WithKeys/AddKeys/Map
as step s51
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/SinglePartitionWriteTables/Window.Into()/Window.Assign
as step s52
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/SinglePartitionWriteTables/GroupByKey as step s53
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/SinglePartitionWriteTables/Values/Values/Map as
step s54
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/SinglePartitionWriteTables/ParDo(GarbageCollectTemporaryFiles)
as step s55
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/CreateEmptyFailedInserts/Read(CreateSource) as
step s56
Sep 19, 2018 6:17:48 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding DropInputs as step s57
Sep 19, 2018 6:17:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging pipeline description to
gs://temp-storage-for-perf-tests/nexmark/staging/
Sep 19, 2018 6:17:48 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading <183315 bytes, hash 4QxTX9uJGo0xUrP7ortd7Q> to
gs://temp-storage-for-perf-tests/nexmark/staging/pipeline-4QxTX9uJGo0xUrP7ortd7Q.pb
Dataflow SDK version: 2.8.0-SNAPSHOT
Sep 19, 2018 6:17:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
Submitted job: 2018-09-19_11_17_49-5886286226277859134
INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-09-19_11_17_49-5886286226277859134?project=apache-beam-testing
Sep 19, 2018 6:17:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
> --region=us-central1 2018-09-19_11_17_49-5886286226277859134
==========================================================================================
Run started 2018-09-19T17:23:21.974Z and ran for PT3268.996S
Default configuration:
{"debug":true,"query":0,"sourceType":"DIRECT","sinkType":"DEVNULL","exportSummaryToBigQuery":false,"pubSubMode":"COMBINED","numEvents":100000,"numEventGenerators":100,"rateShape":"SINE","firstEventRate":10000,"nextEventRate":10000,"rateUnit":"PER_SECOND","ratePeriodSec":600,"preloadSeconds":0,"streamTimeout":240,"isRateLimited":false,"useWallclockEventTime":false,"avgPersonByteSize":200,"avgAuctionByteSize":500,"avgBidByteSize":100,"hotAuctionRatio":2,"hotSellersRatio":4,"hotBiddersRatio":4,"windowSizeSec":10,"windowPeriodSec":5,"watermarkHoldbackSec":0,"numInFlightAuctions":100,"numActivePeople":1000,"coderStrategy":"HAND","cpuDelayMs":0,"diskBusyBytes":0,"auctionSkip":123,"fanout":5,"maxAuctionsWaitingTime":600,"occasionalDelaySec":3,"probDelayedEvent":0.1,"maxLogEvents":100000,"usePubsubPublishTime":false,"outOfOrderGroupSize":1}
Configurations:
Conf Description
0000 query:0; exportSummaryToBigQuery:true; numEvents:10000000
0001 query:1; exportSummaryToBigQuery:true; numEvents:10000000
0002 query:2; exportSummaryToBigQuery:true; numEvents:10000000
0003 query:3; exportSummaryToBigQuery:true; numEvents:10000000
0004 query:4; exportSummaryToBigQuery:true; numEvents:1000000
0005 query:5; exportSummaryToBigQuery:true; numEvents:10000000
0006 query:6; exportSummaryToBigQuery:true; numEvents:1000000
0007 query:7; exportSummaryToBigQuery:true; numEvents:10000000
0008 query:8; exportSummaryToBigQuery:true; numEvents:10000000
0009 query:9; exportSummaryToBigQuery:true; numEvents:1000000
0010 query:10; exportSummaryToBigQuery:true; numEvents:10000000
0011 query:11; exportSummaryToBigQuery:true; numEvents:10000000
0012 query:12; exportSummaryToBigQuery:true; numEvents:10000000
Performance:
Conf Runtime(sec) (Baseline) Events(/sec) (Baseline) Results
(Baseline)
0000 59.6 167897.9 10000000
0001 -1.0 -1.0 -1
*** Job was unexpectedly updated ***
0002 27.6 362594.7 83527
0003 36.3 275178.9 59507
0004 39.2 25516.7 140
0005 70.8 141250.9 4567
0006 39.2 25525.2 8902
0007 78.6 127179.5 100
0008 41.7 240067.2 92833
0009 41.6 24059.3 32548
0010 -1.0 -1.0 -1
*** Job was unexpectedly updated ***
0011 54.2 184481.4 1839657
0012 143.8 69564.7 199929
==========================================================================================
Exception in thread "main" java.lang.RuntimeException: Execution was not
successful
at org.apache.beam.sdk.nexmark.Main.runAll(Main.java:174)
at org.apache.beam.sdk.nexmark.Main.main(Main.java:477)
> Task :beam-sdks-java-nexmark:run FAILED
:beam-sdks-java-nexmark:run (Thread[Task worker for ':' Thread 4,5,main])
completed. Took 54 mins 30.495 secs.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':beam-sdks-java-nexmark:run'.
> Process 'command '/usr/local/asfpackages/java/jdk1.8.0_152/bin/java''
> finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 5.0.
See
https://docs.gradle.org/4.8/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 57m 44s
65 actionable tasks: 61 executed, 4 from cache
Publishing build scan...
https://gradle.com/s/7n44mpyxebwi6
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure