See
<https://builds.apache.org/job/beam_PostCommit_Java_Nexmark_Dataflow/873/display/redirect?page=changes>
Changes:
[migryz] Add script to sync data for Beam GitHub community metrics.
[migryz] Address PR comments.
[swegner] Add GitHub data to source freshness dashboard
[swegner] Set y axis limits to 0-100%
[migryz] Fix unittests
[migryz] Address PR changes
[amyrvold] [BEAM-5735] Contributor guide improvements
[lcwik] [BEAM-5299] Define max timestamp for global window in proto (#6381)
[lcwik] Remove unintended comment in BeamFnLoggingClient (#6796)
[migryz] Add Code Velocity dashboard json
------------------------------------------
[...truncated 887.33 KB...]
Performance:
Conf Runtime(sec) (Baseline) Events(/sec) (Baseline) Results
(Baseline)
0000 -1.0 -1.0 -1
*** Job was unexpectedly updated ***
0001 -1.0 -1.0 -1
*** Job was unexpectedly updated ***
0002 -1.0 -1.0 -1
*** Job was unexpectedly updated ***
0003 -1.0 -1.0 -1
*** Job was unexpectedly updated ***
0004 -1.0 -1.0 -1
*** Job was unexpectedly updated ***
0005 -1.0 -1.0 -1
*** Job was unexpectedly updated ***
0006 -1.0 -1.0 -1
*** Job was unexpectedly updated ***
0007 -1.0 -1.0 -1
*** Job was unexpectedly updated ***
0008 -1.0 -1.0 -1
*** Job was unexpectedly updated ***
0009 -1.0 -1.0 -1
*** Job was unexpectedly updated ***
0010 -1.0 -1.0 -1
*** Job was unexpectedly updated ***
0011 -1.0 -1.0 -1
*** Job was unexpectedly updated ***
0012 -1.0 -1.0 -1
*** Job was unexpectedly updated ***
==========================================================================================
Oct 31, 2018 12:19:49 AM
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 31, 2018 12:19:49 AM org.apache.beam.runners.dataflow.DataflowRunner
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from
the classpath: will stage 129 files. Enable logging at DEBUG level to see which
files will be staged.
Oct 31, 2018 12:19:49 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing
implications related to Google Compute Engine usage and other Google Cloud
Services.
Oct 31, 2018 12:19:49 AM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Uploading 129 files from PipelineOptions.filesToStage to staging location
to prepare for execution.
Oct 31, 2018 12:19:50 AM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Staging files complete: 129 files cached, 0 files newly uploaded
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Create.Values/Read(CreateSource) as step s1
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding savePerfsToBigQuery/PrepareWrite/ParDo(Anonymous) as step s2
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/JobIdCreationRoot/Read(CreateSource) as step s3
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding savePerfsToBigQuery/BatchLoads/CreateJobId as step s4
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map
as step s5
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey
as step s6
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues
as step s7
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map
as step s8
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
as step s9
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly
as step s10
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/View.AsSingleton/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
as step s11
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/View.AsSingleton/Combine.GloballyAsSingletonView/CreateDataflowView
as step s12
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding savePerfsToBigQuery/BatchLoads/Create.Values/Read(CreateSource) as
step s13
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding savePerfsToBigQuery/BatchLoads/GetTempFilePrefix as step s14
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/TempFilePrefixView/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map
as step s15
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/TempFilePrefixView/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey
as step s16
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/TempFilePrefixView/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues
as step s17
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/TempFilePrefixView/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map
as step s18
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/TempFilePrefixView/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
as step s19
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/TempFilePrefixView/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly
as step s20
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/TempFilePrefixView/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
as step s21
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/TempFilePrefixView/Combine.GloballyAsSingletonView/CreateDataflowView
as step s22
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding savePerfsToBigQuery/BatchLoads/rewindowIntoGlobal/Window.Assign as
step s23
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding savePerfsToBigQuery/BatchLoads/WriteBundlesToFiles as step s24
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding savePerfsToBigQuery/BatchLoads/GroupByDestination as step s25
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding savePerfsToBigQuery/BatchLoads/WriteGroupedRecords as step s26
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding savePerfsToBigQuery/BatchLoads/FlattenFiles as step s27
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/ReifyResults/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)
as step s28
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/ReifyResults/View.AsIterable/CreateDataflowView
as step s29
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/ReifyResults/Create.Values/Read(CreateSource) as
step s30
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding savePerfsToBigQuery/BatchLoads/ReifyResults/ParDo(Anonymous) as
step s31
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding savePerfsToBigQuery/BatchLoads/WritePartitionUntriggered as step
s32
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/MultiPartitionsReshuffle/Window.Into()/Window.Assign
as step s33
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding savePerfsToBigQuery/BatchLoads/MultiPartitionsReshuffle/GroupByKey
as step s34
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/MultiPartitionsReshuffle/ExpandIterable as step
s35
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/MultiPartitionsWriteTables/ParMultiDo(WriteTables)
as step s36
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/MultiPartitionsWriteTables/WithKeys/AddKeys/Map
as step s37
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/MultiPartitionsWriteTables/Window.Into()/Window.Assign
as step s38
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/MultiPartitionsWriteTables/GroupByKey as step s39
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/MultiPartitionsWriteTables/Values/Values/Map as
step s40
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/MultiPartitionsWriteTables/ParDo(GarbageCollectTemporaryFiles)
as step s41
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/ReifyRenameInput/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)
as step s42
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/ReifyRenameInput/View.AsIterable/CreateDataflowView
as step s43
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/ReifyRenameInput/Create.Values/Read(CreateSource)
as step s44
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding savePerfsToBigQuery/BatchLoads/ReifyRenameInput/ParDo(Anonymous)
as step s45
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding savePerfsToBigQuery/BatchLoads/WriteRenameUntriggered as step s46
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/SinglePartitionsReshuffle/Window.Into()/Window.Assign
as step s47
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/SinglePartitionsReshuffle/GroupByKey as step s48
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/SinglePartitionsReshuffle/ExpandIterable as step
s49
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/SinglePartitionWriteTables/ParMultiDo(WriteTables)
as step s50
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/SinglePartitionWriteTables/WithKeys/AddKeys/Map
as step s51
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/SinglePartitionWriteTables/Window.Into()/Window.Assign
as step s52
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/SinglePartitionWriteTables/GroupByKey as step s53
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/SinglePartitionWriteTables/Values/Values/Map as
step s54
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/SinglePartitionWriteTables/ParDo(GarbageCollectTemporaryFiles)
as step s55
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding
savePerfsToBigQuery/BatchLoads/CreateEmptyFailedInserts/Read(CreateSource) as
step s56
Oct 31, 2018 12:19:50 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding DropInputs as step s57
Oct 31, 2018 12:19:50 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging pipeline description to
gs://temp-storage-for-perf-tests/nexmark/staging/
Oct 31, 2018 12:19:50 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading <176184 bytes, hash 4L3F9kiA-WhLeLCjIfRdGA> to
gs://temp-storage-for-perf-tests/nexmark/staging/pipeline-4L3F9kiA-WhLeLCjIfRdGA.pb
Dataflow SDK version: 2.9.0-SNAPSHOT
Oct 31, 2018 12:19:51 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-10-30_17_19_50-15189656732554862601?project=apache-beam-testing
Submitted job: 2018-10-30_17_19_50-15189656732554862601
Oct 31, 2018 12:19:51 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
> --region=us-central1 2018-10-30_17_19_50-15189656732554862601
==========================================================================================
Run started 2018-10-31T00:13:03.187Z and ran for PT408.561S
Default configuration:
{"debug":true,"query":null,"sourceType":"DIRECT","sinkType":"DEVNULL","exportSummaryToBigQuery":false,"pubSubMode":"COMBINED","numEvents":100000,"numEventGenerators":100,"rateShape":"SINE","firstEventRate":10000,"nextEventRate":10000,"rateUnit":"PER_SECOND","ratePeriodSec":600,"preloadSeconds":0,"streamTimeout":240,"isRateLimited":false,"useWallclockEventTime":false,"avgPersonByteSize":200,"avgAuctionByteSize":500,"avgBidByteSize":100,"hotAuctionRatio":2,"hotSellersRatio":4,"hotBiddersRatio":4,"windowSizeSec":10,"windowPeriodSec":5,"watermarkHoldbackSec":0,"numInFlightAuctions":100,"numActivePeople":1000,"coderStrategy":"HAND","cpuDelayMs":0,"diskBusyBytes":0,"auctionSkip":123,"fanout":5,"maxAuctionsWaitingTime":600,"occasionalDelaySec":3,"probDelayedEvent":0.1,"maxLogEvents":100000,"usePubsubPublishTime":false,"outOfOrderGroupSize":1}
Configurations:
Conf Description
0000 query:PASSTHROUGH; exportSummaryToBigQuery:true; numEvents:10000000
0001 query:CURRENCY_CONVERSION; exportSummaryToBigQuery:true;
numEvents:10000000
0002 query:SELECTION; exportSummaryToBigQuery:true; numEvents:10000000
0003 query:LOCAL_ITEM_SUGGESTION; exportSummaryToBigQuery:true;
numEvents:10000000
0004 query:AVERAGE_PRICE_FOR_CATEGORY; exportSummaryToBigQuery:true;
numEvents:1000000
0005 query:HOT_ITEMS; exportSummaryToBigQuery:true; numEvents:10000000
0006 query:AVERAGE_SELLING_PRICE_BY_SELLER; exportSummaryToBigQuery:true;
numEvents:1000000
0007 query:HIGHEST_BID; exportSummaryToBigQuery:true; numEvents:10000000
0008 query:MONITOR_NEW_USERS; exportSummaryToBigQuery:true;
numEvents:10000000
0009 query:WINNING_BIDS; exportSummaryToBigQuery:true; numEvents:1000000
0010 query:LOG_TO_SHARDED_FILES; exportSummaryToBigQuery:true;
numEvents:10000000
0011 query:USER_SESSIONS; exportSummaryToBigQuery:true; numEvents:10000000
0012 query:PROCESSING_TIME_WINDOWS; exportSummaryToBigQuery:true;
numEvents:10000000
Performance:
Conf Runtime(sec) (Baseline) Events(/sec) (Baseline) Results
(Baseline)
0000 -1.0 -1.0 -1
*** Job was unexpectedly updated ***
0001 -1.0 -1.0 -1
*** Job was unexpectedly updated ***
0002 -1.0 -1.0 -1
*** Job was unexpectedly updated ***
0003 -1.0 -1.0 -1
*** Job was unexpectedly updated ***
0004 -1.0 -1.0 -1
*** Job was unexpectedly updated ***
0005 -1.0 -1.0 -1
*** Job was unexpectedly updated ***
0006 -1.0 -1.0 -1
*** Job was unexpectedly updated ***
0007 -1.0 -1.0 -1
*** Job was unexpectedly updated ***
0008 -1.0 -1.0 -1
*** Job was unexpectedly updated ***
0009 -1.0 -1.0 -1
*** Job was unexpectedly updated ***
0010 -1.0 -1.0 -1
*** Job was unexpectedly updated ***
0011 -1.0 -1.0 -1
*** Job was unexpectedly updated ***
0012 -1.0 -1.0 -1
*** Job was unexpectedly updated ***
==========================================================================================
Exception in thread "main" java.lang.RuntimeException: Execution was not
successful
at org.apache.beam.sdk.nexmark.Main.runAll(Main.java:174)
at org.apache.beam.sdk.nexmark.Main.main(Main.java:477)
> Task :beam-sdks-java-nexmark:run FAILED
:beam-sdks-java-nexmark:run (Thread[Task worker for ':' Thread 5,5,main])
completed. Took 6 mins 49.396 secs.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':beam-sdks-java-nexmark:run'.
> Process 'command '/usr/local/asfpackages/java/jdk1.8.0_172/bin/java''
> finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 5.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/4.10.2/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 11m 41s
68 actionable tasks: 68 executed
Publishing build scan...
https://gradle.com/s/niu5zsoync5d6
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]