See
<https://builds.apache.org/job/beam_PostCommit_Java_Nexmark_Flink/7/display/redirect?page=changes>
Changes:
[matthias] Futurize examples subpackage
[kirpichov] A couple more trivial changes extracted from #5662
[katarzyna.kucharczyk] Created static retry method which accepts retriable
method, amount of
[rober] Fix build break from PR 5676, which modified method signatures.
[migryz] Implemented working prototype for PostCommit jobs
[migryz] Update post commit jobs to utilize new helper methods
[ankurgoenka] Fix cache references in portable flink runner
[Pablo] Adding ignore parameters for gitattributes
[swegner] Remove submodule
[migryz] Fix typo in _ghprb suffix
[migryz] Update suffix to be PullRequest
[ccy] Revert #5689 to fix build
[ankurgoenka] Remove unused import
[github] Make suffix shorter
[kirpichov] Make ImmutableExecutableStage constructors public
[kirpichov] [BEAM-4285] Implement Flink batch side input handler
[ekirpichov] Introduces PipelineValidator that checks the well-formedness of a
[cademarkegard] [BEAM-4325] Enforce ErrorProne analysis in the SQL project
[echauchot] [BEAM-4283] Fix naming of the BigQuery fields
[coheigea] Removing some null checks, where we already know that the variable in
[iemejia] [BEAM-3314] Set correctly host and port on RedisIO
[iemejia] [BEAM-3314] Fix error-prone warnings and add extra test for Read
------------------------------------------
[...truncated 3.83 MB...]
INFO: Shutting down BLOB cache
Jun 21, 2018 2:45:15 PM org.apache.flink.runtime.blob.AbstractBlobCache close
INFO: Shutting down BLOB cache
Jun 21, 2018 2:45:15 PM org.apache.flink.runtime.blob.BlobServer close
INFO: Stopped BLOB server at 0.0.0.0:37159
Jun 21, 2018 2:45:15 PM org.apache.flink.runtime.rpc.akka.AkkaRpcService
lambda$stopService$4
INFO: Stopped Akka RPC service.
Jun 21, 2018 2:45:15 PM org.apache.beam.runners.flink.FlinkRunner run
SEVERE: Pipeline execution failed
org.apache.flink.runtime.client.JobExecutionException:
org.apache.beam.sdk.util.UserCodeException: java.io.IOException: Unable to
insert job:
beam_load_mainjenkins062114421865f5fecf_c0cd4ad2cd3a46569d678f11a548b866_6831ef916a10e30d9ef1c0e5f0e1ec7d_00001_00000-0,
aborting after 9 .
at
org.apache.flink.runtime.minicluster.MiniCluster.executeJobBlocking(MiniCluster.java:625)
at
org.apache.flink.client.LocalExecutor.executePlan(LocalExecutor.java:234)
at
org.apache.flink.api.java.LocalEnvironment.execute(LocalEnvironment.java:91)
at
org.apache.beam.runners.flink.FlinkPipelineExecutionEnvironment.executePipeline(FlinkPipelineExecutionEnvironment.java:114)
at org.apache.beam.runners.flink.FlinkRunner.run(FlinkRunner.java:116)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:297)
at org.apache.beam.sdk.nexmark.Main.savePerfsToBigQuery(Main.java:182)
at org.apache.beam.sdk.nexmark.Main.runAll(Main.java:102)
at org.apache.beam.sdk.nexmark.Main.main(Main.java:395)
Caused by: org.apache.beam.sdk.util.UserCodeException: java.io.IOException:
Unable to insert job:
beam_load_mainjenkins062114421865f5fecf_c0cd4ad2cd3a46569d678f11a548b866_6831ef916a10e30d9ef1c0e5f0e1ec7d_00001_00000-0,
aborting after 9 .
at
org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:36)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables$WriteTablesDoFn$DoFnInvoker.invokeProcessElement(Unknown
Source)
at
org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:185)
at
org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:146)
at
org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate.processElement(DoFnRunnerWithMetricsUpdate.java:66)
at
org.apache.beam.runners.flink.translation.functions.FlinkDoFnFunction.mapPartition(FlinkDoFnFunction.java:120)
at
org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:503)
at
org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:368)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:703)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.IOException: Unable to insert job:
beam_load_mainjenkins062114421865f5fecf_c0cd4ad2cd3a46569d678f11a548b866_6831ef916a10e30d9ef1c0e5f0e1ec7d_00001_00000-0,
aborting after 9 .
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl.startJob(BigQueryServicesImpl.java:231)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl.startJob(BigQueryServicesImpl.java:202)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl.startLoadJob(BigQueryServicesImpl.java:142)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables.load(WriteTables.java:269)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables.access$600(WriteTables.java:80)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables$WriteTablesDoFn.processElement(WriteTables.java:159)
Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException:
400 Bad Request
{
"code" : 400,
"errors" : [ {
"domain" : "global",
"message" : "Source URI must be a Google Cloud Storage location:
<https://builds.apache.org/job/beam_PostCommit_Java_Nexmark_Flink/ws/src/sdks/java/nexmark/nexmark-temp/BigQueryWriteTemp/beam_load_mainjenkins062114421865f5fecf_c0cd4ad2cd3a46569d678f11a548b866/2b1cebd6-8d28-486c-a208-6d4f7f3123a0",>
"reason" : "invalid"
} ],
"message" : "Source URI must be a Google Cloud Storage location:
<https://builds.apache.org/job/beam_PostCommit_Java_Nexmark_Flink/ws/src/sdks/java/nexmark/nexmark-temp/BigQueryWriteTemp/beam_load_mainjenkins062114421865f5fecf_c0cd4ad2cd3a46569d678f11a548b866/2b1cebd6-8d28-486c-a208-6d4f7f3123a0">
}
at
com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146)
at
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
at
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:321)
at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1065)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:419)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:352)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:469)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl.startJob(BigQueryServicesImpl.java:216)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl.startJob(BigQueryServicesImpl.java:202)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl.startLoadJob(BigQueryServicesImpl.java:142)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables.load(WriteTables.java:269)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables.access$600(WriteTables.java:80)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables$WriteTablesDoFn.processElement(WriteTables.java:159)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables$WriteTablesDoFn$DoFnInvoker.invokeProcessElement(Unknown
Source)
at
org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:185)
at
org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:146)
at
org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate.processElement(DoFnRunnerWithMetricsUpdate.java:66)
at
org.apache.beam.runners.flink.translation.functions.FlinkDoFnFunction.mapPartition(FlinkDoFnFunction.java:120)
at
org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:503)
at
org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:368)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:703)
at java.lang.Thread.run(Thread.java:748)
==========================================================================================
Run started 2018-06-21T14:42:16.966Z and ran for PT178.429S
Default configuration:
{"debug":true,"query":0,"sourceType":"DIRECT","sinkType":"DEVNULL","exportSummaryToBigQuery":false,"pubSubMode":"COMBINED","numEvents":100000,"numEventGenerators":100,"rateShape":"SINE","firstEventRate":10000,"nextEventRate":10000,"rateUnit":"PER_SECOND","ratePeriodSec":600,"preloadSeconds":0,"streamTimeout":240,"isRateLimited":false,"useWallclockEventTime":false,"avgPersonByteSize":200,"avgAuctionByteSize":500,"avgBidByteSize":100,"hotAuctionRatio":2,"hotSellersRatio":4,"hotBiddersRatio":4,"windowSizeSec":10,"windowPeriodSec":5,"watermarkHoldbackSec":0,"numInFlightAuctions":100,"numActivePeople":1000,"coderStrategy":"HAND","cpuDelayMs":0,"diskBusyBytes":0,"auctionSkip":123,"fanout":5,"maxAuctionsWaitingTime":600,"occasionalDelaySec":3,"probDelayedEvent":0.1,"maxLogEvents":100000,"usePubsubPublishTime":false,"outOfOrderGroupSize":1}
Configurations:
Conf Description
0000 query:0; exportSummaryToBigQuery:true; streamTimeout:60
0001 query:1; exportSummaryToBigQuery:true; streamTimeout:60
0002 query:2; exportSummaryToBigQuery:true; streamTimeout:60
Exception in thread "main" 0003 query:3; exportSummaryToBigQuery:true;
streamTimeout:60
0004 query:4; exportSummaryToBigQuery:true; numEvents:10000; streamTimeout:60
0005 query:5; exportSummaryToBigQuery:true; streamTimeout:60
0006 query:6; exportSummaryToBigQuery:true; numEvents:10000; streamTimeout:60
0007 query:7; exportSummaryToBigQuery:true; streamTimeout:60
0008 query:8; exportSummaryToBigQuery:true; streamTimeout:60
0009 query:9; exportSummaryToBigQuery:true; numEvents:10000; streamTimeout:60
0010 query:10; exportSummaryToBigQuery:true; streamTimeout:60
0011 query:11; exportSummaryToBigQuery:true; streamTimeout:60
0012 query:12; exportSummaryToBigQuery:true; streamTimeout:60
Performance:
Conf Runtime(sec) (Baseline) Events(/sec) (Baseline) Results
(Baseline)
0000 0.9 112359.6 100000
java.lang.RuntimeException: Pipeline execution failed
0001 0.5 184162.1 92000
at org.apache.beam.runners.flink.FlinkRunner.run(FlinkRunner.java:119)
0002 0.4 227790.4 351
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:297)
at org.apache.beam.sdk.nexmark.Main.savePerfsToBigQuery(Main.java:182)
at org.apache.beam.sdk.nexmark.Main.runAll(Main.java:102)
at org.apache.beam.sdk.nexmark.Main.main(Main.java:395)
0003 10.9 9168.4 580
Caused by: org.apache.flink.runtime.client.JobExecutionException:
org.apache.beam.sdk.util.UserCodeException: java.io.IOException: Unable to
insert job:
beam_load_mainjenkins062114421865f5fecf_c0cd4ad2cd3a46569d678f11a548b866_6831ef916a10e30d9ef1c0e5f0e1ec7d_00001_00000-0,
aborting after 9 .
at
org.apache.flink.runtime.minicluster.MiniCluster.executeJobBlocking(MiniCluster.java:625)
0004 6.1 1631.1 40
at
org.apache.flink.client.LocalExecutor.executePlan(LocalExecutor.java:234)
0005 5.8 17385.3 12
at
org.apache.flink.api.java.LocalEnvironment.execute(LocalEnvironment.java:91)
0006 6.1 1649.6 103
at
org.apache.beam.runners.flink.FlinkPipelineExecutionEnvironment.executePipeline(FlinkPipelineExecutionEnvironment.java:114)
at org.apache.beam.runners.flink.FlinkRunner.run(FlinkRunner.java:116)
... 5 more
0007 6.4 15511.1 1
Caused by: org.apache.beam.sdk.util.UserCodeException: java.io.IOException:
Unable to insert job:
beam_load_mainjenkins062114421865f5fecf_c0cd4ad2cd3a46569d678f11a548b866_6831ef916a10e30d9ef1c0e5f0e1ec7d_00001_00000-0,
aborting after 9 .
0008 7.8 12822.2 6000
at
org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:36)
0009 5.6 1800.5 298
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables$WriteTablesDoFn$DoFnInvoker.invokeProcessElement(Unknown
Source)
0010 6.6 15089.8 1
at
org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:185)
0011 7.3 13719.3 1919
at
org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:146)
0012 5.9 16946.3 1919
at
org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate.processElement(DoFnRunnerWithMetricsUpdate.java:66)
at
org.apache.beam.runners.flink.translation.functions.FlinkDoFnFunction.mapPartition(FlinkDoFnFunction.java:120)
==========================================================================================
at
org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:503)
at
org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:368)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:703)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.IOException: Unable to insert job:
beam_load_mainjenkins062114421865f5fecf_c0cd4ad2cd3a46569d678f11a548b866_6831ef916a10e30d9ef1c0e5f0e1ec7d_00001_00000-0,
aborting after 9 .
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl.startJob(BigQueryServicesImpl.java:231)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl.startJob(BigQueryServicesImpl.java:202)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl.startLoadJob(BigQueryServicesImpl.java:142)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables.load(WriteTables.java:269)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables.access$600(WriteTables.java:80)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables$WriteTablesDoFn.processElement(WriteTables.java:159)
Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException:
400 Bad Request
{
"code" : 400,
"errors" : [ {
"domain" : "global",
"message" : "Source URI must be a Google Cloud Storage location:
<https://builds.apache.org/job/beam_PostCommit_Java_Nexmark_Flink/ws/src/sdks/java/nexmark/nexmark-temp/BigQueryWriteTemp/beam_load_mainjenkins062114421865f5fecf_c0cd4ad2cd3a46569d678f11a548b866/2b1cebd6-8d28-486c-a208-6d4f7f3123a0",>
"reason" : "invalid"
} ],
"message" : "Source URI must be a Google Cloud Storage location:
<https://builds.apache.org/job/beam_PostCommit_Java_Nexmark_Flink/ws/src/sdks/java/nexmark/nexmark-temp/BigQueryWriteTemp/beam_load_mainjenkins062114421865f5fecf_c0cd4ad2cd3a46569d678f11a548b866/2b1cebd6-8d28-486c-a208-6d4f7f3123a0">
}
at
com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146)
at
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
at
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:321)
at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1065)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:419)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:352)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:469)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl.startJob(BigQueryServicesImpl.java:216)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl.startJob(BigQueryServicesImpl.java:202)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl.startLoadJob(BigQueryServicesImpl.java:142)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables.load(WriteTables.java:269)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables.access$600(WriteTables.java:80)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables$WriteTablesDoFn.processElement(WriteTables.java:159)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables$WriteTablesDoFn$DoFnInvoker.invokeProcessElement(Unknown
Source)
at
org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:185)
at
org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:146)
at
org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate.processElement(DoFnRunnerWithMetricsUpdate.java:66)
at
org.apache.beam.runners.flink.translation.functions.FlinkDoFnFunction.mapPartition(FlinkDoFnFunction.java:120)
at
org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:503)
at
org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:368)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:703)
at java.lang.Thread.run(Thread.java:748)
> Task :beam-sdks-java-nexmark:run FAILED
:beam-sdks-java-nexmark:run (Thread[Task worker for ':' Thread 6,5,main])
completed. Took 3 mins 1.984 secs.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':beam-sdks-java-nexmark:run'.
> Process 'command '/usr/local/asfpackages/java/jdk1.8.0_152/bin/java''
> finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 5.0.
See
https://docs.gradle.org/4.8/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 6m 7s
62 actionable tasks: 62 executed
Publishing build scan...
https://gradle.com/s/bamg3reczunnk
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user
[email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]