See <https://builds.apache.org/job/beam_PostCommit_Java_Nexmark_Dataflow/4485/display/redirect?page=changes>
Changes: [pabloem] [BEAM-8335] Adds support for multi-output TestStream (#9953) ------------------------------------------ [...truncated 16.38 KB...] > Task :runners:google-cloud-dataflow-java:worker:windmill:processResources > Task :runners:google-cloud-dataflow-java:worker:windmill:classes > Task :sdks:java:extensions:sql:copyFmppTemplatesFromCalciteCore > Task :model:pipeline:compileJava FROM-CACHE > Task :sdks:java:extensions:sql:copyFmppTemplatesFromSrc > Task :model:pipeline:processResources > Task :model:pipeline:classes > Task :model:pipeline:jar > Task :model:fn-execution:extractIncludeProto > Task :model:job-management:extractIncludeProto > Task :model:job-management:generateProto > Task :model:fn-execution:generateProto > Task :model:job-management:compileJava FROM-CACHE > Task :model:job-management:classes > Task :model:fn-execution:compileJava FROM-CACHE > Task :model:fn-execution:classes > Task :sdks:java:extensions:sql:generateFmppSources > Task :runners:google-cloud-dataflow-java:worker:windmill:shadowJar > Task :model:pipeline:shadowJar > Task :model:job-management:shadowJar > Task :sdks:java:extensions:sql:compileJavacc Java Compiler Compiler Version 4.0 (Parser Generator) (type "javacc" with no arguments for help) Reading from file <https://builds.apache.org/job/beam_PostCommit_Java_Nexmark_Dataflow/ws/src/sdks/java/extensions/sql/build/generated/fmpp/javacc/Parser.jj> . . . Note: UNICODE_INPUT option is specified. Please make sure you create the parser/lexer using a Reader with the correct character encoding. Warning: Lookahead adequacy checking not being performed since option LOOKAHEAD is more than 1. Set option FORCE_LA_CHECK to true to force checking. File "TokenMgrError.java" does not exist. Will create one. File "ParseException.java" does not exist. Will create one. File "Token.java" does not exist. Will create one. File "SimpleCharStream.java" does not exist. Will create one. Parser generated with 0 errors and 1 warnings. > Task :sdks:java:extensions:sql:processResources > Task :model:fn-execution:shadowJar > Task :sdks:java:core:compileJava FROM-CACHE > Task :sdks:java:core:classes > Task :sdks:java:core:shadowJar > Task :sdks:java:extensions:protobuf:extractIncludeProto > Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE > Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE > Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE > Task :sdks:java:extensions:join-library:compileJava FROM-CACHE > Task :sdks:java:extensions:join-library:classes UP-TO-DATE > Task :sdks:java:io:mongodb:compileJava FROM-CACHE > Task :sdks:java:io:mongodb:classes UP-TO-DATE > Task :runners:core-construction-java:compileJava FROM-CACHE > Task :runners:core-construction-java:classes UP-TO-DATE > Task :sdks:java:io:mongodb:jar > Task :sdks:java:extensions:join-library:jar > Task :vendor:sdks-java-extensions-protobuf:shadowJar > Task :runners:local-java:compileJava FROM-CACHE > Task :runners:local-java:classes UP-TO-DATE > Task :runners:local-java:jar > Task :sdks:java:fn-execution:compileJava FROM-CACHE > Task :sdks:java:fn-execution:classes UP-TO-DATE > Task :runners:core-construction-java:jar > Task :sdks:java:io:kafka:compileJava FROM-CACHE > Task :sdks:java:io:kafka:classes UP-TO-DATE > Task :sdks:java:fn-execution:jar > Task :sdks:java:io:kafka:jar > Task :sdks:java:core:jar > Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE > Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE > Task :runners:core-java:compileJava FROM-CACHE > Task :runners:core-java:classes UP-TO-DATE > Task :sdks:java:extensions:google-cloud-platform-core:jar > Task :sdks:java:testing:test-utils:compileJava FROM-CACHE > Task :sdks:java:testing:test-utils:classes UP-TO-DATE > Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE > Task :sdks:java:extensions:protobuf:classes UP-TO-DATE > Task :sdks:java:testing:test-utils:jar > Task :runners:core-java:jar > Task :sdks:java:extensions:protobuf:jar > Task :sdks:java:harness:compileJava FROM-CACHE > Task :sdks:java:harness:classes UP-TO-DATE > Task :sdks:java:harness:jar > Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE > Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE > Task :sdks:java:io:google-cloud-platform:jar > Task :runners:google-cloud-dataflow-java:compileJava FROM-CACHE > Task :runners:google-cloud-dataflow-java:classes > Task :runners:google-cloud-dataflow-java:jar > Task :sdks:java:io:parquet:compileJava FROM-CACHE > Task :sdks:java:io:parquet:classes UP-TO-DATE > Task :sdks:java:io:parquet:jar > Task :sdks:java:harness:shadowJar > Task :runners:java-fn-execution:compileJava FROM-CACHE > Task :runners:java-fn-execution:classes UP-TO-DATE > Task :runners:java-fn-execution:jar > Task :runners:direct-java:compileJava FROM-CACHE > Task :runners:direct-java:classes UP-TO-DATE > Task :runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava > FROM-CACHE > Task :runners:google-cloud-dataflow-java:worker:legacy-worker:classes > UP-TO-DATE > Task :runners:direct-java:shadowJar > Task :sdks:java:extensions:sql:compileJava FROM-CACHE > Task :sdks:java:extensions:sql:classes > Task :sdks:java:extensions:sql:jar > Task :sdks:java:testing:nexmark:compileJava FROM-CACHE > Task :sdks:java:testing:nexmark:classes > Task :sdks:java:testing:nexmark:jar > Task :runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar > Task :sdks:java:testing:nexmark:run SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". SLF4J: Defaulting to no-operation (NOP) logger implementation SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details. 2019-12-09T22:13:17.215Z Running query:PASSTHROUGH; exportSummaryToBigQuery:true; numEvents:10000000 2019-12-09T22:13:17.216Z Running query:AVERAGE_PRICE_FOR_CATEGORY; exportSummaryToBigQuery:true; numEvents:1000000 2019-12-09T22:13:17.216Z Running query:LOCAL_ITEM_SUGGESTION; exportSummaryToBigQuery:true; numEvents:10000000 2019-12-09T22:13:17.216Z Running query:SELECTION; exportSummaryToBigQuery:true; numEvents:10000000 2019-12-09T22:13:17.216Z Running query:AVERAGE_SELLING_PRICE_BY_SELLER; exportSummaryToBigQuery:true; numEvents:1000000 2019-12-09T22:13:17.216Z Running query:CURRENCY_CONVERSION; exportSummaryToBigQuery:true; numEvents:10000000 2019-12-09T22:13:17.216Z Running query:HOT_ITEMS; exportSummaryToBigQuery:true; numEvents:10000000 2019-12-09T22:13:17.215Z Running query:USER_SESSIONS; exportSummaryToBigQuery:true; numEvents:10000000 2019-12-09T22:13:17.216Z Running query:HIGHEST_BID; exportSummaryToBigQuery:true; numEvents:10000000 2019-12-09T22:13:17.215Z Running query:MONITOR_NEW_USERS; exportSummaryToBigQuery:true; numEvents:10000000 2019-12-09T22:13:17.215Z Running query:WINNING_BIDS; exportSummaryToBigQuery:true; numEvents:1000000 2019-12-09T22:13:17.215Z Running query:LOG_TO_SHARDED_FILES; exportSummaryToBigQuery:true; numEvents:10000000 2019-12-09T22:13:17.215Z Running query:PROCESSING_TIME_WINDOWS; exportSummaryToBigQuery:true; numEvents:10000000 2019-12-09T22:13:17.215Z Running query:SESSION_SIDE_INPUT_JOIN; exportSummaryToBigQuery:true; numEvents:10000000 2019-12-09T22:13:17.215Z Running query:BOUNDED_SIDE_INPUT_JOIN; exportSummaryToBigQuery:true; numEvents:10000000 2019-12-09T22:13:18.385Z Generating 10000000 events in batch mode 2019-12-09T22:13:18.385Z Generating 10000000 events in batch mode 2019-12-09T22:13:18.385Z Generating 10000000 events in batch mode 2019-12-09T22:13:18.385Z Generating 10000000 events in batch mode 2019-12-09T22:13:18.385Z Generating 1000000 events in batch mode 2019-12-09T22:13:18.385Z Generating 10000000 events in batch mode 2019-12-09T22:13:18.385Z Generating 10000000 events in batch mode 2019-12-09T22:13:18.385Z Generating 1000000 events in batch mode 2019-12-09T22:13:18.385Z Generating 10000000 events in batch mode 2019-12-09T22:13:18.385Z Generating 10000000 events in batch mode 2019-12-09T22:13:18.385Z Generating 10000000 events in batch mode 2019-12-09T22:13:18.389Z Generating 10000000 events in batch mode 2019-12-09T22:13:18.389Z Generating 10000000 events in batch mode 2019-12-09T22:13:18.472Z Generating 1000000 events in batch mode 2019-12-09T22:13:18.481Z Generating 10000000 events in batch mode 2019-12-09T22:13:18.699Z Expected auction duration is 16667 ms 2019-12-09T22:13:18.702Z Expected auction duration is 16667 ms 2019-12-09T22:13:18.702Z Expected auction duration is 16667 ms ========================================================================================== Run started 2019-12-09T22:13:16.963Z and ran for PT13.964S Default configuration: Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: Failed to create a workflow job: (854887132b43cfef): The workflow could not be created. Causes: (57b80f6eb74e3cdb): Dataflow quota error for jobs-per-project quota. Project apache-beam-testing is running 302 jobs. Please check the quota usage via GCP Console. If it exceeds the limit, please wait for a workflow to finish or contact Google Cloud Support to request an increase in quota. If it does not, contact Google Cloud Support. {"debug":true,"query":null,"sourceType":"DIRECT","sinkType":"DEVNULL","exportSummaryToBigQuery":false,"pubSubMode":"COMBINED","sideInputType":"DIRECT","sideInputRowCount":500,"sideInputNumShards":3,"sideInputUrl":null,"sessionGap":{"standardDays":0,"standardHours":0,"standardMinutes":10,"standardSeconds":600,"millis":600000},"numEvents":100000,"numEventGenerators":100,"rateShape":"SINE","firstEventRate":10000,"nextEventRate":10000,"rateUnit":"PER_SECOND","ratePeriodSec":600,"preloadSeconds":0,"streamTimeout":240,"isRateLimited":false,"useWallclockEventTime":false,"avgPersonByteSize":200,"avgAuctionByteSize":500,"avgBidByteSize":100,"hotAuctionRatio":2,"hotSellersRatio":4,"hotBiddersRatio":4,"windowSizeSec":10,"windowPeriodSec":5,"watermarkHoldbackSec":0,"numInFlightAuctions":100,"numActivePeople":1000,"coderStrategy":"HAND","cpuDelayMs":0,"diskBusyBytes":0,"auctionSkip":123,"fanout":5,"maxAuctionsWaitingTime":600,"occasionalDelaySec":3,"probDelayedEvent":0.1,"maxLogEvents":100000,"usePubsubPublishTime":false,"outOfOrderGroupSize":1} at org.apache.beam.sdk.nexmark.Main.runAll(Main.java:128) at org.apache.beam.sdk.nexmark.Main.main(Main.java:415) Configurations: Conf Description 0000 query:PASSTHROUGH; exportSummaryToBigQuery:true; numEvents:10000000 0001 query:CURRENCY_CONVERSION; exportSummaryToBigQuery:true; numEvents:10000000 0002 query:SELECTION; exportSummaryToBigQuery:true; numEvents:10000000 0003 query:LOCAL_ITEM_SUGGESTION; exportSummaryToBigQuery:true; numEvents:10000000 0004 query:AVERAGE_PRICE_FOR_CATEGORY; exportSummaryToBigQuery:true; numEvents:1000000 0005 query:HOT_ITEMS; exportSummaryToBigQuery:true; numEvents:10000000 0006 query:AVERAGE_SELLING_PRICE_BY_SELLER; exportSummaryToBigQuery:true; numEvents:1000000 0007 query:HIGHEST_BID; exportSummaryToBigQuery:true; numEvents:10000000 0008 query:MONITOR_NEW_USERS; exportSummaryToBigQuery:true; numEvents:10000000 0009 query:WINNING_BIDS; exportSummaryToBigQuery:true; numEvents:1000000 Caused by: java.lang.RuntimeException: Failed to create a workflow job: (854887132b43cfef): The workflow could not be created. Causes: (57b80f6eb74e3cdb): Dataflow quota error for jobs-per-project quota. Project apache-beam-testing is running 302 jobs. Please check the quota usage via GCP Console. If it exceeds the limit, please wait for a workflow to finish or contact Google Cloud Support to request an increase in quota. If it does not, contact Google Cloud Support. at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:974) at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:188) at org.apache.beam.sdk.Pipeline.run(Pipeline.java:315) at org.apache.beam.sdk.Pipeline.run(Pipeline.java:301) at org.apache.beam.sdk.nexmark.NexmarkLauncher.run(NexmarkLauncher.java:1178) at org.apache.beam.sdk.nexmark.Main$Run.call(Main.java:90) at org.apache.beam.sdk.nexmark.Main$Run.call(Main.java:79) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 0010 query:LOG_TO_SHARDED_FILES; exportSummaryToBigQuery:true; numEvents:10000000 at java.util.concurrent.FutureTask.run(FutureTask.java:266) 0011 query:USER_SESSIONS; exportSummaryToBigQuery:true; numEvents:10000000 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) 0012 query:PROCESSING_TIME_WINDOWS; exportSummaryToBigQuery:true; numEvents:10000000 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) 0013 query:BOUNDED_SIDE_INPUT_JOIN; exportSummaryToBigQuery:true; numEvents:10000000 at java.lang.Thread.run(Thread.java:748) 0014 query:SESSION_SIDE_INPUT_JOIN; exportSummaryToBigQuery:true; numEvents:10000000 Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request { Performance: "code" : 400, "errors" : [ { Conf Runtime(sec) (Baseline) Events(/sec) (Baseline) Results (Baseline) "domain" : "global", 0000 *** not run *** 0001 *** not run *** 0002 *** not run *** "message" : "(854887132b43cfef): The workflow could not be created. Causes: (57b80f6eb74e3cdb): Dataflow quota error for jobs-per-project quota. Project apache-beam-testing is running 302 jobs. Please check the quota usage via GCP Console. If it exceeds the limit, please wait for a workflow to finish or contact Google Cloud Support to request an increase in quota. If it does not, contact Google Cloud Support.", 0003 *** not run *** "reason" : "failedPrecondition" 0004 *** not run *** } ], 0005 *** not run *** 0006 *** not run *** 0007 *** not run *** "message" : "(854887132b43cfef): The workflow could not be created. Causes: (57b80f6eb74e3cdb): Dataflow quota error for jobs-per-project quota. Project apache-beam-testing is running 302 jobs. Please check the quota usage via GCP Console. If it exceeds the limit, please wait for a workflow to finish or contact Google Cloud Support to request an increase in quota. If it does not, contact Google Cloud Support.", 0008 *** not run *** "status" : "FAILED_PRECONDITION" 0009 *** not run *** } 0010 *** not run *** at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:150) 0011 *** not run *** at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113) 0012 *** not run *** at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40) 0013 *** not run *** at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:417) 0014 *** not run *** at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1132) ========================================================================================== at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:515) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:448) at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565) at org.apache.beam.runners.dataflow.DataflowClient.createJob(DataflowClient.java:61) at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:960) ... 12 more > Task :sdks:java:testing:nexmark:run FAILED FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':sdks:java:testing:nexmark:run'. > Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with > non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1m 4s 81 actionable tasks: 55 executed, 26 from cache Publishing build scan... https://scans.gradle.com/s/divtpmmnley5s Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
