See 
<https://builds.apache.org/job/beam_PostCommit_Java_Nexmark_Flink/4239/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-8294] run Spark portable validates runner tests in parallel

[kirillkozlov] Filter push-down should not be reliant on project push-down 
[BEAM-8508]

[kirillkozlov] Fixed a bug with selecting the same field more than once. 
Changed a cost

[kirillkozlov] Selecting fields in a different order should not drop the Calc 
when

[kirillkozlov] Refactoring

[kirillkozlov] spotlesApply

[github] Bump Dataflow python container versions

[kirillkozlov] IOPushDownRule should not be applied more than once to the same 
IORels


------------------------------------------
[...truncated 11.22 KB...]
> Task :sdks:java:extensions:protobuf:extractProto
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :sdks:java:extensions:join-library:processResources NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :sdks:java:io:parquet:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :sdks:java:testing:nexmark:processResources
> Task :model:job-management:processResources
> Task :model:fn-execution:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :runners:flink:1.9:copySourceOverrides
> Task :runners:flink:1.9:copyTestResourcesOverrides NO-SOURCE
> Task :runners:flink:1.9:processResources
> Task :sdks:java:build-tools:compileJava FROM-CACHE
> Task :sdks:java:build-tools:processResources
> Task :sdks:java:build-tools:classes
> Task :sdks:java:core:processResources
> Task :sdks:java:build-tools:jar
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :model:pipeline:generateProto
> Task :sdks:java:extensions:sql:copyFmppTemplatesFromCalciteCore
> Task :sdks:java:extensions:sql:copyFmppTemplatesFromSrc
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :sdks:java:extensions:sql:generateFmppSources
> Task :model:job-management:extractIncludeProto
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:generateProto
> Task :model:fn-execution:generateProto
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes

> Task :sdks:java:extensions:sql:compileJavacc
Java Compiler Compiler Version 4.0 (Parser Generator)
(type "javacc" with no arguments for help)
Reading from file 
<https://builds.apache.org/job/beam_PostCommit_Java_Nexmark_Flink/ws/src/sdks/java/extensions/sql/build/generated/fmpp/javacc/Parser.jj>
 . . .
Warning: Note: UNICODE_INPUT option is specified. Please make sure you create 
the parser/lexer using a Reader with the correct character encoding.Lookahead 
adequacy checking not being performed since option LOOKAHEAD is more than 1.  
Set option FORCE_LA_CHECK to true to force checking.

File "TokenMgrError.java" does not exist.  Will create one.
File "ParseException.java" does not exist.  Will create one.
File "Token.java" does not exist.  Will create one.
File "SimpleCharStream.java" does not exist.  Will create one.
Parser generated with 0 errors and 1 warnings.

> Task :sdks:java:extensions:sql:processResources
> Task :model:pipeline:shadowJar
> Task :model:job-management:shadowJar
> Task :model:fn-execution:shadowJar
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar
> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :runners:local-java:compileJava FROM-CACHE
> Task :runners:local-java:classes UP-TO-DATE
> Task :runners:local-java:jar
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:extensions:join-library:compileJava FROM-CACHE
> Task :sdks:java:extensions:join-library:classes UP-TO-DATE
> Task :sdks:java:extensions:join-library:jar
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar
> Task :sdks:java:core:jar
> Task :runners:core-construction-java:jar
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :sdks:java:io:parquet:compileJava FROM-CACHE
> Task :sdks:java:io:parquet:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:io:parquet:jar
> Task :sdks:java:extensions:protobuf:jar
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :runners:core-java:jar
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar
> Task :sdks:java:harness:shadowJar
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :runners:direct-java:compileJava FROM-CACHE
> Task :runners:direct-java:classes UP-TO-DATE
> Task :runners:flink:1.9:compileJava FROM-CACHE
> Task :runners:flink:1.9:classes
> Task :runners:direct-java:jar
> Task :runners:flink:1.9:jar
> Task :runners:direct-java:shadowJar
> Task :sdks:java:extensions:sql:compileJava FROM-CACHE
> Task :sdks:java:extensions:sql:classes
> Task :sdks:java:extensions:sql:jar
> Task :sdks:java:testing:nexmark:compileJava FROM-CACHE
> Task :sdks:java:testing:nexmark:classes
> Task :sdks:java:testing:nexmark:jar

> Task :sdks:java:testing:nexmark:run
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further 
details.
2019-11-07T23:37:40.853Z Running query:PASSTHROUGH; 
exportSummaryToBigQuery:true; streamTimeout:60
2019-11-07T23:37:41.022Z Generating 100000 events in batch mode
2019-11-07T23:37:41.301Z Running query:CURRENCY_CONVERSION; 
exportSummaryToBigQuery:true; streamTimeout:60
2019-11-07T23:37:41.306Z Generating 100000 events in batch mode

==========================================================================================
Run started 2019-11-07T23:37:40.641Z and ran for PT0.660S

Default configuration:
{"debug":true,"query":null,"sourceType":"DIRECT","sinkType":"DEVNULL","exportSummaryToBigQuery":false,"pubSubMode":"COMBINED","sideInputType":"DIRECT","sideInputRowCount":500,"sideInputNumShards":3,"sideInputUrl":null,"sessionGap":{"standardDays":0,"standardHours":0,"standardMinutes":10,"standardSeconds":600,"millis":600000},"numEvents":100000,"numEventGenerators":100,"rateShape":"SINE","firstEventRate":10000,"nextEventRate":10000,"rateUnit":"PER_SECOND","ratePeriodSec":600,"preloadSeconds":0,"streamTimeout":240,"isRateLimited":false,"useWallclockEventTime":false,"avgPersonByteSize":200,"avgAuctionByteSize":500,"avgBidByteSize":100,"hotAuctionRatio":2,"hotSellersRatio":4,"hotBiddersRatio":4,"windowSizeSec":10,"windowPeriodSec":5,"watermarkHoldbackSec":0,"numInFlightAuctions":100,"numActivePeople":1000,"coderStrategy":"HAND","cpuDelayMs":0,"diskBusyBytes":0,"auctionSkip":123,"fanout":5,"maxAuctionsWaitingTime":600,"occasionalDelaySec":3,"probDelayedEvent":0.1,"maxLogEvents":100000,"usePubsubPublishTime":false,"outOfOrderGroupSize":1}

Configurations:
  Conf  Description
  0000  query:PASSTHROUGH; exportSummaryToBigQuery:true; streamTimeout:60
  0001  query:CURRENCY_CONVERSION; exportSummaryToBigQuery:true; 
streamTimeout:60
Exception in thread "main"   0002  query:SELECTION; 
exportSummaryToBigQuery:true; streamTimeout:60
  0003  query:LOCAL_ITEM_SUGGESTION; exportSummaryToBigQuery:true; 
streamTimeout:60
  0004  query:AVERAGE_PRICE_FOR_CATEGORY; exportSummaryToBigQuery:true; 
numEvents:10000; streamTimeout:60
java.lang.RuntimeException: 
org.apache.beam.sdk.Pipeline$PipelineExecutionException: 
java.lang.IllegalArgumentException: Classes already registered: class 
org.apache.beam.runners.core.SplittableParDoViaKeyedWorkItems$ProcessElements
  0005  query:HOT_ITEMS; exportSummaryToBigQuery:true; streamTimeout:60
        at org.apache.beam.sdk.nexmark.Main.runAll(Main.java:128)
        at org.apache.beam.sdk.nexmark.Main.main(Main.java:415)
  0006  query:AVERAGE_SELLING_PRICE_BY_SELLER; exportSummaryToBigQuery:true; 
numEvents:10000; streamTimeout:60
Caused by: org.apache.beam.sdk.Pipeline$PipelineExecutionException: 
java.lang.IllegalArgumentException: Classes already registered: class 
org.apache.beam.runners.core.SplittableParDoViaKeyedWorkItems$ProcessElements
  0007  query:HIGHEST_BID; exportSummaryToBigQuery:true; streamTimeout:60
        at 
org.apache.beam.runners.flink.TestFlinkRunner.run(TestFlinkRunner.java:74)
  0008  query:MONITOR_NEW_USERS; exportSummaryToBigQuery:true; streamTimeout:60
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:315)
  0009  query:WINNING_BIDS; exportSummaryToBigQuery:true; numEvents:10000; 
streamTimeout:60
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:301)
  0010  query:LOG_TO_SHARDED_FILES; exportSummaryToBigQuery:true; 
streamTimeout:60
        at 
org.apache.beam.sdk.nexmark.NexmarkLauncher.run(NexmarkLauncher.java:1178)
  0011  at org.apache.beam.sdk.nexmark.Main$Run.call(Main.java:90)
 query:USER_SESSIONS; exportSummaryToBigQuery:true; streamTimeout:60
        at org.apache.beam.sdk.nexmark.Main$Run.call(Main.java:79)
  0012  query:PROCESSING_TIME_WINDOWS; exportSummaryToBigQuery:true; 
streamTimeout:60
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
  0013  query:BOUNDED_SIDE_INPUT_JOIN; exportSummaryToBigQuery:true; 
streamTimeout:60
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
  0014  query:SESSION_SIDE_INPUT_JOIN; exportSummaryToBigQuery:true; 
streamTimeout:60
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)

Performance:
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
  Conf  Runtime(sec)    (Baseline)  Events(/sec)    (Baseline)       Results    
(Baseline)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
  0000  *** not run ***
        at java.lang.Thread.run(Thread.java:748)
  0001  *** not run ***
Caused by: java.lang.IllegalArgumentException: Classes already registered: 
class 
org.apache.beam.runners.core.SplittableParDoViaKeyedWorkItems$ProcessElements
  0002  *** not run ***
        at 
org.apache.beam.runners.core.construction.PTransformTranslation$KnownTransformPayloadTranslator.loadTransformPayloadTranslators(PTransformTranslation.java:311)
  0003  *** not run ***
        at 
org.apache.beam.runners.core.construction.PTransformTranslation$KnownTransformPayloadTranslator.<clinit>(PTransformTranslation.java:293)
  0004  *** not run ***
        at 
org.apache.beam.runners.core.construction.PTransformTranslation.loadKnownTranslators(PTransformTranslation.java:130)
  0005  *** not run ***
        at 
org.apache.beam.runners.core.construction.PTransformTranslation.<clinit>(PTransformTranslation.java:125)
  0006  *** not run ***
        at 
org.apache.beam.runners.flink.FlinkTransformOverrides.getDefaultOverrides(FlinkTransformOverrides.java:37)
  0007  *** not run ***
  0008  *** not run ***
        at 
org.apache.beam.runners.flink.FlinkPipelineExecutionEnvironment.translate(FlinkPipelineExecutionEnvironment.java:114)
  0009  *** not run ***
        at org.apache.beam.runners.flink.FlinkRunner.run(FlinkRunner.java:108)
  0010  *** not run ***
        at 
org.apache.beam.runners.flink.TestFlinkRunner.run(TestFlinkRunner.java:57)
  0011  *** not run ***
        ... 11 more
  0012  *** not run ***
  0013  *** not run ***
  0014  *** not run ***
==========================================================================================

2019-11-07T23:37:41.420Z Running query:SELECTION; exportSummaryToBigQuery:true; 
streamTimeout:60
2019-11-07T23:37:41.425Z Generating 100000 events in batch mode
2019-11-07T23:37:41.467Z Running query:LOCAL_ITEM_SUGGESTION; 
exportSummaryToBigQuery:true; streamTimeout:60
2019-11-07T23:37:41.472Z Generating 100000 events in batch mode
2019-11-07T23:37:41.542Z Running query:AVERAGE_PRICE_FOR_CATEGORY; 
exportSummaryToBigQuery:true; numEvents:10000; streamTimeout:60
2019-11-07T23:37:41.546Z Generating 10000 events in batch mode
2019-11-07T23:37:41.555Z Expected auction duration is 16667 ms
2019-11-07T23:37:41.662Z Running query:HOT_ITEMS; exportSummaryToBigQuery:true; 
streamTimeout:60
2019-11-07T23:37:41.665Z Generating 100000 events in batch mode
2019-11-07T23:37:41.734Z Running query:AVERAGE_SELLING_PRICE_BY_SELLER; 
exportSummaryToBigQuery:true; numEvents:10000; streamTimeout:60
2019-11-07T23:37:41.737Z Generating 10000 events in batch mode
2019-11-07T23:37:41.747Z Expected auction duration is 16667 ms
2019-11-07T23:37:41.789Z Running query:HIGHEST_BID; 
exportSummaryToBigQuery:true; streamTimeout:60
2019-11-07T23:37:41.791Z Generating 100000 events in batch mode
2019-11-07T23:37:41.838Z Running query:MONITOR_NEW_USERS; 
exportSummaryToBigQuery:true; streamTimeout:60
2019-11-07T23:37:41.840Z Generating 100000 events in batch mode
2019-11-07T23:37:41.865Z Running query:WINNING_BIDS; 
exportSummaryToBigQuery:true; numEvents:10000; streamTimeout:60
2019-11-07T23:37:41.868Z Generating 10000 events in batch mode
2019-11-07T23:37:41.874Z Expected auction duration is 16667 ms
2019-11-07T23:37:41.890Z Running query:LOG_TO_SHARDED_FILES; 
exportSummaryToBigQuery:true; streamTimeout:60
2019-11-07T23:37:41.893Z Generating 100000 events in batch mode
2019-11-07T23:37:41.923Z Running query:USER_SESSIONS; 
exportSummaryToBigQuery:true; streamTimeout:60
2019-11-07T23:37:41.925Z Generating 100000 events in batch mode
2019-11-07T23:37:41.942Z Running query:PROCESSING_TIME_WINDOWS; 
exportSummaryToBigQuery:true; streamTimeout:60
2019-11-07T23:37:41.945Z Generating 100000 events in batch mode
2019-11-07T23:37:41.962Z Running query:BOUNDED_SIDE_INPUT_JOIN; 
exportSummaryToBigQuery:true; streamTimeout:60
2019-11-07T23:37:41.966Z Generating 100000 events in batch mode
2019-11-07T23:37:41.991Z Running query:SESSION_SIDE_INPUT_JOIN; 
exportSummaryToBigQuery:true; streamTimeout:60
2019-11-07T23:37:41.993Z Generating 100000 events in batch mode

> Task :sdks:java:testing:nexmark:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:nexmark:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with 
> non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 35s
76 actionable tasks: 52 executed, 24 from cache

Publishing build scan...
https://gradle.com/s/tk75vwkheadcc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to