See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/160/display/redirect?page=changes>
Changes: [M.Yzvenn] fix - 1MB is interpreted as 1000, not 1024 [katarzyna.kucharczyk] [BEAM-6335] Added streaming GroupByKey Test that reads SyntheticSource [katarzyna.kucharczyk] [BEAM-6335] Changed SyntheticDataPublisher to publish String UTF values [katarzyna.kucharczyk] [BEAM-6335] Added custom PubSub Matcher that stops pipeline after [robertwb] [BEAM-8629] Don't return mutable class type hints. [valentyn] Restore original behavior of evaluating worker host on Windows until a [echauchot] [BEAM-8470] Add an empty spark-structured-streaming runner project [echauchot] [BEAM-8470] Fix missing dep [echauchot] [BEAM-8470] Add SparkPipelineOptions [echauchot] [BEAM-8470] Start pipeline translation [echauchot] [BEAM-8470] Add global pipeline translation structure [echauchot] [BEAM-8470] Add nodes translators structure [echauchot] [BEAM-8470] Wire node translators with pipeline translator [echauchot] [BEAM-8470] Renames: better differenciate pipeline translator for [echauchot] [BEAM-8470] Organise methods in PipelineTranslator [echauchot] [BEAM-8470] Initialise BatchTranslationContext [echauchot] [BEAM-8470] Refactoring: -move batch/streaming common translation [echauchot] [BEAM-8470] Make transform translation clearer: renaming, comments [echauchot] [BEAM-8470] Improve javadocs [echauchot] [BEAM-8470] Move SparkTransformOverrides to correct package [echauchot] [BEAM-8470] Move common translation context components to superclass [echauchot] [BEAM-8470] apply spotless [echauchot] [BEAM-8470] Make codestyle and firebug happy [echauchot] [BEAM-8470] Add TODOs [echauchot] [BEAM-8470] Post-pone batch qualifier in all classes names for [echauchot] [BEAM-8470] Add precise TODO for multiple TransformTranslator per [echauchot] [BEAM-8470] Added SparkRunnerRegistrar [echauchot] [BEAM-8470] Add basic pipeline execution. Refactor translatePipeline() [echauchot] [BEAM-8470] Create PCollections manipulation methods [echauchot] [BEAM-8470] Create Datasets manipulation methods [echauchot] [BEAM-8470] Add Flatten transformation translator [echauchot] [BEAM-8470] Add primitive GroupByKeyTranslatorBatch implementation [echauchot] [BEAM-8470] Use Iterators.transform() to return Iterable [echauchot] [BEAM-8470] Implement read transform [echauchot] [BEAM-8470] update TODO [echauchot] [BEAM-8470] Apply spotless [echauchot] [BEAM-8470] start source instanciation [echauchot] [BEAM-8470] Improve exception flow [echauchot] [BEAM-8470] Improve type enforcement in ReadSourceTranslator [echauchot] [BEAM-8470] Experiment over using spark Catalog to pass in Beam Source [echauchot] [BEAM-8470] Add source mocks [echauchot] [BEAM-8470] fix mock, wire mock in translators and create a main test. [echauchot] [BEAM-8470] Use raw WindowedValue so that spark Encoders could work [echauchot] [BEAM-8470] clean deps [echauchot] [BEAM-8470] Move DatasetSourceMock to proper batch mode [echauchot] [BEAM-8470] Run pipeline in batch mode or in streaming mode [echauchot] [BEAM-8470] Split batch and streaming sources and translators [echauchot] [BEAM-8470] Use raw Encoder<WindowedValue> also in regular [echauchot] [BEAM-8470] Clean [echauchot] [BEAM-8470] Add ReadSourceTranslatorStreaming [echauchot] [BEAM-8470] Move Source and translator mocks to a mock package. [echauchot] [BEAM-8470] Pass Beam Source and PipelineOptions to the spark DataSource [echauchot] [BEAM-8470] Refactor DatasetSource fields [echauchot] [BEAM-8470] Wire real SourceTransform and not mock and update the test [echauchot] [BEAM-8470] Add missing 0-arg public constructor [echauchot] [BEAM-8470] Use new PipelineOptionsSerializationUtils [echauchot] [BEAM-8470] Apply spotless and fix checkstyle [echauchot] [BEAM-8470] Add a dummy schema for reader [echauchot] [BEAM-8470] Add empty 0-arg constructor for mock source [echauchot] [BEAM-8470] Clean [echauchot] [BEAM-8470] Checkstyle and Findbugs [echauchot] [BEAM-8470] Refactor SourceTest to a UTest instaed of a main [echauchot] [BEAM-8470] Fix pipeline triggering: use a spark action instead of [echauchot] [BEAM-8470] improve readability of options passing to the source [echauchot] [BEAM-8470] Clean unneeded fields in DatasetReader [echauchot] [BEAM-8470] Fix serialization issues [echauchot] [BEAM-8470] Add SerializationDebugger [echauchot] [BEAM-8470] Add serialization test [echauchot] [BEAM-8470] Move SourceTest to same package as tested class [echauchot] [BEAM-8470] Fix SourceTest [echauchot] [BEAM-8470] Simplify beam reader creation as it created once the source [echauchot] [BEAM-8470] Put all transform translators Serializable [echauchot] [BEAM-8470] Enable test mode [echauchot] [BEAM-8470] Enable gradle build scan [echauchot] [BEAM-8470] Add flatten test [echauchot] [BEAM-8470] First attempt for ParDo primitive implementation [echauchot] [BEAM-8470] Serialize windowedValue to byte[] in source to be able to [echauchot] [BEAM-8470] Comment schema choices [echauchot] [BEAM-8470] Fix errorprone [echauchot] [BEAM-8470] Fix testMode output to comply with new binary schema [echauchot] [BEAM-8470] Cleaning [echauchot] [BEAM-8470] Remove bundleSize parameter and always use spark default [echauchot] [BEAM-8470] Fix split bug [echauchot] [BEAM-8470] Clean [echauchot] [BEAM-8470] Add ParDoTest [echauchot] [BEAM-8470] Address minor review notes [echauchot] [BEAM-8470] Clean [echauchot] [BEAM-8470] Add GroupByKeyTest [echauchot] [BEAM-8470] Add comments and TODO to GroupByKeyTranslatorBatch [echauchot] [BEAM-8470] Fix type checking with Encoder of WindowedValue<T> [echauchot] [BEAM-8470] Port latest changes of ReadSourceTranslatorBatch to [echauchot] [BEAM-8470] Remove no more needed putDatasetRaw [echauchot] [BEAM-8470] Add ComplexSourceTest [echauchot] [BEAM-8470] Fail in case of having SideInouts or State/Timers [echauchot] [BEAM-8470] Fix Encoders: create an Encoder for every manipulated type [echauchot] [BEAM-8470] Apply spotless [echauchot] [BEAM-8470] Fixed Javadoc error [echauchot] [BEAM-8470] Rename SparkSideInputReader class and rename pruneOutput() [echauchot] [BEAM-8470] Don't use deprecated [echauchot] [BEAM-8470] Simplify logic of ParDo translator [echauchot] [BEAM-8470] Fix kryo issue in GBK translator with a workaround [echauchot] [BEAM-8470] Rename SparkOutputManager for consistency [echauchot] [BEAM-8470] Fix for test elements container in GroupByKeyTest [echauchot] [BEAM-8470] Added "testTwoPardoInRow" [echauchot] [BEAM-8470] Add a test for the most simple possible Combine [echauchot] [BEAM-8470] Rename SparkDoFnFilterFunction to DoFnFilterFunction for [echauchot] [BEAM-8470] Generalize the use of SerializablePipelineOptions in place [echauchot] [BEAM-8470] Fix getSideInputs [echauchot] [BEAM-8470] Extract binary schema creation in a helper class [echauchot] [BEAM-8470] First version of combinePerKey [echauchot] [BEAM-8470] Improve type checking of Tuple2 encoder [echauchot] [BEAM-8470] Introduce WindowingHelpers (and helpers package) and use it [echauchot] [BEAM-8470] Fix combiner using KV as input, use binary encoders in place [echauchot] [BEAM-8470] Add combinePerKey and CombineGlobally tests [echauchot] [BEAM-8470] Introduce RowHelpers [echauchot] [BEAM-8470] Add CombineGlobally translation to avoid translating [echauchot] [BEAM-8470] Cleaning [echauchot] [BEAM-8470] Get back to classes in translators resolution because URNs [echauchot] [BEAM-8470] Fix various type checking issues in Combine.Globally [echauchot] [BEAM-8470] Update test with Long [echauchot] [BEAM-8470] Fix combine. For unknown reason GenericRowWithSchema is used [echauchot] [BEAM-8470] Use more generic Row instead of GenericRowWithSchema [echauchot] [BEAM-8470] Add explanation about receiving a Row as input in the [echauchot] [BEAM-8470] Fix encoder bug in combinePerkey [echauchot] [BEAM-8470] Cleaning [echauchot] [BEAM-8470] Implement WindowAssignTranslatorBatch [echauchot] [BEAM-8470] Implement WindowAssignTest [echauchot] [BEAM-8470] Fix javadoc [echauchot] [BEAM-8470] Added SideInput support [echauchot] [BEAM-8470] Fix CheckStyle violations [echauchot] [BEAM-8470] Don't use Reshuffle translation [echauchot] [BEAM-8470] Added using CachedSideInputReader [echauchot] [BEAM-8470] Added TODO comment for ReshuffleTranslatorBatch [echauchot] [BEAM-8470] And unchecked warning suppression [echauchot] [BEAM-8470] Add streaming source initialisation [echauchot] [BEAM-8470] Implement first streaming source [echauchot] [BEAM-8470] Add a TODO on spark output modes [echauchot] [BEAM-8470] Add transformators registry in PipelineTranslatorStreaming [echauchot] [BEAM-8470] Add source streaming test [echauchot] [BEAM-8470] Specify checkpointLocation at the pipeline start [echauchot] [BEAM-8470] Clean unneeded 0 arg constructor in batch source [echauchot] [BEAM-8470] Clean streaming source [echauchot] [BEAM-8470] Continue impl of offsets for streaming source [echauchot] [BEAM-8470] Deal with checkpoint and offset based read [echauchot] [BEAM-8470] Apply spotless and fix spotbugs warnings [echauchot] [BEAM-8470] Disable never ending test [echauchot] [BEAM-8470] Fix access level issues, typos and modernize code to Java 8 [echauchot] [BEAM-8470] Merge Spark Structured Streaming runner into main Spark [echauchot] [BEAM-8470] Fix non-vendored imports from Spark Streaming Runner classes [echauchot] [BEAM-8470] Pass doFnSchemaInformation to ParDo batch translation [echauchot] [BEAM-8470] Fix spotless issues after rebase [echauchot] [BEAM-8470] Fix logging levels in Spark Structured Streaming translation [echauchot] [BEAM-8470] Add SparkStructuredStreamingPipelineOptions and [echauchot] [BEAM-8470] Rename SparkPipelineResult to [echauchot] [BEAM-8470] Use PAssert in Spark Structured Streaming transform tests [echauchot] [BEAM-8470] Ignore spark offsets (cf javadoc) [echauchot] [BEAM-8470] implement source.stop [echauchot] [BEAM-8470] Update javadoc [echauchot] [BEAM-8470] Apply Spotless [echauchot] [BEAM-8470] Enable batch Validates Runner tests for Structured Streaming [echauchot] [BEAM-8470] Limit the number of partitions to make tests go 300% faster [echauchot] [BEAM-8470] Fixes ParDo not calling setup and not tearing down if [echauchot] [BEAM-8470] Pass transform based doFnSchemaInformation in ParDo [echauchot] [BEAM-8470] Consider null object case on RowHelpers, fixes empty side [echauchot] [BEAM-8470] Put back batch/simpleSourceTest.testBoundedSource [echauchot] [BEAM-8470] Update windowAssignTest [echauchot] [BEAM-8470] Add comment about checkpoint mark [echauchot] [BEAM-8470] Re-code GroupByKeyTranslatorBatch to conserve windowing [echauchot] [BEAM-8470] re-enable reduceFnRunner timers for output [echauchot] [BEAM-8470] Improve visibility of debug messages [echauchot] [BEAM-8470] Add a test that GBK preserves windowing [echauchot] [BEAM-8470] Add TODO in Combine translations [echauchot] [BEAM-8470] Update KVHelpers.extractKey() to deal with WindowedValue and [echauchot] [BEAM-8470] Fix comment about schemas [echauchot] [BEAM-8470] Implement reduce part of CombineGlobally translation with [echauchot] [BEAM-8470] Output data after combine [echauchot] [BEAM-8470] Implement merge accumulators part of CombineGlobally [echauchot] [BEAM-8470] Fix encoder in combine call [echauchot] [BEAM-8470] Revert extractKey while combinePerKey is not done (so that [echauchot] [BEAM-8470] Apply a groupByKey avoids for some reason that the spark [echauchot] [BEAM-8470] Fix case when a window does not merge into any other window [echauchot] [BEAM-8470] Fix wrong encoder in combineGlobally GBK [echauchot] [BEAM-8470] Fix bug in the window merging logic [echauchot] [BEAM-8470] Remove the mapPartition that adds a key per partition [echauchot] [BEAM-8470] Remove CombineGlobally translation because it is less [echauchot] [BEAM-8470] Now that there is only Combine.PerKey translation, make only [echauchot] [BEAM-8470] Clean no more needed KVHelpers [echauchot] [BEAM-8470] Clean not more needed RowHelpers [echauchot] [BEAM-8470] Clean not more needed WindowingHelpers [echauchot] [BEAM-8470] Fix javadoc of AggregatorCombiner [echauchot] [BEAM-8470] Fixed immutable list bug [echauchot] [BEAM-8470] add comment in combine globally test [echauchot] [BEAM-8470] Clean groupByKeyTest [echauchot] [BEAM-8470] Add a test that combine per key preserves windowing [echauchot] [BEAM-8470] Ignore for now not working test testCombineGlobally [echauchot] [BEAM-8470] Add metrics support in DoFn [echauchot] [BEAM-8470] Add missing dependencies to run Spark Structured Streaming [echauchot] [BEAM-8470] Add setEnableSparkMetricSinks() method [echauchot] [BEAM-8470] Fix javadoc [echauchot] [BEAM-8470] Fix accumulators initialization in Combine that prevented [echauchot] [BEAM-8470] Add a test to check that CombineGlobally preserves windowing [echauchot] [BEAM-8470] Persist all output Dataset if there are multiple outputs in [echauchot] [BEAM-8470] Added metrics sinks and tests [echauchot] [BEAM-8470] Make spotless happy [echauchot] [BEAM-8470] Add PipelineResults to Spark structured streaming. [echauchot] [BEAM-8470] Update log4j configuration [echauchot] [BEAM-8470] Add spark execution plans extended debug messages. [echauchot] [BEAM-8470] Print number of leaf datasets [echauchot] [BEAM-8470] fixup! Add PipelineResults to Spark structured streaming. [echauchot] [BEAM-8470] Remove no more needed AggregatorCombinerPerKey (there is [echauchot] [BEAM-8470] After testing performance and correctness, launch pipeline [echauchot] [BEAM-8470] Improve Pardo translation performance: avoid calling a [echauchot] [BEAM-8470] Use "sparkMaster" in local mode to obtain number of shuffle [echauchot] [BEAM-8470] Wrap Beam Coders into Spark Encoders using [echauchot] [BEAM-8470] Wrap Beam Coders into Spark Encoders using [echauchot] [BEAM-8470] type erasure: spark encoders require a Class<T>, pass Object [echauchot] [BEAM-8470] Fix scala Product in Encoders to avoid StackEverflow [echauchot] [BEAM-8470] Conform to spark ExpressionEncoders: pass classTags, [echauchot] [BEAM-8470] Add a simple spark native test to test Beam coders wrapping [echauchot] [BEAM-8470] Fix code generation in Beam coder wrapper [echauchot] [BEAM-8470] Lazy init coder because coder instance cannot be [echauchot] [BEAM-8470] Fix warning in coder construction by reflexion [echauchot] [BEAM-8470] Fix ExpressionEncoder generated code: typos, try catch, fqcn [echauchot] [BEAM-8470] Fix getting the output value in code generation [echauchot] [BEAM-8470] Fix beam coder lazy init using reflexion: use .clas + try [echauchot] [BEAM-8470] Remove lazy init of beam coder because there is no generic [echauchot] [BEAM-8470] Remove example code [echauchot] [BEAM-8470] Fix equal and hashcode [echauchot] [BEAM-8470] Fix generated code: uniform exceptions catching, fix [echauchot] [BEAM-8470] Add an assert of equality in the encoders test [echauchot] [BEAM-8470] Apply spotless and checkstyle and add javadocs [echauchot] [BEAM-8470] Wrap exceptions in UserCoderExceptions [echauchot] [BEAM-8470] Put Encoders expressions serializable [echauchot] [BEAM-8470] Catch Exception instead of IOException because some coders [echauchot] [BEAM-8470] Apply new Encoders to CombinePerKey [echauchot] [BEAM-8470] Apply new Encoders to Read source [echauchot] [BEAM-8470] Improve performance of source: the mapper already calls [echauchot] [BEAM-8470] Ignore long time failing test: SparkMetricsSinkTest [echauchot] [BEAM-8470] Apply new Encoders to Window assign translation [echauchot] [BEAM-8470] Apply new Encoders to AggregatorCombiner [echauchot] [BEAM-8470] Create a Tuple2Coder to encode scala tuple2 [echauchot] [BEAM-8470] Apply new Encoders to GroupByKey [echauchot] [BEAM-8470] Apply new Encoders to Pardo. Replace Tuple2Coder with [echauchot] [BEAM-8470] Apply spotless, fix typo and javadoc [echauchot] [BEAM-8470] Use beam encoders also in the output of the source [echauchot] [BEAM-8470] Remove unneeded cast [echauchot] [BEAM-8470] Fix: Remove generic hack of using object. Use actual Coder [echauchot] [BEAM-8470] Remove Encoders based on kryo now that we call Beam coders [echauchot] [BEAM-8470] Add a jenkins job for validates runner tests in the new [echauchot] [BEAM-8470] Apply spotless [echauchot] [BEAM-8470] Rebase on master: pass sideInputMapping in SimpleDoFnRunner [echauchot] Fix SpotBugs [echauchot] [BEAM-8470] simplify coders in combinePerKey translation [echauchot] [BEAM-8470] Fix combiner. Do not reuse instance of accumulator [echauchot] [BEAM-8470] input windows can arrive exploded (for sliding windows). As [echauchot] [BEAM-8470] Add a combine test with sliding windows [echauchot] [BEAM-8470] Add a test to test combine translation on binaryCombineFn [echauchot] [BEAM-8470] Fix tests: use correct [echauchot] [BEAM-8470] Fix wrong expected results in [echauchot] [BEAM-8470] Add disclaimers about this runner being experimental [echauchot] [BEAM-8470] Fix: create an empty accumulator in [echauchot] [BEAM-8470] Apply spotless [echauchot] [BEAM-8470] Add a countPerElement test with sliding windows [echauchot] [BEAM-8470] Fix the output timestamps of combine: timestamps must be [echauchot] [BEAM-8470] set log level to info to avoid resource consumption in [echauchot] [BEAM-8470] Fix CombineTest.testCountPerElementWithSlidingWindows [aromanenko.dev] [BEAM-8470] Remove "validatesStructuredStreamingRunnerBatch" from [echauchot] [BEAM-8470] Fix timestamps in combine output: assign the timestamp to [valentyn] Guard pickling operations with a lock to prevent race condition in [iemejia] [website] Add Spark Structured Runner VR badge to the github template [tvalentyn] [BEAM-8575] Add a Python test to test windowing in DoFn finish_bundle() [github] [BEAM-3419] Flesh out iterable side inputs and key enumeration for [github] common --> unique [kcweaver] [BEAM-8795] fix Spark runner build ------------------------------------------ [...truncated 45.07 KB...] 9b9135266343: Waiting b516fdb2f92c: Waiting 35cb335ba5ee: Pushed 9294be2dc57b: Pushed cb5ec178d71d: Pushed 00830bb28f2f: Pushed 8134ef34b406: Pushed 1a523193c04b: Layer already exists 68293f34a71c: Layer already exists e0f6863312c5: Layer already exists 20215a41fca5: Layer already exists a72a7e555fe1: Layer already exists b8f8aeff56a8: Layer already exists 687890749166: Layer already exists 2f77733e9824: Layer already exists b516fdb2f92c: Pushed 97041f29baff: Layer already exists 75de80043de3: Pushed 300bafa0c32e: Pushed 9b9135266343: Pushed latest: digest: sha256:8b82766b3bb04a52000c854e9cc0ea13af209104893834d8f07adacef9bb8ece size: 4110 [Gradle] - Launching build. [src] $ <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/gradlew> --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Pdocker-repository-root=gcr.io/apache-beam-testing/beam_portability -Pdocker-tag=latest :runners:flink:1.9:job-server-container:docker > Task :buildSrc:compileJava NO-SOURCE > Task :buildSrc:compileGroovy UP-TO-DATE > Task :buildSrc:pluginDescriptors UP-TO-DATE > Task :buildSrc:processResources UP-TO-DATE > Task :buildSrc:classes UP-TO-DATE > Task :buildSrc:jar UP-TO-DATE > Task :buildSrc:assemble UP-TO-DATE > Task :buildSrc:spotlessGroovy UP-TO-DATE > Task :buildSrc:spotlessGroovyCheck UP-TO-DATE > Task :buildSrc:spotlessGroovyGradle UP-TO-DATE > Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE > Task :buildSrc:spotlessCheck UP-TO-DATE > Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE > Task :buildSrc:compileTestJava NO-SOURCE > Task :buildSrc:compileTestGroovy NO-SOURCE > Task :buildSrc:processTestResources NO-SOURCE > Task :buildSrc:testClasses UP-TO-DATE > Task :buildSrc:test NO-SOURCE > Task :buildSrc:validateTaskProperties UP-TO-DATE > Task :buildSrc:check UP-TO-DATE > Task :buildSrc:build UP-TO-DATE Configuration on demand is an incubating feature. > Task :sdks:java:core:generateAvroProtocol NO-SOURCE > Task :runners:core-construction-java:processResources NO-SOURCE > Task :runners:core-java:processResources NO-SOURCE > Task :sdks:java:extensions:google-cloud-platform-core:processResources > NO-SOURCE > Task :sdks:java:fn-execution:processResources NO-SOURCE > Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE > Task :sdks:java:core:generateAvroJava NO-SOURCE > Task :sdks:java:harness:processResources NO-SOURCE > Task :runners:java-fn-execution:processResources NO-SOURCE > Task :runners:flink:1.9:copyResourcesOverrides NO-SOURCE > Task :runners:flink:1.9:job-server:processResources NO-SOURCE > Task :sdks:java:io:kafka:processResources NO-SOURCE > Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE > Task :runners:flink:1.9:job-server-container:dockerClean UP-TO-DATE > Task :model:fn-execution:extractProto > Task :model:job-management:extractProto > Task :sdks:java:extensions:protobuf:extractProto > Task :sdks:java:extensions:protobuf:processResources NO-SOURCE > Task :model:job-management:processResources > Task :sdks:java:core:generateGrammarSource FROM-CACHE > Task :model:fn-execution:processResources > Task :runners:flink:1.9:copySourceOverrides > Task :runners:flink:1.9:copyTestResourcesOverrides NO-SOURCE > Task :runners:flink:1.9:processResources > Task :sdks:java:build-tools:compileJava FROM-CACHE > Task :sdks:java:build-tools:processResources > Task :sdks:java:build-tools:classes > Task :sdks:java:core:processResources > Task :sdks:java:build-tools:jar > Task :model:pipeline:extractIncludeProto > Task :model:pipeline:extractProto > Task :model:pipeline:generateProto > Task :model:pipeline:compileJava FROM-CACHE > Task :model:pipeline:processResources > Task :model:pipeline:classes > Task :model:pipeline:jar > Task :model:fn-execution:extractIncludeProto > Task :model:job-management:extractIncludeProto > Task :model:job-management:generateProto > Task :model:fn-execution:generateProto > Task :model:job-management:compileJava FROM-CACHE > Task :model:job-management:classes > Task :model:fn-execution:compileJava FROM-CACHE > Task :model:fn-execution:classes > Task :model:pipeline:shadowJar > Task :model:job-management:shadowJar > Task :model:fn-execution:shadowJar > Task :sdks:java:core:compileJava FROM-CACHE > Task :sdks:java:core:classes > Task :sdks:java:core:shadowJar > Task :sdks:java:extensions:protobuf:extractIncludeProto > Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE > Task :runners:core-construction-java:compileJava FROM-CACHE > Task :runners:core-construction-java:classes UP-TO-DATE > Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE > Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE > Task :sdks:java:io:kafka:compileJava FROM-CACHE > Task :sdks:java:io:kafka:classes UP-TO-DATE > Task :runners:core-construction-java:jar > Task :vendor:sdks-java-extensions-protobuf:shadowJar > Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE > Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE > Task :sdks:java:io:kafka:jar > Task :sdks:java:extensions:google-cloud-platform-core:jar > Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE > Task :sdks:java:extensions:protobuf:classes UP-TO-DATE > Task :sdks:java:fn-execution:compileJava FROM-CACHE > Task :sdks:java:fn-execution:classes UP-TO-DATE > Task :sdks:java:extensions:protobuf:jar > Task :sdks:java:fn-execution:jar > Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE > Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE > Task :runners:core-java:compileJava FROM-CACHE > Task :runners:core-java:classes UP-TO-DATE > Task :runners:core-java:jar > Task :sdks:java:io:google-cloud-platform:jar > Task :sdks:java:harness:compileJava FROM-CACHE > Task :sdks:java:harness:classes UP-TO-DATE > Task :sdks:java:harness:jar > Task :sdks:java:harness:shadowJar > Task :runners:java-fn-execution:compileJava FROM-CACHE > Task :runners:java-fn-execution:classes UP-TO-DATE > Task :runners:java-fn-execution:jar > Task :runners:flink:1.9:compileJava FROM-CACHE > Task :runners:flink:1.9:classes > Task :runners:flink:1.9:jar > Task :runners:flink:1.9:job-server:compileJava NO-SOURCE > Task :runners:flink:1.9:job-server:classes UP-TO-DATE > Task :runners:flink:1.9:job-server:shadowJar > Task :runners:flink:1.9:job-server-container:copyDockerfileDependencies > Task :runners:flink:1.9:job-server-container:dockerPrepare > Task :runners:flink:1.9:job-server-container:docker Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD SUCCESSFUL in 1m 1s 58 actionable tasks: 40 executed, 17 from cache, 1 up-to-date Publishing build scan... https://gradle.com/s/tag4jmmq7tpzi [beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins1630882223308223201.sh + echo 'Tagging image...' Tagging image... [beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins1044795556399523082.sh + docker tag gcr.io/apache-beam-testing/beam_portability/flink-job-server gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest [beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins6358006993617592537.sh + echo 'Pushing image...' Pushing image... [beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins8147433839283377235.sh + docker push gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest The push refers to repository [gcr.io/apache-beam-testing/beam_portability/flink-job-server] 3cdeb42bd074: Preparing a713ef985762: Preparing 58993fe94077: Preparing a8902d6047fe: Preparing 99557920a7c5: Preparing 7e3c900343d0: Preparing b8f8aeff56a8: Preparing 687890749166: Preparing 2f77733e9824: Preparing 97041f29baff: Preparing b8f8aeff56a8: Waiting 687890749166: Waiting 2f77733e9824: Waiting 97041f29baff: Waiting 7e3c900343d0: Waiting 99557920a7c5: Layer already exists a8902d6047fe: Layer already exists 7e3c900343d0: Layer already exists 687890749166: Layer already exists b8f8aeff56a8: Layer already exists 2f77733e9824: Layer already exists 97041f29baff: Layer already exists 3cdeb42bd074: Pushed 58993fe94077: Pushed a713ef985762: Pushed latest: digest: sha256:0e05076aadc4eb1134935b0a648a04b2f146af55b21138725a431ef65ad5424a size: 2427 [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content JOB_SERVER_IMAGE=gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest CLUSTER_NAME=beam-loadtests-python-gbk-flink-batch-160 DETACHED_MODE=true HARNESS_IMAGES_TO_PULL=gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest FLINK_NUM_WORKERS=16 FLINK_DOWNLOAD_URL=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz GCS_BUCKET=gs://beam-flink-cluster HADOOP_DOWNLOAD_URL=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar FLINK_TASKMANAGER_SLOTS=1 ARTIFACTS_DIR=gs://beam-flink-cluster/beam-loadtests-python-gbk-flink-batch-160 GCLOUD_ZONE=us-central1-a [EnvInject] - Variables injected successfully. [beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins669358712997469422.sh + echo Setting up flink cluster Setting up flink cluster [beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins4849622832382728024.sh + cd <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/.test-infra/dataproc> + ./flink_cluster.sh create + GCLOUD_ZONE=us-central1-a + DATAPROC_VERSION=1.2 + MASTER_NAME=beam-loadtests-python-gbk-flink-batch-160-m + INIT_ACTIONS_FOLDER_NAME=init-actions + FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh + BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh + DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh + FLINK_LOCAL_PORT=8081 + FLINK_TASKMANAGER_SLOTS=1 + TASK_MANAGER_MEM=10240 + YARN_APPLICATION_MASTER= + create + upload_init_actions + echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster' Uploading initialization actions to GCS bucket: gs://beam-flink-cluster + gsutil cp -r init-actions/beam.sh init-actions/docker.sh init-actions/flink.sh gs://beam-flink-cluster/init-actions Copying file://init-actions/beam.sh [Content-Type=text/x-sh]... / [0 files][ 0.0 B/ 2.3 KiB] / [1 files][ 2.3 KiB/ 2.3 KiB] Copying file://init-actions/docker.sh [Content-Type=text/x-sh]... / [1 files][ 2.3 KiB/ 6.0 KiB] / [2 files][ 6.0 KiB/ 6.0 KiB] Copying file://init-actions/flink.sh [Content-Type=text/x-sh]... / [2 files][ 6.0 KiB/ 13.4 KiB] / [3 files][ 13.4 KiB/ 13.4 KiB] - Operation completed over 3 objects/13.4 KiB. + create_cluster + local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz, + metadata+=flink-start-yarn-session=true, + metadata+=flink-taskmanager-slots=1, + metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar + [[ -n gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest ]] + metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest + [[ -n gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest ]] + metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest + local image_version=1.2 + echo 'Starting dataproc cluster. Dataproc version: 1.2' Starting dataproc cluster. Dataproc version: 1.2 + local num_dataproc_workers=17 + gcloud dataproc clusters create beam-loadtests-python-gbk-flink-batch-160 --region=global --num-workers=17 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest, --image-version=1.2 --zone=us-central1-a --quiet Waiting on operation [projects/apache-beam-testing/regions/global/operations/8104a723-10b4-3ddd-86f5-d11df7766d03]. Waiting for cluster creation operation... WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance. ....................................................................................................................................................................................................................................................................................done. ERROR: (gcloud.dataproc.clusters.create) Operation [projects/apache-beam-testing/regions/global/operations/8104a723-10b4-3ddd-86f5-d11df7766d03] failed: Multiple Errors: - Initialization action failed. Failed action 'gs://beam-flink-cluster/init-actions/beam.sh', see output in: gs://dataproc-6c5fbcbb-a2de-406e-9cf7-8c1ce0b6a604-us/google-cloud-dataproc-metainfo/f3831d39-1a73-4ee3-93ae-6b7a34343f52/beam-loadtests-python-gbk-flink-batch-160-w-1/dataproc-initialization-script-1_output - Initialization action failed. Failed action 'gs://beam-flink-cluster/init-actions/beam.sh', see output in: gs://dataproc-6c5fbcbb-a2de-406e-9cf7-8c1ce0b6a604-us/google-cloud-dataproc-metainfo/f3831d39-1a73-4ee3-93ae-6b7a34343f52/beam-loadtests-python-gbk-flink-batch-160-w-16/dataproc-initialization-script-1_output. Build step 'Execute shell' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
