See
<https://builds.apache.org/job/beam_PostCommit_Python2/1032/display/redirect?page=changes>
Changes:
[echauchot] [BEAM-8470] Add an empty spark-structured-streaming runner project
[echauchot] [BEAM-8470] Fix missing dep
[echauchot] [BEAM-8470] Add SparkPipelineOptions
[echauchot] [BEAM-8470] Start pipeline translation
[echauchot] [BEAM-8470] Add global pipeline translation structure
[echauchot] [BEAM-8470] Add nodes translators structure
[echauchot] [BEAM-8470] Wire node translators with pipeline translator
[echauchot] [BEAM-8470] Renames: better differenciate pipeline translator for
[echauchot] [BEAM-8470] Organise methods in PipelineTranslator
[echauchot] [BEAM-8470] Initialise BatchTranslationContext
[echauchot] [BEAM-8470] Refactoring: -move batch/streaming common translation
[echauchot] [BEAM-8470] Make transform translation clearer: renaming, comments
[echauchot] [BEAM-8470] Improve javadocs
[echauchot] [BEAM-8470] Move SparkTransformOverrides to correct package
[echauchot] [BEAM-8470] Move common translation context components to superclass
[echauchot] [BEAM-8470] apply spotless
[echauchot] [BEAM-8470] Make codestyle and firebug happy
[echauchot] [BEAM-8470] Add TODOs
[echauchot] [BEAM-8470] Post-pone batch qualifier in all classes names for
[echauchot] [BEAM-8470] Add precise TODO for multiple TransformTranslator per
[echauchot] [BEAM-8470] Added SparkRunnerRegistrar
[echauchot] [BEAM-8470] Add basic pipeline execution. Refactor
translatePipeline()
[echauchot] [BEAM-8470] Create PCollections manipulation methods
[echauchot] [BEAM-8470] Create Datasets manipulation methods
[echauchot] [BEAM-8470] Add Flatten transformation translator
[echauchot] [BEAM-8470] Add primitive GroupByKeyTranslatorBatch implementation
[echauchot] [BEAM-8470] Use Iterators.transform() to return Iterable
[echauchot] [BEAM-8470] Implement read transform
[echauchot] [BEAM-8470] update TODO
[echauchot] [BEAM-8470] Apply spotless
[echauchot] [BEAM-8470] start source instanciation
[echauchot] [BEAM-8470] Improve exception flow
[echauchot] [BEAM-8470] Improve type enforcement in ReadSourceTranslator
[echauchot] [BEAM-8470] Experiment over using spark Catalog to pass in Beam
Source
[echauchot] [BEAM-8470] Add source mocks
[echauchot] [BEAM-8470] fix mock, wire mock in translators and create a main
test.
[echauchot] [BEAM-8470] Use raw WindowedValue so that spark Encoders could work
[echauchot] [BEAM-8470] clean deps
[echauchot] [BEAM-8470] Move DatasetSourceMock to proper batch mode
[echauchot] [BEAM-8470] Run pipeline in batch mode or in streaming mode
[echauchot] [BEAM-8470] Split batch and streaming sources and translators
[echauchot] [BEAM-8470] Use raw Encoder<WindowedValue> also in regular
[echauchot] [BEAM-8470] Clean
[echauchot] [BEAM-8470] Add ReadSourceTranslatorStreaming
[echauchot] [BEAM-8470] Move Source and translator mocks to a mock package.
[echauchot] [BEAM-8470] Pass Beam Source and PipelineOptions to the spark
DataSource
[echauchot] [BEAM-8470] Refactor DatasetSource fields
[echauchot] [BEAM-8470] Wire real SourceTransform and not mock and update the
test
[echauchot] [BEAM-8470] Add missing 0-arg public constructor
[echauchot] [BEAM-8470] Use new PipelineOptionsSerializationUtils
[echauchot] [BEAM-8470] Apply spotless and fix checkstyle
[echauchot] [BEAM-8470] Add a dummy schema for reader
[echauchot] [BEAM-8470] Add empty 0-arg constructor for mock source
[echauchot] [BEAM-8470] Clean
[echauchot] [BEAM-8470] Checkstyle and Findbugs
[echauchot] [BEAM-8470] Refactor SourceTest to a UTest instaed of a main
[echauchot] [BEAM-8470] Fix pipeline triggering: use a spark action instead of
[echauchot] [BEAM-8470] improve readability of options passing to the source
[echauchot] [BEAM-8470] Clean unneeded fields in DatasetReader
[echauchot] [BEAM-8470] Fix serialization issues
[echauchot] [BEAM-8470] Add SerializationDebugger
[echauchot] [BEAM-8470] Add serialization test
[echauchot] [BEAM-8470] Move SourceTest to same package as tested class
[echauchot] [BEAM-8470] Fix SourceTest
[echauchot] [BEAM-8470] Simplify beam reader creation as it created once the
source
[echauchot] [BEAM-8470] Put all transform translators Serializable
[echauchot] [BEAM-8470] Enable test mode
[echauchot] [BEAM-8470] Enable gradle build scan
[echauchot] [BEAM-8470] Add flatten test
[echauchot] [BEAM-8470] First attempt for ParDo primitive implementation
[echauchot] [BEAM-8470] Serialize windowedValue to byte[] in source to be able
to
[echauchot] [BEAM-8470] Comment schema choices
[echauchot] [BEAM-8470] Fix errorprone
[echauchot] [BEAM-8470] Fix testMode output to comply with new binary schema
[echauchot] [BEAM-8470] Cleaning
[echauchot] [BEAM-8470] Remove bundleSize parameter and always use spark default
[echauchot] [BEAM-8470] Fix split bug
[echauchot] [BEAM-8470] Clean
[echauchot] [BEAM-8470] Add ParDoTest
[echauchot] [BEAM-8470] Address minor review notes
[echauchot] [BEAM-8470] Clean
[echauchot] [BEAM-8470] Add GroupByKeyTest
[echauchot] [BEAM-8470] Add comments and TODO to GroupByKeyTranslatorBatch
[echauchot] [BEAM-8470] Fix type checking with Encoder of WindowedValue<T>
[echauchot] [BEAM-8470] Port latest changes of ReadSourceTranslatorBatch to
[echauchot] [BEAM-8470] Remove no more needed putDatasetRaw
[echauchot] [BEAM-8470] Add ComplexSourceTest
[echauchot] [BEAM-8470] Fail in case of having SideInouts or State/Timers
[echauchot] [BEAM-8470] Fix Encoders: create an Encoder for every manipulated
type
[echauchot] [BEAM-8470] Apply spotless
[echauchot] [BEAM-8470] Fixed Javadoc error
[echauchot] [BEAM-8470] Rename SparkSideInputReader class and rename
pruneOutput()
[echauchot] [BEAM-8470] Don't use deprecated
[echauchot] [BEAM-8470] Simplify logic of ParDo translator
[echauchot] [BEAM-8470] Fix kryo issue in GBK translator with a workaround
[echauchot] [BEAM-8470] Rename SparkOutputManager for consistency
[echauchot] [BEAM-8470] Fix for test elements container in GroupByKeyTest
[echauchot] [BEAM-8470] Added "testTwoPardoInRow"
[echauchot] [BEAM-8470] Add a test for the most simple possible Combine
[echauchot] [BEAM-8470] Rename SparkDoFnFilterFunction to DoFnFilterFunction for
[echauchot] [BEAM-8470] Generalize the use of SerializablePipelineOptions in
place
[echauchot] [BEAM-8470] Fix getSideInputs
[echauchot] [BEAM-8470] Extract binary schema creation in a helper class
[echauchot] [BEAM-8470] First version of combinePerKey
[echauchot] [BEAM-8470] Improve type checking of Tuple2 encoder
[echauchot] [BEAM-8470] Introduce WindowingHelpers (and helpers package) and
use it
[echauchot] [BEAM-8470] Fix combiner using KV as input, use binary encoders in
place
[echauchot] [BEAM-8470] Add combinePerKey and CombineGlobally tests
[echauchot] [BEAM-8470] Introduce RowHelpers
[echauchot] [BEAM-8470] Add CombineGlobally translation to avoid translating
[echauchot] [BEAM-8470] Cleaning
[echauchot] [BEAM-8470] Get back to classes in translators resolution because
URNs
[echauchot] [BEAM-8470] Fix various type checking issues in Combine.Globally
[echauchot] [BEAM-8470] Update test with Long
[echauchot] [BEAM-8470] Fix combine. For unknown reason GenericRowWithSchema is
used
[echauchot] [BEAM-8470] Use more generic Row instead of GenericRowWithSchema
[echauchot] [BEAM-8470] Add explanation about receiving a Row as input in the
[echauchot] [BEAM-8470] Fix encoder bug in combinePerkey
[echauchot] [BEAM-8470] Cleaning
[echauchot] [BEAM-8470] Implement WindowAssignTranslatorBatch
[echauchot] [BEAM-8470] Implement WindowAssignTest
[echauchot] [BEAM-8470] Fix javadoc
[echauchot] [BEAM-8470] Added SideInput support
[echauchot] [BEAM-8470] Fix CheckStyle violations
[echauchot] [BEAM-8470] Don't use Reshuffle translation
[echauchot] [BEAM-8470] Added using CachedSideInputReader
[echauchot] [BEAM-8470] Added TODO comment for ReshuffleTranslatorBatch
[echauchot] [BEAM-8470] And unchecked warning suppression
[echauchot] [BEAM-8470] Add streaming source initialisation
[echauchot] [BEAM-8470] Implement first streaming source
[echauchot] [BEAM-8470] Add a TODO on spark output modes
[echauchot] [BEAM-8470] Add transformators registry in
PipelineTranslatorStreaming
[echauchot] [BEAM-8470] Add source streaming test
[echauchot] [BEAM-8470] Specify checkpointLocation at the pipeline start
[echauchot] [BEAM-8470] Clean unneeded 0 arg constructor in batch source
[echauchot] [BEAM-8470] Clean streaming source
[echauchot] [BEAM-8470] Continue impl of offsets for streaming source
[echauchot] [BEAM-8470] Deal with checkpoint and offset based read
[echauchot] [BEAM-8470] Apply spotless and fix spotbugs warnings
[echauchot] [BEAM-8470] Disable never ending test
[echauchot] [BEAM-8470] Fix access level issues, typos and modernize code to
Java 8
[echauchot] [BEAM-8470] Merge Spark Structured Streaming runner into main Spark
[echauchot] [BEAM-8470] Fix non-vendored imports from Spark Streaming Runner
classes
[echauchot] [BEAM-8470] Pass doFnSchemaInformation to ParDo batch translation
[echauchot] [BEAM-8470] Fix spotless issues after rebase
[echauchot] [BEAM-8470] Fix logging levels in Spark Structured Streaming
translation
[echauchot] [BEAM-8470] Add SparkStructuredStreamingPipelineOptions and
[echauchot] [BEAM-8470] Rename SparkPipelineResult to
[echauchot] [BEAM-8470] Use PAssert in Spark Structured Streaming transform
tests
[echauchot] [BEAM-8470] Ignore spark offsets (cf javadoc)
[echauchot] [BEAM-8470] implement source.stop
[echauchot] [BEAM-8470] Update javadoc
[echauchot] [BEAM-8470] Apply Spotless
[echauchot] [BEAM-8470] Enable batch Validates Runner tests for Structured
Streaming
[echauchot] [BEAM-8470] Limit the number of partitions to make tests go 300%
faster
[echauchot] [BEAM-8470] Fixes ParDo not calling setup and not tearing down if
[echauchot] [BEAM-8470] Pass transform based doFnSchemaInformation in ParDo
[echauchot] [BEAM-8470] Consider null object case on RowHelpers, fixes empty
side
[echauchot] [BEAM-8470] Put back batch/simpleSourceTest.testBoundedSource
[echauchot] [BEAM-8470] Update windowAssignTest
[echauchot] [BEAM-8470] Add comment about checkpoint mark
[echauchot] [BEAM-8470] Re-code GroupByKeyTranslatorBatch to conserve windowing
[echauchot] [BEAM-8470] re-enable reduceFnRunner timers for output
[echauchot] [BEAM-8470] Improve visibility of debug messages
[echauchot] [BEAM-8470] Add a test that GBK preserves windowing
[echauchot] [BEAM-8470] Add TODO in Combine translations
[echauchot] [BEAM-8470] Update KVHelpers.extractKey() to deal with
WindowedValue and
[echauchot] [BEAM-8470] Fix comment about schemas
[echauchot] [BEAM-8470] Implement reduce part of CombineGlobally translation
with
[echauchot] [BEAM-8470] Output data after combine
[echauchot] [BEAM-8470] Implement merge accumulators part of CombineGlobally
[echauchot] [BEAM-8470] Fix encoder in combine call
[echauchot] [BEAM-8470] Revert extractKey while combinePerKey is not done (so
that
[echauchot] [BEAM-8470] Apply a groupByKey avoids for some reason that the spark
[echauchot] [BEAM-8470] Fix case when a window does not merge into any other
window
[echauchot] [BEAM-8470] Fix wrong encoder in combineGlobally GBK
[echauchot] [BEAM-8470] Fix bug in the window merging logic
[echauchot] [BEAM-8470] Remove the mapPartition that adds a key per partition
[echauchot] [BEAM-8470] Remove CombineGlobally translation because it is less
[echauchot] [BEAM-8470] Now that there is only Combine.PerKey translation, make
only
[echauchot] [BEAM-8470] Clean no more needed KVHelpers
[echauchot] [BEAM-8470] Clean not more needed RowHelpers
[echauchot] [BEAM-8470] Clean not more needed WindowingHelpers
[echauchot] [BEAM-8470] Fix javadoc of AggregatorCombiner
[echauchot] [BEAM-8470] Fixed immutable list bug
[echauchot] [BEAM-8470] add comment in combine globally test
[echauchot] [BEAM-8470] Clean groupByKeyTest
[echauchot] [BEAM-8470] Add a test that combine per key preserves windowing
[echauchot] [BEAM-8470] Ignore for now not working test testCombineGlobally
[echauchot] [BEAM-8470] Add metrics support in DoFn
[echauchot] [BEAM-8470] Add missing dependencies to run Spark Structured
Streaming
[echauchot] [BEAM-8470] Add setEnableSparkMetricSinks() method
[echauchot] [BEAM-8470] Fix javadoc
[echauchot] [BEAM-8470] Fix accumulators initialization in Combine that
prevented
[echauchot] [BEAM-8470] Add a test to check that CombineGlobally preserves
windowing
[echauchot] [BEAM-8470] Persist all output Dataset if there are multiple
outputs in
[echauchot] [BEAM-8470] Added metrics sinks and tests
[echauchot] [BEAM-8470] Make spotless happy
[echauchot] [BEAM-8470] Add PipelineResults to Spark structured streaming.
[echauchot] [BEAM-8470] Update log4j configuration
[echauchot] [BEAM-8470] Add spark execution plans extended debug messages.
[echauchot] [BEAM-8470] Print number of leaf datasets
[echauchot] [BEAM-8470] fixup! Add PipelineResults to Spark structured
streaming.
[echauchot] [BEAM-8470] Remove no more needed AggregatorCombinerPerKey (there is
[echauchot] [BEAM-8470] After testing performance and correctness, launch
pipeline
[echauchot] [BEAM-8470] Improve Pardo translation performance: avoid calling a
[echauchot] [BEAM-8470] Use "sparkMaster" in local mode to obtain number of
shuffle
[echauchot] [BEAM-8470] Wrap Beam Coders into Spark Encoders using
[echauchot] [BEAM-8470] Wrap Beam Coders into Spark Encoders using
[echauchot] [BEAM-8470] type erasure: spark encoders require a Class<T>, pass
Object
[echauchot] [BEAM-8470] Fix scala Product in Encoders to avoid StackEverflow
[echauchot] [BEAM-8470] Conform to spark ExpressionEncoders: pass classTags,
[echauchot] [BEAM-8470] Add a simple spark native test to test Beam coders
wrapping
[echauchot] [BEAM-8470] Fix code generation in Beam coder wrapper
[echauchot] [BEAM-8470] Lazy init coder because coder instance cannot be
[echauchot] [BEAM-8470] Fix warning in coder construction by reflexion
[echauchot] [BEAM-8470] Fix ExpressionEncoder generated code: typos, try catch,
fqcn
[echauchot] [BEAM-8470] Fix getting the output value in code generation
[echauchot] [BEAM-8470] Fix beam coder lazy init using reflexion: use .clas +
try
[echauchot] [BEAM-8470] Remove lazy init of beam coder because there is no
generic
[echauchot] [BEAM-8470] Remove example code
[echauchot] [BEAM-8470] Fix equal and hashcode
[echauchot] [BEAM-8470] Fix generated code: uniform exceptions catching, fix
[echauchot] [BEAM-8470] Add an assert of equality in the encoders test
[echauchot] [BEAM-8470] Apply spotless and checkstyle and add javadocs
[echauchot] [BEAM-8470] Wrap exceptions in UserCoderExceptions
[echauchot] [BEAM-8470] Put Encoders expressions serializable
[echauchot] [BEAM-8470] Catch Exception instead of IOException because some
coders
[echauchot] [BEAM-8470] Apply new Encoders to CombinePerKey
[echauchot] [BEAM-8470] Apply new Encoders to Read source
[echauchot] [BEAM-8470] Improve performance of source: the mapper already calls
[echauchot] [BEAM-8470] Ignore long time failing test: SparkMetricsSinkTest
[echauchot] [BEAM-8470] Apply new Encoders to Window assign translation
[echauchot] [BEAM-8470] Apply new Encoders to AggregatorCombiner
[echauchot] [BEAM-8470] Create a Tuple2Coder to encode scala tuple2
[echauchot] [BEAM-8470] Apply new Encoders to GroupByKey
[echauchot] [BEAM-8470] Apply new Encoders to Pardo. Replace Tuple2Coder with
[echauchot] [BEAM-8470] Apply spotless, fix typo and javadoc
[echauchot] [BEAM-8470] Use beam encoders also in the output of the source
[echauchot] [BEAM-8470] Remove unneeded cast
[echauchot] [BEAM-8470] Fix: Remove generic hack of using object. Use actual
Coder
[echauchot] [BEAM-8470] Remove Encoders based on kryo now that we call Beam
coders
[echauchot] [BEAM-8470] Add a jenkins job for validates runner tests in the new
[echauchot] [BEAM-8470] Apply spotless
[echauchot] [BEAM-8470] Rebase on master: pass sideInputMapping in
SimpleDoFnRunner
[echauchot] Fix SpotBugs
[echauchot] [BEAM-8470] simplify coders in combinePerKey translation
[echauchot] [BEAM-8470] Fix combiner. Do not reuse instance of accumulator
[echauchot] [BEAM-8470] input windows can arrive exploded (for sliding
windows). As
[echauchot] [BEAM-8470] Add a combine test with sliding windows
[echauchot] [BEAM-8470] Add a test to test combine translation on
binaryCombineFn
[echauchot] [BEAM-8470] Fix tests: use correct
[echauchot] [BEAM-8470] Fix wrong expected results in
[echauchot] [BEAM-8470] Add disclaimers about this runner being experimental
[echauchot] [BEAM-8470] Fix: create an empty accumulator in
[echauchot] [BEAM-8470] Apply spotless
[echauchot] [BEAM-8470] Add a countPerElement test with sliding windows
[echauchot] [BEAM-8470] Fix the output timestamps of combine: timestamps must be
[echauchot] [BEAM-8470] set log level to info to avoid resource consumption in
[echauchot] [BEAM-8470] Fix CombineTest.testCountPerElementWithSlidingWindows
[aromanenko.dev] [BEAM-8470] Remove "validatesStructuredStreamingRunnerBatch"
from
[echauchot] [BEAM-8470] Fix timestamps in combine output: assign the timestamp
to
------------------------------------------
[...truncated 1.12 MB...]
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak
safety net for task GroupReduce (GroupReduce at assert_that/Group/GroupByKey)
(2/2) (69e5df33cbca0e0f4d1f8119e23ceeda) [DEPLOYING]
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task
GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2)
(69e5df33cbca0e0f4d1f8119e23ceeda) [DEPLOYING].
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Registering task at network:
GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2)
(69e5df33cbca0e0f4d1f8119e23ceeda) [DEPLOYING].
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at
assert_that/Group/GroupByKey) (2/2) (69e5df33cbca0e0f4d1f8119e23ceeda) switched
from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-9] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce
(GroupReduce at assert_that/Group/GroupByKey) (2/2)
(69e5df33cbca0e0f4d1f8119e23ceeda) switched from DEPLOYING to RUNNING.
[CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) -> GroupCombine
(GroupCombine at GroupCombine: assert_that/Group/GroupByKey) -> Map (Key
Extractor) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN Filter
(UnionFixFilter) -> Map (Key Extractor) -> GroupCombine (GroupCombine at
GroupCombine: assert_that/Group/GroupByKey) -> Map (Key Extractor) (1/2)
(120622b766c0f159c7fe0e450fabf014) switched from RUNNING to FINISHED.
[CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) -> GroupCombine
(GroupCombine at GroupCombine: assert_that/Group/GroupByKey) -> Map (Key
Extractor) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task
resources for CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) ->
GroupCombine (GroupCombine at GroupCombine: assert_that/Group/GroupByKey) ->
Map (Key Extractor) (1/2) (120622b766c0f159c7fe0e450fabf014).
[CHAIN MapPartition (MapPartition at [6]{Map(<lambda at external_test.py:389>),
Map(<lambda at external_test.py:390>), assert_that}) -> FlatMap (FlatMap at
ExtractOutput[0]) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN
MapPartition (MapPartition at [6]{Map(<lambda at external_test.py:389>),
Map(<lambda at external_test.py:390>), assert_that}) -> FlatMap (FlatMap at
ExtractOutput[0]) (2/2) (8068370e530111afaaa3c8426fa94e30) switched from
RUNNING to FINISHED.
[CHAIN MapPartition (MapPartition at [6]{Map(<lambda at external_test.py:389>),
Map(<lambda at external_test.py:390>), assert_that}) -> FlatMap (FlatMap at
ExtractOutput[0]) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task -
Freeing task resources for CHAIN MapPartition (MapPartition at [6]{Map(<lambda
at external_test.py:389>), Map(<lambda at external_test.py:390>), assert_that})
-> FlatMap (FlatMap at ExtractOutput[0]) (2/2)
(8068370e530111afaaa3c8426fa94e30).
[CHAIN MapPartition (MapPartition at [6]{Map(<lambda at external_test.py:389>),
Map(<lambda at external_test.py:390>), assert_that}) -> FlatMap (FlatMap at
ExtractOutput[0]) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task -
Ensuring all FileSystem streams are closed for task CHAIN MapPartition
(MapPartition at [6]{Map(<lambda at external_test.py:389>), Map(<lambda at
external_test.py:390>), assert_that}) -> FlatMap (FlatMap at ExtractOutput[0])
(2/2) (8068370e530111afaaa3c8426fa94e30) [FINISHED]
[CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) -> GroupCombine
(GroupCombine at GroupCombine: assert_that/Group/GroupByKey) -> Map (Key
Extractor) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN Filter
(UnionFixFilter) -> Map (Key Extractor) -> GroupCombine (GroupCombine at
GroupCombine: assert_that/Group/GroupByKey) -> Map (Key Extractor) (2/2)
(7002f33ed3f4da098a12405897226822) switched from RUNNING to FINISHED.
[CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) -> GroupCombine
(GroupCombine at GroupCombine: assert_that/Group/GroupByKey) -> Map (Key
Extractor) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task
resources for CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) ->
GroupCombine (GroupCombine at GroupCombine: assert_that/Group/GroupByKey) ->
Map (Key Extractor) (2/2) (7002f33ed3f4da098a12405897226822).
[CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) -> GroupCombine
(GroupCombine at GroupCombine: assert_that/Group/GroupByKey) -> Map (Key
Extractor) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all
FileSystem streams are closed for task CHAIN Filter (UnionFixFilter) -> Map
(Key Extractor) -> GroupCombine (GroupCombine at GroupCombine:
assert_that/Group/GroupByKey) -> Map (Key Extractor) (1/2)
(120622b766c0f159c7fe0e450fabf014) [FINISHED]
[CHAIN Filter (UnionFixFilter) -> Map (Key Extractor) -> GroupCombine
(GroupCombine at GroupCombine: assert_that/Group/GroupByKey) -> Map (Key
Extractor) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all
FileSystem streams are closed for task CHAIN Filter (UnionFixFilter) -> Map
(Key Extractor) -> GroupCombine (GroupCombine at GroupCombine:
assert_that/Group/GroupByKey) -> Map (Key Extractor) (2/2)
(7002f33ed3f4da098a12405897226822) [FINISHED]
[flink-akka.actor.default-dispatcher-6] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task CHAIN
MapPartition (MapPartition at [6]{Map(<lambda at external_test.py:389>),
Map(<lambda at external_test.py:390>), assert_that}) -> FlatMap (FlatMap at
ExtractOutput[0]) 8068370e530111afaaa3c8426fa94e30.
[flink-akka.actor.default-dispatcher-6] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task CHAIN Filter
(UnionFixFilter) -> Map (Key Extractor) -> GroupCombine (GroupCombine at
GroupCombine: assert_that/Group/GroupByKey) -> Map (Key Extractor)
120622b766c0f159c7fe0e450fabf014.
[flink-akka.actor.default-dispatcher-6] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task CHAIN Filter
(UnionFixFilter) -> Map (Key Extractor) -> GroupCombine (GroupCombine at
GroupCombine: assert_that/Group/GroupByKey) -> Map (Key Extractor)
7002f33ed3f4da098a12405897226822.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition
(MapPartition at [6]{Map(<lambda at external_test.py:389>), Map(<lambda at
external_test.py:390>), assert_that}) -> FlatMap (FlatMap at ExtractOutput[0])
(2/2) (8068370e530111afaaa3c8426fa94e30) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN Filter
(UnionFixFilter) -> Map (Key Extractor) -> GroupCombine (GroupCombine at
GroupCombine: assert_that/Group/GroupByKey) -> Map (Key Extractor) (1/2)
(120622b766c0f159c7fe0e450fabf014) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN Filter
(UnionFixFilter) -> Map (Key Extractor) -> GroupCombine (GroupCombine at
GroupCombine: assert_that/Group/GroupByKey) -> Map (Key Extractor) (2/2)
(7002f33ed3f4da098a12405897226822) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-10] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition
(MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)
(d333c5ee8fa003309f5ed7ec21092e0b) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-10] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition
(MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)
(d333c5ee8fa003309f5ed7ec21092e0b) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-10] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying MapPartition
(MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2) (attempt #0) to
782db3e5-0c4f-4528-b67d-dad8ef40bb9f @ localhost (dataPort=-1)
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task MapPartition
(MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2).
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)]
INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at
[3]assert_that/{Group, Unkey, Match}) (2/2) (d333c5ee8fa003309f5ed7ec21092e0b)
switched from CREATED to DEPLOYING.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)]
INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream
leak safety net for task MapPartition (MapPartition at [3]assert_that/{Group,
Unkey, Match}) (2/2) (d333c5ee8fa003309f5ed7ec21092e0b) [DEPLOYING]
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)]
INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task
MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)
(d333c5ee8fa003309f5ed7ec21092e0b) [DEPLOYING].
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)]
INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network:
MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)
(d333c5ee8fa003309f5ed7ec21092e0b) [DEPLOYING].
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)]
INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at
[3]assert_that/{Group, Unkey, Match}) (2/2) (d333c5ee8fa003309f5ed7ec21092e0b)
switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition
(MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)
(d333c5ee8fa003309f5ed7ec21092e0b) switched from DEPLOYING to RUNNING.
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at
assert_that/Group/GroupByKey) (2/2) (69e5df33cbca0e0f4d1f8119e23ceeda) switched
from RUNNING to FINISHED.
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for
GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2)
(69e5df33cbca0e0f4d1f8119e23ceeda).
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are
closed for task GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2)
(69e5df33cbca0e0f4d1f8119e23ceeda) [FINISHED]
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at
assert_that/Group/GroupByKey) (1/2) (2401d2edb129e60fbebf9c26a34f9d46) switched
from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task GroupReduce
(GroupReduce at assert_that/Group/GroupByKey) 69e5df33cbca0e0f4d1f8119e23ceeda.
[flink-akka.actor.default-dispatcher-6] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition
(MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)
(b007e921fff21f81151664dfe51d4f2f) switched from CREATED to SCHEDULED.
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for
GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/2)
(2401d2edb129e60fbebf9c26a34f9d46).
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are
closed for task GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/2)
(2401d2edb129e60fbebf9c26a34f9d46) [FINISHED]
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task GroupReduce
(GroupReduce at assert_that/Group/GroupByKey) 2401d2edb129e60fbebf9c26a34f9d46.
[flink-akka.actor.default-dispatcher-6] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition
(MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)
(b007e921fff21f81151664dfe51d4f2f) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-6] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying MapPartition
(MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) (attempt #0) to
782db3e5-0c4f-4528-b67d-dad8ef40bb9f @ localhost (dataPort=-1)
[flink-akka.actor.default-dispatcher-6] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce
(GroupReduce at assert_that/Group/GroupByKey) (2/2)
(69e5df33cbca0e0f4d1f8119e23ceeda) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-6] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce
(GroupReduce at assert_that/Group/GroupByKey) (1/2)
(2401d2edb129e60fbebf9c26a34f9d46) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-10] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task MapPartition
(MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2).
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)]
INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at
[3]assert_that/{Group, Unkey, Match}) (1/2) (b007e921fff21f81151664dfe51d4f2f)
switched from CREATED to DEPLOYING.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)]
INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream
leak safety net for task MapPartition (MapPartition at [3]assert_that/{Group,
Unkey, Match}) (1/2) (b007e921fff21f81151664dfe51d4f2f) [DEPLOYING]
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)]
INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task
MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)
(b007e921fff21f81151664dfe51d4f2f) [DEPLOYING].
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)]
INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network:
MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)
(b007e921fff21f81151664dfe51d4f2f) [DEPLOYING].
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)]
INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at
[3]assert_that/{Group, Unkey, Match}) (1/2) (b007e921fff21f81151664dfe51d4f2f)
switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-6] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition
(MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)
(b007e921fff21f81151664dfe51d4f2f) switched from DEPLOYING to RUNNING.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)]
INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at
[3]assert_that/{Group, Unkey, Match}) (2/2) (d333c5ee8fa003309f5ed7ec21092e0b)
switched from RUNNING to FINISHED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)]
INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for
MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)
(d333c5ee8fa003309f5ed7ec21092e0b).
[flink-akka.actor.default-dispatcher-6] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink
(DiscardingOutput) (2/2) (51cf2f73cdf353bccd4099f7225d3326) switched from
CREATED to SCHEDULED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)]
INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem
streams are closed for task MapPartition (MapPartition at
[3]assert_that/{Group, Unkey, Match}) (2/2) (d333c5ee8fa003309f5ed7ec21092e0b)
[FINISHED]
[flink-akka.actor.default-dispatcher-10] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task MapPartition
(MapPartition at [3]assert_that/{Group, Unkey, Match})
d333c5ee8fa003309f5ed7ec21092e0b.
[flink-akka.actor.default-dispatcher-6] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink
(DiscardingOutput) (2/2) (51cf2f73cdf353bccd4099f7225d3326) switched from
SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-6] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying DataSink
(DiscardingOutput) (2/2) (attempt #0) to 782db3e5-0c4f-4528-b67d-dad8ef40bb9f @
localhost (dataPort=-1)
[flink-akka.actor.default-dispatcher-10] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task DataSink
(DiscardingOutput) (2/2).
[flink-akka.actor.default-dispatcher-6] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition
(MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)
(d333c5ee8fa003309f5ed7ec21092e0b) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (2/2)
(51cf2f73cdf353bccd4099f7225d3326) switched from CREATED to DEPLOYING.
[DataSink (DiscardingOutput) (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak
safety net for task DataSink (DiscardingOutput) (2/2)
(51cf2f73cdf353bccd4099f7225d3326) [DEPLOYING]
[DataSink (DiscardingOutput) (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task DataSink
(DiscardingOutput) (2/2) (51cf2f73cdf353bccd4099f7225d3326) [DEPLOYING].
[DataSink (DiscardingOutput) (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Registering task at network:
DataSink (DiscardingOutput) (2/2) (51cf2f73cdf353bccd4099f7225d3326)
[DEPLOYING].
[DataSink (DiscardingOutput) (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (2/2)
(51cf2f73cdf353bccd4099f7225d3326) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink
(DiscardingOutput) (2/2) (51cf2f73cdf353bccd4099f7225d3326) switched from
DEPLOYING to RUNNING.
[DataSink (DiscardingOutput) (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (2/2)
(51cf2f73cdf353bccd4099f7225d3326) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for DataSink
(DiscardingOutput) (2/2) (51cf2f73cdf353bccd4099f7225d3326).
[DataSink (DiscardingOutput) (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are
closed for task DataSink (DiscardingOutput) (2/2)
(51cf2f73cdf353bccd4099f7225d3326) [FINISHED]
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task DataSink
(DiscardingOutput) 51cf2f73cdf353bccd4099f7225d3326.
[flink-akka.actor.default-dispatcher-8] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink
(DiscardingOutput) (2/2) (51cf2f73cdf353bccd4099f7225d3326) switched from
RUNNING to FINISHED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)]
INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at
[3]assert_that/{Group, Unkey, Match}) (1/2) (b007e921fff21f81151664dfe51d4f2f)
switched from RUNNING to FINISHED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)]
INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for
MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)
(b007e921fff21f81151664dfe51d4f2f).
[flink-akka.actor.default-dispatcher-8] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink
(DiscardingOutput) (1/2) (eb856976be955f3c2cf97199f40b3388) switched from
CREATED to SCHEDULED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)]
INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem
streams are closed for task MapPartition (MapPartition at
[3]assert_that/{Group, Unkey, Match}) (1/2) (b007e921fff21f81151664dfe51d4f2f)
[FINISHED]
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task MapPartition
(MapPartition at [3]assert_that/{Group, Unkey, Match})
b007e921fff21f81151664dfe51d4f2f.
[flink-akka.actor.default-dispatcher-8] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink
(DiscardingOutput) (1/2) (eb856976be955f3c2cf97199f40b3388) switched from
SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-8] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying DataSink
(DiscardingOutput) (1/2) (attempt #0) to 782db3e5-0c4f-4528-b67d-dad8ef40bb9f @
localhost (dataPort=-1)
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task DataSink
(DiscardingOutput) (1/2).
[flink-akka.actor.default-dispatcher-8] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition
(MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)
(b007e921fff21f81151664dfe51d4f2f) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (1/2)
(eb856976be955f3c2cf97199f40b3388) switched from CREATED to DEPLOYING.
[DataSink (DiscardingOutput) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak
safety net for task DataSink (DiscardingOutput) (1/2)
(eb856976be955f3c2cf97199f40b3388) [DEPLOYING]
[DataSink (DiscardingOutput) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task DataSink
(DiscardingOutput) (1/2) (eb856976be955f3c2cf97199f40b3388) [DEPLOYING].
[DataSink (DiscardingOutput) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Registering task at network:
DataSink (DiscardingOutput) (1/2) (eb856976be955f3c2cf97199f40b3388)
[DEPLOYING].
[DataSink (DiscardingOutput) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (1/2)
(eb856976be955f3c2cf97199f40b3388) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink
(DiscardingOutput) (1/2) (eb856976be955f3c2cf97199f40b3388) switched from
DEPLOYING to RUNNING.
[DataSink (DiscardingOutput) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (1/2)
(eb856976be955f3c2cf97199f40b3388) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for DataSink
(DiscardingOutput) (1/2) (eb856976be955f3c2cf97199f40b3388).
[DataSink (DiscardingOutput) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are
closed for task DataSink (DiscardingOutput) (1/2)
(eb856976be955f3c2cf97199f40b3388) [FINISHED]
[flink-akka.actor.default-dispatcher-8] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task DataSink
(DiscardingOutput) eb856976be955f3c2cf97199f40b3388.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink
(DiscardingOutput) (1/2) (eb856976be955f3c2cf97199f40b3388) switched from
RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - Job
BeamApp-root-1120143840-4cc8184b (8bd05dfb235e0bdc485c4c8e7f8c7022) switched
from state RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-6] INFO
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Job
8bd05dfb235e0bdc485c4c8e7f8c7022 reached globally terminal state FINISHED.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.jobmaster.JobMaster - Stopping the JobMaster for job
BeamApp-root-1120143840-4cc8184b(8bd05dfb235e0bdc485c4c8e7f8c7022).
[flink-runner-job-invoker] INFO
org.apache.flink.runtime.minicluster.MiniCluster - Shutting down Flink Mini
Cluster
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Suspending SlotPool.
[flink-akka.actor.default-dispatcher-8] INFO
org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot
TaskSlot(index:1, state:ACTIVE, resource profile:
ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647,
directMemoryInMB=2147483647, nativeMemoryInMB=2147483647,
networkMemoryInMB=2147483647, managedMemoryInMB=8138}, allocationId:
d5b5e769cd8566fb05e5f8e4a3cdc77b, jobId: 8bd05dfb235e0bdc485c4c8e7f8c7022).
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.jobmaster.JobMaster - Close ResourceManager connection
e74cb409371f773ce723084a0d2153dc: JobManager is shutting down..
[flink-runner-job-invoker] INFO
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shutting down rest
endpoint.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Stopping SlotPool.
[flink-akka.actor.default-dispatcher-6] INFO
org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Disconnect
job manager a790dfbc1239009f53f1dfe725844734@akka://flink/user/jobmanager_1 for
job 8bd05dfb235e0bdc485c4c8e7f8c7022 from the resource manager.
[mini-cluster-io-thread-16] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - JobManager for job
8bd05dfb235e0bdc485c4c8e7f8c7022 with leader id
a790dfbc1239009f53f1dfe725844734 lost leadership.
[flink-akka.actor.default-dispatcher-8] INFO
org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot
TaskSlot(index:0, state:ACTIVE, resource profile:
ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647,
directMemoryInMB=2147483647, nativeMemoryInMB=2147483647,
networkMemoryInMB=2147483647, managedMemoryInMB=8138}, allocationId:
bf2e799866ffe4dfd85b63bc942981df, jobId: 8bd05dfb235e0bdc485c4c8e7f8c7022).
[flink-akka.actor.default-dispatcher-8] INFO
org.apache.flink.runtime.taskexecutor.JobLeaderService - Remove job
8bd05dfb235e0bdc485c4c8e7f8c7022 from job leader monitoring.
[flink-akka.actor.default-dispatcher-8] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager
connection for job 8bd05dfb235e0bdc485c4c8e7f8c7022.
[flink-akka.actor.default-dispatcher-8] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager
connection for job 8bd05dfb235e0bdc485c4c8e7f8c7022.
[flink-akka.actor.default-dispatcher-8] INFO
org.apache.flink.runtime.taskexecutor.JobLeaderService - Cannot reconnect to
job 8bd05dfb235e0bdc485c4c8e7f8c7022 because it is not registered.
[flink-akka.actor.default-dispatcher-8] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopping TaskExecutor
akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-8] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Close ResourceManager
connection e74cb409371f773ce723084a0d2153dc.
[flink-akka.actor.default-dispatcher-10] INFO
org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Closing
TaskExecutor connection 782db3e5-0c4f-4528-b67d-dad8ef40bb9f because: The
TaskExecutor is shutting down.
[flink-akka.actor.default-dispatcher-8] INFO
org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader
service.
[flink-akka.actor.default-dispatcher-8] INFO
org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager - Shutting
down TaskExecutorLocalStateStoresManager.
[flink-akka.actor.default-dispatcher-8] INFO
org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager
removed spill file directory /tmp/flink-io-d26bb250-0f87-40b5-a8fa-86645dabbbe3
[flink-akka.actor.default-dispatcher-8] INFO
org.apache.flink.runtime.io.network.NettyShuffleEnvironment - Shutting down the
network environment and its components.
[ForkJoinPool.commonPool-worker-9] INFO
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache
directory /tmp/flink-web-ui
[ForkJoinPool.commonPool-worker-9] INFO
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-10] INFO
org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down
cluster because application is in CANCELED, diagnostics
DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-10] INFO
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher
akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-10] INFO
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all
currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-6] INFO
org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing
the SlotManager.
[flink-akka.actor.default-dispatcher-6] INFO
org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl -
Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-10] INFO
org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator
- Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-10] INFO
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher
akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-8] INFO
org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager
removed spill file directory
/tmp/flink-netty-shuffle-cd8e0cf2-d631-4f7a-9a0b-07af884626d4
[flink-akka.actor.default-dispatcher-8] INFO
org.apache.flink.runtime.taskexecutor.KvStateService - Shutting down the
kvState service and its components.
[flink-akka.actor.default-dispatcher-8] INFO
org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader
service.
[flink-akka.actor.default-dispatcher-8] INFO
org.apache.flink.runtime.filecache.FileCache - removed file cache directory
/tmp/flink-dist-cache-1391befa-3810-4358-aff2-7c97b6e17586
[flink-akka.actor.default-dispatcher-8] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor
akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-8] INFO
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator -
Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator -
Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator -
Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService -
Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService -
Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:46185
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO
org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 20048
msecs
[flink-runner-job-invoker] INFO
org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO
org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers :
MetricQueryResults(Counters(ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:385>)_18}:
0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:385>)_18}:
0,
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at
core.py:2532>)_26}: 0,
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_9}: 12,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0,
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 9,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:385>)_18}:
7,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_21}: 3, 37Map(<lambda at
external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0,
37Map(<lambda at
external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}:
0, 37Map(<lambda at
external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_23}: 3,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_20}: 3,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0,
14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}:
0,
14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0,
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:389>)_21}:
0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0,
14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 4,
14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}:
0,
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_22}: 1,
14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0,
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at
core.py:2532>)_26}: 0,
14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_10}: 12,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_24:1}: 3, 37Map(<lambda at
external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_11}: 12,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_12}: 12,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_15}: 3,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_16}: 3, 37Map(<lambda at
external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_12}: 12,
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at
core.py:2532>)_26}: 0,
14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0,
ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/ExtractOutputs}:
0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_14}: 3,
14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}:
0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}:
0,
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_24:0}: 1,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 37Map(<lambda at
external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/write/pcollection:0}: 0, 37Map(<lambda at
external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0,
14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0,
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_18}: 1, 37Map(<lambda at
external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/write/pcollection:0}: 0, 37Map(<lambda at
external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_13}: 6,
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0,
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_17}: 1,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}:
0, 37Map(<lambda at
external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/write/pcollection:0}: 0,
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_19}: 1,
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0,
ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_14}: 3,
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0,
ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Merge}:
0,
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0,
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0,
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 4,
37Map(<lambda at
external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:element_count:v1
{PCOLLECTION=external_2root/Init/Map/ParMultiDo(Anonymous).output}: 6,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_27}: 1,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_29}: 1,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_28}: 1,
14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0,
ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/read/pcollection_1:0}: 0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0,
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0,
ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/read/pcollection_1:0}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:385>)_18}:
7,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 37Map(<lambda at
external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}:
0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0,
14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_2}: 12,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 37Map(<lambda at
external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0,
14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_1}: 1,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}:
0,
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 9,
14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:390>)_22}:
0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:389>)_21}:
0, 37Map(<lambda at
external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}:
0,
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0,
ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0,
ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, 37Map(<lambda at
external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_30}: 1,
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0,
ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:element_count:v1
{PCOLLECTION=pcollection_1}: 3,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0,
14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}:
0,
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0,
ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group.out/beam:env:docker:v1:0:beam:metric:element_count:v1
{PCOLLECTION=pcollection_2}: 3,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:389>)_21}:
0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0,
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:390>)_22}:
0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:389>)_21}:
0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 37Map(<lambda
at
external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0,
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0,
37Map(<lambda at
external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:element_count:v1
{PCOLLECTION=pcollection}: 5,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, 37Map(<lambda
at
external_test.py:385>).None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}:
0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0,
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0,
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at
core.py:2532>)_26}: 0,
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:390>)_22}:
0,
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0,
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:390>)_22}:
0,
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}:
0)Distributions(46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=57,
count=3, min=19, max=19},
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=54,
count=3, min=18, max=18},
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_23}: DistributionResult{sum=63,
count=3, min=21, max=21},
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=17,
count=1, min=17, max=17},
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_30}: DistributionResult{sum=14,
count=1, min=14, max=14},
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=15,
count=1, min=15, max=15},
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_17}: DistributionResult{sum=13,
count=1, min=13, max=13},
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_28}: DistributionResult{sum=41,
count=1, min=41, max=41},
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_18}: DistributionResult{sum=16,
count=1, min=16, max=16},
14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=180,
count=12, min=15, max=15},
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_29}: DistributionResult{sum=33,
count=1, min=33, max=33},
14Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13,
count=1, min=13, max=13},
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_27}: DistributionResult{sum=58,
count=1, min=58, max=58},
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_24:1}: DistributionResult{sum=72,
count=3, min=24, max=24},
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=168,
count=12, min=14, max=14},
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=168,
count=12, min=14, max=14},
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=168,
count=12, min=14, max=14},
26assert_that/Create/Impulse.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_24:0}: DistributionResult{sum=19,
count=1, min=19, max=19},
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=45,
count=3, min=15, max=15},
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=192,
count=12, min=16, max=16},
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=54,
count=3, min=18, max=18},
46ExternalTransform(beam:transforms:xlang:count).org.apache.beam.sdk.values.PCollection.<init>:400#554dcd2b40b5f04b/beam:env:docker:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=51,
count=3, min=17, max=17}))
[flink-runner-job-invoker] INFO
org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService -
Manifest at
/tmp/beam-artifact-staging/job_d8722f97-eb6b-4fb9-bbb9-1ad96a595a79/MANIFEST
has 0 artifact locations
[flink-runner-job-invoker] INFO
org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService
- Removed dir
/tmp/beam-artifact-staging/job_d8722f97-eb6b-4fb9-bbb9-1ad96a595a79/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
> Task :sdks:python:test-suites:portable:py2:crossLanguageTests
> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_bigquery_tornadoes_it
(apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT)
... ok
test_autocomplete_it
(apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it
(apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT)
... ok
test_leader_board_it
(apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit
(apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT)
... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use
WriteToBigQuery instead.
kms_key=transform.kms_key))
test_game_stats_it
(apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
... ok
test_streaming_wordcount_it
(apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:651:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1214:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
experiments = p.options.view_as(DebugOptions).experiments or []
test_hourly_team_score_it
(apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT)
... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1214:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_user_score_it
(apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use
WriteToBigQuery instead.
kms_key=transform.kms_key))
test_multiple_destinations_transform
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
... ok
test_bigquery_read_1M_python
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ...
ok
test_copy_batch
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest)
... ok
test_copy_rewrite_token
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296:
FutureWarning: MatchAll is experimental.
| 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307:
FutureWarning: MatchAll is experimental.
| 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307:
FutureWarning: ReadMatches is experimental.
| 'Checksums' >> beam.Map(compute_hash))
test_big_query_read
(apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types
(apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_bqfl_streaming
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP:
TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use
WriteToBigQuery instead.
kms_key=kms_key))
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP:
https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ...
ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_new_types
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_standard_sql
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_standard_sql_kms_key_native
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_write
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ...
SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it
(apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
... ok
test_metrics_it
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
... ok
test_datastore_write_limit
(apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML:
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3513.290s
OK (SKIP=4)
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task
':sdks:python:test-suites:portable:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 1m 33s
117 actionable tasks: 113 executed, 1 from cache, 3 up-to-date
Publishing build scan...
https://gradle.com/s/yibo62hpyxzv4
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]