See 
<https://builds.apache.org/job/beam_PostCommit_Python35_VR_Flink/473/display/redirect?page=changes>

Changes:

[echauchot] [BEAM-8470] Add an empty spark-structured-streaming runner project

[echauchot] [BEAM-8470] Fix missing dep

[echauchot] [BEAM-8470] Add SparkPipelineOptions

[echauchot] [BEAM-8470] Start pipeline translation

[echauchot] [BEAM-8470] Add global pipeline translation structure

[echauchot] [BEAM-8470] Add nodes translators structure

[echauchot] [BEAM-8470] Wire node translators with pipeline translator

[echauchot] [BEAM-8470] Renames: better differenciate pipeline translator for

[echauchot] [BEAM-8470] Organise methods in PipelineTranslator

[echauchot] [BEAM-8470] Initialise BatchTranslationContext

[echauchot] [BEAM-8470] Refactoring: -move batch/streaming common translation

[echauchot] [BEAM-8470] Make transform translation clearer: renaming, comments

[echauchot] [BEAM-8470] Improve javadocs

[echauchot] [BEAM-8470] Move SparkTransformOverrides to correct package

[echauchot] [BEAM-8470] Move common translation context components to superclass

[echauchot] [BEAM-8470] apply spotless

[echauchot] [BEAM-8470] Make codestyle and firebug happy

[echauchot] [BEAM-8470] Add TODOs

[echauchot] [BEAM-8470] Post-pone batch qualifier in all classes names for

[echauchot] [BEAM-8470] Add precise TODO for multiple TransformTranslator per

[echauchot] [BEAM-8470] Added SparkRunnerRegistrar

[echauchot] [BEAM-8470] Add basic pipeline execution. Refactor 
translatePipeline()

[echauchot] [BEAM-8470] Create PCollections manipulation methods

[echauchot] [BEAM-8470] Create Datasets manipulation methods

[echauchot] [BEAM-8470] Add Flatten transformation translator

[echauchot] [BEAM-8470] Add primitive GroupByKeyTranslatorBatch implementation

[echauchot] [BEAM-8470] Use Iterators.transform() to return Iterable

[echauchot] [BEAM-8470] Implement read transform

[echauchot] [BEAM-8470] update TODO

[echauchot] [BEAM-8470] Apply spotless

[echauchot] [BEAM-8470] start source instanciation

[echauchot] [BEAM-8470] Improve exception flow

[echauchot] [BEAM-8470] Improve type enforcement in ReadSourceTranslator

[echauchot] [BEAM-8470] Experiment over using spark Catalog to pass in Beam 
Source

[echauchot] [BEAM-8470] Add source mocks

[echauchot] [BEAM-8470] fix mock, wire mock in translators and create a main 
test.

[echauchot] [BEAM-8470] Use raw WindowedValue so that spark Encoders could work

[echauchot] [BEAM-8470] clean deps

[echauchot] [BEAM-8470] Move DatasetSourceMock to proper batch mode

[echauchot] [BEAM-8470] Run pipeline in batch mode or in streaming mode

[echauchot] [BEAM-8470] Split batch and streaming sources and translators

[echauchot] [BEAM-8470] Use raw Encoder<WindowedValue> also in regular

[echauchot] [BEAM-8470] Clean

[echauchot] [BEAM-8470] Add ReadSourceTranslatorStreaming

[echauchot] [BEAM-8470] Move Source and translator mocks to a mock package.

[echauchot] [BEAM-8470] Pass Beam Source and PipelineOptions to the spark 
DataSource

[echauchot] [BEAM-8470] Refactor DatasetSource fields

[echauchot] [BEAM-8470] Wire real SourceTransform and not mock and update the 
test

[echauchot] [BEAM-8470] Add missing 0-arg public constructor

[echauchot] [BEAM-8470] Use new PipelineOptionsSerializationUtils

[echauchot] [BEAM-8470] Apply spotless and fix  checkstyle

[echauchot] [BEAM-8470] Add a dummy schema for reader

[echauchot] [BEAM-8470] Add empty 0-arg constructor for mock source

[echauchot] [BEAM-8470] Clean

[echauchot] [BEAM-8470] Checkstyle and Findbugs

[echauchot] [BEAM-8470] Refactor SourceTest to a UTest instaed of a main

[echauchot] [BEAM-8470] Fix pipeline triggering: use a spark action instead of

[echauchot] [BEAM-8470] improve readability of options passing to the source

[echauchot] [BEAM-8470] Clean unneeded fields in DatasetReader

[echauchot] [BEAM-8470] Fix serialization issues

[echauchot] [BEAM-8470] Add SerializationDebugger

[echauchot] [BEAM-8470] Add serialization test

[echauchot] [BEAM-8470] Move SourceTest to same package as tested class

[echauchot] [BEAM-8470] Fix SourceTest

[echauchot] [BEAM-8470] Simplify beam reader creation as it created once the 
source

[echauchot] [BEAM-8470] Put all transform translators Serializable

[echauchot] [BEAM-8470] Enable test mode

[echauchot] [BEAM-8470] Enable gradle build scan

[echauchot] [BEAM-8470] Add flatten test

[echauchot] [BEAM-8470] First attempt for ParDo primitive implementation

[echauchot] [BEAM-8470] Serialize windowedValue to byte[] in source to be able 
to

[echauchot] [BEAM-8470] Comment schema choices

[echauchot] [BEAM-8470] Fix errorprone

[echauchot] [BEAM-8470] Fix testMode output to comply with new binary schema

[echauchot] [BEAM-8470] Cleaning

[echauchot] [BEAM-8470] Remove bundleSize parameter and always use spark default

[echauchot] [BEAM-8470] Fix split bug

[echauchot] [BEAM-8470] Clean

[echauchot] [BEAM-8470] Add ParDoTest

[echauchot] [BEAM-8470] Address minor review notes

[echauchot] [BEAM-8470] Clean

[echauchot] [BEAM-8470] Add GroupByKeyTest

[echauchot] [BEAM-8470] Add comments and TODO to GroupByKeyTranslatorBatch

[echauchot] [BEAM-8470] Fix type checking with Encoder of WindowedValue<T>

[echauchot] [BEAM-8470] Port latest changes of ReadSourceTranslatorBatch to

[echauchot] [BEAM-8470] Remove no more needed putDatasetRaw

[echauchot] [BEAM-8470] Add ComplexSourceTest

[echauchot] [BEAM-8470] Fail in case of having SideInouts or State/Timers

[echauchot] [BEAM-8470] Fix Encoders: create an Encoder for every manipulated 
type

[echauchot] [BEAM-8470] Apply spotless

[echauchot] [BEAM-8470] Fixed Javadoc error

[echauchot] [BEAM-8470] Rename SparkSideInputReader class and rename 
pruneOutput()

[echauchot] [BEAM-8470] Don't use deprecated

[echauchot] [BEAM-8470] Simplify logic of ParDo translator

[echauchot] [BEAM-8470] Fix kryo issue in GBK translator with a workaround

[echauchot] [BEAM-8470] Rename SparkOutputManager for consistency

[echauchot] [BEAM-8470] Fix for test elements container in GroupByKeyTest

[echauchot] [BEAM-8470] Added "testTwoPardoInRow"

[echauchot] [BEAM-8470] Add a test for the most simple possible Combine

[echauchot] [BEAM-8470] Rename SparkDoFnFilterFunction to DoFnFilterFunction for

[echauchot] [BEAM-8470] Generalize the use of SerializablePipelineOptions in 
place

[echauchot] [BEAM-8470] Fix getSideInputs

[echauchot] [BEAM-8470] Extract binary schema creation in a helper class

[echauchot] [BEAM-8470] First version of combinePerKey

[echauchot] [BEAM-8470] Improve type checking of Tuple2 encoder

[echauchot] [BEAM-8470] Introduce WindowingHelpers (and helpers package) and 
use it

[echauchot] [BEAM-8470] Fix combiner using KV as input, use binary encoders in 
place

[echauchot] [BEAM-8470] Add combinePerKey and CombineGlobally tests

[echauchot] [BEAM-8470] Introduce RowHelpers

[echauchot] [BEAM-8470] Add CombineGlobally translation to avoid translating

[echauchot] [BEAM-8470] Cleaning

[echauchot] [BEAM-8470] Get back to classes in translators resolution because 
URNs

[echauchot] [BEAM-8470] Fix various type checking issues in Combine.Globally

[echauchot] [BEAM-8470] Update test with Long

[echauchot] [BEAM-8470] Fix combine. For unknown reason GenericRowWithSchema is 
used

[echauchot] [BEAM-8470] Use more generic Row instead of GenericRowWithSchema

[echauchot] [BEAM-8470] Add explanation about receiving a Row as input in the

[echauchot] [BEAM-8470] Fix encoder bug in combinePerkey

[echauchot] [BEAM-8470] Cleaning

[echauchot] [BEAM-8470] Implement WindowAssignTranslatorBatch

[echauchot] [BEAM-8470] Implement WindowAssignTest

[echauchot] [BEAM-8470] Fix javadoc

[echauchot] [BEAM-8470] Added SideInput support

[echauchot] [BEAM-8470] Fix CheckStyle violations

[echauchot] [BEAM-8470] Don't use Reshuffle translation

[echauchot] [BEAM-8470] Added using CachedSideInputReader

[echauchot] [BEAM-8470] Added TODO comment for ReshuffleTranslatorBatch

[echauchot] [BEAM-8470] And unchecked warning suppression

[echauchot] [BEAM-8470] Add streaming source initialisation

[echauchot] [BEAM-8470] Implement first streaming source

[echauchot] [BEAM-8470] Add a TODO on spark output modes

[echauchot] [BEAM-8470] Add transformators registry in 
PipelineTranslatorStreaming

[echauchot] [BEAM-8470] Add source streaming test

[echauchot] [BEAM-8470] Specify checkpointLocation at the pipeline start

[echauchot] [BEAM-8470] Clean unneeded 0 arg constructor in batch source

[echauchot] [BEAM-8470] Clean streaming source

[echauchot] [BEAM-8470] Continue impl of offsets for streaming source

[echauchot] [BEAM-8470] Deal with checkpoint and offset based read

[echauchot] [BEAM-8470] Apply spotless and fix spotbugs warnings

[echauchot] [BEAM-8470] Disable never ending test

[echauchot] [BEAM-8470] Fix access level issues, typos and modernize code to 
Java 8

[echauchot] [BEAM-8470] Merge Spark Structured Streaming runner into main Spark

[echauchot] [BEAM-8470] Fix non-vendored imports from Spark Streaming Runner 
classes

[echauchot] [BEAM-8470] Pass doFnSchemaInformation to ParDo batch translation

[echauchot] [BEAM-8470] Fix spotless issues after rebase

[echauchot] [BEAM-8470] Fix logging levels in Spark Structured Streaming 
translation

[echauchot] [BEAM-8470] Add SparkStructuredStreamingPipelineOptions and

[echauchot] [BEAM-8470] Rename SparkPipelineResult to

[echauchot] [BEAM-8470] Use PAssert in Spark Structured Streaming transform 
tests

[echauchot] [BEAM-8470] Ignore spark offsets (cf javadoc)

[echauchot] [BEAM-8470] implement source.stop

[echauchot] [BEAM-8470] Update javadoc

[echauchot] [BEAM-8470] Apply Spotless

[echauchot] [BEAM-8470] Enable batch Validates Runner tests for Structured 
Streaming

[echauchot] [BEAM-8470] Limit the number of partitions to make tests go 300% 
faster

[echauchot] [BEAM-8470] Fixes ParDo not calling setup and not tearing down if

[echauchot] [BEAM-8470] Pass transform based doFnSchemaInformation in ParDo

[echauchot] [BEAM-8470] Consider null object case on RowHelpers, fixes empty 
side

[echauchot] [BEAM-8470] Put back batch/simpleSourceTest.testBoundedSource

[echauchot] [BEAM-8470] Update windowAssignTest

[echauchot] [BEAM-8470] Add comment about checkpoint mark

[echauchot] [BEAM-8470] Re-code GroupByKeyTranslatorBatch to conserve windowing

[echauchot] [BEAM-8470] re-enable reduceFnRunner timers for output

[echauchot] [BEAM-8470] Improve visibility of debug messages

[echauchot] [BEAM-8470] Add a test that GBK preserves windowing

[echauchot] [BEAM-8470] Add TODO in Combine translations

[echauchot] [BEAM-8470] Update KVHelpers.extractKey() to deal with 
WindowedValue and

[echauchot] [BEAM-8470] Fix comment about schemas

[echauchot] [BEAM-8470] Implement reduce part of CombineGlobally translation 
with

[echauchot] [BEAM-8470] Output data after combine

[echauchot] [BEAM-8470] Implement merge accumulators part of CombineGlobally

[echauchot] [BEAM-8470] Fix encoder in combine call

[echauchot] [BEAM-8470] Revert extractKey while combinePerKey is not done (so 
that

[echauchot] [BEAM-8470] Apply a groupByKey avoids for some reason that the spark

[echauchot] [BEAM-8470] Fix case when a window does not merge into any other 
window

[echauchot] [BEAM-8470] Fix wrong encoder in combineGlobally GBK

[echauchot] [BEAM-8470] Fix bug in the window merging logic

[echauchot] [BEAM-8470] Remove the mapPartition that adds a key per partition

[echauchot] [BEAM-8470] Remove CombineGlobally translation because it is less

[echauchot] [BEAM-8470] Now that there is only Combine.PerKey translation, make 
only

[echauchot] [BEAM-8470] Clean no more needed KVHelpers

[echauchot] [BEAM-8470] Clean not more needed RowHelpers

[echauchot] [BEAM-8470] Clean not more needed WindowingHelpers

[echauchot] [BEAM-8470] Fix javadoc of AggregatorCombiner

[echauchot] [BEAM-8470] Fixed immutable list bug

[echauchot] [BEAM-8470] add comment in combine globally test

[echauchot] [BEAM-8470] Clean groupByKeyTest

[echauchot] [BEAM-8470] Add a test that combine per key preserves windowing

[echauchot] [BEAM-8470] Ignore for now not working test testCombineGlobally

[echauchot] [BEAM-8470] Add metrics support in DoFn

[echauchot] [BEAM-8470] Add missing dependencies to run Spark Structured 
Streaming

[echauchot] [BEAM-8470] Add setEnableSparkMetricSinks() method

[echauchot] [BEAM-8470] Fix javadoc

[echauchot] [BEAM-8470] Fix accumulators initialization in Combine that 
prevented

[echauchot] [BEAM-8470] Add a test to check that CombineGlobally preserves 
windowing

[echauchot] [BEAM-8470] Persist all output Dataset if there are multiple 
outputs in

[echauchot] [BEAM-8470] Added metrics sinks and tests

[echauchot] [BEAM-8470] Make spotless happy

[echauchot] [BEAM-8470] Add PipelineResults to Spark structured streaming.

[echauchot] [BEAM-8470] Update log4j configuration

[echauchot] [BEAM-8470] Add spark execution plans extended debug messages.

[echauchot] [BEAM-8470] Print number of leaf datasets

[echauchot] [BEAM-8470] fixup! Add PipelineResults to Spark structured 
streaming.

[echauchot] [BEAM-8470] Remove no more needed AggregatorCombinerPerKey (there is

[echauchot] [BEAM-8470] After testing performance and correctness, launch 
pipeline

[echauchot] [BEAM-8470] Improve Pardo translation performance: avoid calling a

[echauchot] [BEAM-8470] Use "sparkMaster" in local mode to obtain number of 
shuffle

[echauchot] [BEAM-8470] Wrap Beam Coders into Spark Encoders using

[echauchot] [BEAM-8470] Wrap Beam Coders into Spark Encoders using

[echauchot] [BEAM-8470] type erasure: spark encoders require a Class<T>, pass 
Object

[echauchot] [BEAM-8470] Fix scala Product in Encoders to avoid StackEverflow

[echauchot] [BEAM-8470] Conform to spark ExpressionEncoders: pass classTags,

[echauchot] [BEAM-8470] Add a simple spark native test to test Beam coders 
wrapping

[echauchot] [BEAM-8470] Fix code generation in Beam coder wrapper

[echauchot] [BEAM-8470] Lazy init coder because coder instance cannot be

[echauchot] [BEAM-8470] Fix warning in coder construction by reflexion

[echauchot] [BEAM-8470] Fix ExpressionEncoder generated code: typos, try catch, 
fqcn

[echauchot] [BEAM-8470] Fix getting the output value in code generation

[echauchot] [BEAM-8470] Fix beam coder lazy init using reflexion: use .clas + 
try

[echauchot] [BEAM-8470] Remove lazy init of beam coder because there is no 
generic

[echauchot] [BEAM-8470] Remove example code

[echauchot] [BEAM-8470] Fix equal and hashcode

[echauchot] [BEAM-8470] Fix generated code: uniform exceptions catching, fix

[echauchot] [BEAM-8470] Add an assert of equality in the encoders test

[echauchot] [BEAM-8470] Apply spotless and checkstyle and add javadocs

[echauchot] [BEAM-8470] Wrap exceptions in UserCoderExceptions

[echauchot] [BEAM-8470] Put Encoders expressions serializable

[echauchot] [BEAM-8470] Catch Exception instead of IOException because some 
coders

[echauchot] [BEAM-8470] Apply new Encoders to CombinePerKey

[echauchot] [BEAM-8470] Apply new Encoders to Read source

[echauchot] [BEAM-8470] Improve performance of source: the mapper already calls

[echauchot] [BEAM-8470] Ignore long time failing test: SparkMetricsSinkTest

[echauchot] [BEAM-8470] Apply new Encoders to Window assign translation

[echauchot] [BEAM-8470] Apply new Encoders to AggregatorCombiner

[echauchot] [BEAM-8470] Create a Tuple2Coder to encode scala tuple2

[echauchot] [BEAM-8470] Apply new Encoders to GroupByKey

[echauchot] [BEAM-8470] Apply new Encoders to Pardo. Replace Tuple2Coder with

[echauchot] [BEAM-8470] Apply spotless, fix typo and javadoc

[echauchot] [BEAM-8470] Use beam encoders also in the output of the source

[echauchot] [BEAM-8470] Remove unneeded cast

[echauchot] [BEAM-8470] Fix: Remove generic hack of using object. Use actual 
Coder

[echauchot] [BEAM-8470] Remove Encoders based on kryo now that we call Beam 
coders

[echauchot] [BEAM-8470] Add a jenkins job for validates runner tests in the new

[echauchot] [BEAM-8470] Apply spotless

[echauchot] [BEAM-8470] Rebase on master: pass sideInputMapping in 
SimpleDoFnRunner

[echauchot] Fix SpotBugs

[echauchot] [BEAM-8470] simplify coders in combinePerKey translation

[echauchot] [BEAM-8470] Fix combiner. Do not reuse instance of accumulator

[echauchot] [BEAM-8470] input windows can arrive exploded (for sliding 
windows). As

[echauchot] [BEAM-8470] Add a combine test with sliding windows

[echauchot] [BEAM-8470] Add a test to test combine translation on 
binaryCombineFn

[echauchot] [BEAM-8470] Fix tests: use correct

[echauchot] [BEAM-8470] Fix wrong expected results in

[echauchot] [BEAM-8470] Add disclaimers about this runner being experimental

[echauchot] [BEAM-8470] Fix: create an empty accumulator in

[echauchot] [BEAM-8470] Apply spotless

[echauchot] [BEAM-8470] Add a countPerElement test with sliding windows

[echauchot] [BEAM-8470] Fix the output timestamps of combine: timestamps must be

[echauchot] [BEAM-8470] set log level to info to avoid resource consumption in

[echauchot] [BEAM-8470] Fix CombineTest.testCountPerElementWithSlidingWindows

[aromanenko.dev] [BEAM-8470] Remove "validatesStructuredStreamingRunnerBatch" 
from

[echauchot] [BEAM-8470] Fix timestamps in combine output: assign the timestamp 
to


------------------------------------------
[...truncated 34.04 MB...]
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task Source: Impulse 
731d1d8ba7c4419859be5f7454f94a05.
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)] 
INFO org.apache.flink.runtime.taskmanager.Task - assert_that/Group/GroupByKey 
-> [3]assert_that/{Group, Unkey, Match} (1/2) 
(b3b3f65bad4c2eb8b82225f2ff2a9d9c) switched from DEPLOYING to RUNNING.
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (2/2)] 
INFO org.apache.flink.runtime.taskmanager.Task - assert_that/Group/GroupByKey 
-> [3]assert_that/{Group, Unkey, Match} (2/2) 
(dd6b25f600a43f700ca630d1fc21cfad) switched from DEPLOYING to RUNNING.
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)] 
INFO org.apache.flink.streaming.runtime.tasks.StreamTask - No state backend has 
been configured, using default (Memory / JobManager) MemoryStateBackend (data 
in heap memory / checkpoints to JobManager) (checkpoints: 'null', savepoints: 
'null', asynchronous: TRUE, maxStateSize: 5242880)
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (2/2)] 
INFO org.apache.flink.streaming.runtime.tasks.StreamTask - No state backend has 
been configured, using default (Memory / JobManager) MemoryStateBackend (data 
in heap memory / checkpoints to JobManager) (checkpoints: 'null', savepoints: 
'null', asynchronous: TRUE, maxStateSize: 5242880)
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Source: Impulse (1/1) 
(28395676430345322816a26efedd35cf) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Source: Impulse (1/1) 
(731d1d8ba7c4419859be5f7454f94a05) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - 
assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2) 
(b3b3f65bad4c2eb8b82225f2ff2a9d9c) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - 
assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (2/2) 
(dd6b25f600a43f700ca630d1fc21cfad) switched from DEPLOYING to RUNNING.
[GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:589>), assert_that} 
(2/2)] INFO org.apache.flink.runtime.state.heap.HeapKeyedStateBackend - 
Initializing heap keyed state backend with stream factory.
[GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:589>), assert_that} 
(1/2)] INFO org.apache.flink.runtime.state.heap.HeapKeyedStateBackend - 
Initializing heap keyed state backend with stream factory.
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (2/2)] 
INFO org.apache.flink.runtime.state.heap.HeapKeyedStateBackend - Initializing 
heap keyed state backend with stream factory.
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)] 
INFO org.apache.flink.runtime.state.heap.HeapKeyedStateBackend - Initializing 
heap keyed state backend with stream factory.
Nov 20, 2019 2:47:51 PM 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference
 cleanQueue
SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=930, target=localhost:45223} 
was not shutdown properly!!! ~*~*~*
    Make sure to call shutdown()/shutdownNow() and wait until 
awaitTermination() returns true.
java.lang.RuntimeException: ManagedChannel allocation site
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:94)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:52)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:43)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:514)
        at 
org.apache.beam.sdk.fn.channel.ManagedChannelFactory.forDescriptor(ManagedChannelFactory.java:44)
        at 
org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory.createEnvironment(ExternalEnvironmentFactory.java:112)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:186)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:170)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4964)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:246)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:234)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.forStage(DefaultJobBundleFactory.java:216)
        at 
org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext.getStageBundleFactory(DefaultExecutableStageContext.java:38)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.getStageBundleFactory(ReferenceCountingExecutableStageContextFactory.java:198)
        at 
org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator.open(ExecutableStageDoFnOperator.java:196)
        at 
org.apache.flink.streaming.runtime.tasks.StreamTask.openAllOperators(StreamTask.java:532)
        at 
org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:396)
        at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
        at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
        at java.lang.Thread.run(Thread.java:748)

Nov 20, 2019 2:47:51 PM 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference
 cleanQueue
SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=938, target=localhost:45223} 
was not shutdown properly!!! ~*~*~*
    Make sure to call shutdown()/shutdownNow() and wait until 
awaitTermination() returns true.
java.lang.RuntimeException: ManagedChannel allocation site
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:94)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:52)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:43)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:514)
        at 
org.apache.beam.sdk.fn.channel.ManagedChannelFactory.forDescriptor(ManagedChannelFactory.java:44)
        at 
org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory$1.close(ExternalEnvironmentFactory.java:155)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.$closeResource(DefaultJobBundleFactory.java:381)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.close(DefaultJobBundleFactory.java:381)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.unref(DefaultJobBundleFactory.java:401)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.access$800(DefaultJobBundleFactory.java:347)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.lambda$createEnvironmentCaches$3(DefaultJobBundleFactory.java:154)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.processPendingNotifications(LocalCache.java:1809)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.runUnlockedCleanup(LocalCache.java:3462)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.postWriteCleanup(LocalCache.java:3438)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.clear(LocalCache.java:3215)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.clear(LocalCache.java:4270)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalManualCache.invalidateAll(LocalCache.java:4909)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.close(DefaultJobBundleFactory.java:224)
        at 
org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext.close(DefaultExecutableStageContext.java:43)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.closeActual(ReferenceCountingExecutableStageContextFactory.java:208)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.access$200(ReferenceCountingExecutableStageContextFactory.java:184)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.release(ReferenceCountingExecutableStageContextFactory.java:173)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.scheduleRelease(ReferenceCountingExecutableStageContextFactory.java:132)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.access$300(ReferenceCountingExecutableStageContextFactory.java:44)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.close(ReferenceCountingExecutableStageContextFactory.java:204)
        at 
org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator.$closeResource(ExecutableStageDoFnOperator.java:487)
        at 
org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator.dispose(ExecutableStageDoFnOperator.java:487)
        at 
org.apache.flink.streaming.runtime.tasks.StreamTask.tryDisposeAllOperators(StreamTask.java:562)
        at 
org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:443)
        at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
        at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
        at java.lang.Thread.run(Thread.java:748)

INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel 
for localhost:37225.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with 
unbounded number of workers.
[grpc-default-executor-0] INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService - 
Beam Fn Control client connected with id 31-1
[[4]assert_that/{Create, Group} (1/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - [4]assert_that/{Create, Group} 
(1/2) (2338cb7b42d740ca1c89f9e26169ab29) switched from RUNNING to FINISHED.
[[4]assert_that/{Create, Group} (1/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for 
[4]assert_that/{Create, Group} (1/2) (2338cb7b42d740ca1c89f9e26169ab29).
[[4]assert_that/{Create, Group} (1/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are 
closed for task [4]assert_that/{Create, Group} (1/2) 
(2338cb7b42d740ca1c89f9e26169ab29) [FINISHED]
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task 
[4]assert_that/{Create, Group} 2338cb7b42d740ca1c89f9e26169ab29.
[[1]Create/FlatMap(<lambda at core.py:2532>) (1/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - [1]Create/FlatMap(<lambda at 
core.py:2532>) (1/2) (475e028d0b96e9864403bed847366f01) switched from RUNNING 
to FINISHED.
[[1]Create/FlatMap(<lambda at core.py:2532>) (1/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for 
[1]Create/FlatMap(<lambda at core.py:2532>) (1/2) 
(475e028d0b96e9864403bed847366f01).
[[1]Create/FlatMap(<lambda at core.py:2532>) (1/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are 
closed for task [1]Create/FlatMap(<lambda at core.py:2532>) (1/2) 
(475e028d0b96e9864403bed847366f01) [FINISHED]
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task 
[1]Create/FlatMap(<lambda at core.py:2532>) 475e028d0b96e9864403bed847366f01.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - 
[4]assert_that/{Create, Group} (1/2) (2338cb7b42d740ca1c89f9e26169ab29) 
switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - 
[1]Create/FlatMap(<lambda at core.py:2532>) (1/2) 
(475e028d0b96e9864403bed847366f01) switched from RUNNING to FINISHED.
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for 
localhost:37257.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for 
localhost:40079
[grpc-default-executor-0] INFO 
org.apache.beam.runners.fnexecution.data.GrpcDataService - Beam Fn Data client 
connected.
[[1]Create/FlatMap(<lambda at core.py:2532>) (2/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - [1]Create/FlatMap(<lambda at 
core.py:2532>) (2/2) (1e7fd2c0d44dfa72df51788a9340878a) switched from RUNNING 
to FINISHED.
[[1]Create/FlatMap(<lambda at core.py:2532>) (2/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for 
[1]Create/FlatMap(<lambda at core.py:2532>) (2/2) 
(1e7fd2c0d44dfa72df51788a9340878a).
[[1]Create/FlatMap(<lambda at core.py:2532>) (2/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are 
closed for task [1]Create/FlatMap(<lambda at core.py:2532>) (2/2) 
(1e7fd2c0d44dfa72df51788a9340878a) [FINISHED]
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task 
[1]Create/FlatMap(<lambda at core.py:2532>) 1e7fd2c0d44dfa72df51788a9340878a.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - 
[1]Create/FlatMap(<lambda at core.py:2532>) (2/2) 
(1e7fd2c0d44dfa72df51788a9340878a) switched from RUNNING to FINISHED.
[[4]assert_that/{Create, Group} (2/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - [4]assert_that/{Create, Group} 
(2/2) (766b90a6e33608b89a25226d37ba23db) switched from RUNNING to FINISHED.
[[4]assert_that/{Create, Group} (2/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for 
[4]assert_that/{Create, Group} (2/2) (766b90a6e33608b89a25226d37ba23db).
[[4]assert_that/{Create, Group} (2/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are 
closed for task [4]assert_that/{Create, Group} (2/2) 
(766b90a6e33608b89a25226d37ba23db) [FINISHED]
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task 
[4]assert_that/{Create, Group} 766b90a6e33608b89a25226d37ba23db.
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - 
[4]assert_that/{Create, Group} (2/2) (766b90a6e33608b89a25226d37ba23db) 
switched from RUNNING to FINISHED.
[[3]{Create, Map(<lambda at fn_api_runner_test.py:586>), 
WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (1/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - [3]{Create, Map(<lambda at 
fn_api_runner_test.py:586>), WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (1/2) 
(3af0245cdc0f3fef5b1eb0167f56baba) switched from RUNNING to FINISHED.
[[3]{Create, Map(<lambda at fn_api_runner_test.py:586>), 
WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (1/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for 
[3]{Create, Map(<lambda at fn_api_runner_test.py:586>), 
WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (1/2) 
(3af0245cdc0f3fef5b1eb0167f56baba).
[[3]{Create, Map(<lambda at fn_api_runner_test.py:586>), 
WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (1/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are 
closed for task [3]{Create, Map(<lambda at fn_api_runner_test.py:586>), 
WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (1/2) 
(3af0245cdc0f3fef5b1eb0167f56baba) [FINISHED]
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task [3]{Create, 
Map(<lambda at fn_api_runner_test.py:586>), WindowInto(WindowIntoFn)} -> 
ToKeyedWorkItem 3af0245cdc0f3fef5b1eb0167f56baba.
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - [3]{Create, 
Map(<lambda at fn_api_runner_test.py:586>), WindowInto(WindowIntoFn)} -> 
ToKeyedWorkItem (1/2) (3af0245cdc0f3fef5b1eb0167f56baba) switched from RUNNING 
to FINISHED.
[[3]{Create, Map(<lambda at fn_api_runner_test.py:586>), 
WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (2/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - [3]{Create, Map(<lambda at 
fn_api_runner_test.py:586>), WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (2/2) 
(c400b87d3cc39e24eeeabdab7be41ea8) switched from RUNNING to FINISHED.
[[3]{Create, Map(<lambda at fn_api_runner_test.py:586>), 
WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (2/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for 
[3]{Create, Map(<lambda at fn_api_runner_test.py:586>), 
WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (2/2) 
(c400b87d3cc39e24eeeabdab7be41ea8).
[[3]{Create, Map(<lambda at fn_api_runner_test.py:586>), 
WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (2/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are 
closed for task [3]{Create, Map(<lambda at fn_api_runner_test.py:586>), 
WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (2/2) 
(c400b87d3cc39e24eeeabdab7be41ea8) [FINISHED]
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task [3]{Create, 
Map(<lambda at fn_api_runner_test.py:586>), WindowInto(WindowIntoFn)} -> 
ToKeyedWorkItem c400b87d3cc39e24eeeabdab7be41ea8.
[GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:589>), assert_that} 
(1/2)] INFO org.apache.flink.runtime.taskmanager.Task - GroupByKey -> 
[5]{Map(<lambda at fn_api_runner_test.py:589>), assert_that} (1/2) 
(92422ac2ea199b6bbb05c19f673958c1) switched from RUNNING to FINISHED.
[GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:589>), assert_that} 
(1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources 
for GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:589>), assert_that} 
(1/2) (92422ac2ea199b6bbb05c19f673958c1).
[GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:589>), assert_that} 
(1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem 
streams are closed for task GroupByKey -> [5]{Map(<lambda at 
fn_api_runner_test.py:589>), assert_that} (1/2) 
(92422ac2ea199b6bbb05c19f673958c1) [FINISHED]
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task GroupByKey -> 
[5]{Map(<lambda at fn_api_runner_test.py:589>), assert_that} 
92422ac2ea199b6bbb05c19f673958c1.
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - [3]{Create, 
Map(<lambda at fn_api_runner_test.py:586>), WindowInto(WindowIntoFn)} -> 
ToKeyedWorkItem (2/2) (c400b87d3cc39e24eeeabdab7be41ea8) switched from RUNNING 
to FINISHED.
[ToKeyedWorkItem (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - 
ToKeyedWorkItem (1/2) (32ad37d80e57f69f4dfce9b22ef888c3) switched from RUNNING 
to FINISHED.
[ToKeyedWorkItem (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - 
Freeing task resources for ToKeyedWorkItem (1/2) 
(32ad37d80e57f69f4dfce9b22ef888c3).
[ToKeyedWorkItem (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - 
Ensuring all FileSystem streams are closed for task ToKeyedWorkItem (1/2) 
(32ad37d80e57f69f4dfce9b22ef888c3) [FINISHED]
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task ToKeyedWorkItem 
32ad37d80e57f69f4dfce9b22ef888c3.
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupByKey -> 
[5]{Map(<lambda at fn_api_runner_test.py:589>), assert_that} (1/2) 
(92422ac2ea199b6bbb05c19f673958c1) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - ToKeyedWorkItem (1/2) 
(32ad37d80e57f69f4dfce9b22ef888c3) switched from RUNNING to FINISHED.
[GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:589>), assert_that} 
(2/2)] INFO org.apache.flink.runtime.taskmanager.Task - GroupByKey -> 
[5]{Map(<lambda at fn_api_runner_test.py:589>), assert_that} (2/2) 
(f22b3dd3677bf2d3d82c04ecc167b443) switched from RUNNING to FINISHED.
[GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:589>), assert_that} 
(2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources 
for GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:589>), assert_that} 
(2/2) (f22b3dd3677bf2d3d82c04ecc167b443).
[GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:589>), assert_that} 
(2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem 
streams are closed for task GroupByKey -> [5]{Map(<lambda at 
fn_api_runner_test.py:589>), assert_that} (2/2) 
(f22b3dd3677bf2d3d82c04ecc167b443) [FINISHED]
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task GroupByKey -> 
[5]{Map(<lambda at fn_api_runner_test.py:589>), assert_that} 
f22b3dd3677bf2d3d82c04ecc167b443.
[ToKeyedWorkItem (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - 
ToKeyedWorkItem (2/2) (0a474efd90ddc2c5fa2f6bb18bbaf0a8) switched from RUNNING 
to FINISHED.
[ToKeyedWorkItem (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - 
Freeing task resources for ToKeyedWorkItem (2/2) 
(0a474efd90ddc2c5fa2f6bb18bbaf0a8).
[ToKeyedWorkItem (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - 
Ensuring all FileSystem streams are closed for task ToKeyedWorkItem (2/2) 
(0a474efd90ddc2c5fa2f6bb18bbaf0a8) [FINISHED]
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task ToKeyedWorkItem 
0a474efd90ddc2c5fa2f6bb18bbaf0a8.
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (2/2)] 
INFO org.apache.flink.runtime.taskmanager.Task - assert_that/Group/GroupByKey 
-> [3]assert_that/{Group, Unkey, Match} (2/2) 
(dd6b25f600a43f700ca630d1fc21cfad) switched from RUNNING to FINISHED.
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (2/2)] 
INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for 
assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (2/2) 
(dd6b25f600a43f700ca630d1fc21cfad).
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (2/2)] 
INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem 
streams are closed for task assert_that/Group/GroupByKey -> 
[3]assert_that/{Group, Unkey, Match} (2/2) (dd6b25f600a43f700ca630d1fc21cfad) 
[FINISHED]
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task 
assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} 
dd6b25f600a43f700ca630d1fc21cfad.
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupByKey -> 
[5]{Map(<lambda at fn_api_runner_test.py:589>), assert_that} (2/2) 
(f22b3dd3677bf2d3d82c04ecc167b443) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - ToKeyedWorkItem (2/2) 
(0a474efd90ddc2c5fa2f6bb18bbaf0a8) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - 
assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (2/2) 
(dd6b25f600a43f700ca630d1fc21cfad) switched from RUNNING to FINISHED.
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)] 
INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory - 
Closing environment urn: "beam:env:external:v1"
payload: "\n\021\022\017localhost:45465"

INFO:apache_beam.runners.worker.sdk_worker:No more requests from control plane
INFO:apache_beam.runners.worker.sdk_worker:SDK Harness waiting for in-flight 
requests to complete
INFO:apache_beam.runners.worker.data_plane:Closing all cached grpc data 
channels.
INFO:apache_beam.runners.worker.sdk_worker:Closing all cached gRPC state 
handlers.
INFO:apache_beam.runners.worker.sdk_worker:Done consuming work.
[grpc-default-executor-0] WARN 
org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for unknown 
endpoint.
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)] 
WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for 
unknown endpoint.
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)] 
INFO org.apache.flink.runtime.taskmanager.Task - assert_that/Group/GroupByKey 
-> [3]assert_that/{Group, Unkey, Match} (1/2) 
(b3b3f65bad4c2eb8b82225f2ff2a9d9c) switched from RUNNING to FINISHED.
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)] 
INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for 
assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2) 
(b3b3f65bad4c2eb8b82225f2ff2a9d9c).
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)] 
INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem 
streams are closed for task assert_that/Group/GroupByKey -> 
[3]assert_that/{Group, Unkey, Match} (1/2) (b3b3f65bad4c2eb8b82225f2ff2a9d9c) 
[FINISHED]
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task 
assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} 
b3b3f65bad4c2eb8b82225f2ff2a9d9c.
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - 
assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2) 
(b3b3f65bad4c2eb8b82225f2ff2a9d9c) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Job 
test_windowing_1574261270.729975 (1d23d8a63f579b6d31d816bc302cf95c) switched 
from state RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.checkpoint.CheckpointCoordinator - Stopping checkpoint 
coordinator for job 1d23d8a63f579b6d31d816bc302cf95c.
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.checkpoint.StandaloneCompletedCheckpointStore - 
Shutting down
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Job 
1d23d8a63f579b6d31d816bc302cf95c reached globally terminal state FINISHED.
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Stopping the JobMaster for job 
test_windowing_1574261270.729975(1d23d8a63f579b6d31d816bc302cf95c).
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot 
TaskSlot(index:1, state:ACTIVE, resource profile: 
ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, 
directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, 
networkMemoryInMB=2147483647, managedMemoryInMB=8127}, allocationId: 
3f4d8982e46d66cc96a199c9aaa5ed37, jobId: 1d23d8a63f579b6d31d816bc302cf95c).
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Suspending SlotPool.
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Close ResourceManager connection 
76449f6a6597eb0a1568d00cc6c5fba9: JobManager is shutting down..
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Stopping SlotPool.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot 
TaskSlot(index:0, state:ACTIVE, resource profile: 
ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, 
directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, 
networkMemoryInMB=2147483647, managedMemoryInMB=8127}, allocationId: 
81616427765cc8bd0fc70c3c540f5289, jobId: 1d23d8a63f579b6d31d816bc302cf95c).
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Disconnect 
job manager a2af622831da2afb1ea8dfbb51f84232@akka://flink/user/jobmanager_61 
for job 1d23d8a63f579b6d31d816bc302cf95c from the resource manager.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.taskexecutor.JobLeaderService - Remove job 
1d23d8a63f579b6d31d816bc302cf95c from job leader monitoring.
[flink-runner-job-invoker] INFO 
org.apache.flink.runtime.minicluster.MiniCluster - Shutting down Flink Mini 
Cluster
[flink-runner-job-invoker] INFO 
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shutting down rest 
endpoint.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager 
connection for job 1d23d8a63f579b6d31d816bc302cf95c.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager 
connection for job 1d23d8a63f579b6d31d816bc302cf95c.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.taskexecutor.JobLeaderService - Cannot reconnect to 
job 1d23d8a63f579b6d31d816bc302cf95c because it is not registered.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopping TaskExecutor 
akka://flink/user/taskmanager_60.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Close ResourceManager 
connection 76449f6a6597eb0a1568d00cc6c5fba9.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader 
service.
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Closing 
TaskExecutor connection accc763f-a752-4c0b-a224-311760ff4e1a because: The 
TaskExecutor is shutting down.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager - Shutting 
down TaskExecutorLocalStateStoresManager.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager 
removed spill file directory /tmp/flink-io-6da85ab0-90a7-43f8-b3f9-c3588cd8113c
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.io.network.NettyShuffleEnvironment - Shutting down the 
network environment and its components.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager 
removed spill file directory 
/tmp/flink-netty-shuffle-0be59800-fd15-4876-b0cc-32ec7ef10c68
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.taskexecutor.KvStateService - Shutting down the 
kvState service and its components.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader 
service.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.filecache.FileCache - removed file cache directory 
/tmp/flink-dist-cache-2a0fd808-16d3-41ae-9965-bdde5a218a42
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor 
akka://flink/user/taskmanager_60.
[ForkJoinPool.commonPool-worker-9] INFO 
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache 
directory /tmp/flink-web-ui
[ForkJoinPool.commonPool-worker-9] INFO 
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down 
cluster because application is in CANCELED, diagnostics 
DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher 
akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all 
currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing 
the SlotManager.
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - 
Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator
 - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher 
akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.beam.runners.flink.metrics.FileReporter - wrote metrics to 
/tmp/flinktest-conf596pw5pr/test-metrics.txt
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - 
Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - 
Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - 
Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - 
Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - 
Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:44487
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO 
org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 249 
msecs
[flink-runner-job-invoker] INFO 
org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO 
org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : 
MetricQueryResults(Counters(26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_33}: 0, 
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at 
fn_api_runner_test.py:586>)_17}: 0, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_40}:
 0, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0, 
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at 
fn_api_runner_test.py:589>)_23}: 0, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_31}: 0, 
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at 
core.py:2532>)_27}: 0, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0, 
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_10}: 5, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_31}: 0, 
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_11}: 5, 
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_12}: 5, 
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}: 
0, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_30}: 0, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_30}: 0, 
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_42}: 0, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at 
core.py:2532>)_27}: 0, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_22}: 1, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at 
fn_api_runner_test.py:589>)_23}: 0, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_31}: 0, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0, 
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_30}: 1, 
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_42}: 0, 
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}: 
0, 
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_41}: 0, 
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_WindowInto(WindowIntoFn)_18}: 0, 
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}: 
0, 
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_40}:
 0, 
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_9}: 5, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_30}: 0, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_18}: 1, 
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_29}: 0, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_17}: 1, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_41}: 0, 
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_34}: 0, 
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_WindowInto(WindowIntoFn)_18}: 0, 
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_24:1}: 1, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_31}: 0, 
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_29}: 1, 
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_27}: 1, 
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_WindowInto(WindowIntoFn)_18}: 0, 
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_28}: 1, 
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at 
fn_api_runner_test.py:586>)_17}: 0, 
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_29}: 0, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_29}: 0, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_29}: 0, 
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 
{PCOLLECTION=ref_PCollection_PCollection_1}: 1, 
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_WindowInto(WindowIntoFn)_18}: 0, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_15}: 2, 
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 
{PCOLLECTION=ref_PCollection_PCollection_2}: 5, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_16}: 2, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_33}: 0, 
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_40}:
 0, 
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_41}: 0, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_34}: 0, 
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_41}: 0, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_24:0}: 2, 
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_40}:
 0, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at 
core.py:2532>)_27}: 0, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_42}: 0, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0, 
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2532>)_4}: 
0, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_30}: 0, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_34}: 0, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0, 
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_34}: 0, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_33}: 0, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at 
fn_api_runner_test.py:589>)_23}: 0, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0, 
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at 
fn_api_runner_test.py:586>)_17}: 0, 
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_20}: 2, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_23}: 2, 
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_42}: 0, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_21}: 2, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at 
fn_api_runner_test.py:589>)_23}: 0, 
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0, 
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at 
core.py:2532>)_27}: 0, 
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at 
fn_api_runner_test.py:586>)_17}: 0, 
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_33}: 0, 
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 
0)Distributions(14Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, 
count=1, min=13, max=13}, 
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=70, 
count=5, min=14, max=14}, 
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=100, 
count=5, min=20, max=20}, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_24:0}: DistributionResult{sum=60, 
count=2, min=29, max=31}, 
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=140, 
count=5, min=28, max=28}, 
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=80, 
count=5, min=16, max=16}, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_24:1}: DistributionResult{sum=19, 
count=1, min=19, max=19}, 
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 
{PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=75, 
count=5, min=15, max=15}, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=15, 
count=1, min=15, max=15}, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=75, 
count=2, min=36, max=39}, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_18}: DistributionResult{sum=16, 
count=1, min=16, max=16}, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=66, 
count=2, min=32, max=34}, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_17}: DistributionResult{sum=13, 
count=1, min=13, max=13}, 
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_30}: DistributionResult{sum=14, 
count=1, min=14, max=14}, 
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_28}: DistributionResult{sum=47, 
count=1, min=47, max=47}, 
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_29}: DistributionResult{sum=39, 
count=1, min=39, max=39}, 
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=17, 
count=1, min=17, max=17}, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=46, 
count=2, min=22, max=24}, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=48, 
count=2, min=23, max=25}, 
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_23}: DistributionResult{sum=52, 
count=2, min=25, max=27}, 
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_27}: DistributionResult{sum=59, 
count=1, min=59, max=59}))
[flink-runner-job-invoker] INFO 
org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService - 
Manifest at 
/tmp/flinktest3iewmhnb/job_33066f8f-baf4-42cf-9d41-b171370268de/MANIFEST has 0 
artifact locations
[flink-runner-job-invoker] INFO 
org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService
 - Removed dir /tmp/flinktest3iewmhnb/job_33066f8f-baf4-42cf-9d41-b171370268de/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.INFO:__main__:removing conf dir: /tmp/flinktest-conf596pw5pr

----------------------------------------------------------------------
Ran 76 tests in 124.170s

OK (skipped=14)

FAILURE: Build failed with an exception.

* Where:
Script 
'<https://builds.apache.org/job/beam_PostCommit_Python35_VR_Flink/ws/src/sdks/python/test-suites/portable/common.gradle'>
 line: 55

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:portable:py35:flinkCompatibilityMatrixBatchLOOPBACK'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 18m 2s
72 actionable tasks: 63 executed, 8 from cache, 1 up-to-date

Publishing build scan...
https://gradle.com/s/askcw2lecutai

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to