See 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Flink/3270/display/redirect?page=changes>

Changes:

[Udi Meiri] [BEAM-11211] parquetio_test using multiple pyarrow versions

[ajamato] [BEAM-11092] Add bigquery io request count metric, implementing

[Kenneth Knowles] Make UsesTestStream extend UsesUnboundedPCollections for 
exclusion in

[noreply] [BEAM-11303] Use sum as the post-agg for size (#13379)

[Boyuan Zhang] Update SDF programming guide.


------------------------------------------
[...truncated 1.04 MB...]
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'zone' was 
already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'job_name' 
was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'runner' 
was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'temp_location' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'streaming' 
was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'pubsub_root_url' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'environment_cache_millis' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'job_endpoint' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'output_executable_path' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'sdk_worker_parallelism' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'files_to_stage' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'flink_master' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'experiments' was already added
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
RUNNING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ok
test_tagged_join (apache_beam.transforms.sql_test.SqlTransformTest) ... 
INFO:apache_beam.utils.subprocess_server:Using pre-built snapshot at 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.27.0-SNAPSHOT.jar>
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' 
'<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.27.0-SNAPSHOT.jar'>
 '36591']
DEBUG:root:Waiting for grpc channel to be ready at localhost:36591.
INFO:apache_beam.utils.subprocess_server:b'Starting expansion service at 
localhost:36591'
DEBUG:root:Waiting for grpc channel to be ready at localhost:36591.
INFO:apache_beam.utils.subprocess_server:b'Nov 19, 2020 12:58:21 AM 
org.apache.beam.sdk.expansion.service.ExpansionService loadRegisteredTransforms'
INFO:apache_beam.utils.subprocess_server:b'INFO: Registering external 
transforms: [beam:external:java:sql:v1, 
beam:external:java:generate_sequence:v1]'
INFO:apache_beam.utils.subprocess_server:b'\tbeam:external:java:sql:v1: 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$1/2121744517@626b2d4a'
INFO:apache_beam.utils.subprocess_server:b'\tbeam:external:java:generate_sequence:v1:
 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$1/2121744517@5e91993f'
DEBUG:root:Waiting for grpc channel to be ready at localhost:36591.
DEBUG:root:Waiting for grpc channel to be ready at localhost:36591.
DEBUG:root:Waiting for grpc channel to be ready at localhost:36591.
DEBUG:root:Waiting for grpc channel to be ready at localhost:36591.
INFO:apache_beam.utils.subprocess_server:b'Nov 19, 2020 12:58:22 AM 
org.apache.beam.sdk.expansion.service.ExpansionService expand'
INFO:apache_beam.utils.subprocess_server:b"INFO: Expanding 
'SqlTransform(beam:external:java:sql:v1)' with URN 'beam:external:java:sql:v1'"
INFO:apache_beam.utils.subprocess_server:b'Nov 19, 2020 12:58:23 AM 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader
 payloadToConfig'
INFO:apache_beam.utils.subprocess_server:b"WARNING: Configuration class 
'org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar$Configuration'
 has no schema registered. Attempting to construct with setter approach."
INFO:apache_beam.utils.subprocess_server:b'Nov 19, 2020 12:58:25 AM 
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel'
INFO:apache_beam.utils.subprocess_server:b'INFO: SQL:'
INFO:apache_beam.utils.subprocess_server:b'SELECT `simple`.`id` AS `id`, 
`enrich`.`metadata` AS `metadata`'
INFO:apache_beam.utils.subprocess_server:b'FROM `beam`.`simple` AS `simple`'
INFO:apache_beam.utils.subprocess_server:b'INNER JOIN `beam`.`enrich` AS 
`enrich` ON `simple`.`id` = `enrich`.`id`'
INFO:apache_beam.utils.subprocess_server:b'Nov 19, 2020 12:58:25 AM 
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel'
INFO:apache_beam.utils.subprocess_server:b'INFO: SQLPlan>'
INFO:apache_beam.utils.subprocess_server:b'LogicalProject(id=[$0], 
metadata=[$4])'
INFO:apache_beam.utils.subprocess_server:b'  LogicalJoin(condition=[=($0, $3)], 
joinType=[inner])'
INFO:apache_beam.utils.subprocess_server:b'    BeamIOSourceRel(table=[[beam, 
simple]])'
INFO:apache_beam.utils.subprocess_server:b'    BeamIOSourceRel(table=[[beam, 
enrich]])'
INFO:apache_beam.utils.subprocess_server:b''
INFO:apache_beam.utils.subprocess_server:b'Nov 19, 2020 12:58:26 AM 
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel'
INFO:apache_beam.utils.subprocess_server:b'INFO: BEAMPlan>'
INFO:apache_beam.utils.subprocess_server:b'BeamCalcRel(expr#0..4=[{inputs}], 
id=[$t2], metadata=[$t1])'
INFO:apache_beam.utils.subprocess_server:b'  BeamCoGBKJoinRel(condition=[=($2, 
$0)], joinType=[inner])'
INFO:apache_beam.utils.subprocess_server:b'    BeamIOSourceRel(table=[[beam, 
enrich]])'
INFO:apache_beam.utils.subprocess_server:b'    BeamIOSourceRel(table=[[beam, 
simple]])'
INFO:apache_beam.utils.subprocess_server:b''
DEBUG:root:Sending SIGINT to job_server
DEBUG:root:Unhandled type_constraint: Union[]
DEBUG:root:Unhandled type_constraint: Union[]
DEBUG:root:Unhandled type_constraint: Union[]
DEBUG:root:Unhandled type_constraint: Union[]
WARNING:root:Make sure that locally built Python SDK docker image has Python 
3.6 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python3.6_sdk:2.27.0.dev. 
If the image is not available at local, we will try to pull from hub.docker.com
WARNING:root:Make sure that locally built Python SDK docker image has Python 
3.6 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python3.6_sdk:2.27.0.dev. 
If the image is not available at local, we will try to pull from hub.docker.com
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function lift_combiners at 0x7f918c0c4158> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:42 [1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages: 
['ref_AppliedPTransform_Create enrich/Impulse_3\n  Create 
enrich/Impulse:beam:transform:impulse:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create 
enrich/FlatMap(<lambda at core.py:3036>)_4\n  Create enrich/FlatMap(<lambda at 
core.py:3036>):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create 
enrich/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n  Create 
enrich/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create 
enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n  
Create 
enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create 
enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n  Create 
enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create 
enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_11\n
  Create 
enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create 
enrich/MaybeReshuffle/Reshuffle/RemoveRandomKeys_12\n  Create 
enrich/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create enrich/Map(decode)_13\n  Create 
enrich/Map(decode):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create 
simple/Impulse_15\n  Create simple/Impulse:beam:transform:impulse:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create 
simple/FlatMap(<lambda at core.py:3036>)_16\n  Create simple/FlatMap(<lambda at 
core.py:3036>):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create 
simple/MaybeReshuffle/Reshuffle/AddRandomKeys_19\n  Create 
simple/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create 
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_21\n  
Create 
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create 
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_22\n  Create 
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create 
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_23\n
  Create 
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create 
simple/MaybeReshuffle/Reshuffle/RemoveRandomKeys_24\n  Create 
simple/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create simple/Map(decode)_25\n  Create 
simple/Map(decode):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_6/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_6/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_5/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_5/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/left_TimestampCombiner/Flatten.PCollections\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/left_TimestampCombiner/Flatten.PCollections:beam:transform:flatten:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/right_TimestampCombiner/Flatten.PCollections\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/right_TimestampCombiner/Flatten.PCollections:beam:transform:flatten:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/extractKeylhs/ParMultiDo(Anonymous)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/extractKeylhs/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/extractKeyrhs/ParMultiDo(Anonymous)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/extractKeyrhs/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/Flatten\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/Flatten:beam:transform:flatten:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/GBK\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/GBK:beam:transform:group_by_key:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/ParDo(ConvertCoGbkResult)/ParMultiDo(ConvertCoGbkResult)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/ParDo(ConvertCoGbkResult)/ParMultiDo(ConvertCoGbkResult):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Select.Fields/ParDo(Select)/ParMultiDo(Select)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Select.Fields/ParDo(Select)/ParMultiDo(Select):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCalcRel_96/ParDo(Calc)/ParMultiDo(Calc)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCalcRel_96/ParDo(Calc)/ParMultiDo(Calc):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Create/Impulse_29\n  
assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at 
core.py:3036>)_30\n  assert_that/Create/FlatMap(<lambda at 
core.py:3036>):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Create/Map(decode)_32\n  
assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_33\n  
assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/ToVoidKey_34\n  
assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/pair_with_0_36\n  
assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/pair_with_1_37\n  
assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/Flatten_38\n  
assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/GroupByKey_39\n  
assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_40\n 
 assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Unkey_41\n  
assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Match_42\n  
assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>']
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'dataflow_kms_key' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'enable_streaming_engine' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'project' 
was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'worker_region' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'worker_zone' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'zone' was 
already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'job_name' 
was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'runner' 
was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'temp_location' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'streaming' 
was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'pubsub_root_url' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'environment_cache_millis' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'job_endpoint' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'output_executable_path' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'sdk_worker_parallelism' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'files_to_stage' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'flink_master' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'experiments' was already added
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
RUNNING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ok
test_windowing_before_sql (apache_beam.transforms.sql_test.SqlTransformTest) 
... INFO:apache_beam.utils.subprocess_server:Using pre-built snapshot at 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.27.0-SNAPSHOT.jar>
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' 
'<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.27.0-SNAPSHOT.jar'>
 '50835']
DEBUG:root:Waiting for grpc channel to be ready at localhost:50835.
DEBUG:root:Waiting for grpc channel to be ready at localhost:50835.
INFO:apache_beam.utils.subprocess_server:b'Starting expansion service at 
localhost:50835'
DEBUG:root:Waiting for grpc channel to be ready at localhost:50835.
INFO:apache_beam.utils.subprocess_server:b'Nov 19, 2020 12:58:58 AM 
org.apache.beam.sdk.expansion.service.ExpansionService loadRegisteredTransforms'
INFO:apache_beam.utils.subprocess_server:b'INFO: Registering external 
transforms: [beam:external:java:sql:v1, 
beam:external:java:generate_sequence:v1]'
INFO:apache_beam.utils.subprocess_server:b'\tbeam:external:java:sql:v1: 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$1/2121744517@626b2d4a'
INFO:apache_beam.utils.subprocess_server:b'\tbeam:external:java:generate_sequence:v1:
 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$1/2121744517@5e91993f'
DEBUG:root:Waiting for grpc channel to be ready at localhost:50835.
DEBUG:root:Waiting for grpc channel to be ready at localhost:50835.
DEBUG:root:Waiting for grpc channel to be ready at localhost:50835.
DEBUG:root:Waiting for grpc channel to be ready at localhost:50835.
DEBUG:root:Waiting for grpc channel to be ready at localhost:50835.
DEBUG:root:Waiting for grpc channel to be ready at localhost:50835.
DEBUG:root:Waiting for grpc channel to be ready at localhost:50835.
INFO:apache_beam.utils.subprocess_server:b'Nov 19, 2020 12:59:01 AM 
org.apache.beam.sdk.expansion.service.ExpansionService expand'
INFO:apache_beam.utils.subprocess_server:b"INFO: Expanding 
'SqlTransform(beam:external:java:sql:v1)' with URN 'beam:external:java:sql:v1'"
INFO:apache_beam.utils.subprocess_server:b'Nov 19, 2020 12:59:03 AM 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader
 payloadToConfig'
INFO:apache_beam.utils.subprocess_server:b"WARNING: Configuration class 
'org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar$Configuration'
 has no schema registered. Attempting to construct with setter approach."
INFO:apache_beam.utils.subprocess_server:b'Nov 19, 2020 12:59:07 AM 
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel'
INFO:apache_beam.utils.subprocess_server:b'INFO: SQL:'
INFO:apache_beam.utils.subprocess_server:b'SELECT COUNT(*) AS `count`'
INFO:apache_beam.utils.subprocess_server:b'FROM `beam`.`PCOLLECTION` AS 
`PCOLLECTION`'
INFO:apache_beam.utils.subprocess_server:b'Nov 19, 2020 12:59:08 AM 
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel'
INFO:apache_beam.utils.subprocess_server:b'INFO: SQLPlan>'
INFO:apache_beam.utils.subprocess_server:b'LogicalAggregate(group=[{}], 
count=[COUNT()])'
INFO:apache_beam.utils.subprocess_server:b'  LogicalProject($f0=[0])'
INFO:apache_beam.utils.subprocess_server:b'    BeamIOSourceRel(table=[[beam, 
PCOLLECTION]])'
INFO:apache_beam.utils.subprocess_server:b''
INFO:apache_beam.utils.subprocess_server:b'Nov 19, 2020 12:59:08 AM 
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel'
INFO:apache_beam.utils.subprocess_server:b'INFO: BEAMPlan>'
INFO:apache_beam.utils.subprocess_server:b'BeamAggregationRel(group=[{}], 
count=[COUNT()])'
INFO:apache_beam.utils.subprocess_server:b'  BeamIOSourceRel(table=[[beam, 
PCOLLECTION]])'
INFO:apache_beam.utils.subprocess_server:b''
DEBUG:root:Sending SIGINT to job_server
DEBUG:root:Unhandled type_constraint: Union[]
DEBUG:root:Unhandled type_constraint: Union[]
DEBUG:root:Unhandled type_constraint: Union[]
DEBUG:root:Unhandled type_constraint: Union[]
WARNING:root:Make sure that locally built Python SDK docker image has Python 
3.6 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python3.6_sdk:2.27.0.dev. 
If the image is not available at local, we will try to pull from hub.docker.com
WARNING:root:Make sure that locally built Python SDK docker image has Python 
3.6 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python3.6_sdk:2.27.0.dev. 
If the image is not available at local, we will try to pull from hub.docker.com
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function lift_combiners at 0x7f918c0c4158> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:29 [1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages: 
['ref_AppliedPTransform_Create/Impulse_3\n  
Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:3036>)_4\n  
Create/FlatMap(<lambda at core.py:3036>):beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n  
Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n
  
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n
  
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_11\n
  
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_12\n  
Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create/Map(decode)_13\n  
Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at 
sql_test.py:174>)_14\n  Map(<lambda at 
sql_test.py:174>):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_WindowInto(WindowIntoFn)_15\n  
WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'external_8SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_3/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_3/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/toRow/ParDo(Anonymous)/ParMultiDo(Anonymous)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/toRow/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/selectKeys/AddKeys/Map/ParMultiDo(Anonymous)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/selectKeys/AddKeys/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/GroupByKey\n
  
SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/GroupByKey:beam:transform:group_by_key:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/Combine/ParDo(Anonymous)/ParMultiDo(Anonymous)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/Combine/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToRow/ParMultiDo(Anonymous)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToRow/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/mergeRecord/ParMultiDo(Anonymous)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/mergeRecord/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Create/Impulse_19\n  
assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at 
core.py:3036>)_20\n  assert_that/Create/FlatMap(<lambda at 
core.py:3036>):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Create/Map(decode)_22\n  
assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_23\n  
assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/ToVoidKey_24\n  
assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/pair_with_0_26\n  
assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/pair_with_1_27\n  
assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/Flatten_28\n  
assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/GroupByKey_29\n  
assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_30\n 
 assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Unkey_31\n  
assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Match_32\n  
assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>']
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'dataflow_kms_key' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'enable_streaming_engine' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'project' 
was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'worker_region' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'worker_zone' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'zone' was 
already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'job_name' 
was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'runner' 
was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'temp_location' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'streaming' 
was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'pubsub_root_url' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'environment_cache_millis' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'job_endpoint' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'output_executable_path' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'sdk_worker_parallelism' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'files_to_stage' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'flink_master' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'experiments' was already added
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
RUNNING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ok
test_zetasql_generate_data (apache_beam.transforms.sql_test.SqlTransformTest) 
... INFO:apache_beam.utils.subprocess_server:Using pre-built snapshot at 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.27.0-SNAPSHOT.jar>
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' 
'<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.27.0-SNAPSHOT.jar'>
 '50393']
DEBUG:root:Waiting for grpc channel to be ready at localhost:50393.
INFO:apache_beam.utils.subprocess_server:b'Starting expansion service at 
localhost:50393'
DEBUG:root:Waiting for grpc channel to be ready at localhost:50393.
INFO:apache_beam.utils.subprocess_server:b'Nov 19, 2020 12:59:29 AM 
org.apache.beam.sdk.expansion.service.ExpansionService loadRegisteredTransforms'
INFO:apache_beam.utils.subprocess_server:b'INFO: Registering external 
transforms: [beam:external:java:sql:v1, 
beam:external:java:generate_sequence:v1]'
INFO:apache_beam.utils.subprocess_server:b'\tbeam:external:java:sql:v1: 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$1/2121744517@626b2d4a'
INFO:apache_beam.utils.subprocess_server:b'\tbeam:external:java:generate_sequence:v1:
 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$1/2121744517@5e91993f'
DEBUG:root:Waiting for grpc channel to be ready at localhost:50393.
DEBUG:root:Waiting for grpc channel to be ready at localhost:50393.
DEBUG:root:Waiting for grpc channel to be ready at localhost:50393.
DEBUG:root:Waiting for grpc channel to be ready at localhost:50393.
INFO:apache_beam.utils.subprocess_server:b'Nov 19, 2020 12:59:30 AM 
org.apache.beam.sdk.expansion.service.ExpansionService expand'
INFO:apache_beam.utils.subprocess_server:b"INFO: Expanding 
'SqlTransform(beam:external:java:sql:v1)' with URN 'beam:external:java:sql:v1'"
INFO:apache_beam.utils.subprocess_server:b'Nov 19, 2020 12:59:31 AM 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader
 payloadToConfig'
INFO:apache_beam.utils.subprocess_server:b"WARNING: Configuration class 
'org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar$Configuration'
 has no schema registered. Attempting to construct with setter approach."
DEBUG:root:Sending SIGINT to job_server
DEBUG:root:Unhandled type_constraint: Union[]
DEBUG:root:Unhandled type_constraint: Union[]
DEBUG:root:Unhandled type_constraint: Union[]
DEBUG:root:Unhandled type_constraint: Union[]
WARNING:root:Make sure that locally built Python SDK docker image has Python 
3.6 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python3.6_sdk:2.27.0.dev. 
If the image is not available at local, we will try to pull from hub.docker.com
WARNING:root:Make sure that locally built Python SDK docker image has Python 
3.6 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python3.6_sdk:2.27.0.dev. 
If the image is not available at local, we will try to pull from hub.docker.com
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function lift_combiners at 0x7f918c0c4158> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:16 [1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages: 
['external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/Impulse\n
  
SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/Impulse:beam:transform:impulse:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_9SqlTransform(beam:external:java:sql:v1)/BeamZetaSqlCalcRel_17/ParDo(Calc)/ParMultiDo(Calc)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamZetaSqlCalcRel_17/ParDo(Calc)/ParMultiDo(Calc):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Create/Impulse_5\n  
assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:3036>)_6\n 
 assert_that/Create/FlatMap(<lambda at core.py:3036>):beam:transform:pardo:v1\n 
 must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Create/Map(decode)_8\n  
assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_9\n  
assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/ToVoidKey_10\n  
assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/pair_with_0_12\n  
assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/pair_with_1_13\n  
assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/Flatten_14\n  
assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/GroupByKey_15\n  
assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_16\n 
 assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Unkey_17\n  
assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Match_18\n  
assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>']
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'dataflow_kms_key' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'enable_streaming_engine' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'project' 
was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'worker_region' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'worker_zone' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'zone' was 
already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'job_name' 
was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'runner' 
was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'temp_location' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'streaming' 
was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'pubsub_root_url' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'environment_cache_millis' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'job_endpoint' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'output_executable_path' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'sdk_worker_parallelism' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'files_to_stage' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'flink_master' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'experiments' was already added
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
RUNNING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ok

----------------------------------------------------------------------
XML: nosetests-xlangSqlValidateRunner.xml
----------------------------------------------------------------------
XML: 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 9 tests in 292.473s

OK

> Task :runners:flink:1.10:job-server:flinkJobServerCleanup
Stopping job server pid: 26403.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task 
':runners:flink:1.10:job-server:validatesCrossLanguageRunnerJavaUsingPython'.
> There were failing tests. See the report at: 
> file://<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/runners/flink/1.10/job-server/build/reports/tests/validatesCrossLanguageRunnerJavaUsingPython/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.7/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 57m 53s
189 actionable tasks: 168 executed, 17 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches 
limit is too low.

Publishing build scan...
https://gradle.com/s/do3aqwyklavd4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to