See 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/269/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Use index for partitioning of elementwise operations with 
multiple

[Robert Bradshaw] [BEAM-10873] Introduce partitioning session for stronger 
testing.

[Robert Bradshaw] [BEAM-10873] Use partitioinig session for tests.

[noreply] Merge pull request #12704 from [BEAM-10603] Implement the new Large

[noreply] write to file ability for java Nexmark suite (#12813)

[noreply] add readme file to python nexmark suites (#12808)

[noreply] Merge pull request #12807 from [BEAM-2855] implement query 10

[noreply] Merge pull request #12770 from [BEAM-10545] Assembled the extension 
with

[noreply] Document GroupBy transform. (#12834)

[noreply] [BEAM-10886] Fix Java Wordcount Direct Runner (windows-latest) 
(#12846)

[noreply] * [BEAM-10705] Extract and use the filename when downloading a remote


------------------------------------------
[...truncated 957.26 KB...]
INFO:apache_beam.utils.subprocess_server:SELECT `simple`.`id` AS `id`, 
`enrich`.`metadata` AS `metadata`
INFO:apache_beam.utils.subprocess_server:FROM `beam`.`simple` AS `simple`
INFO:apache_beam.utils.subprocess_server:INNER JOIN `beam`.`enrich` AS `enrich` 
ON `simple`.`id` = `enrich`.`id`
INFO:apache_beam.utils.subprocess_server:Sep 15, 2020 6:28:55 PM 
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
INFO:apache_beam.utils.subprocess_server:INFO: SQLPlan>
INFO:apache_beam.utils.subprocess_server:LogicalProject(id=[$0], metadata=[$4])
INFO:apache_beam.utils.subprocess_server:  LogicalJoin(condition=[=($0, $3)], 
joinType=[inner])
INFO:apache_beam.utils.subprocess_server:    BeamIOSourceRel(table=[[beam, 
simple]])
INFO:apache_beam.utils.subprocess_server:    BeamIOSourceRel(table=[[beam, 
enrich]])
INFO:apache_beam.utils.subprocess_server:
INFO:apache_beam.utils.subprocess_server:Sep 15, 2020 6:28:55 PM 
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
INFO:apache_beam.utils.subprocess_server:INFO: BEAMPlan>
INFO:apache_beam.utils.subprocess_server:BeamCalcRel(expr#0..4=[{inputs}], 
id=[$t2], metadata=[$t1])
INFO:apache_beam.utils.subprocess_server:  BeamCoGBKJoinRel(condition=[=($2, 
$0)], joinType=[inner])
INFO:apache_beam.utils.subprocess_server:    BeamIOSourceRel(table=[[beam, 
enrich]])
INFO:apache_beam.utils.subprocess_server:    BeamIOSourceRel(table=[[beam, 
simple]])
INFO:apache_beam.utils.subprocess_server:
DEBUG:root:Sending SIGINT to job_server
WARNING:root:Make sure that locally built Python SDK docker image has Python 
2.7 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python2.7_sdk:2.25.0.dev. 
If the image is not available at local, we will try to pull from hub.docker.com
WARNING:root:Make sure that locally built Python SDK docker image has Python 
2.7 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python2.7_sdk:2.25.0.dev. 
If the image is not available at local, we will try to pull from hub.docker.com
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function lift_combiners at 0x7f094ff70938> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:42 [1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages: 
['ref_AppliedPTransform_Create enrich/Impulse_3\n  Create 
enrich/Impulse:beam:transform:impulse:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create 
enrich/FlatMap(<lambda at core.py:2876>)_4\n  Create enrich/FlatMap(<lambda at 
core.py:2876>):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create 
enrich/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n  Create 
enrich/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create 
enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n  
Create 
enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create 
enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n  Create 
enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create 
enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_11\n
  Create 
enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create 
enrich/MaybeReshuffle/Reshuffle/RemoveRandomKeys_12\n  Create 
enrich/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create enrich/Map(decode)_13\n  Create 
enrich/Map(decode):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create 
simple/Impulse_15\n  Create simple/Impulse:beam:transform:impulse:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create 
simple/FlatMap(<lambda at core.py:2876>)_16\n  Create simple/FlatMap(<lambda at 
core.py:2876>):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create 
simple/MaybeReshuffle/Reshuffle/AddRandomKeys_19\n  Create 
simple/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create 
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_21\n  
Create 
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create 
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_22\n  Create 
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create 
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_23\n
  Create 
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create 
simple/MaybeReshuffle/Reshuffle/RemoveRandomKeys_24\n  Create 
simple/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create simple/Map(decode)_25\n  Create 
simple/Map(decode):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_6/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_6/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_5/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_5/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/left_TimestampCombiner/Flatten.PCollections\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/left_TimestampCombiner/Flatten.PCollections:beam:transform:flatten:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/right_TimestampCombiner/Flatten.PCollections\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/right_TimestampCombiner/Flatten.PCollections:beam:transform:flatten:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/extractKeylhs/ParMultiDo(Anonymous)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/extractKeylhs/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/extractKeyrhs/ParMultiDo(Anonymous)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/extractKeyrhs/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/Flatten\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/Flatten:beam:transform:flatten:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/GBK\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/GBK:beam:transform:group_by_key:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/ParDo(ConvertCoGbkResult)/ParMultiDo(ConvertCoGbkResult)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/ParDo(ConvertCoGbkResult)/ParMultiDo(ConvertCoGbkResult):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Select.Fields/ParDo(Select)/ParMultiDo(Select)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Select.Fields/ParDo(Select)/ParMultiDo(Select):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCalcRel_96/ParDo(Calc)/ParMultiDo(Calc)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCalcRel_96/ParDo(Calc)/ParMultiDo(Calc):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Create/Impulse_29\n  
assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at 
core.py:2876>)_30\n  assert_that/Create/FlatMap(<lambda at 
core.py:2876>):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Create/Map(decode)_32\n  
assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_33\n  
assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/ToVoidKey_34\n  
assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/pair_with_0_36\n  
assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/pair_with_1_37\n  
assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/Flatten_38\n  
assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/GroupByKey_39\n  
assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_40\n 
 assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Unkey_41\n  
assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Match_42\n  
assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>']
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
RUNNING
INFO:root:==================== <function annotate_downstream_side_inputs at 
0x7fdcb606ec80> ====================
INFO:root:==================== <function fix_side_input_pcoll_coders at 
0x7fdcb606ed70> ====================
INFO:root:==================== <function pack_combiners at 0x7fdcb606ede8> 
====================
INFO:root:==================== <function lift_combiners at 0x7fdcb606ee60> 
====================
INFO:root:==================== <function expand_sdf at 0x7fdcb606eed8> 
====================
INFO:root:==================== <function expand_gbk at 0x7fdcb606ef50> 
====================
INFO:root:==================== <function sink_flattens at 0x7fdcb60700c8> 
====================
INFO:root:==================== <function greedily_fuse at 0x7fdcb6070140> 
====================
INFO:root:==================== <function read_to_impulse at 0x7fdcb60701b8> 
====================
INFO:root:==================== <function impulse_to_input at 0x7fdcb6070230> 
====================
INFO:root:==================== <function sort_stages at 0x7fdcb6070410> 
====================
INFO:root:==================== <function setup_timer_mapping at 0x7fdcb6070398> 
====================
INFO:root:==================== <function populate_data_channel_coders at 
0x7fdcb6070488> ====================
INFO:root:starting control server on port 32917
INFO:root:starting data server on port 35151
INFO:root:starting state server on port 37783
INFO:root:starting logging server on port 38013
INFO:root:Created Worker handler 
<apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler
 object at 0x7fdc9f48aa10> for environment 
ref_Environment_default_environment_1 (beam:env:docker:v1, 
'\n$apache/beam_python2.7_sdk:2.25.0.dev')
INFO:root:Unable to pull image apache/beam_python2.7_sdk:2.25.0.dev
INFO:root:Waiting for docker to start up.Current status is running
INFO:root:Docker container is running. container_id = 
264db1a29e06871daa89f0afc959e227f75ff330629b35c12cfb00bbdbdc1dc8, worker_id = 
worker_74
INFO:root:Running (ref_AppliedPTransform_Create 
simple/Impulse_15)+((ref_AppliedPTransform_Create simple/FlatMap(<lambda at 
core.py:2876>)_16)+((ref_AppliedPTransform_Create 
simple/MaybeReshuffle/Reshuffle/AddRandomKeys_19)+((ref_AppliedPTransform_Create
 
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_21)+(Create
 simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write))))
INFO:root:Running (Create 
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+((ref_AppliedPTransform_Create
 
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_23)+((ref_AppliedPTransform_Create
 
simple/MaybeReshuffle/Reshuffle/RemoveRandomKeys_24)+((ref_AppliedPTransform_Create
 simple/Map(decode)_25)+(ref_PCollection_PCollection_1/Write))))
INFO:root:Created Worker handler 
<apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler
 object at 0x7fdc9f441c90> for environment external_7beam:env:docker:v1 
(beam:env:docker:v1, '\n\x1fapache/beam_java_sdk:2.25.0.dev')
INFO:root:Unable to pull image apache/beam_java_sdk:2.25.0.dev
INFO:root:Waiting for docker to start up.Current status is running
INFO:root:Docker container is running. container_id = 
6695dd7dae727ea77076ad2318151b9fd5d58a5ce1a90411a173c19d7a83c4a5, worker_id = 
worker_75
INFO:root:Running 
(((((ref_PCollection_PCollection_1/Read)+(external_7SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_5/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous)))+(external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/right_TimestampCombiner/Flatten.PCollections))+(external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/extractKeyrhs/ParMultiDo(Anonymous)))+(external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)))+((SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/Flatten/Transcode/1)+(SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/Flatten/Write/1))
INFO:root:Running (ref_AppliedPTransform_Create 
enrich/Impulse_3)+((ref_AppliedPTransform_Create enrich/FlatMap(<lambda at 
core.py:2876>)_4)+((ref_AppliedPTransform_Create 
enrich/MaybeReshuffle/Reshuffle/AddRandomKeys_7)+((ref_AppliedPTransform_Create 
enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9)+(Create
 enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write))))
INFO:root:Running ((Create 
enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_Create
 
enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_11))+((ref_AppliedPTransform_Create
 
enrich/MaybeReshuffle/Reshuffle/RemoveRandomKeys_12)+((ref_AppliedPTransform_Create
 enrich/Map(decode)_13)+(ref_PCollection_PCollection_2/Write)))
INFO:root:Running 
(((ref_AppliedPTransform_assert_that/Create/Impulse_29)+(ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda
 at 
core.py:2876>)_30))+(ref_AppliedPTransform_assert_that/Create/Map(decode)_32))+((ref_AppliedPTransform_assert_that/Group/pair_with_0_36)+((assert_that/Group/Flatten/Transcode/0)+(assert_that/Group/Flatten/Write/0)))
INFO:root:Running 
((((ref_PCollection_PCollection_2/Read)+(external_7SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_6/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous)))+(external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/left_TimestampCombiner/Flatten.PCollections))+((external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/extractKeylhs/ParMultiDo(Anonymous))+(external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable))))+((SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/Flatten/Transcode/0)+(SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/Flatten/Write/0))
INFO:root:Running 
(SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/Flatten/Read)+(SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/GBK/Write)
INFO:root:Running 
((SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/GBK/Read)+(external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)))+((external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/ParDo(ConvertCoGbkResult)/ParMultiDo(ConvertCoGbkResult))+(((external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Select.Fields/ParDo(Select)/ParMultiDo(Select))+(external_7SqlTransform(beam:external:java:sql:v1)/BeamCalcRel_96/ParDo(Calc)/ParMultiDo(Calc)))+(ref_PCollection_PCollection_17/Write)))
INFO:root:Running 
((ref_PCollection_PCollection_17/Read)+(ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_33))+((ref_AppliedPTransform_assert_that/ToVoidKey_34)+((ref_AppliedPTransform_assert_that/Group/pair_with_1_37)+((assert_that/Group/Flatten/Transcode/1)+(assert_that/Group/Flatten/Write/1))))
INFO:root:Running 
(assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write)
INFO:root:Running 
(((assert_that/Group/GroupByKey/Read)+(ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_40))+(ref_AppliedPTransform_assert_that/Unkey_41))+(ref_AppliedPTransform_assert_that/Match_42)
INFO:root:Successfully completed job in 15.1636970043 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ok
test_windowing_before_sql (apache_beam.transforms.sql_test.SqlTransformTest) 
... INFO:apache_beam.utils.subprocess_server:Using pre-built snapshot at 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.25.0-SNAPSHOT.jar>
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' 
'<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.25.0-SNAPSHOT.jar'>
 '56807']
DEBUG:root:Waiting for grpc channel to be ready at localhost:56807.
INFO:apache_beam.utils.subprocess_server:Starting expansion service at 
localhost:56807
DEBUG:root:Waiting for grpc channel to be ready at localhost:56807.
INFO:apache_beam.utils.subprocess_server:Sep 15, 2020 6:29:16 PM 
org.apache.beam.sdk.expansion.service.ExpansionService loadRegisteredTransforms
INFO:apache_beam.utils.subprocess_server:INFO: Registering external transforms: 
[beam:external:java:sql:v1, beam:external:java:generate_sequence:v1]
INFO:apache_beam.utils.subprocess_server:       beam:external:java:sql:v1: 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$1/2121744517@626b2d4a
INFO:apache_beam.utils.subprocess_server:       
beam:external:java:generate_sequence:v1: 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$1/2121744517@5e91993f
DEBUG:root:Waiting for grpc channel to be ready at localhost:56807.
DEBUG:root:Waiting for grpc channel to be ready at localhost:56807.
DEBUG:root:Waiting for grpc channel to be ready at localhost:56807.
DEBUG:root:Waiting for grpc channel to be ready at localhost:56807.
INFO:apache_beam.utils.subprocess_server:Sep 15, 2020 6:29:17 PM 
org.apache.beam.sdk.expansion.service.ExpansionService expand
INFO:apache_beam.utils.subprocess_server:INFO: Expanding 
'SqlTransform(beam:external:java:sql:v1)' with URN 'beam:external:java:sql:v1'
INFO:apache_beam.utils.subprocess_server:Sep 15, 2020 6:29:18 PM 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader
 payloadToConfig
INFO:apache_beam.utils.subprocess_server:WARNING: Configuration class 
'org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar$Configuration'
 has no schema registered. Attempting to construct with setter approach.
INFO:apache_beam.utils.subprocess_server:Sep 15, 2020 6:29:20 PM 
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
INFO:apache_beam.utils.subprocess_server:INFO: SQL:
INFO:apache_beam.utils.subprocess_server:SELECT COUNT(*) AS `count`
INFO:apache_beam.utils.subprocess_server:FROM `beam`.`PCOLLECTION` AS 
`PCOLLECTION`
INFO:apache_beam.utils.subprocess_server:Sep 15, 2020 6:29:20 PM 
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
INFO:apache_beam.utils.subprocess_server:INFO: SQLPlan>
INFO:apache_beam.utils.subprocess_server:LogicalAggregate(group=[{}], 
count=[COUNT()])
INFO:apache_beam.utils.subprocess_server:  LogicalProject($f0=[0])
INFO:apache_beam.utils.subprocess_server:    BeamIOSourceRel(table=[[beam, 
PCOLLECTION]])
INFO:apache_beam.utils.subprocess_server:
INFO:apache_beam.utils.subprocess_server:Sep 15, 2020 6:29:20 PM 
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
INFO:apache_beam.utils.subprocess_server:INFO: BEAMPlan>
INFO:apache_beam.utils.subprocess_server:BeamAggregationRel(group=[{}], 
count=[COUNT()])
INFO:apache_beam.utils.subprocess_server:  BeamIOSourceRel(table=[[beam, 
PCOLLECTION]])
INFO:apache_beam.utils.subprocess_server:
DEBUG:root:Sending SIGINT to job_server
WARNING:root:Make sure that locally built Python SDK docker image has Python 
2.7 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python2.7_sdk:2.25.0.dev. 
If the image is not available at local, we will try to pull from hub.docker.com
WARNING:root:Make sure that locally built Python SDK docker image has Python 
2.7 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python2.7_sdk:2.25.0.dev. 
If the image is not available at local, we will try to pull from hub.docker.com
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function lift_combiners at 0x7f094ff70938> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:29 [1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages: 
['ref_AppliedPTransform_Create/Impulse_3\n  
Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2876>)_4\n  
Create/FlatMap(<lambda at core.py:2876>):beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n  
Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n
  
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n
  
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_11\n
  
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_12\n  
Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create/Map(decode)_13\n  
Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at 
sql_test.py:172>)_14\n  Map(<lambda at 
sql_test.py:172>):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_WindowInto(WindowIntoFn)_15\n  
WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'external_8SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_3/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_3/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/toRow/ParDo(Anonymous)/ParMultiDo(Anonymous)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/toRow/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/selectKeys/AddKeys/Map/ParMultiDo(Anonymous)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/selectKeys/AddKeys/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/GroupByKey\n
  
SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/GroupByKey:beam:transform:group_by_key:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/Combine/ParDo(Anonymous)/ParMultiDo(Anonymous)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/Combine/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToRow/ParMultiDo(Anonymous)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToRow/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/mergeRecord/ParMultiDo(Anonymous)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/mergeRecord/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Create/Impulse_19\n  
assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at 
core.py:2876>)_20\n  assert_that/Create/FlatMap(<lambda at 
core.py:2876>):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Create/Map(decode)_22\n  
assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_23\n  
assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/ToVoidKey_24\n  
assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/pair_with_0_26\n  
assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/pair_with_1_27\n  
assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/Flatten_28\n  
assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/GroupByKey_29\n  
assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_30\n 
 assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Unkey_31\n  
assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Match_32\n  
assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>']
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
RUNNING
INFO:root:==================== <function annotate_downstream_side_inputs at 
0x7fdcb606ec80> ====================
INFO:root:==================== <function fix_side_input_pcoll_coders at 
0x7fdcb606ed70> ====================
INFO:root:==================== <function pack_combiners at 0x7fdcb606ede8> 
====================
INFO:root:==================== <function lift_combiners at 0x7fdcb606ee60> 
====================
INFO:root:==================== <function expand_sdf at 0x7fdcb606eed8> 
====================
INFO:root:==================== <function expand_gbk at 0x7fdcb606ef50> 
====================
INFO:root:==================== <function sink_flattens at 0x7fdcb60700c8> 
====================
INFO:root:==================== <function greedily_fuse at 0x7fdcb6070140> 
====================
INFO:root:==================== <function read_to_impulse at 0x7fdcb60701b8> 
====================
INFO:root:==================== <function impulse_to_input at 0x7fdcb6070230> 
====================
INFO:root:==================== <function sort_stages at 0x7fdcb6070410> 
====================
INFO:root:==================== <function setup_timer_mapping at 0x7fdcb6070398> 
====================
INFO:root:==================== <function populate_data_channel_coders at 
0x7fdcb6070488> ====================
INFO:root:starting control server on port 39189
INFO:root:starting data server on port 37547
INFO:root:starting state server on port 35857
INFO:root:starting logging server on port 42321
INFO:root:Created Worker handler 
<apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler
 object at 0x7fdcb5bcdc50> for environment 
ref_Environment_default_environment_1 (beam:env:docker:v1, 
'\n$apache/beam_python2.7_sdk:2.25.0.dev')
INFO:root:Unable to pull image apache/beam_python2.7_sdk:2.25.0.dev
INFO:root:Waiting for docker to start up.Current status is running
INFO:root:Docker container is running. container_id = 
96c0d32d4fe8280652a14644ae8cf4e0a3a56109eb8b60e9ab320d36ce037014, worker_id = 
worker_76
INFO:root:Running 
(ref_AppliedPTransform_assert_that/Create/Impulse_19)+((ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda
 at 
core.py:2876>)_20)+((ref_AppliedPTransform_assert_that/Create/Map(decode)_22)+((ref_AppliedPTransform_assert_that/Group/pair_with_0_26)+((assert_that/Group/Flatten/Transcode/0)+(assert_that/Group/Flatten/Write/0)))))
INFO:root:Running 
(ref_AppliedPTransform_Create/Impulse_3)+((ref_AppliedPTransform_Create/FlatMap(<lambda
 at 
core.py:2876>)_4)+((ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7)+((ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9)+(Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write))))
INFO:root:Running 
(((Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+((ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_11)+(ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_12)))+((ref_AppliedPTransform_Create/Map(decode)_13)+(ref_AppliedPTransform_Map(<lambda
 at 
sql_test.py:172>)_14)))+((ref_AppliedPTransform_WindowInto(WindowIntoFn)_15)+(ref_PCollection_PCollection_1/Write))
INFO:root:Created Worker handler 
<apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler
 object at 0x7fdc9f4cca50> for environment external_8beam:env:docker:v1 
(beam:env:docker:v1, '\n\x1fapache/beam_java_sdk:2.25.0.dev')
INFO:root:Unable to pull image apache/beam_java_sdk:2.25.0.dev
INFO:root:Waiting for docker to start up.Current status is running
INFO:root:Docker container is running. container_id = 
a9dad9684c8846672638de94321d6055ed7e6faeaae0ef914cbb6cd557a257aa, worker_id = 
worker_77
INFO:root:Running 
(((ref_PCollection_PCollection_1/Read)+(external_8SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_3/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous)))+((external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/toRow/ParDo(Anonymous)/ParMultiDo(Anonymous))+(external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/selectKeys/AddKeys/Map/ParMultiDo(Anonymous))))+(SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/GroupByKey/Write)
INFO:root:Running 
(SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/GroupByKey/Read)+(((external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/Combine/ParDo(Anonymous)/ParMultiDo(Anonymous))+(external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToRow/ParMultiDo(Anonymous)))+((external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/mergeRecord/ParMultiDo(Anonymous))+(ref_PCollection_PCollection_11/Write)))
INFO:root:Running 
((ref_PCollection_PCollection_11/Read)+((ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_23)+((ref_AppliedPTransform_assert_that/ToVoidKey_24)+((ref_AppliedPTransform_assert_that/Group/pair_with_1_27)+(assert_that/Group/Flatten/Transcode/1)))))+(assert_that/Group/Flatten/Write/1)
INFO:root:Running 
(assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write)
INFO:root:Running 
((assert_that/Group/GroupByKey/Read)+(ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_30))+((ref_AppliedPTransform_assert_that/Unkey_31)+(ref_AppliedPTransform_assert_that/Match_32))
INFO:root:Successfully completed job in 12.1070480347 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ok
test_zetasql_generate_data (apache_beam.transforms.sql_test.SqlTransformTest) 
... INFO:apache_beam.utils.subprocess_server:Using pre-built snapshot at 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.25.0-SNAPSHOT.jar>
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' 
'<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.25.0-SNAPSHOT.jar'>
 '54773']
DEBUG:root:Waiting for grpc channel to be ready at localhost:54773.
INFO:apache_beam.utils.subprocess_server:Starting expansion service at 
localhost:54773
DEBUG:root:Waiting for grpc channel to be ready at localhost:54773.
INFO:apache_beam.utils.subprocess_server:Sep 15, 2020 6:29:37 PM 
org.apache.beam.sdk.expansion.service.ExpansionService loadRegisteredTransforms
INFO:apache_beam.utils.subprocess_server:INFO: Registering external transforms: 
[beam:external:java:sql:v1, beam:external:java:generate_sequence:v1]
INFO:apache_beam.utils.subprocess_server:       beam:external:java:sql:v1: 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$1/2121744517@626b2d4a
INFO:apache_beam.utils.subprocess_server:       
beam:external:java:generate_sequence:v1: 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$1/2121744517@5e91993f
DEBUG:root:Waiting for grpc channel to be ready at localhost:54773.
DEBUG:root:Waiting for grpc channel to be ready at localhost:54773.
DEBUG:root:Waiting for grpc channel to be ready at localhost:54773.
DEBUG:root:Waiting for grpc channel to be ready at localhost:54773.
INFO:apache_beam.utils.subprocess_server:Sep 15, 2020 6:29:38 PM 
org.apache.beam.sdk.expansion.service.ExpansionService expand
INFO:apache_beam.utils.subprocess_server:INFO: Expanding 
'SqlTransform(beam:external:java:sql:v1)' with URN 'beam:external:java:sql:v1'
INFO:apache_beam.utils.subprocess_server:Sep 15, 2020 6:29:39 PM 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader
 payloadToConfig
INFO:apache_beam.utils.subprocess_server:WARNING: Configuration class 
'org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar$Configuration'
 has no schema registered. Attempting to construct with setter approach.
DEBUG:root:Sending SIGINT to job_server
WARNING:root:Make sure that locally built Python SDK docker image has Python 
2.7 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python2.7_sdk:2.25.0.dev. 
If the image is not available at local, we will try to pull from hub.docker.com
WARNING:root:Make sure that locally built Python SDK docker image has Python 
2.7 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python2.7_sdk:2.25.0.dev. 
If the image is not available at local, we will try to pull from hub.docker.com
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function lift_combiners at 0x7f094ff70938> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:16 [1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages: 
['external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/Impulse\n
  
SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/Impulse:beam:transform:impulse:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/MapElements/Map/ParMultiDo(Anonymous)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/MapElements/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_9SqlTransform(beam:external:java:sql:v1)/BeamZetaSqlCalcRel_17/ParDo(Calc)/ParMultiDo(Calc)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamZetaSqlCalcRel_17/ParDo(Calc)/ParMultiDo(Calc):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Create/Impulse_5\n  
assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2876>)_6\n 
 assert_that/Create/FlatMap(<lambda at core.py:2876>):beam:transform:pardo:v1\n 
 must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Create/Map(decode)_8\n  
assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_9\n  
assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/ToVoidKey_10\n  
assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/pair_with_0_12\n  
assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/pair_with_1_13\n  
assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/Flatten_14\n  
assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/GroupByKey_15\n  
assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_16\n 
 assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Unkey_17\n  
assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Match_18\n  
assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>']
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
RUNNING
INFO:root:==================== <function annotate_downstream_side_inputs at 
0x7fdcb606ec80> ====================
INFO:root:==================== <function fix_side_input_pcoll_coders at 
0x7fdcb606ed70> ====================
INFO:root:==================== <function pack_combiners at 0x7fdcb606ede8> 
====================
INFO:root:==================== <function lift_combiners at 0x7fdcb606ee60> 
====================
INFO:root:==================== <function expand_sdf at 0x7fdcb606eed8> 
====================
INFO:root:==================== <function expand_gbk at 0x7fdcb606ef50> 
====================
INFO:root:==================== <function sink_flattens at 0x7fdcb60700c8> 
====================
INFO:root:==================== <function greedily_fuse at 0x7fdcb6070140> 
====================
INFO:root:==================== <function read_to_impulse at 0x7fdcb60701b8> 
====================
INFO:root:==================== <function impulse_to_input at 0x7fdcb6070230> 
====================
INFO:root:==================== <function sort_stages at 0x7fdcb6070410> 
====================
INFO:root:==================== <function setup_timer_mapping at 0x7fdcb6070398> 
====================
INFO:root:==================== <function populate_data_channel_coders at 
0x7fdcb6070488> ====================
INFO:root:starting control server on port 40825
INFO:root:starting data server on port 44815
INFO:root:starting state server on port 38981
INFO:root:starting logging server on port 46615
INFO:root:Created Worker handler 
<apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler
 object at 0x7fdc9f4cc750> for environment external_9beam:env:docker:v1 
(beam:env:docker:v1, '\n\x1fapache/beam_java_sdk:2.25.0.dev')
INFO:root:Unable to pull image apache/beam_java_sdk:2.25.0.dev
INFO:root:Waiting for docker to start up.Current status is running
INFO:root:Docker container is running. container_id = 
3745ffc9c358f07577ae5aec0e93f5049b965d2a4da4479e156e48ca37dc8769, worker_id = 
worker_78
INFO:root:Running 
(((external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/Impulse)+(external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/MapElements/Map/ParMultiDo(Anonymous)))+((SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/PairWithRestriction)+(SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/SplitAndSizeRestriction)))+(external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/MapElements/Map/ParMultiDo(Anonymous).output_split/Write)
INFO:root:Running 
((external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/MapElements/Map/ParMultiDo(Anonymous).output_split/Read)+(SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/Process))+((external_9SqlTransform(beam:external:java:sql:v1)/BeamZetaSqlCalcRel_17/ParDo(Calc)/ParMultiDo(Calc))+(ref_PCollection_PCollection_1/Write))
INFO:root:Created Worker handler 
<apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler
 object at 0x7fdc9f441b10> for environment 
ref_Environment_default_environment_1 (beam:env:docker:v1, 
'\n$apache/beam_python2.7_sdk:2.25.0.dev')
INFO:root:Unable to pull image apache/beam_python2.7_sdk:2.25.0.dev
INFO:root:Waiting for docker to start up.Current status is running
INFO:root:Docker container is running. container_id = 
09c80ccc833b7a658958ac94740aed4fcce106c9eddb1dd78c8882e67af90713, worker_id = 
worker_79
INFO:root:Running 
((ref_PCollection_PCollection_1/Read)+((ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_9)+((ref_AppliedPTransform_assert_that/ToVoidKey_10)+(ref_AppliedPTransform_assert_that/Group/pair_with_1_13))))+((assert_that/Group/Flatten/Transcode/1)+(assert_that/Group/Flatten/Write/1))
INFO:root:Running 
((ref_AppliedPTransform_assert_that/Create/Impulse_5)+((ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda
 at 
core.py:2876>)_6)+((ref_AppliedPTransform_assert_that/Create/Map(decode)_8)+((ref_AppliedPTransform_assert_that/Group/pair_with_0_12)+(assert_that/Group/Flatten/Transcode/0)))))+(assert_that/Group/Flatten/Write/0)
INFO:root:Running 
(assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write)
INFO:root:Running 
(assert_that/Group/GroupByKey/Read)+((ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_16)+((ref_AppliedPTransform_assert_that/Unkey_17)+(ref_AppliedPTransform_assert_that/Match_18)))
INFO:root:Successfully completed job in 13.6125879288 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ok

----------------------------------------------------------------------
XML: nosetests-xlangSqlValidateRunner.xml
----------------------------------------------------------------------
XML: 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 9 tests in 196.041s

OK

> Task :sdks:python:test-suites:direct:xlang:fnApiJobServerCleanup
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/build/gradleenv/1922375555/local/lib/python2.7/site-packages/apache_beam/__init__.py>:82:
 UserWarning: You are using the final Apache Beam release with Python 2 
support. New releases of Apache Beam will require Python 3.6 or a newer version.
  'You are using the final Apache Beam release with Python 2 support. '
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/build/gradleenv/1922375555/local/lib/python2.7/site-packages/azure/storage/blob/_shared/encryption.py>:15:
 CryptographyDeprecationWarning: Python 2 is no longer supported by the Python 
core team. Support for it is now deprecated in cryptography, and will be 
removed in a future release.
  from cryptography.hazmat.backends import default_backend
Killing process at 28156

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:direct:xlang:validatesCrossLanguageRunnerJavaUsingPython'.
> There were failing tests. See the report at: 
> file://<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/python/test-suites/direct/xlang/build/reports/tests/validatesCrossLanguageRunnerJavaUsingPython/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 28m 52s
141 actionable tasks: 110 executed, 27 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/naahmhm2ioeye

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to