See
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/264/display/redirect>
Changes:
------------------------------------------
[...truncated 961.69 KB...]
INFO:apache_beam.utils.subprocess_server:SELECT `simple`.`id` AS `id`,
`enrich`.`metadata` AS `metadata`
INFO:apache_beam.utils.subprocess_server:FROM `beam`.`simple` AS `simple`
INFO:apache_beam.utils.subprocess_server:INNER JOIN `beam`.`enrich` AS `enrich`
ON `simple`.`id` = `enrich`.`id`
INFO:apache_beam.utils.subprocess_server:Sep 14, 2020 12:30:39 PM
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
INFO:apache_beam.utils.subprocess_server:INFO: SQLPlan>
INFO:apache_beam.utils.subprocess_server:LogicalProject(id=[$0], metadata=[$4])
INFO:apache_beam.utils.subprocess_server: LogicalJoin(condition=[=($0, $3)],
joinType=[inner])
INFO:apache_beam.utils.subprocess_server: BeamIOSourceRel(table=[[beam,
simple]])
INFO:apache_beam.utils.subprocess_server: BeamIOSourceRel(table=[[beam,
enrich]])
INFO:apache_beam.utils.subprocess_server:
INFO:apache_beam.utils.subprocess_server:Sep 14, 2020 12:30:39 PM
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
INFO:apache_beam.utils.subprocess_server:INFO: BEAMPlan>
INFO:apache_beam.utils.subprocess_server:BeamCalcRel(expr#0..4=[{inputs}],
id=[$t2], metadata=[$t1])
INFO:apache_beam.utils.subprocess_server: BeamCoGBKJoinRel(condition=[=($2,
$0)], joinType=[inner])
INFO:apache_beam.utils.subprocess_server: BeamIOSourceRel(table=[[beam,
enrich]])
INFO:apache_beam.utils.subprocess_server: BeamIOSourceRel(table=[[beam,
simple]])
INFO:apache_beam.utils.subprocess_server:
DEBUG:root:Sending SIGINT to job_server
WARNING:root:Make sure that locally built Python SDK docker image has Python
2.7 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python2.7_sdk:2.25.0.dev.
If the image is not available at local, we will try to pull from hub.docker.com
WARNING:root:Make sure that locally built Python SDK docker image has Python
2.7 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python2.7_sdk:2.25.0.dev.
If the image is not available at local, we will try to pull from hub.docker.com
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
<function lift_combiners at 0x7f0c04defd70> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:42 [1, 1, 1,
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages:
['ref_AppliedPTransform_Create enrich/Impulse_3\n Create
enrich/Impulse:beam:transform:impulse:v1\n must follow: \n
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create
enrich/FlatMap(<lambda at core.py:2876>)_4\n Create enrich/FlatMap(<lambda at
core.py:2876>):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create
enrich/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n Create
enrich/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n must
follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create
enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n
Create
enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create
enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n Create
enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create
enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_11\n
Create
enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create
enrich/MaybeReshuffle/Reshuffle/RemoveRandomKeys_12\n Create
enrich/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create enrich/Map(decode)_13\n Create
enrich/Map(decode):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create
simple/Impulse_15\n Create simple/Impulse:beam:transform:impulse:v1\n must
follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create
simple/FlatMap(<lambda at core.py:2876>)_16\n Create simple/FlatMap(<lambda at
core.py:2876>):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create
simple/MaybeReshuffle/Reshuffle/AddRandomKeys_19\n Create
simple/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n must
follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_21\n
Create
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_22\n Create
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_23\n
Create
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create
simple/MaybeReshuffle/Reshuffle/RemoveRandomKeys_24\n Create
simple/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create simple/Map(decode)_25\n Create
simple/Map(decode):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'external_7SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_6/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous)\n
SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_6/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_7SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_5/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous)\n
SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_5/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/left_TimestampCombiner/Flatten.PCollections\n
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/left_TimestampCombiner/Flatten.PCollections:beam:transform:flatten:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/right_TimestampCombiner/Flatten.PCollections\n
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/right_TimestampCombiner/Flatten.PCollections:beam:transform:flatten:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/extractKeylhs/ParMultiDo(Anonymous)\n
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/extractKeylhs/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/extractKeyrhs/ParMultiDo(Anonymous)\n
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/extractKeyrhs/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)\n
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)\n
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/Flatten\n
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/Flatten:beam:transform:flatten:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/GBK\n
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/GBK:beam:transform:group_by_key:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)\n
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/ParDo(ConvertCoGbkResult)/ParMultiDo(ConvertCoGbkResult)\n
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/ParDo(ConvertCoGbkResult)/ParMultiDo(ConvertCoGbkResult):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Select.Fields/ParDo(Select)/ParMultiDo(Select)\n
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Select.Fields/ParDo(Select)/ParMultiDo(Select):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCalcRel_96/ParDo(Calc)/ParMultiDo(Calc)\n
SqlTransform(beam:external:java:sql:v1)/BeamCalcRel_96/ParDo(Calc)/ParMultiDo(Calc):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Create/Impulse_29\n
assert_that/Create/Impulse:beam:transform:impulse:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at
core.py:2876>)_30\n assert_that/Create/FlatMap(<lambda at
core.py:2876>):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Create/Map(decode)_32\n
assert_that/Create/Map(decode):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_33\n
assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/ToVoidKey_34\n
assert_that/ToVoidKey:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/pair_with_0_36\n
assert_that/Group/pair_with_0:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/pair_with_1_37\n
assert_that/Group/pair_with_1:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/Flatten_38\n
assert_that/Group/Flatten:beam:transform:flatten:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/GroupByKey_39\n
assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_40\n
assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Unkey_41\n
assert_that/Unkey:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Match_42\n
assert_that/Match:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>']
INFO:apache_beam.runners.portability.portable_runner:Job state changed to
STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to
STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to
RUNNING
INFO:root:==================== <function annotate_downstream_side_inputs at
0x7f6c0e8b0b18> ====================
INFO:root:==================== <function fix_side_input_pcoll_coders at
0x7f6c0e8b0c08> ====================
INFO:root:==================== <function pack_combiners at 0x7f6c0e8b0c80>
====================
INFO:root:==================== <function lift_combiners at 0x7f6c0e8b0cf8>
====================
INFO:root:==================== <function expand_sdf at 0x7f6c0e8b0d70>
====================
INFO:root:==================== <function expand_gbk at 0x7f6c0e8b0de8>
====================
INFO:root:==================== <function sink_flattens at 0x7f6c0e8b0ed8>
====================
INFO:root:==================== <function greedily_fuse at 0x7f6c0e8b0f50>
====================
INFO:root:==================== <function read_to_impulse at 0x7f6c0e8b2050>
====================
INFO:root:==================== <function impulse_to_input at 0x7f6c0e8b20c8>
====================
INFO:root:==================== <function sort_stages at 0x7f6c0e8b22a8>
====================
INFO:root:==================== <function setup_timer_mapping at 0x7f6c0e8b2230>
====================
INFO:root:==================== <function populate_data_channel_coders at
0x7f6c0e8b2320> ====================
INFO:root:starting control server on port 43101
INFO:root:starting data server on port 45465
INFO:root:starting state server on port 46763
INFO:root:starting logging server on port 40101
INFO:root:Created Worker handler
<apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler
object at 0x7f6be6a02250> for environment
ref_Environment_default_environment_1 (beam:env:docker:v1,
'\n$apache/beam_python2.7_sdk:2.25.0.dev')
INFO:root:Unable to pull image apache/beam_python2.7_sdk:2.25.0.dev
INFO:root:Waiting for docker to start up.Current status is running
INFO:root:Docker container is running. container_id =
500b8d8e5cf1fca445a1e81c4e7b56ea2073f7079babfc6587dca8b69b24f937, worker_id =
worker_74
INFO:root:Running (ref_AppliedPTransform_Create
enrich/Impulse_3)+((ref_AppliedPTransform_Create enrich/FlatMap(<lambda at
core.py:2876>)_4)+((ref_AppliedPTransform_Create
enrich/MaybeReshuffle/Reshuffle/AddRandomKeys_7)+((ref_AppliedPTransform_Create
enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9)+(Create
enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write))))
INFO:root:Running ((Create
enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_Create
enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_11))+((ref_AppliedPTransform_Create
enrich/MaybeReshuffle/Reshuffle/RemoveRandomKeys_12)+((ref_AppliedPTransform_Create
enrich/Map(decode)_13)+(ref_PCollection_PCollection_2/Write)))
INFO:root:Created Worker handler
<apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler
object at 0x7f6be68da4d0> for environment external_7beam:env:docker:v1
(beam:env:docker:v1, '\n\x1fapache/beam_java_sdk:2.25.0.dev')
INFO:root:Unable to pull image apache/beam_java_sdk:2.25.0.dev
INFO:root:Waiting for docker to start up.Current status is running
INFO:root:Docker container is running. container_id =
59579d6cf4db34af6a764ef02a8ce3b907bb598fe3d577a7c2b464bd126dd180, worker_id =
worker_75
INFO:root:Running
((((ref_PCollection_PCollection_2/Read)+(external_7SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_6/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous)))+(external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/left_TimestampCombiner/Flatten.PCollections))+((external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/extractKeylhs/ParMultiDo(Anonymous))+(external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable))))+((SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/Flatten/Transcode/0)+(SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/Flatten/Write/0))
INFO:root:Running (ref_AppliedPTransform_Create
simple/Impulse_15)+((ref_AppliedPTransform_Create simple/FlatMap(<lambda at
core.py:2876>)_16)+((ref_AppliedPTransform_Create
simple/MaybeReshuffle/Reshuffle/AddRandomKeys_19)+((ref_AppliedPTransform_Create
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_21)+(Create
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write))))
INFO:root:Running (Create
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+((ref_AppliedPTransform_Create
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_23)+((ref_AppliedPTransform_Create
simple/MaybeReshuffle/Reshuffle/RemoveRandomKeys_24)+((ref_AppliedPTransform_Create
simple/Map(decode)_25)+(ref_PCollection_PCollection_1/Write))))
INFO:root:Running
(((((ref_PCollection_PCollection_1/Read)+(external_7SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_5/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous)))+(external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/right_TimestampCombiner/Flatten.PCollections))+(external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/extractKeyrhs/ParMultiDo(Anonymous)))+(external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)))+((SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/Flatten/Transcode/1)+(SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/Flatten/Write/1))
INFO:root:Running
(SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/Flatten/Read)+(SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/GBK/Write)
INFO:root:Running
(((ref_AppliedPTransform_assert_that/Create/Impulse_29)+(ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda
at
core.py:2876>)_30))+(ref_AppliedPTransform_assert_that/Create/Map(decode)_32))+((ref_AppliedPTransform_assert_that/Group/pair_with_0_36)+((assert_that/Group/Flatten/Transcode/0)+(assert_that/Group/Flatten/Write/0)))
INFO:root:Running
((SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/GBK/Read)+(external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)))+((external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/ParDo(ConvertCoGbkResult)/ParMultiDo(ConvertCoGbkResult))+(((external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Select.Fields/ParDo(Select)/ParMultiDo(Select))+(external_7SqlTransform(beam:external:java:sql:v1)/BeamCalcRel_96/ParDo(Calc)/ParMultiDo(Calc)))+(ref_PCollection_PCollection_17/Write)))
INFO:root:Running
((ref_PCollection_PCollection_17/Read)+(ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_33))+((ref_AppliedPTransform_assert_that/ToVoidKey_34)+((ref_AppliedPTransform_assert_that/Group/pair_with_1_37)+((assert_that/Group/Flatten/Transcode/1)+(assert_that/Group/Flatten/Write/1))))
INFO:root:Running
(assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write)
INFO:root:Running
(((assert_that/Group/GroupByKey/Read)+(ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_40))+(ref_AppliedPTransform_assert_that/Unkey_41))+(ref_AppliedPTransform_assert_that/Match_42)
INFO:root:Successfully completed job in 11.5453460217 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ok
test_windowing_before_sql (apache_beam.transforms.sql_test.SqlTransformTest)
... INFO:apache_beam.utils.subprocess_server:Using pre-built snapshot at
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.25.0-SNAPSHOT.jar>
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar'
'<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.25.0-SNAPSHOT.jar'>
'56609']
DEBUG:root:Waiting for grpc channel to be ready at localhost:56609.
INFO:apache_beam.utils.subprocess_server:Starting expansion service at
localhost:56609
DEBUG:root:Waiting for grpc channel to be ready at localhost:56609.
DEBUG:root:Waiting for grpc channel to be ready at localhost:56609.
INFO:apache_beam.utils.subprocess_server:Sep 14, 2020 12:30:57 PM
org.apache.beam.sdk.expansion.service.ExpansionService loadRegisteredTransforms
INFO:apache_beam.utils.subprocess_server:INFO: Registering external transforms:
[beam:external:java:sql:v1, beam:external:java:generate_sequence:v1]
INFO:apache_beam.utils.subprocess_server: beam:external:java:sql:v1:
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$1/2121744517@626b2d4a
INFO:apache_beam.utils.subprocess_server:
beam:external:java:generate_sequence:v1:
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$1/2121744517@5e91993f
DEBUG:root:Waiting for grpc channel to be ready at localhost:56609.
DEBUG:root:Waiting for grpc channel to be ready at localhost:56609.
DEBUG:root:Waiting for grpc channel to be ready at localhost:56609.
INFO:apache_beam.utils.subprocess_server:Sep 14, 2020 12:30:58 PM
org.apache.beam.sdk.expansion.service.ExpansionService expand
INFO:apache_beam.utils.subprocess_server:INFO: Expanding
'SqlTransform(beam:external:java:sql:v1)' with URN 'beam:external:java:sql:v1'
INFO:apache_beam.utils.subprocess_server:Sep 14, 2020 12:30:59 PM
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader
payloadToConfig
INFO:apache_beam.utils.subprocess_server:WARNING: Configuration class
'org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar$Configuration'
has no schema registered. Attempting to construct with setter approach.
INFO:apache_beam.utils.subprocess_server:Sep 14, 2020 12:31:01 PM
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
INFO:apache_beam.utils.subprocess_server:INFO: SQL:
INFO:apache_beam.utils.subprocess_server:SELECT COUNT(*) AS `count`
INFO:apache_beam.utils.subprocess_server:FROM `beam`.`PCOLLECTION` AS
`PCOLLECTION`
INFO:apache_beam.utils.subprocess_server:Sep 14, 2020 12:31:02 PM
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
INFO:apache_beam.utils.subprocess_server:INFO: SQLPlan>
INFO:apache_beam.utils.subprocess_server:LogicalAggregate(group=[{}],
count=[COUNT()])
INFO:apache_beam.utils.subprocess_server: LogicalProject($f0=[0])
INFO:apache_beam.utils.subprocess_server: BeamIOSourceRel(table=[[beam,
PCOLLECTION]])
INFO:apache_beam.utils.subprocess_server:
INFO:apache_beam.utils.subprocess_server:Sep 14, 2020 12:31:02 PM
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
INFO:apache_beam.utils.subprocess_server:INFO: BEAMPlan>
INFO:apache_beam.utils.subprocess_server:BeamAggregationRel(group=[{}],
count=[COUNT()])
INFO:apache_beam.utils.subprocess_server: BeamIOSourceRel(table=[[beam,
PCOLLECTION]])
INFO:apache_beam.utils.subprocess_server:
DEBUG:root:Sending SIGINT to job_server
WARNING:root:Make sure that locally built Python SDK docker image has Python
2.7 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python2.7_sdk:2.25.0.dev.
If the image is not available at local, we will try to pull from hub.docker.com
WARNING:root:Make sure that locally built Python SDK docker image has Python
2.7 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python2.7_sdk:2.25.0.dev.
If the image is not available at local, we will try to pull from hub.docker.com
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
<function lift_combiners at 0x7f0c04defd70> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:29 [1, 1, 1,
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages:
['ref_AppliedPTransform_Create/Impulse_3\n
Create/Impulse:beam:transform:impulse:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2876>)_4\n
Create/FlatMap(<lambda at core.py:2876>):beam:transform:pardo:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n
Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_11\n
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_12\n
Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create/Map(decode)_13\n
Create/Map(decode):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at
sql_test.py:172>)_14\n Map(<lambda at
sql_test.py:172>):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_WindowInto(WindowIntoFn)_15\n
WindowInto(WindowIntoFn):beam:transform:window_into:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'external_8SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_3/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous)\n
SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_3/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/toRow/ParDo(Anonymous)/ParMultiDo(Anonymous)\n
SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/toRow/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/selectKeys/AddKeys/Map/ParMultiDo(Anonymous)\n
SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/selectKeys/AddKeys/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/GroupByKey\n
SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/GroupByKey:beam:transform:group_by_key:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/Combine/ParDo(Anonymous)/ParMultiDo(Anonymous)\n
SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/Combine/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToRow/ParMultiDo(Anonymous)\n
SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToRow/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/mergeRecord/ParMultiDo(Anonymous)\n
SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/mergeRecord/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Create/Impulse_19\n
assert_that/Create/Impulse:beam:transform:impulse:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at
core.py:2876>)_20\n assert_that/Create/FlatMap(<lambda at
core.py:2876>):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Create/Map(decode)_22\n
assert_that/Create/Map(decode):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_23\n
assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/ToVoidKey_24\n
assert_that/ToVoidKey:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/pair_with_0_26\n
assert_that/Group/pair_with_0:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/pair_with_1_27\n
assert_that/Group/pair_with_1:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/Flatten_28\n
assert_that/Group/Flatten:beam:transform:flatten:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/GroupByKey_29\n
assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_30\n
assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Unkey_31\n
assert_that/Unkey:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Match_32\n
assert_that/Match:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>']
INFO:apache_beam.runners.portability.portable_runner:Job state changed to
STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to
STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to
RUNNING
INFO:root:==================== <function annotate_downstream_side_inputs at
0x7f6c0e8b0b18> ====================
INFO:root:==================== <function fix_side_input_pcoll_coders at
0x7f6c0e8b0c08> ====================
INFO:root:==================== <function pack_combiners at 0x7f6c0e8b0c80>
====================
INFO:root:==================== <function lift_combiners at 0x7f6c0e8b0cf8>
====================
INFO:root:==================== <function expand_sdf at 0x7f6c0e8b0d70>
====================
INFO:root:==================== <function expand_gbk at 0x7f6c0e8b0de8>
====================
INFO:root:==================== <function sink_flattens at 0x7f6c0e8b0ed8>
====================
INFO:root:==================== <function greedily_fuse at 0x7f6c0e8b0f50>
====================
INFO:root:==================== <function read_to_impulse at 0x7f6c0e8b2050>
====================
INFO:root:==================== <function impulse_to_input at 0x7f6c0e8b20c8>
====================
INFO:root:==================== <function sort_stages at 0x7f6c0e8b22a8>
====================
INFO:root:==================== <function setup_timer_mapping at 0x7f6c0e8b2230>
====================
INFO:root:==================== <function populate_data_channel_coders at
0x7f6c0e8b2320> ====================
INFO:root:starting control server on port 35523
INFO:root:starting data server on port 40659
INFO:root:starting state server on port 38599
INFO:root:starting logging server on port 42671
INFO:root:Created Worker handler
<apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler
object at 0x7f6be6894bd0> for environment
ref_Environment_default_environment_1 (beam:env:docker:v1,
'\n$apache/beam_python2.7_sdk:2.25.0.dev')
INFO:root:Unable to pull image apache/beam_python2.7_sdk:2.25.0.dev
INFO:root:Waiting for docker to start up.Current status is running
INFO:root:Docker container is running. container_id =
a928d8520fda3ce99066cfe1627445204a5fa271d121372518abbad3706caae0, worker_id =
worker_76
INFO:root:Running
(ref_AppliedPTransform_Create/Impulse_3)+((ref_AppliedPTransform_Create/FlatMap(<lambda
at
core.py:2876>)_4)+((ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7)+((ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9)+(Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write))))
INFO:root:Running
(((Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+((ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_11)+(ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_12)))+((ref_AppliedPTransform_Create/Map(decode)_13)+(ref_AppliedPTransform_Map(<lambda
at
sql_test.py:172>)_14)))+((ref_AppliedPTransform_WindowInto(WindowIntoFn)_15)+(ref_PCollection_PCollection_1/Write))
INFO:root:Created Worker handler
<apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler
object at 0x7f6bec071290> for environment external_8beam:env:docker:v1
(beam:env:docker:v1, '\n\x1fapache/beam_java_sdk:2.25.0.dev')
INFO:root:Unable to pull image apache/beam_java_sdk:2.25.0.dev
INFO:root:Waiting for docker to start up.Current status is running
INFO:root:Docker container is running. container_id =
d7e24432f23e29fa61886b07bb4ae197063a837804de3a9c8ec5e338b88aa70a, worker_id =
worker_77
INFO:root:Running
(((ref_PCollection_PCollection_1/Read)+(external_8SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_3/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous)))+((external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/toRow/ParDo(Anonymous)/ParMultiDo(Anonymous))+(external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/selectKeys/AddKeys/Map/ParMultiDo(Anonymous))))+(SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/GroupByKey/Write)
INFO:root:Running
(ref_AppliedPTransform_assert_that/Create/Impulse_19)+((ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda
at
core.py:2876>)_20)+((ref_AppliedPTransform_assert_that/Create/Map(decode)_22)+((ref_AppliedPTransform_assert_that/Group/pair_with_0_26)+((assert_that/Group/Flatten/Transcode/0)+(assert_that/Group/Flatten/Write/0)))))
INFO:root:Running
(SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/GroupByKey/Read)+(((external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/Combine/ParDo(Anonymous)/ParMultiDo(Anonymous))+(external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToRow/ParMultiDo(Anonymous)))+((external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/mergeRecord/ParMultiDo(Anonymous))+(ref_PCollection_PCollection_11/Write)))
INFO:root:Running
((ref_PCollection_PCollection_11/Read)+((ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_23)+((ref_AppliedPTransform_assert_that/ToVoidKey_24)+((ref_AppliedPTransform_assert_that/Group/pair_with_1_27)+(assert_that/Group/Flatten/Transcode/1)))))+(assert_that/Group/Flatten/Write/1)
INFO:root:Running
(assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write)
INFO:root:Running
((assert_that/Group/GroupByKey/Read)+(ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_30))+((ref_AppliedPTransform_assert_that/Unkey_31)+(ref_AppliedPTransform_assert_that/Match_32))
INFO:root:Successfully completed job in 10.7718629837 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ok
test_zetasql_generate_data (apache_beam.transforms.sql_test.SqlTransformTest)
... INFO:apache_beam.utils.subprocess_server:Using pre-built snapshot at
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.25.0-SNAPSHOT.jar>
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar'
'<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.25.0-SNAPSHOT.jar'>
'47035']
DEBUG:root:Waiting for grpc channel to be ready at localhost:47035.
INFO:apache_beam.utils.subprocess_server:Starting expansion service at
localhost:47035
DEBUG:root:Waiting for grpc channel to be ready at localhost:47035.
INFO:apache_beam.utils.subprocess_server:Sep 14, 2020 12:31:18 PM
org.apache.beam.sdk.expansion.service.ExpansionService loadRegisteredTransforms
INFO:apache_beam.utils.subprocess_server:INFO: Registering external transforms:
[beam:external:java:sql:v1, beam:external:java:generate_sequence:v1]
INFO:apache_beam.utils.subprocess_server: beam:external:java:sql:v1:
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$1/2121744517@626b2d4a
INFO:apache_beam.utils.subprocess_server:
beam:external:java:generate_sequence:v1:
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$1/2121744517@5e91993f
DEBUG:root:Waiting for grpc channel to be ready at localhost:47035.
DEBUG:root:Waiting for grpc channel to be ready at localhost:47035.
DEBUG:root:Waiting for grpc channel to be ready at localhost:47035.
DEBUG:root:Waiting for grpc channel to be ready at localhost:47035.
INFO:apache_beam.utils.subprocess_server:Sep 14, 2020 12:31:19 PM
org.apache.beam.sdk.expansion.service.ExpansionService expand
INFO:apache_beam.utils.subprocess_server:INFO: Expanding
'SqlTransform(beam:external:java:sql:v1)' with URN 'beam:external:java:sql:v1'
INFO:apache_beam.utils.subprocess_server:Sep 14, 2020 12:31:19 PM
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader
payloadToConfig
INFO:apache_beam.utils.subprocess_server:WARNING: Configuration class
'org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar$Configuration'
has no schema registered. Attempting to construct with setter approach.
DEBUG:root:Sending SIGINT to job_server
WARNING:root:Make sure that locally built Python SDK docker image has Python
2.7 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python2.7_sdk:2.25.0.dev.
If the image is not available at local, we will try to pull from hub.docker.com
WARNING:root:Make sure that locally built Python SDK docker image has Python
2.7 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python2.7_sdk:2.25.0.dev.
If the image is not available at local, we will try to pull from hub.docker.com
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
<function lift_combiners at 0x7f0c04defd70> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:16 [1, 1, 1,
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages:
['external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/Impulse\n
SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/Impulse:beam:transform:impulse:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/MapElements/Map/ParMultiDo(Anonymous)\n
SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/MapElements/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)\n
SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_9SqlTransform(beam:external:java:sql:v1)/BeamZetaSqlCalcRel_17/ParDo(Calc)/ParMultiDo(Calc)\n
SqlTransform(beam:external:java:sql:v1)/BeamZetaSqlCalcRel_17/ParDo(Calc)/ParMultiDo(Calc):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Create/Impulse_5\n
assert_that/Create/Impulse:beam:transform:impulse:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2876>)_6\n
assert_that/Create/FlatMap(<lambda at core.py:2876>):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Create/Map(decode)_8\n
assert_that/Create/Map(decode):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_9\n
assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/ToVoidKey_10\n
assert_that/ToVoidKey:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/pair_with_0_12\n
assert_that/Group/pair_with_0:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/pair_with_1_13\n
assert_that/Group/pair_with_1:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/Flatten_14\n
assert_that/Group/Flatten:beam:transform:flatten:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/GroupByKey_15\n
assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_16\n
assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Unkey_17\n
assert_that/Unkey:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Match_18\n
assert_that/Match:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>']
INFO:apache_beam.runners.portability.portable_runner:Job state changed to
STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to
STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to
RUNNING
INFO:root:==================== <function annotate_downstream_side_inputs at
0x7f6c0e8b0b18> ====================
INFO:root:==================== <function fix_side_input_pcoll_coders at
0x7f6c0e8b0c08> ====================
INFO:root:==================== <function pack_combiners at 0x7f6c0e8b0c80>
====================
INFO:root:==================== <function lift_combiners at 0x7f6c0e8b0cf8>
====================
INFO:root:==================== <function expand_sdf at 0x7f6c0e8b0d70>
====================
INFO:root:==================== <function expand_gbk at 0x7f6c0e8b0de8>
====================
INFO:root:==================== <function sink_flattens at 0x7f6c0e8b0ed8>
====================
INFO:root:==================== <function greedily_fuse at 0x7f6c0e8b0f50>
====================
INFO:root:==================== <function read_to_impulse at 0x7f6c0e8b2050>
====================
INFO:root:==================== <function impulse_to_input at 0x7f6c0e8b20c8>
====================
INFO:root:==================== <function sort_stages at 0x7f6c0e8b22a8>
====================
INFO:root:==================== <function setup_timer_mapping at 0x7f6c0e8b2230>
====================
INFO:root:==================== <function populate_data_channel_coders at
0x7f6c0e8b2320> ====================
INFO:root:starting control server on port 38055
INFO:root:starting data server on port 37441
INFO:root:starting state server on port 33255
INFO:root:starting logging server on port 42971
INFO:root:Created Worker handler
<apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler
object at 0x7f6be6a22310> for environment
ref_Environment_default_environment_1 (beam:env:docker:v1,
'\n$apache/beam_python2.7_sdk:2.25.0.dev')
INFO:root:Unable to pull image apache/beam_python2.7_sdk:2.25.0.dev
INFO:root:Waiting for docker to start up.Current status is running
INFO:root:Docker container is running. container_id =
2959497204fcf3ef8fae1dd5b6a2fc51a104f36cb0c74566cf4a9bbc5c415d21, worker_id =
worker_78
INFO:root:Running
((ref_AppliedPTransform_assert_that/Create/Impulse_5)+((ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda
at
core.py:2876>)_6)+((ref_AppliedPTransform_assert_that/Create/Map(decode)_8)+((ref_AppliedPTransform_assert_that/Group/pair_with_0_12)+(assert_that/Group/Flatten/Transcode/0)))))+(assert_that/Group/Flatten/Write/0)
INFO:root:Created Worker handler
<apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler
object at 0x7f6be6885990> for environment external_9beam:env:docker:v1
(beam:env:docker:v1, '\n\x1fapache/beam_java_sdk:2.25.0.dev')
INFO:root:Unable to pull image apache/beam_java_sdk:2.25.0.dev
INFO:root:Waiting for docker to start up.Current status is running
INFO:root:Docker container is running. container_id =
8ef13931bdcbf49fd8fc885fb8a61f798aba6375dd0557626aff838e75c92a12, worker_id =
worker_79
INFO:root:Running
(((external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/Impulse)+(external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/MapElements/Map/ParMultiDo(Anonymous)))+((SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/PairWithRestriction)+(SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/SplitAndSizeRestriction)))+(external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/MapElements/Map/ParMultiDo(Anonymous).output_split/Write)
INFO:root:Running
((external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/MapElements/Map/ParMultiDo(Anonymous).output_split/Read)+(SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/Process))+((external_9SqlTransform(beam:external:java:sql:v1)/BeamZetaSqlCalcRel_17/ParDo(Calc)/ParMultiDo(Calc))+(ref_PCollection_PCollection_1/Write))
INFO:root:Running
((ref_PCollection_PCollection_1/Read)+((ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_9)+((ref_AppliedPTransform_assert_that/ToVoidKey_10)+(ref_AppliedPTransform_assert_that/Group/pair_with_1_13))))+((assert_that/Group/Flatten/Transcode/1)+(assert_that/Group/Flatten/Write/1))
INFO:root:Running
(assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write)
INFO:root:Running
(assert_that/Group/GroupByKey/Read)+((ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_16)+((ref_AppliedPTransform_assert_that/Unkey_17)+(ref_AppliedPTransform_assert_that/Match_18)))
INFO:root:Successfully completed job in 11.8926160336 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ok
----------------------------------------------------------------------
XML: nosetests-xlangSqlValidateRunner.xml
----------------------------------------------------------------------
XML:
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 9 tests in 190.446s
OK
> Task :sdks:python:test-suites:direct:xlang:fnApiJobServerCleanup
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/build/gradleenv/1922375555/local/lib/python2.7/site-packages/apache_beam/__init__.py>:82:
UserWarning: You are using the final Apache Beam release with Python 2
support. New releases of Apache Beam will require Python 3.6 or a newer version.
'You are using the final Apache Beam release with Python 2 support. '
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/build/gradleenv/1922375555/local/lib/python2.7/site-packages/azure/storage/blob/_shared/encryption.py>:15:
CryptographyDeprecationWarning: Python 2 is no longer supported by the Python
core team. Support for it is now deprecated in cryptography, and will be
removed in a future release.
from cryptography.hazmat.backends import default_backend
Killing process at 11775
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task
':sdks:python:test-suites:direct:xlang:validatesCrossLanguageRunnerJavaUsingPython'.
> There were failing tests. See the report at:
> file://<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/python/test-suites/direct/xlang/build/reports/tests/validatesCrossLanguageRunnerJavaUsingPython/index.html>
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 31m 15s
141 actionable tasks: 115 executed, 22 from cache, 4 up-to-date
Publishing build scan...
https://gradle.com/s/2ncrcizgrwd5y
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]