See 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/275/display/redirect?page=changes>

Changes:

[Robin Qiu] Support UNNEST an (possibly nested) array field of an struct column


------------------------------------------
[...truncated 507.19 KB...]
root: DEBUG: Waiting for grpc channel to be ready at localhost:49745.
root: DEBUG: Waiting for grpc channel to be ready at localhost:49745.
root: DEBUG: Waiting for grpc channel to be ready at localhost:49745.
apache_beam.utils.subprocess_server: INFO: Starting expansion service at 
localhost:49745
root: DEBUG: Waiting for grpc channel to be ready at localhost:49745.
root: DEBUG: Waiting for grpc channel to be ready at localhost:49745.
apache_beam.utils.subprocess_server: INFO: Sep 17, 2020 6:45:36 AM 
org.apache.beam.sdk.expansion.service.ExpansionService loadRegisteredTransforms
apache_beam.utils.subprocess_server: INFO: INFO: Registering external 
transforms: [beam:external:java:sql:v1, beam:external:java:generate_sequence:v1]
root: DEBUG: Waiting for grpc channel to be ready at localhost:49745.
apache_beam.utils.subprocess_server: INFO:      beam:external:java:sql:v1: 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$1/2121744517@626b2d4a
apache_beam.utils.subprocess_server: INFO:      
beam:external:java:generate_sequence:v1: 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$1/2121744517@5e91993f
root: DEBUG: Waiting for grpc channel to be ready at localhost:49745.
root: DEBUG: Waiting for grpc channel to be ready at localhost:49745.
root: DEBUG: Waiting for grpc channel to be ready at localhost:49745.
root: WARNING: Waiting for grpc channel to be ready at localhost:49745.
root: WARNING: Waiting for grpc channel to be ready at localhost:49745.
apache_beam.utils.subprocess_server: INFO: Sep 17, 2020 6:45:41 AM 
org.apache.beam.sdk.expansion.service.ExpansionService expand
apache_beam.utils.subprocess_server: INFO: INFO: Expanding 
'SqlTransform(beam:external:java:sql:v1)' with URN 'beam:external:java:sql:v1'
apache_beam.utils.subprocess_server: INFO: Sep 17, 2020 6:45:48 AM 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader
 payloadToConfig
apache_beam.utils.subprocess_server: INFO: WARNING: Configuration class 
'org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar$Configuration'
 has no schema registered. Attempting to construct with setter approach.
apache_beam.utils.subprocess_server: INFO: Sep 17, 2020 6:45:58 AM 
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
apache_beam.utils.subprocess_server: INFO: INFO: SQL:
apache_beam.utils.subprocess_server: INFO: SELECT `PCOLLECTION`.`a` * 
`PCOLLECTION`.`a` AS `s`, `LENGTH`(`PCOLLECTION`.`b`) AS `c`
apache_beam.utils.subprocess_server: INFO: FROM `beam`.`PCOLLECTION` AS 
`PCOLLECTION`
apache_beam.utils.subprocess_server: INFO: Sep 17, 2020 6:45:59 AM 
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
apache_beam.utils.subprocess_server: INFO: INFO: SQLPlan>
apache_beam.utils.subprocess_server: INFO: LogicalProject(s=[*($0, $0)], 
c=[LENGTH($1)])
apache_beam.utils.subprocess_server: INFO:   BeamIOSourceRel(table=[[beam, 
PCOLLECTION]])
apache_beam.utils.subprocess_server: INFO: 
apache_beam.utils.subprocess_server: INFO: Sep 17, 2020 6:46:00 AM 
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
apache_beam.utils.subprocess_server: INFO: INFO: BEAMPlan>
apache_beam.utils.subprocess_server: INFO: BeamCalcRel(expr#0..1=[{inputs}], 
expr#2=[*($t0, $t0)], expr#3=[LENGTH($t1)], s=[$t2], c=[$t3])
apache_beam.utils.subprocess_server: INFO:   BeamIOSourceRel(table=[[beam, 
PCOLLECTION]])
apache_beam.utils.subprocess_server: INFO: 
root: DEBUG: Sending SIGINT to job_server
root: WARNING: Make sure that locally built Python SDK docker image has Python 
2.7 interpreter.
root: INFO: Using Python SDK docker image: 
apache/beam_python2.7_sdk:2.25.0.dev. If the image is not available at local, 
we will try to pull from hub.docker.com
root: WARNING: Make sure that locally built Python SDK docker image has Python 
2.7 interpreter.
root: INFO: Using Python SDK docker image: 
apache/beam_python2.7_sdk:2.25.0.dev. If the image is not available at local, 
we will try to pull from hub.docker.com
apache_beam.runners.portability.fn_api_runner.translations: INFO: 
==================== <function lift_combiners at 0x7f6072bd1de8> 
====================
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: 23 [1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: Stages: 
['ref_AppliedPTransform_Create/Impulse_3\n  
Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2905>)_4\n  
Create/FlatMap(<lambda at core.py:2905>):beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n  
Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n
  
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n
  
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_11\n
  
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_12\n  
Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create/Map(decode)_13\n  
Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at 
sql_test.py:150>)_14\n  Map(<lambda at 
sql_test.py:150>):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'external_6SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_2/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_2/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_6SqlTransform(beam:external:java:sql:v1)/BeamCalcRel_18/ParDo(Calc)/ParMultiDo(Calc)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCalcRel_18/ParDo(Calc)/ParMultiDo(Calc):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Create/Impulse_18\n  
assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at 
core.py:2905>)_19\n  assert_that/Create/FlatMap(<lambda at 
core.py:2905>):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Create/Map(decode)_21\n  
assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_22\n  
assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/ToVoidKey_23\n  
assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/pair_with_0_25\n  
assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/pair_with_1_26\n  
assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/Flatten_27\n  
assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/GroupByKey_28\n  
assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_29\n 
 assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Unkey_30\n  
assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Match_31\n  
assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>']
apache_beam.runners.portability.portable_runner: INFO: Job state changed to 
STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to 
STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to 
RUNNING
root: INFO: ==================== <function annotate_downstream_side_inputs at 
0x7fed241510c8> ====================
root: INFO: ==================== <function fix_side_input_pcoll_coders at 
0x7fed241511b8> ====================
root: INFO: ==================== <function eliminate_common_key_with_none at 
0x7fed241512a8> ====================
root: INFO: ==================== <function pack_combiners at 0x7fed24151320> 
====================
root: INFO: ==================== <function lift_combiners at 0x7fed24151398> 
====================
root: INFO: ==================== <function expand_sdf at 0x7fed24151410> 
====================
root: INFO: ==================== <function expand_gbk at 0x7fed24151488> 
====================
root: INFO: ==================== <function sink_flattens at 0x7fed24151578> 
====================
root: INFO: ==================== <function greedily_fuse at 0x7fed241515f0> 
====================
root: INFO: ==================== <function read_to_impulse at 0x7fed24151668> 
====================
root: INFO: ==================== <function impulse_to_input at 0x7fed241516e0> 
====================
root: INFO: ==================== <function sort_stages at 0x7fed241518c0> 
====================
root: INFO: ==================== <function setup_timer_mapping at 
0x7fed24151848> ====================
root: INFO: ==================== <function populate_data_channel_coders at 
0x7fed24151938> ====================
root: INFO: starting control server on port 39277
root: INFO: starting data server on port 37771
root: INFO: starting state server on port 36427
root: INFO: starting logging server on port 45213
root: INFO: Created Worker handler 
<apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler
 object at 0x7fed0c1ca550> for environment 
ref_Environment_default_environment_1 (beam:env:docker:v1, 
'\n$apache/beam_python2.7_sdk:2.25.0.dev')
root: INFO: Unable to pull image apache/beam_python2.7_sdk:2.25.0.dev
root: INFO: Waiting for docker to start up.Current status is running
root: INFO: Docker container is running. container_id = 
99c917764b2cfcc9757e1fbd6bb8e1e99c806d65b0bce4ff6f55e718835d07b6, worker_id = 
worker_74
root: INFO: Running 
(ref_AppliedPTransform_assert_that/Create/Impulse_18)+((ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda
 at 
core.py:2905>)_19)+((ref_AppliedPTransform_assert_that/Create/Map(decode)_21)+((ref_AppliedPTransform_assert_that/Group/pair_with_0_25)+((assert_that/Group/Flatten/Transcode/0)+(assert_that/Group/Flatten/Write/0)))))
root: INFO: Running 
(ref_AppliedPTransform_Create/Impulse_3)+((ref_AppliedPTransform_Create/FlatMap(<lambda
 at 
core.py:2905>)_4)+((ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7)+((ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9)+(Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write))))
root: INFO: Running 
((Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+((ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_11)+(ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_12)))+((ref_AppliedPTransform_Create/Map(decode)_13)+((ref_AppliedPTransform_Map(<lambda
 at sql_test.py:150>)_14)+(ref_PCollection_PCollection_1/Write)))
root: INFO: Created Worker handler 
<apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler
 object at 0x7fed0c16f910> for environment external_6beam:env:docker:v1 
(beam:env:docker:v1, '\n\x1fapache/beam_java_sdk:2.25.0.dev')
root: INFO: Unable to pull image apache/beam_java_sdk:2.25.0.dev
root: INFO: Waiting for docker to start up.Current status is running
root: INFO: Docker container is running. container_id = 
4f7ebc401dafc34080242de55da750e901803ed5400e85bc31eec18d1e619e28, worker_id = 
worker_75
root: INFO: Running 
(((ref_PCollection_PCollection_1/Read)+(external_6SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_2/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous)))+(external_6SqlTransform(beam:external:java:sql:v1)/BeamCalcRel_18/ParDo(Calc)/ParMultiDo(Calc)))+(ref_PCollection_PCollection_10/Write)
root: INFO: Running 
((ref_PCollection_PCollection_10/Read)+((ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_22)+((ref_AppliedPTransform_assert_that/ToVoidKey_23)+((ref_AppliedPTransform_assert_that/Group/pair_with_1_26)+(assert_that/Group/Flatten/Transcode/1)))))+(assert_that/Group/Flatten/Write/1)
root: INFO: Running 
(assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write)
root: INFO: Running 
(((assert_that/Group/GroupByKey/Read)+(ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_29))+(ref_AppliedPTransform_assert_that/Unkey_30))+(ref_AppliedPTransform_assert_that/Match_31)
--------------------- >> end captured logging << ---------------------

======================================================================
ERROR: test_tagged_join (apache_beam.transforms.sql_test.SqlTransformTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/python/apache_beam/transforms/sql_test.py";,>
 line 143, in test_tagged_join
    assert_that(out, equal_to([(1, "a"), (26, "z"), (1, "a")]))
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 568, in __exit__
    self.result = self.run()
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/python/apache_beam/testing/test_pipeline.py";,>
 line 114, in run
    state = result.wait_until_finish()
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py";,>
 line 552, in wait_until_finish
    raise self._runtime_exception
_MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
        status = StatusCode.DEADLINE_EXCEEDED
        details = "Deadline Exceeded"
        debug_error_string = 
"{"created":"@1600325394.342277993","description":"Error received from peer 
ipv4:127.0.0.1:18091","file":"src/core/lib/surface/call.cc","file_line":1062,"grpc_message":"Deadline
 Exceeded","grpc_status":4}"
>
-------------------- >> begin captured logging << --------------------
apache_beam.utils.subprocess_server: INFO: Using pre-built snapshot at 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.25.0-SNAPSHOT.jar>
apache_beam.utils.subprocess_server: INFO: Starting service with ['java' '-jar' 
'<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.25.0-SNAPSHOT.jar'>
 '41249']
root: DEBUG: Waiting for grpc channel to be ready at localhost:41249.
root: DEBUG: Waiting for grpc channel to be ready at localhost:41249.
root: DEBUG: Waiting for grpc channel to be ready at localhost:41249.
root: DEBUG: Waiting for grpc channel to be ready at localhost:41249.
root: DEBUG: Waiting for grpc channel to be ready at localhost:41249.
root: DEBUG: Waiting for grpc channel to be ready at localhost:41249.
root: DEBUG: Waiting for grpc channel to be ready at localhost:41249.
apache_beam.utils.subprocess_server: INFO: Starting expansion service at 
localhost:41249
root: DEBUG: Waiting for grpc channel to be ready at localhost:41249.
root: DEBUG: Waiting for grpc channel to be ready at localhost:41249.
root: DEBUG: Waiting for grpc channel to be ready at localhost:41249.
apache_beam.utils.subprocess_server: INFO: Sep 17, 2020 6:47:30 AM 
org.apache.beam.sdk.expansion.service.ExpansionService loadRegisteredTransforms
apache_beam.utils.subprocess_server: INFO: INFO: Registering external 
transforms: [beam:external:java:sql:v1, beam:external:java:generate_sequence:v1]
apache_beam.utils.subprocess_server: INFO:      beam:external:java:sql:v1: 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$1/2121744517@626b2d4a
apache_beam.utils.subprocess_server: INFO:      
beam:external:java:generate_sequence:v1: 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$1/2121744517@5e91993f
root: DEBUG: Waiting for grpc channel to be ready at localhost:41249.
root: DEBUG: Waiting for grpc channel to be ready at localhost:41249.
root: WARNING: Waiting for grpc channel to be ready at localhost:41249.
root: WARNING: Waiting for grpc channel to be ready at localhost:41249.
root: WARNING: Waiting for grpc channel to be ready at localhost:41249.
root: WARNING: Waiting for grpc channel to be ready at localhost:41249.
apache_beam.utils.subprocess_server: INFO: Sep 17, 2020 6:47:38 AM 
org.apache.beam.sdk.expansion.service.ExpansionService expand
apache_beam.utils.subprocess_server: INFO: INFO: Expanding 
'SqlTransform(beam:external:java:sql:v1)' with URN 'beam:external:java:sql:v1'
apache_beam.utils.subprocess_server: INFO: Sep 17, 2020 6:47:48 AM 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader
 payloadToConfig
apache_beam.utils.subprocess_server: INFO: WARNING: Configuration class 
'org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar$Configuration'
 has no schema registered. Attempting to construct with setter approach.
apache_beam.utils.subprocess_server: INFO: Sep 17, 2020 6:48:05 AM 
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
apache_beam.utils.subprocess_server: INFO: INFO: SQL:
apache_beam.utils.subprocess_server: INFO: SELECT `simple`.`id` AS `id`, 
`enrich`.`metadata` AS `metadata`
apache_beam.utils.subprocess_server: INFO: FROM `beam`.`simple` AS `simple`
apache_beam.utils.subprocess_server: INFO: INNER JOIN `beam`.`enrich` AS 
`enrich` ON `simple`.`id` = `enrich`.`id`
apache_beam.utils.subprocess_server: INFO: Sep 17, 2020 6:48:08 AM 
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
apache_beam.utils.subprocess_server: INFO: INFO: SQLPlan>
apache_beam.utils.subprocess_server: INFO: LogicalProject(id=[$0], 
metadata=[$4])
apache_beam.utils.subprocess_server: INFO:   LogicalJoin(condition=[=($0, $3)], 
joinType=[inner])
apache_beam.utils.subprocess_server: INFO:     BeamIOSourceRel(table=[[beam, 
simple]])
apache_beam.utils.subprocess_server: INFO:     BeamIOSourceRel(table=[[beam, 
enrich]])
apache_beam.utils.subprocess_server: INFO: 
apache_beam.utils.subprocess_server: INFO: Sep 17, 2020 6:48:10 AM 
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
apache_beam.utils.subprocess_server: INFO: INFO: BEAMPlan>
apache_beam.utils.subprocess_server: INFO: BeamCalcRel(expr#0..4=[{inputs}], 
id=[$t2], metadata=[$t1])
apache_beam.utils.subprocess_server: INFO:   BeamCoGBKJoinRel(condition=[=($2, 
$0)], joinType=[inner])
apache_beam.utils.subprocess_server: INFO:     BeamIOSourceRel(table=[[beam, 
enrich]])
apache_beam.utils.subprocess_server: INFO:     BeamIOSourceRel(table=[[beam, 
simple]])
apache_beam.utils.subprocess_server: INFO: 
root: DEBUG: Sending SIGINT to job_server
root: WARNING: Make sure that locally built Python SDK docker image has Python 
2.7 interpreter.
root: INFO: Using Python SDK docker image: 
apache/beam_python2.7_sdk:2.25.0.dev. If the image is not available at local, 
we will try to pull from hub.docker.com
root: WARNING: Make sure that locally built Python SDK docker image has Python 
2.7 interpreter.
root: INFO: Using Python SDK docker image: 
apache/beam_python2.7_sdk:2.25.0.dev. If the image is not available at local, 
we will try to pull from hub.docker.com
apache_beam.runners.portability.fn_api_runner.translations: INFO: 
==================== <function lift_combiners at 0x7f6072bd1de8> 
====================
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: 42 [1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: Stages: 
['ref_AppliedPTransform_Create enrich/Impulse_3\n  Create 
enrich/Impulse:beam:transform:impulse:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create 
enrich/FlatMap(<lambda at core.py:2905>)_4\n  Create enrich/FlatMap(<lambda at 
core.py:2905>):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create 
enrich/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n  Create 
enrich/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create 
enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n  
Create 
enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create 
enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n  Create 
enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create 
enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_11\n
  Create 
enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create 
enrich/MaybeReshuffle/Reshuffle/RemoveRandomKeys_12\n  Create 
enrich/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create enrich/Map(decode)_13\n  Create 
enrich/Map(decode):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create 
simple/Impulse_15\n  Create simple/Impulse:beam:transform:impulse:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create 
simple/FlatMap(<lambda at core.py:2905>)_16\n  Create simple/FlatMap(<lambda at 
core.py:2905>):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create 
simple/MaybeReshuffle/Reshuffle/AddRandomKeys_19\n  Create 
simple/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create 
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_21\n  
Create 
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create 
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_22\n  Create 
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create 
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_23\n
  Create 
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create 
simple/MaybeReshuffle/Reshuffle/RemoveRandomKeys_24\n  Create 
simple/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Create simple/Map(decode)_25\n  Create 
simple/Map(decode):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_6/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_6/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_5/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_5/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/left_TimestampCombiner/Flatten.PCollections\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/left_TimestampCombiner/Flatten.PCollections:beam:transform:flatten:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/right_TimestampCombiner/Flatten.PCollections\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/right_TimestampCombiner/Flatten.PCollections:beam:transform:flatten:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/extractKeylhs/ParMultiDo(Anonymous)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/extractKeylhs/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/extractKeyrhs/ParMultiDo(Anonymous)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/extractKeyrhs/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/Flatten\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/Flatten:beam:transform:flatten:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/GBK\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/GBK:beam:transform:group_by_key:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/ParDo(ConvertCoGbkResult)/ParMultiDo(ConvertCoGbkResult)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/ParDo(ConvertCoGbkResult)/ParMultiDo(ConvertCoGbkResult):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Select.Fields/ParDo(Select)/ParMultiDo(Select)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Select.Fields/ParDo(Select)/ParMultiDo(Select):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_7SqlTransform(beam:external:java:sql:v1)/BeamCalcRel_96/ParDo(Calc)/ParMultiDo(Calc)\n
  
SqlTransform(beam:external:java:sql:v1)/BeamCalcRel_96/ParDo(Calc)/ParMultiDo(Calc):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Create/Impulse_29\n  
assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at 
core.py:2905>)_30\n  assert_that/Create/FlatMap(<lambda at 
core.py:2905>):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Create/Map(decode)_32\n  
assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_33\n  
assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/ToVoidKey_34\n  
assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/pair_with_0_36\n  
assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/pair_with_1_37\n  
assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/Flatten_38\n  
assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/GroupByKey_39\n  
assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_40\n 
 assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Unkey_41\n  
assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that/Match_42\n  
assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>']
apache_beam.runners.portability.portable_runner: INFO: Job state changed to 
STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to 
STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to 
RUNNING
root: INFO: ==================== <function annotate_downstream_side_inputs at 
0x7fed241510c8> ====================
root: INFO: ==================== <function fix_side_input_pcoll_coders at 
0x7fed241511b8> ====================
root: INFO: ==================== <function eliminate_common_key_with_none at 
0x7fed241512a8> ====================
root: INFO: ==================== <function pack_combiners at 0x7fed24151320> 
====================
root: INFO: ==================== <function lift_combiners at 0x7fed24151398> 
====================
root: INFO: ==================== <function expand_sdf at 0x7fed24151410> 
====================
root: INFO: ==================== <function expand_gbk at 0x7fed24151488> 
====================
root: INFO: ==================== <function sink_flattens at 0x7fed24151578> 
====================
root: INFO: ==================== <function greedily_fuse at 0x7fed241515f0> 
====================
root: INFO: ==================== <function read_to_impulse at 0x7fed24151668> 
====================
root: INFO: ==================== <function impulse_to_input at 0x7fed241516e0> 
====================
root: INFO: ==================== <function sort_stages at 0x7fed241518c0> 
====================
root: INFO: ==================== <function setup_timer_mapping at 
0x7fed24151848> ====================
root: INFO: ==================== <function populate_data_channel_coders at 
0x7fed24151938> ====================
root: INFO: starting control server on port 44817
root: INFO: starting data server on port 42435
root: INFO: starting state server on port 40907
root: INFO: starting logging server on port 39303
root: INFO: Created Worker handler 
<apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler
 object at 0x7fed0c234090> for environment 
ref_Environment_default_environment_1 (beam:env:docker:v1, 
'\n$apache/beam_python2.7_sdk:2.25.0.dev')
root: INFO: Unable to pull image apache/beam_python2.7_sdk:2.25.0.dev
root: INFO: Waiting for docker to start up.Current status is running
root: INFO: Docker container is running. container_id = 
ca734d17fecc22e52807920a4d543a190828a3004b905c04a7550fb066a2b276, worker_id = 
worker_76
root: INFO: Running (ref_AppliedPTransform_Create 
simple/Impulse_15)+((ref_AppliedPTransform_Create simple/FlatMap(<lambda at 
core.py:2905>)_16)+((ref_AppliedPTransform_Create 
simple/MaybeReshuffle/Reshuffle/AddRandomKeys_19)+((ref_AppliedPTransform_Create
 
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_21)+(Create
 simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write))))
root: INFO: Running (Create 
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+((ref_AppliedPTransform_Create
 
simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_23)+((ref_AppliedPTransform_Create
 
simple/MaybeReshuffle/Reshuffle/RemoveRandomKeys_24)+((ref_AppliedPTransform_Create
 simple/Map(decode)_25)+(ref_PCollection_PCollection_1/Write))))
root: INFO: Created Worker handler 
<apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler
 object at 0x7fed0c298d90> for environment external_7beam:env:docker:v1 
(beam:env:docker:v1, '\n\x1fapache/beam_java_sdk:2.25.0.dev')
root: INFO: Unable to pull image apache/beam_java_sdk:2.25.0.dev
root: INFO: Waiting for docker to start up.Current status is running
root: INFO: Docker container is running. container_id = 
19281259148c05443a491337bf0376beafea0fa5542a96d25ee48737dd878e93, worker_id = 
worker_77
root: INFO: Running 
(((((ref_PCollection_PCollection_1/Read)+(external_7SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_5/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous)))+(external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/right_TimestampCombiner/Flatten.PCollections))+(external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/extractKeyrhs/ParMultiDo(Anonymous)))+(external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)))+((SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/Flatten/Transcode/1)+(SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/Flatten/Write/1))
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangSqlValidateRunner.xml
----------------------------------------------------------------------
XML: 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 9 tests in 1026.124s

FAILED (errors=2)

> Task 
> :sdks:python:test-suites:direct:xlang:validatesCrossLanguageRunnerPythonUsingSql
>  FAILED

> Task :sdks:python:test-suites:direct:xlang:fnApiJobServerCleanup
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/build/gradleenv/1922375555/local/lib/python2.7/site-packages/apache_beam/__init__.py>:82:
 UserWarning: You are using the final Apache Beam release with Python 2 
support. New releases of Apache Beam will require Python 3.6 or a newer version.
  'You are using the final Apache Beam release with Python 2 support. '
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/build/gradleenv/1922375555/local/lib/python2.7/site-packages/azure/storage/blob/_shared/encryption.py>:15:
 CryptographyDeprecationWarning: Python 2 is no longer supported by the Python 
core team. Support for it is now deprecated in cryptography, and will be 
removed in a future release.
  from cryptography.hazmat.backends import default_backend

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':sdks:python:test-suites:direct:xlang:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':sdks:python:test-suites:direct:xlang:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':sdks:python:test-suites:direct:xlang:validatesCrossLanguageRunnerPythonUsingSql'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 53m 20s
140 actionable tasks: 103 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/konyfq6cg4o6m

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org

Reply via email to