See
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/2205/display/redirect>
Changes:
------------------------------------------
[...truncated 995.39 KB...]
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
root: ERROR: java.lang.NullPointerException
apache_beam.runners.portability.portable_runner: INFO: Job state changed to
FAILED
--------------------- >> end captured logging << ---------------------
======================================================================
ERROR: test_windowing_before_sql
(apache_beam.transforms.sql_test.SqlTransformTest)
----------------------------------------------------------------------
Traceback (most recent call last):
File
"<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/transforms/sql_test.py",>
line 179, in test_windowing_before_sql
assert_that(out, equal_to([(1, ), (1, ), (1, )]))
File
"<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/pipeline.py",>
line 582, in __exit__
self.result = self.run()
File
"<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",>
line 112, in run
state = result.wait_until_finish()
File
"<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",>
line 602, in wait_until_finish
raise self._runtime_exception
RuntimeError: Pipeline
BeamApp-jenkins-0421123657-1fface43_83c2fa3b-111a-44b3-9952-e04326b02c4e failed
in state FAILED: java.lang.NullPointerException
-------------------- >> begin captured logging << --------------------
apache_beam.utils.subprocess_server: INFO: Using pre-built snapshot at
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.30.0-SNAPSHOT.jar>
apache_beam.utils.subprocess_server: INFO: Starting service with ['java' '-jar'
'<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.30.0-SNAPSHOT.jar'>
'49567']
root: DEBUG: Waiting for grpc channel to be ready at localhost:49567.
apache_beam.utils.subprocess_server: INFO: b'Starting expansion service at
localhost:49567'
root: DEBUG: Waiting for grpc channel to be ready at localhost:49567.
apache_beam.utils.subprocess_server: INFO: b'Apr 21, 2021 12:36:49 PM
org.apache.beam.sdk.expansion.service.ExpansionService loadRegisteredTransforms'
apache_beam.utils.subprocess_server: INFO: b'INFO: Registering external
transforms: [beam:external:java:sql:v1,
beam:external:java:generate_sequence:v1]'
apache_beam.utils.subprocess_server: INFO: b'\tbeam:external:java:sql:v1:
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/1130478920@5680a178'
apache_beam.utils.subprocess_server: INFO:
b'\tbeam:external:java:generate_sequence:v1:
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/1130478920@5fdef03a'
root: DEBUG: Waiting for grpc channel to be ready at localhost:49567.
root: DEBUG: Waiting for grpc channel to be ready at localhost:49567.
root: DEBUG: Waiting for grpc channel to be ready at localhost:49567.
root: DEBUG: Waiting for grpc channel to be ready at localhost:49567.
apache_beam.utils.subprocess_server: INFO: b'Apr 21, 2021 12:36:50 PM
org.apache.beam.sdk.expansion.service.ExpansionService expand'
apache_beam.utils.subprocess_server: INFO: b"INFO: Expanding
'SqlTransform(beam:external:java:sql:v1)' with URN 'beam:external:java:sql:v1'"
apache_beam.utils.subprocess_server: INFO: b'Apr 21, 2021 12:36:51 PM
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader
payloadToConfig'
apache_beam.utils.subprocess_server: INFO: b"WARNING: Configuration class
'org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar$Configuration'
has no schema registered. Attempting to construct with setter approach."
apache_beam.utils.subprocess_server: INFO: b'Apr 21, 2021 12:36:52 PM
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel'
apache_beam.utils.subprocess_server: INFO: b'INFO: SQL:'
apache_beam.utils.subprocess_server: INFO: b'SELECT COUNT(*) AS `count`'
apache_beam.utils.subprocess_server: INFO: b'FROM `beam`.`PCOLLECTION` AS
`PCOLLECTION`'
apache_beam.utils.subprocess_server: INFO: b'Apr 21, 2021 12:36:53 PM
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel'
apache_beam.utils.subprocess_server: INFO: b'INFO: SQLPlan>'
apache_beam.utils.subprocess_server: INFO: b'LogicalAggregate(group=[{}],
count=[COUNT()])'
apache_beam.utils.subprocess_server: INFO: b' LogicalProject($f0=[0])'
apache_beam.utils.subprocess_server: INFO: b' BeamIOSourceRel(table=[[beam,
PCOLLECTION]])'
apache_beam.utils.subprocess_server: INFO: b''
apache_beam.utils.subprocess_server: INFO: b'Apr 21, 2021 12:36:53 PM
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel'
apache_beam.utils.subprocess_server: INFO: b'INFO: BEAMPlan>'
apache_beam.utils.subprocess_server: INFO: b'BeamAggregationRel(group=[{}],
count=[COUNT()])'
apache_beam.utils.subprocess_server: INFO: b' BeamIOSourceRel(table=[[beam,
PCOLLECTION]])'
apache_beam.utils.subprocess_server: INFO: b''
root: DEBUG: Sending SIGINT to job_server
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: WARNING: Make sure that locally built Python SDK docker image has Python
3.6 interpreter.
root: INFO: Default Python SDK image for environment is
apache/beam_python3.6_sdk:2.30.0.dev
root: INFO: No image given, using default Python SDK image
root: WARNING: Make sure that locally built Python SDK docker image has Python
3.6 interpreter.
root: INFO: Default Python SDK image for environment is
apache/beam_python3.6_sdk:2.30.0.dev
root: INFO: Python SDK container image set to
"apache/beam_python3.6_sdk:2.30.0.dev" for Docker environment
apache_beam.runners.portability.fn_api_runner.translations: INFO:
==================== <function lift_combiners at 0x7fcb88826d90>
====================
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: 27 [1, 1, 1,
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: Stages:
['ref_AppliedPTransform_Create-Impulse_3\n
Create/Impulse:beam:transform:impulse:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create-FlatMap-lambda-at-core-py-2955-_4\n
Create/FlatMap(<lambda at core.py:2955>):beam:transform:pardo:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create-MaybeReshuffle-Reshuffle-AddRandomKeys_7\n
Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create-MaybeReshuffle-Reshuffle-ReshufflePerKey-Map-reify_timestamps-_9\n
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create-MaybeReshuffle-Reshuffle-ReshufflePerKey-GroupByKey_10\n
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create-MaybeReshuffle-Reshuffle-ReshufflePerKey-FlatMap-restore_timestamps-_11\n
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create-MaybeReshuffle-Reshuffle-RemoveRandomKeys_12\n
Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create-Map-decode-_13\n
Create/Map(decode):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Map-lambda-at-sql_test-py-174-_14\n Map(<lambda at
sql_test.py:174>):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_WindowInto-WindowIntoFn-_15\n
WindowInto(WindowIntoFn):beam:transform:window_into:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'external_8SqlTransform-beam-external-java-sql-v1--BeamAggregationRel_40-Group-CombineFieldsByFields-ToKvs-sele\n
SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/selectKeys/AddKeys/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_8SqlTransform-beam-external-java-sql-v1--BeamAggregationRel_40-Group-CombineFieldsByFields-ToKvs-Grou\n
SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/GroupByKey:beam:transform:group_by_key:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_8SqlTransform-beam-external-java-sql-v1--BeamAggregationRel_40-Group-CombineFieldsByFields-Combine-Pa\n
SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/Combine/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_8SqlTransform-beam-external-java-sql-v1--BeamAggregationRel_40-Group-CombineFieldsByFields-ToRow-ParM\n
SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToRow/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_8SqlTransform-beam-external-java-sql-v1--BeamAggregationRel_40-mergeRecord-ParMultiDo-Anonymous-\n
SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/mergeRecord/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Create-Impulse_19\n
assert_that/Create/Impulse:beam:transform:impulse:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Create-FlatMap-lambda-at-core-py-2955-_20\n
assert_that/Create/FlatMap(<lambda at core.py:2955>):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Create-Map-decode-_22\n
assert_that/Create/Map(decode):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-WindowInto-WindowIntoFn-_23\n
assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-ToVoidKey_24\n
assert_that/ToVoidKey:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Group-pair_with_0_26\n
assert_that/Group/pair_with_0:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Group-pair_with_1_27\n
assert_that/Group/pair_with_1:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Group-Flatten_28\n
assert_that/Group/Flatten:beam:transform:flatten:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Group-GroupByKey_29\n
assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Group-Map-_merge_tagged_vals_under_key-_30\n
assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Unkey_31\n
assert_that/Unkey:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Match_32\n
assert_that/Match:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>']
apache_beam.runners.portability.fn_api_runner.translations: INFO:
==================== <function sort_stages at 0x7fcb88827510>
====================
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: 27 [1, 1, 1,
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: Stages:
['ref_AppliedPTransform_Create-Impulse_3\n
Create/Impulse:beam:transform:impulse:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create-FlatMap-lambda-at-core-py-2955-_4\n
Create/FlatMap(<lambda at core.py:2955>):beam:transform:pardo:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create-MaybeReshuffle-Reshuffle-AddRandomKeys_7\n
Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create-MaybeReshuffle-Reshuffle-ReshufflePerKey-Map-reify_timestamps-_9\n
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create-MaybeReshuffle-Reshuffle-ReshufflePerKey-GroupByKey_10\n
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create-MaybeReshuffle-Reshuffle-ReshufflePerKey-FlatMap-restore_timestamps-_11\n
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create-MaybeReshuffle-Reshuffle-RemoveRandomKeys_12\n
Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create-Map-decode-_13\n
Create/Map(decode):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Map-lambda-at-sql_test-py-174-_14\n Map(<lambda at
sql_test.py:174>):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_WindowInto-WindowIntoFn-_15\n
WindowInto(WindowIntoFn):beam:transform:window_into:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'external_8SqlTransform-beam-external-java-sql-v1--BeamAggregationRel_40-Group-CombineFieldsByFields-ToKvs-sele\n
SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/selectKeys/AddKeys/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_8SqlTransform-beam-external-java-sql-v1--BeamAggregationRel_40-Group-CombineFieldsByFields-ToKvs-Grou\n
SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/GroupByKey:beam:transform:group_by_key:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_8SqlTransform-beam-external-java-sql-v1--BeamAggregationRel_40-Group-CombineFieldsByFields-Combine-Pa\n
SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/Combine/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_8SqlTransform-beam-external-java-sql-v1--BeamAggregationRel_40-Group-CombineFieldsByFields-ToRow-ParM\n
SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToRow/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_8SqlTransform-beam-external-java-sql-v1--BeamAggregationRel_40-mergeRecord-ParMultiDo-Anonymous-\n
SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/mergeRecord/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Create-Impulse_19\n
assert_that/Create/Impulse:beam:transform:impulse:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Create-FlatMap-lambda-at-core-py-2955-_20\n
assert_that/Create/FlatMap(<lambda at core.py:2955>):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Create-Map-decode-_22\n
assert_that/Create/Map(decode):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-WindowInto-WindowIntoFn-_23\n
assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-ToVoidKey_24\n
assert_that/ToVoidKey:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Group-pair_with_0_26\n
assert_that/Group/pair_with_0:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Group-pair_with_1_27\n
assert_that/Group/pair_with_1:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Group-Flatten_28\n
assert_that/Group/Flatten:beam:transform:flatten:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Group-GroupByKey_29\n
assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Group-Map-_merge_tagged_vals_under_key-_30\n
assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Unkey_31\n
assert_that/Unkey:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Match_32\n
assert_that/Match:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>']
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'experiments' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'job_name' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'runner'
was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'temp_location' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'dataflow_kms_key' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'enable_streaming_engine' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'project'
was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'worker_region' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'worker_zone' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'zone'
was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'streaming' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'environment_cache_millis' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'environment_options' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'job_endpoint' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'output_executable_path' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'sdk_worker_parallelism' was already added
apache_beam.runners.portability.portable_runner: INFO: Job state changed to
STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to
STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to
RUNNING
root: DEBUG: java.lang.NullPointerException
at
org.apache.beam.runners.spark.SparkPipelineRunner.run(SparkPipelineRunner.java:120)
at
org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
root: ERROR: java.lang.NullPointerException
apache_beam.runners.portability.portable_runner: INFO: Job state changed to
FAILED
--------------------- >> end captured logging << ---------------------
======================================================================
ERROR: test_zetasql_generate_data
(apache_beam.transforms.sql_test.SqlTransformTest)
----------------------------------------------------------------------
Traceback (most recent call last):
File
"<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/transforms/sql_test.py",>
line 164, in test_zetasql_generate_data
assert_that(out, equal_to([(1, "foo", 3.14)]))
File
"<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/pipeline.py",>
line 582, in __exit__
self.result = self.run()
File
"<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",>
line 112, in run
state = result.wait_until_finish()
File
"<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",>
line 602, in wait_until_finish
raise self._runtime_exception
RuntimeError: Pipeline
BeamApp-jenkins-0421123708-c005c4b9_0dff4fd1-0548-4a4b-9bc8-92d38fdf89d2 failed
in state FAILED: java.lang.NullPointerException
-------------------- >> begin captured logging << --------------------
apache_beam.utils.subprocess_server: INFO: Using pre-built snapshot at
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.30.0-SNAPSHOT.jar>
apache_beam.utils.subprocess_server: INFO: Starting service with ['java' '-jar'
'<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.30.0-SNAPSHOT.jar'>
'44301']
root: DEBUG: Waiting for grpc channel to be ready at localhost:44301.
apache_beam.utils.subprocess_server: INFO: b'Starting expansion service at
localhost:44301'
root: DEBUG: Waiting for grpc channel to be ready at localhost:44301.
apache_beam.utils.subprocess_server: INFO: b'Apr 21, 2021 12:36:58 PM
org.apache.beam.sdk.expansion.service.ExpansionService loadRegisteredTransforms'
apache_beam.utils.subprocess_server: INFO: b'INFO: Registering external
transforms: [beam:external:java:sql:v1,
beam:external:java:generate_sequence:v1]'
apache_beam.utils.subprocess_server: INFO: b'\tbeam:external:java:sql:v1:
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/1130478920@5680a178'
apache_beam.utils.subprocess_server: INFO:
b'\tbeam:external:java:generate_sequence:v1:
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/1130478920@5fdef03a'
root: DEBUG: Waiting for grpc channel to be ready at localhost:44301.
root: DEBUG: Waiting for grpc channel to be ready at localhost:44301.
root: DEBUG: Waiting for grpc channel to be ready at localhost:44301.
root: DEBUG: Waiting for grpc channel to be ready at localhost:44301.
apache_beam.utils.subprocess_server: INFO: b'Apr 21, 2021 12:36:59 PM
org.apache.beam.sdk.expansion.service.ExpansionService expand'
apache_beam.utils.subprocess_server: INFO: b"INFO: Expanding
'SqlTransform(beam:external:java:sql:v1)' with URN 'beam:external:java:sql:v1'"
apache_beam.utils.subprocess_server: INFO: b'Apr 21, 2021 12:37:00 PM
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader
payloadToConfig'
apache_beam.utils.subprocess_server: INFO: b"WARNING: Configuration class
'org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar$Configuration'
has no schema registered. Attempting to construct with setter approach."
apache_beam.utils.subprocess_server: INFO: b'Apr 21, 2021 12:37:03 PM
org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
convertToBeamRelInternal'
apache_beam.utils.subprocess_server: INFO: b'INFO: BEAMPlan>'
apache_beam.utils.subprocess_server: INFO:
b"BeamZetaSqlCalcRel(expr#0=[{inputs}], expr#1=[1:BIGINT],
expr#2=['foo':VARCHAR], expr#3=[3.1400000000000001243E0:DOUBLE], int=[$t1],
str=[$t2], flt=[$t3])"
apache_beam.utils.subprocess_server: INFO: b' BeamValuesRel(tuples=[[{ 0 }]])'
apache_beam.utils.subprocess_server: INFO: b''
root: DEBUG: Sending SIGINT to job_server
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: WARNING: Make sure that locally built Python SDK docker image has Python
3.6 interpreter.
root: INFO: Default Python SDK image for environment is
apache/beam_python3.6_sdk:2.30.0.dev
root: INFO: No image given, using default Python SDK image
root: WARNING: Make sure that locally built Python SDK docker image has Python
3.6 interpreter.
root: INFO: Default Python SDK image for environment is
apache/beam_python3.6_sdk:2.30.0.dev
root: INFO: Python SDK container image set to
"apache/beam_python3.6_sdk:2.30.0.dev" for Docker environment
apache_beam.runners.portability.fn_api_runner.translations: INFO:
==================== <function lift_combiners at 0x7fcb88826d90>
====================
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: 16 [1, 1, 1,
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: Stages:
['external_9SqlTransform-beam-external-java-sql-v1--BeamValuesRel_13-Create-Values-Read-CreateSource--Impulse\n
SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/Impulse:beam:transform:impulse:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_9SqlTransform-beam-external-java-sql-v1--BeamValuesRel_13-Create-Values-Read-CreateSource--ParDo-Outp\n
SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_9SqlTransform-beam-external-java-sql-v1--BeamValuesRel_13-Create-Values-Read-CreateSource--ParDo-Boun\n
SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_9SqlTransform-beam-external-java-sql-v1--BeamZetaSqlCalcRel_17-ParDo-Calc--ParMultiDo-Calc-\n
SqlTransform(beam:external:java:sql:v1)/BeamZetaSqlCalcRel_17/ParDo(Calc)/ParMultiDo(Calc):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Create-Impulse_5\n
assert_that/Create/Impulse:beam:transform:impulse:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Create-FlatMap-lambda-at-core-py-2955-_6\n
assert_that/Create/FlatMap(<lambda at core.py:2955>):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Create-Map-decode-_8\n
assert_that/Create/Map(decode):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-WindowInto-WindowIntoFn-_9\n
assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-ToVoidKey_10\n
assert_that/ToVoidKey:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Group-pair_with_0_12\n
assert_that/Group/pair_with_0:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Group-pair_with_1_13\n
assert_that/Group/pair_with_1:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Group-Flatten_14\n
assert_that/Group/Flatten:beam:transform:flatten:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Group-GroupByKey_15\n
assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Group-Map-_merge_tagged_vals_under_key-_16\n
assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Unkey_17\n
assert_that/Unkey:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Match_18\n
assert_that/Match:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>']
apache_beam.runners.portability.fn_api_runner.translations: INFO:
==================== <function sort_stages at 0x7fcb88827510>
====================
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: 16 [1, 1, 1,
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: Stages:
['external_9SqlTransform-beam-external-java-sql-v1--BeamValuesRel_13-Create-Values-Read-CreateSource--Impulse\n
SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/Impulse:beam:transform:impulse:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_9SqlTransform-beam-external-java-sql-v1--BeamValuesRel_13-Create-Values-Read-CreateSource--ParDo-Outp\n
SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_9SqlTransform-beam-external-java-sql-v1--BeamValuesRel_13-Create-Values-Read-CreateSource--ParDo-Boun\n
SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_9SqlTransform-beam-external-java-sql-v1--BeamZetaSqlCalcRel_17-ParDo-Calc--ParMultiDo-Calc-\n
SqlTransform(beam:external:java:sql:v1)/BeamZetaSqlCalcRel_17/ParDo(Calc)/ParMultiDo(Calc):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Create-Impulse_5\n
assert_that/Create/Impulse:beam:transform:impulse:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Create-FlatMap-lambda-at-core-py-2955-_6\n
assert_that/Create/FlatMap(<lambda at core.py:2955>):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Create-Map-decode-_8\n
assert_that/Create/Map(decode):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-WindowInto-WindowIntoFn-_9\n
assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-ToVoidKey_10\n
assert_that/ToVoidKey:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Group-pair_with_0_12\n
assert_that/Group/pair_with_0:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Group-pair_with_1_13\n
assert_that/Group/pair_with_1:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Group-Flatten_14\n
assert_that/Group/Flatten:beam:transform:flatten:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Group-GroupByKey_15\n
assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Group-Map-_merge_tagged_vals_under_key-_16\n
assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Unkey_17\n
assert_that/Unkey:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that-Match_18\n
assert_that/Match:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>']
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'experiments' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'job_name' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'runner'
was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'temp_location' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'dataflow_kms_key' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'enable_streaming_engine' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'project'
was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'worker_region' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'worker_zone' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'zone'
was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'streaming' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'environment_cache_millis' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'environment_options' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'job_endpoint' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'output_executable_path' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'sdk_worker_parallelism' was already added
apache_beam.runners.portability.portable_runner: INFO: Job state changed to
STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to
STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to
RUNNING
root: DEBUG: java.lang.NullPointerException
at
org.apache.beam.runners.spark.SparkPipelineRunner.run(SparkPipelineRunner.java:120)
at
org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
root: ERROR: java.lang.NullPointerException
apache_beam.runners.portability.portable_runner: INFO: Job state changed to
FAILED
--------------------- >> end captured logging << ---------------------
----------------------------------------------------------------------
XML: nosetests-xlangSqlValidateRunner.xml
----------------------------------------------------------------------
XML:
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 9 tests in 98.655s
FAILED (errors=9)
> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingSql
> FAILED
> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerCleanup
> Task :runners:spark:2:job-server:sparkJobServerCleanup
FAILURE: Build completed with 3 failures.
1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task
':runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task
':runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task
':runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingSql'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 36m 41s
182 actionable tasks: 131 executed, 47 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches
limit is too low.
Publishing build scan...
https://gradle.com/s/ekgaeib4dvwpk
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]