See <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/2235/display/redirect?page=changes>
Changes: [randomstep] [BEAM-12188] Bump snakeyaml to 1.28 [Brian Hulette] Use try-with-resources in ZipFiles [samuelw] [BEAM-12229] StreamingModeExecutionContext invalidated [aromanenko.dev] [BEAM-12243] TPC-DS: use SQL "substring()" instead of "substr()" [Brian Hulette] Allow overwriting zip file in packageDirectoriesToStage [noreply] Merge pull request #14615 from [BEAM-12206] Update BigQuery IO [chamikaramj] Revert "[BEAM-11994] Refactor BigQueryTornadoes to make more options ------------------------------------------ [...truncated 1.59 MB...] DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'project' was already added DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'worker_region' was already added DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'worker_zone' was already added DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'zone' was already added DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'resource_hints' was already added INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE ok ====================================================================== ERROR: test_windowing_before_sql (apache_beam.transforms.sql_test.SqlTransformTest) ---------------------------------------------------------------------- Traceback (most recent call last): File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/transforms/sql_test.py",> line 176, in test_windowing_before_sql assert_that(out, equal_to([(1, ), (1, ), (1, )])) File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/pipeline.py",> line 582, in __exit__ self.result = self.run() File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 116, in run state = result.wait_until_finish() File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",> line 599, in wait_until_finish raise self._runtime_exception RuntimeError: Pipeline BeamApp-jenkins-0429010113-d8b098b9_c5ec3c20-3302-4c9c-becf-2ecf293b74da failed in state FAILED: java.lang.RuntimeException: Error received from SDK harness for instruction 4: java.lang.IllegalStateException at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:491) at org.apache.beam.fn.harness.data.QueueingBeamFnDataClient.drainAndBlock(QueueingBeamFnDataClient.java:226) at org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:326) at org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:140) at org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:110) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) -------------------- >> begin captured logging << -------------------- apache_beam.utils.subprocess_server: INFO: Using pre-built snapshot at <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.30.0-SNAPSHOT.jar> apache_beam.utils.subprocess_server: INFO: Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.30.0-SNAPSHOT.jar'> '47189'] root: DEBUG: Waiting for grpc channel to be ready at localhost:47189. apache_beam.utils.subprocess_server: INFO: b'Starting expansion service at localhost:47189' root: DEBUG: Waiting for grpc channel to be ready at localhost:47189. apache_beam.utils.subprocess_server: INFO: b'Apr 29, 2021 1:01:05 AM org.apache.beam.sdk.expansion.service.ExpansionService loadRegisteredTransforms' apache_beam.utils.subprocess_server: INFO: b'INFO: Registering external transforms: [beam:external:java:sql:v1, beam:external:java:generate_sequence:v1]' apache_beam.utils.subprocess_server: INFO: b'\tbeam:external:java:sql:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/1130478920@5680a178' apache_beam.utils.subprocess_server: INFO: b'\tbeam:external:java:generate_sequence:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/1130478920@5fdef03a' root: DEBUG: Waiting for grpc channel to be ready at localhost:47189. root: DEBUG: Waiting for grpc channel to be ready at localhost:47189. root: DEBUG: Waiting for grpc channel to be ready at localhost:47189. root: DEBUG: Waiting for grpc channel to be ready at localhost:47189. apache_beam.utils.subprocess_server: INFO: b'Apr 29, 2021 1:01:06 AM org.apache.beam.sdk.expansion.service.ExpansionService expand' apache_beam.utils.subprocess_server: INFO: b"INFO: Expanding 'SqlTransform(beam:external:java:sql:v1)' with URN 'beam:external:java:sql:v1'" apache_beam.utils.subprocess_server: INFO: b'Apr 29, 2021 1:01:07 AM org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader payloadToConfig' apache_beam.utils.subprocess_server: INFO: b"WARNING: Configuration class 'org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar$Configuration' has no schema registered. Attempting to construct with setter approach." apache_beam.utils.subprocess_server: INFO: b'Apr 29, 2021 1:01:08 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel' apache_beam.utils.subprocess_server: INFO: b'INFO: SQL:' apache_beam.utils.subprocess_server: INFO: b'SELECT COUNT(*) AS `count`' apache_beam.utils.subprocess_server: INFO: b'FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`' apache_beam.utils.subprocess_server: INFO: b'Apr 29, 2021 1:01:08 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel' apache_beam.utils.subprocess_server: INFO: b'INFO: SQLPlan>' apache_beam.utils.subprocess_server: INFO: b'LogicalAggregate(group=[{}], count=[COUNT()])' apache_beam.utils.subprocess_server: INFO: b' LogicalProject($f0=[0])' apache_beam.utils.subprocess_server: INFO: b' BeamIOSourceRel(table=[[beam, PCOLLECTION]])' apache_beam.utils.subprocess_server: INFO: b'' apache_beam.utils.subprocess_server: INFO: b'Apr 29, 2021 1:01:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel' apache_beam.utils.subprocess_server: INFO: b'INFO: BEAMPlan>' apache_beam.utils.subprocess_server: INFO: b'BeamAggregationRel(group=[{}], count=[COUNT()])' apache_beam.utils.subprocess_server: INFO: b' BeamIOSourceRel(table=[[beam, PCOLLECTION]])' apache_beam.utils.subprocess_server: INFO: b'' root: DEBUG: Sending SIGINT to job_server root: DEBUG: Unhandled type_constraint: Union[] root: DEBUG: Unhandled type_constraint: Union[] root: DEBUG: Unhandled type_constraint: Union[] root: DEBUG: Unhandled type_constraint: Union[] root: WARNING: Make sure that locally built Python SDK docker image has Python 3.6 interpreter. root: INFO: Default Python SDK image for environment is apache/beam_python3.6_sdk:2.30.0.dev root: INFO: No image given, using default Python SDK image root: WARNING: Make sure that locally built Python SDK docker image has Python 3.6 interpreter. root: INFO: Default Python SDK image for environment is apache/beam_python3.6_sdk:2.30.0.dev root: INFO: Python SDK container image set to "apache/beam_python3.6_sdk:2.30.0.dev" for Docker environment apache_beam.runners.portability.fn_api_runner.translations: INFO: ==================== <function lift_combiners at 0x7f09cf700488> ==================== apache_beam.runners.portability.fn_api_runner.translations: DEBUG: 27 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1] apache_beam.runners.portability.fn_api_runner.translations: DEBUG: Stages: ['ref_AppliedPTransform_Create-Impulse_3\n Create/Impulse:beam:transform:impulse:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create-FlatMap-lambda-at-core-py-2930-_4\n Create/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create-MaybeReshuffle-Reshuffle-AddRandomKeys_7\n Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create-MaybeReshuffle-Reshuffle-ReshufflePerKey-Map-reify_timestamps-_9\n Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create-MaybeReshuffle-Reshuffle-ReshufflePerKey-GroupByKey_10\n Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create-MaybeReshuffle-Reshuffle-ReshufflePerKey-FlatMap-restore_timestamps-_11\n Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create-MaybeReshuffle-Reshuffle-RemoveRandomKeys_12\n Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create-Map-decode-_13\n Create/Map(decode):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map-lambda-at-sql_test-py-171-_14\n Map(<lambda at sql_test.py:171>):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_WindowInto-WindowIntoFn-_15\n WindowInto(WindowIntoFn):beam:transform:window_into:v1\n must follow: \n downstream_side_inputs: <unknown>', 'external_8SqlTransform-beam-external-java-sql-v1--BeamAggregationRel_40-Group-CombineFieldsByFields-ToKvs-sele\n SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/selectKeys/AddKeys/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'external_8SqlTransform-beam-external-java-sql-v1--BeamAggregationRel_40-Group-CombineFieldsByFields-ToKvs-Grou\n SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n downstream_side_inputs: <unknown>', 'external_8SqlTransform-beam-external-java-sql-v1--BeamAggregationRel_40-Group-CombineFieldsByFields-Combine-Pa\n SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/Combine/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'external_8SqlTransform-beam-external-java-sql-v1--BeamAggregationRel_40-Group-CombineFieldsByFields-ToRow-ParM\n SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToRow/ParMultiDo(Anonymous):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'external_8SqlTransform-beam-external-java-sql-v1--BeamAggregationRel_40-mergeRecord-ParMultiDo-Anonymous-\n SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/mergeRecord/ParMultiDo(Anonymous):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that-Create-Impulse_19\n assert_that/Create/Impulse:beam:transform:impulse:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that-Create-FlatMap-lambda-at-core-py-2930-_20\n assert_that/Create/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that-Create-Map-decode-_22\n assert_that/Create/Map(decode):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that-WindowInto-WindowIntoFn-_23\n assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that-ToVoidKey_24\n assert_that/ToVoidKey:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that-Group-pair_with_0_26\n assert_that/Group/pair_with_0:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that-Group-pair_with_1_27\n assert_that/Group/pair_with_1:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that-Group-Flatten_28\n assert_that/Group/Flatten:beam:transform:flatten:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that-Group-GroupByKey_29\n assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that-Group-Map-_merge_tagged_vals_under_key-_30\n assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that-Unkey_31\n assert_that/Unkey:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that-Match_32\n assert_that/Match:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>'] apache_beam.runners.portability.fn_api_runner.translations: INFO: ==================== <function sort_stages at 0x7f09cf700b70> ==================== apache_beam.runners.portability.fn_api_runner.translations: DEBUG: 27 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1] apache_beam.runners.portability.fn_api_runner.translations: DEBUG: Stages: ['ref_AppliedPTransform_Create-Impulse_3\n Create/Impulse:beam:transform:impulse:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create-FlatMap-lambda-at-core-py-2930-_4\n Create/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create-MaybeReshuffle-Reshuffle-AddRandomKeys_7\n Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create-MaybeReshuffle-Reshuffle-ReshufflePerKey-Map-reify_timestamps-_9\n Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create-MaybeReshuffle-Reshuffle-ReshufflePerKey-GroupByKey_10\n Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create-MaybeReshuffle-Reshuffle-ReshufflePerKey-FlatMap-restore_timestamps-_11\n Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create-MaybeReshuffle-Reshuffle-RemoveRandomKeys_12\n Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create-Map-decode-_13\n Create/Map(decode):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map-lambda-at-sql_test-py-171-_14\n Map(<lambda at sql_test.py:171>):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_WindowInto-WindowIntoFn-_15\n WindowInto(WindowIntoFn):beam:transform:window_into:v1\n must follow: \n downstream_side_inputs: <unknown>', 'external_8SqlTransform-beam-external-java-sql-v1--BeamAggregationRel_40-Group-CombineFieldsByFields-ToKvs-sele\n SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/selectKeys/AddKeys/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'external_8SqlTransform-beam-external-java-sql-v1--BeamAggregationRel_40-Group-CombineFieldsByFields-ToKvs-Grou\n SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n downstream_side_inputs: <unknown>', 'external_8SqlTransform-beam-external-java-sql-v1--BeamAggregationRel_40-Group-CombineFieldsByFields-Combine-Pa\n SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/Combine/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'external_8SqlTransform-beam-external-java-sql-v1--BeamAggregationRel_40-Group-CombineFieldsByFields-ToRow-ParM\n SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToRow/ParMultiDo(Anonymous):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'external_8SqlTransform-beam-external-java-sql-v1--BeamAggregationRel_40-mergeRecord-ParMultiDo-Anonymous-\n SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/mergeRecord/ParMultiDo(Anonymous):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that-Create-Impulse_19\n assert_that/Create/Impulse:beam:transform:impulse:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that-Create-FlatMap-lambda-at-core-py-2930-_20\n assert_that/Create/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that-Create-Map-decode-_22\n assert_that/Create/Map(decode):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that-WindowInto-WindowIntoFn-_23\n assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that-ToVoidKey_24\n assert_that/ToVoidKey:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that-Group-pair_with_0_26\n assert_that/Group/pair_with_0:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that-Group-pair_with_1_27\n assert_that/Group/pair_with_1:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that-Group-Flatten_28\n assert_that/Group/Flatten:beam:transform:flatten:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that-Group-GroupByKey_29\n assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that-Group-Map-_merge_tagged_vals_under_key-_30\n assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that-Unkey_31\n assert_that/Unkey:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that-Match_32\n assert_that/Match:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>'] apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'job_name' was already added apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'runner' was already added apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'temp_location' was already added apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'environment_cache_millis' was already added apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'environment_options' was already added apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'job_endpoint' was already added apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'output_executable_path' was already added apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'sdk_worker_parallelism' was already added apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'streaming' was already added apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'experiments' was already added apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'dataflow_kms_key' was already added apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'enable_streaming_engine' was already added apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'project' was already added apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'worker_region' was already added apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'worker_zone' was already added apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'zone' was already added apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'resource_hints' was already added apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING root: DEBUG: org.apache.beam.sdk.Pipeline$PipelineExecutionException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 4: java.lang.IllegalStateException at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:491) at org.apache.beam.fn.harness.data.QueueingBeamFnDataClient.drainAndBlock(QueueingBeamFnDataClient.java:226) at org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:326) at org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:140) at org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:110) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) at org.apache.beam.runners.spark.SparkPipelineResult.beamExceptionFrom(SparkPipelineResult.java:73) at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:104) at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:92) at org.apache.beam.runners.spark.SparkPipelineRunner.run(SparkPipelineRunner.java:199) at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86) at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125) at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57) at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 4: java.lang.IllegalStateException at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:491) at org.apache.beam.fn.harness.data.QueueingBeamFnDataClient.drainAndBlock(QueueingBeamFnDataClient.java:226) at org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:326) at org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:140) at org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:110) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357) at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908) at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:60) at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:504) at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory$1.close(DefaultJobBundleFactory.java:555) at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.$closeResource(SparkExecutableStageFunction.java:204) at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.processElements(SparkExecutableStageFunction.java:229) at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:142) at org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:80) at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153) at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153) at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823) at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346) at org.apache.spark.rdd.RDD.iterator(RDD.scala:310) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346) at org.apache.spark.rdd.RDD.iterator(RDD.scala:310) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346) at org.apache.spark.rdd.RDD.iterator(RDD.scala:310) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346) at org.apache.spark.rdd.RDD.iterator(RDD.scala:310) at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:105) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346) at org.apache.spark.rdd.RDD.iterator(RDD.scala:310) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346) at org.apache.spark.rdd.RDD.iterator(RDD.scala:310) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55) at org.apache.spark.scheduler.Task.run(Task.scala:123) at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408) at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414) ... 3 more Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 4: java.lang.IllegalStateException at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:491) at org.apache.beam.fn.harness.data.QueueingBeamFnDataClient.drainAndBlock(QueueingBeamFnDataClient.java:226) at org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:326) at org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:140) at org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:110) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:180) at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:160) at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251) at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33) at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76) at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309) at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292) at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782) at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37) at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123) ... 3 more root: ERROR: java.lang.RuntimeException: Error received from SDK harness for instruction 4: java.lang.IllegalStateException at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:491) at org.apache.beam.fn.harness.data.QueueingBeamFnDataClient.drainAndBlock(QueueingBeamFnDataClient.java:226) at org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:326) at org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:140) at org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:110) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED --------------------- >> end captured logging << --------------------- ---------------------------------------------------------------------- XML: nosetests-xlangSqlValidateRunner.xml ---------------------------------------------------------------------- XML: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 9 tests in 175.878s FAILED (errors=1) > Task :runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingSql > FAILED > Task :runners:spark:2:job-server:validatesCrossLanguageRunnerCleanup Stopping expansion service pid: 11021. Stopping expansion service pid: 11024. > Task :runners:spark:2:job-server:sparkJobServerCleanup Stopping job server pid: 6118. FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingSql'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 28m 53s 173 actionable tasks: 35 executed, 138 up-to-date Gradle was unable to watch the file system for changes. The inotify watches limit is too low. Publishing build scan... https://gradle.com/s/22sth6dhia4hk Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
