See 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/2362/display/redirect?page=changes>

Changes:

[zyichi] Minor fix to prebuilding sdk workflow timeout setting

[Ismaël Mejía] [BEAM-12423] Upgrade pyarrow to support version 4.0.0 too


------------------------------------------
[...truncated 1.89 MB...]
        at 
org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)

root: ERROR: org.apache.spark.SparkException: Only one SparkContext may be 
running in this JVM (see SPARK-2243). To ignore this error, set 
spark.driver.allowMultipleContexts = true. The currently running SparkContext 
was created at:
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
org.apache.beam.runners.spark.translation.SparkContextFactory.createSparkContext(SparkContextFactory.java:101)
org.apache.beam.runners.spark.translation.SparkContextFactory.getSparkContext(SparkContextFactory.java:67)
org.apache.beam.runners.spark.SparkPipelineRunner.run(SparkPipelineRunner.java:118)
org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)
apache_beam.runners.portability.portable_runner: INFO: Job state changed to 
FAILED
--------------------- >> end captured logging << ---------------------

======================================================================
ERROR: test_zetasql_generate_data 
(apache_beam.transforms.sql_test.SqlTransformTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/transforms/sql_test.py";,>
 line 161, in test_zetasql_generate_data
    assert_that(out, equal_to([(1, "foo", 3.14)]))
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 585, in __exit__
    self.result = self.run()
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/testing/test_pipeline.py";,>
 line 116, in run
    state = result.wait_until_finish()
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py";,>
 line 600, in wait_until_finish
    raise self._runtime_exception
RuntimeError: Pipeline 
BeamApp-jenkins-0601183303-d20ebd6f_ab00fc5a-2592-4bae-9498-3cfa9eb48aad failed 
in state FAILED: org.apache.spark.SparkException: Only one SparkContext may be 
running in this JVM (see SPARK-2243). To ignore this error, set 
spark.driver.allowMultipleContexts = true. The currently running SparkContext 
was created at:
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
org.apache.beam.runners.spark.translation.SparkContextFactory.createSparkContext(SparkContextFactory.java:101)
org.apache.beam.runners.spark.translation.SparkContextFactory.getSparkContext(SparkContextFactory.java:67)
org.apache.beam.runners.spark.SparkPipelineRunner.run(SparkPipelineRunner.java:118)
org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)
-------------------- >> begin captured logging << --------------------
apache_beam.utils.subprocess_server: INFO: Using pre-built snapshot at 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.31.0-SNAPSHOT.jar>
apache_beam.utils.subprocess_server: INFO: Starting service with ['java' '-jar' 
'<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.31.0-SNAPSHOT.jar'>
 '43833']
root: DEBUG: Waiting for grpc channel to be ready at localhost:43833.
apache_beam.utils.subprocess_server: INFO: b'Starting expansion service at 
localhost:43833'
root: DEBUG: Waiting for grpc channel to be ready at localhost:43833.
apache_beam.utils.subprocess_server: INFO: b'Jun 01, 2021 6:32:54 PM 
org.apache.beam.sdk.expansion.service.ExpansionService loadRegisteredTransforms'
apache_beam.utils.subprocess_server: INFO: b'INFO: Registering external 
transforms: [beam:external:java:sql:v1, 
beam:external:java:generate_sequence:v1]'
apache_beam.utils.subprocess_server: INFO: b'\tbeam:external:java:sql:v1: 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/1130478920@5680a178'
apache_beam.utils.subprocess_server: INFO: 
b'\tbeam:external:java:generate_sequence:v1: 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/1130478920@5fdef03a'
root: DEBUG: Waiting for grpc channel to be ready at localhost:43833.
root: DEBUG: Waiting for grpc channel to be ready at localhost:43833.
root: DEBUG: Waiting for grpc channel to be ready at localhost:43833.
root: DEBUG: Waiting for grpc channel to be ready at localhost:43833.
apache_beam.utils.subprocess_server: INFO: b'Jun 01, 2021 6:32:55 PM 
org.apache.beam.sdk.expansion.service.ExpansionService expand'
apache_beam.utils.subprocess_server: INFO: b"INFO: Expanding 
'SqlTransform(beam:external:java:sql:v1)' with URN 'beam:external:java:sql:v1'"
apache_beam.utils.subprocess_server: INFO: b'Jun 01, 2021 6:32:56 PM 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader
 payloadToConfig'
apache_beam.utils.subprocess_server: INFO: b"WARNING: Configuration class 
'org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar$Configuration'
 has no schema registered. Attempting to construct with setter approach."
apache_beam.utils.subprocess_server: INFO: b'Jun 01, 2021 6:32:59 PM 
org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner 
convertToBeamRelInternal'
apache_beam.utils.subprocess_server: INFO: b'INFO: BEAMPlan>'
apache_beam.utils.subprocess_server: INFO: 
b"BeamZetaSqlCalcRel(expr#0=[{inputs}], expr#1=[1:BIGINT], 
expr#2=['foo':VARCHAR], expr#3=[3.1400000000000001243E0:DOUBLE], int=[$t1], 
str=[$t2], flt=[$t3])"
apache_beam.utils.subprocess_server: INFO: b'  BeamValuesRel(tuples=[[{ 0 }]])'
apache_beam.utils.subprocess_server: INFO: b''
root: DEBUG: Sending SIGINT to job_server
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: WARNING: Make sure that locally built Python SDK docker image has Python 
3.6 interpreter.
root: INFO: Default Python SDK image for environment is 
apache/beam_python3.6_sdk:2.31.0.dev
root: INFO: No image given, using default Python SDK image
root: WARNING: Make sure that locally built Python SDK docker image has Python 
3.6 interpreter.
root: INFO: Default Python SDK image for environment is 
apache/beam_python3.6_sdk:2.31.0.dev
root: INFO: Python SDK container image set to 
"apache/beam_python3.6_sdk:2.31.0.dev" for Docker environment
apache_beam.runners.portability.fn_api_runner.translations: INFO: 
==================== <function pack_combiners at 0x7f802bb528c8> 
====================
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: 16 [1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: Stages: 
['external_9SqlTransform-beam-external-java-sql-v1--BeamValuesRel_13-Create-Values-Read-CreateSource--Impulse\n
  
SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/Impulse:beam:transform:impulse:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_9SqlTransform-beam-external-java-sql-v1--BeamValuesRel_13-Create-Values-Read-CreateSource--ParDo-Outp\n
  
SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_9SqlTransform-beam-external-java-sql-v1--BeamValuesRel_13-Create-Values-Read-CreateSource--ParDo-Boun\n
  
SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_9SqlTransform-beam-external-java-sql-v1--BeamZetaSqlCalcRel_17-ParDo-Calc--ParMultiDo-Calc-\n
  
SqlTransform(beam:external:java:sql:v1)/BeamZetaSqlCalcRel_17/ParDo(Calc)/ParMultiDo(Calc):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-Create-Impulse_5\n  
assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-Create-FlatMap-lambda-at-core-py-2962-_6\n  
assert_that/Create/FlatMap(<lambda at core.py:2962>):beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-Create-Map-decode-_8\n  
assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-WindowInto-WindowIntoFn-_9\n  
assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-ToVoidKey_10\n  
assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-Group-pair_with_0_12\n  
assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-Group-pair_with_1_13\n  
assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-Group-Flatten_14\n  
assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-Group-GroupByKey_15\n  
assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-Group-Map-_merge_tagged_vals_under_key-_16\n 
 assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-Unkey_17\n  
assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-Match_18\n  
assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>']
apache_beam.runners.portability.fn_api_runner.translations: INFO: 
==================== <function lift_combiners at 0x7f802bb52950> 
====================
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: 16 [1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: Stages: 
['external_9SqlTransform-beam-external-java-sql-v1--BeamValuesRel_13-Create-Values-Read-CreateSource--Impulse\n
  
SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/Impulse:beam:transform:impulse:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_9SqlTransform-beam-external-java-sql-v1--BeamValuesRel_13-Create-Values-Read-CreateSource--ParDo-Outp\n
  
SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_9SqlTransform-beam-external-java-sql-v1--BeamValuesRel_13-Create-Values-Read-CreateSource--ParDo-Boun\n
  
SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_9SqlTransform-beam-external-java-sql-v1--BeamZetaSqlCalcRel_17-ParDo-Calc--ParMultiDo-Calc-\n
  
SqlTransform(beam:external:java:sql:v1)/BeamZetaSqlCalcRel_17/ParDo(Calc)/ParMultiDo(Calc):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-Create-Impulse_5\n  
assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-Create-FlatMap-lambda-at-core-py-2962-_6\n  
assert_that/Create/FlatMap(<lambda at core.py:2962>):beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-Create-Map-decode-_8\n  
assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-WindowInto-WindowIntoFn-_9\n  
assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-ToVoidKey_10\n  
assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-Group-pair_with_0_12\n  
assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-Group-pair_with_1_13\n  
assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-Group-Flatten_14\n  
assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-Group-GroupByKey_15\n  
assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-Group-Map-_merge_tagged_vals_under_key-_16\n 
 assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-Unkey_17\n  
assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-Match_18\n  
assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>']
apache_beam.runners.portability.fn_api_runner.translations: INFO: 
==================== <function sort_stages at 0x7f802bb530d0> 
====================
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: 16 [1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: Stages: 
['external_9SqlTransform-beam-external-java-sql-v1--BeamValuesRel_13-Create-Values-Read-CreateSource--Impulse\n
  
SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/Impulse:beam:transform:impulse:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_9SqlTransform-beam-external-java-sql-v1--BeamValuesRel_13-Create-Values-Read-CreateSource--ParDo-Outp\n
  
SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_9SqlTransform-beam-external-java-sql-v1--BeamValuesRel_13-Create-Values-Read-CreateSource--ParDo-Boun\n
  
SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'external_9SqlTransform-beam-external-java-sql-v1--BeamZetaSqlCalcRel_17-ParDo-Calc--ParMultiDo-Calc-\n
  
SqlTransform(beam:external:java:sql:v1)/BeamZetaSqlCalcRel_17/ParDo(Calc)/ParMultiDo(Calc):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-Create-Impulse_5\n  
assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-Create-FlatMap-lambda-at-core-py-2962-_6\n  
assert_that/Create/FlatMap(<lambda at core.py:2962>):beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-Create-Map-decode-_8\n  
assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-WindowInto-WindowIntoFn-_9\n  
assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-ToVoidKey_10\n  
assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-Group-pair_with_0_12\n  
assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-Group-pair_with_1_13\n  
assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-Group-Flatten_14\n  
assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-Group-GroupByKey_15\n  
assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-Group-Map-_merge_tagged_vals_under_key-_16\n 
 assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-Unkey_17\n  
assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_assert_that-Match_18\n  
assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>']
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 
'job_name' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'runner' 
was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 
'temp_location' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 
'environment_cache_millis' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 
'environment_options' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 
'job_endpoint' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 
'output_executable_path' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 
'sdk_worker_parallelism' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 
'streaming' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 
'experiments' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 
'dataflow_kms_key' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 
'enable_streaming_engine' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'project' 
was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 
'worker_region' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 
'worker_zone' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'zone' 
was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 
'resource_hints' was already added
apache_beam.runners.portability.portable_runner: INFO: Job state changed to 
STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to 
STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to 
RUNNING
root: DEBUG: org.apache.spark.SparkException: Only one SparkContext may be 
running in this JVM (see SPARK-2243). To ignore this error, set 
spark.driver.allowMultipleContexts = true. The currently running SparkContext 
was created at:
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
org.apache.beam.runners.spark.translation.SparkContextFactory.createSparkContext(SparkContextFactory.java:101)
org.apache.beam.runners.spark.translation.SparkContextFactory.getSparkContext(SparkContextFactory.java:67)
org.apache.beam.runners.spark.SparkPipelineRunner.run(SparkPipelineRunner.java:118)
org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)
        at 
org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$2.apply(SparkContext.scala:2489)
        at 
org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$2.apply(SparkContext.scala:2485)
        at scala.Option.foreach(Option.scala:257)
        at 
org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:2485)
        at 
org.apache.spark.SparkContext$.markPartiallyConstructed(SparkContext.scala:2574)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:85)
        at 
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
        at 
org.apache.beam.runners.spark.translation.SparkContextFactory.createSparkContext(SparkContextFactory.java:101)
        at 
org.apache.beam.runners.spark.translation.SparkContextFactory.getSparkContext(SparkContextFactory.java:67)
        at 
org.apache.beam.runners.spark.SparkPipelineRunner.run(SparkPipelineRunner.java:118)
        at 
org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)

root: ERROR: org.apache.spark.SparkException: Only one SparkContext may be 
running in this JVM (see SPARK-2243). To ignore this error, set 
spark.driver.allowMultipleContexts = true. The currently running SparkContext 
was created at:
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
org.apache.beam.runners.spark.translation.SparkContextFactory.createSparkContext(SparkContextFactory.java:101)
org.apache.beam.runners.spark.translation.SparkContextFactory.getSparkContext(SparkContextFactory.java:67)
org.apache.beam.runners.spark.SparkPipelineRunner.run(SparkPipelineRunner.java:118)
org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)
apache_beam.runners.portability.portable_runner: INFO: Job state changed to 
FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangSqlValidateRunner.xml
----------------------------------------------------------------------
XML: 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 9 tests in 120.971s

FAILED (errors=5)

> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingSql 
> FAILED

> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerCleanup
Stopping expansion service pid: 22145.
Stopping expansion service pid: 22148.

> Task :runners:spark:2:job-server:sparkJobServerCleanup
Stopping job server pid: 331.

FAILURE: Build completed with 6 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':runners:spark:2:job-server:validatesCrossLanguageRunnerJavaUsingJava'.
> There were failing tests. See the report at: 
> file://<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/runners/spark/2/job-server/build/reports/tests/validatesCrossLanguageRunnerJavaUsingJava/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':runners:spark:2:job-server:validatesCrossLanguageRunnerJavaUsingPython'.
> There were failing tests. See the report at: 
> file://<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/runners/spark/2/job-server/build/reports/tests/validatesCrossLanguageRunnerJavaUsingPython/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':runners:spark:2:job-server:validatesCrossLanguageRunnerJavaUsingPythonOnly'.
> There were failing tests. See the report at: 
> file://<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/runners/spark/2/job-server/build/reports/tests/validatesCrossLanguageRunnerJavaUsingPythonOnly/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

4: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

5: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

6: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingSql'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 30m 49s
183 actionable tasks: 133 executed, 46 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/5rvrhds7yi4so

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to