See
<https://builds.apache.org/job/beam_PostCommit_XVR_Spark/517/display/redirect?page=changes>
Changes:
[github] [BEAM-4374] Short IDs for the Python SDK (#11286)
------------------------------------------
[...truncated 965.65 KB...]
java.lang.Thread.run(Thread.java:748)
-------------------- >> begin captured logging << --------------------
root: WARNING: Make sure that locally built Python SDK docker image has Python
2.7 interpreter.
root: INFO: Using Python SDK docker image:
apache/beam_python2.7_sdk:2.21.0.dev. If the image is not available at local,
we will try to pull from hub.docker.com
root: WARNING: Make sure that locally built Python SDK docker image has Python
2.7 interpreter.
root: INFO: Using Python SDK docker image:
apache/beam_python2.7_sdk:2.21.0.dev. If the image is not available at local,
we will try to pull from hub.docker.com
apache_beam.runners.portability.fn_api_runner.translations: INFO:
==================== <function lift_combiners at 0x7ff6d23f4c08>
====================
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: 35 [1, 1, 1,
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: Stages:
['ref_AppliedPTransform_Create/Impulse_3\n
Create/Impulse:beam:transform:impulse:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2644>)_4\n
Create/FlatMap(<lambda at core.py:2644>):beam:transform:pardo:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n
Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14\n
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15\n
Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create/Map(decode)_16\n
Create/Map(decode):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'external_9_AppliedPTransform_ExternalTransform(beam:transforms:xlang:test:partition)/Partition(CallableWrapperPartitionFn)/ParDo(ApplyPartitionFnFn)/ParDo(ApplyPartitionFnFn)_5\n
ExternalTransform(beam:transforms:xlang:test:partition)/Partition(CallableWrapperPartitionFn)/ParDo(ApplyPartitionFnFn)/ParDo(ApplyPartitionFnFn):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'external_9_AppliedPTransform_ExternalTransform(beam:transforms:xlang:test:partition)/Map(<lambda
at expansion_service_test.py:216>)_6\n
ExternalTransform(beam:transforms:xlang:test:partition)/Map(<lambda at
expansion_service_test.py:216>):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'external_9_AppliedPTransform_ExternalTransform(beam:transforms:xlang:test:partition)/Map(<lambda
at expansion_service_test.py:217>)_7\n
ExternalTransform(beam:transforms:xlang:test:partition)/Map(<lambda at
expansion_service_test.py:217>):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_check_even/Create/Impulse_20\n
check_even/Create/Impulse:beam:transform:impulse:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_check_even/Create/FlatMap(<lambda at core.py:2644>)_21\n
check_even/Create/FlatMap(<lambda at core.py:2644>):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_check_even/Create/Map(decode)_23\n
check_even/Create/Map(decode):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_check_even/WindowInto(WindowIntoFn)_24\n
check_even/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_check_even/ToVoidKey_25\n
check_even/ToVoidKey:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_check_even/Group/pair_with_0_27\n
check_even/Group/pair_with_0:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_check_even/Group/pair_with_1_28\n
check_even/Group/pair_with_1:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_check_even/Group/Flatten_29\n
check_even/Group/Flatten:beam:transform:flatten:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_check_even/Group/GroupByKey_30\n
check_even/Group/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_check_even/Group/Map(_merge_tagged_vals_under_key)_34\n
check_even/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_check_even/Unkey_35\n
check_even/Unkey:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_check_even/Match_36\n
check_even/Match:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_check_odd/Create/Impulse_39\n
check_odd/Create/Impulse:beam:transform:impulse:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_check_odd/Create/FlatMap(<lambda at core.py:2644>)_40\n
check_odd/Create/FlatMap(<lambda at core.py:2644>):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_check_odd/Create/Map(decode)_42\n
check_odd/Create/Map(decode):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_check_odd/WindowInto(WindowIntoFn)_43\n
check_odd/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_check_odd/ToVoidKey_44\n
check_odd/ToVoidKey:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_check_odd/Group/pair_with_0_46\n
check_odd/Group/pair_with_0:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_check_odd/Group/pair_with_1_47\n
check_odd/Group/pair_with_1:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_check_odd/Group/Flatten_48\n
check_odd/Group/Flatten:beam:transform:flatten:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_check_odd/Group/GroupByKey_49\n
check_odd/Group/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_check_odd/Group/Map(_merge_tagged_vals_under_key)_53\n
check_odd/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_check_odd/Unkey_54\n
check_odd/Unkey:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_check_odd/Match_55\n
check_odd/Match:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>']
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'job_name' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'runner'
was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'temp_location' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'experiments' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'streaming' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'dataflow_kms_key' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'enable_streaming_engine' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'project'
was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'worker_region' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'worker_zone' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'zone'
was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'environment_cache_millis' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'files_to_stage' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'job_endpoint' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'output_executable_path' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'sdk_worker_parallelism' was already added
apache_beam.runners.portability.portable_runner: INFO: Job state changed to
STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to
STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to
RUNNING
root: DEBUG: org.apache.spark.SparkException: Only one SparkContext may be
running in this JVM (see SPARK-2243). To ignore this error, set
spark.driver.allowMultipleContexts = true. The currently running SparkContext
was created at:
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
org.apache.beam.runners.spark.translation.SparkContextFactory.createSparkContext(SparkContextFactory.java:98)
org.apache.beam.runners.spark.translation.SparkContextFactory.getSparkContext(SparkContextFactory.java:64)
org.apache.beam.runners.spark.SparkPipelineRunner.run(SparkPipelineRunner.java:108)
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:83)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)
at
org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$2.apply(SparkContext.scala:2483)
at
org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$2.apply(SparkContext.scala:2479)
at scala.Option.foreach(Option.scala:257)
at
org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:2479)
at
org.apache.spark.SparkContext$.markPartiallyConstructed(SparkContext.scala:2568)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:85)
at
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
at
org.apache.beam.runners.spark.translation.SparkContextFactory.createSparkContext(SparkContextFactory.java:98)
at
org.apache.beam.runners.spark.translation.SparkContextFactory.getSparkContext(SparkContextFactory.java:64)
at
org.apache.beam.runners.spark.SparkPipelineRunner.run(SparkPipelineRunner.java:108)
at
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:83)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
root: ERROR: org.apache.spark.SparkException: Only one SparkContext may be
running in this JVM (see SPARK-2243). To ignore this error, set
spark.driver.allowMultipleContexts = true. The currently running SparkContext
was created at:
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
org.apache.beam.runners.spark.translation.SparkContextFactory.createSparkContext(SparkContextFactory.java:98)
org.apache.beam.runners.spark.translation.SparkContextFactory.getSparkContext(SparkContextFactory.java:64)
org.apache.beam.runners.spark.SparkPipelineRunner.run(SparkPipelineRunner.java:108)
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:83)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)
apache_beam.runners.portability.portable_runner: INFO: Job state changed to
FAILED
--------------------- >> end captured logging << ---------------------
======================================================================
ERROR: test_prefix
(apache_beam.transforms.validate_runner_xlang_test.ValidateRunnerXlangTest)
----------------------------------------------------------------------
Traceback (most recent call last):
File
"<https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/transforms/validate_runner_xlang_test.py",>
line 154, in test_prefix
CrossLanguageTestPipelines().run_prefix(test_pipeline)
File
"<https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/transforms/validate_runner_xlang_test.py",>
line 58, in run_prefix
assert_that(res, equal_to(['0a', '0b']))
File
"<https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/pipeline.py",>
line 525, in __exit__
self.run().wait_until_finish()
File
"<https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",>
line 114, in run
state = result.wait_until_finish()
File
"<https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",>
line 550, in wait_until_finish
(self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline
BeamApp-jenkins-0402231730-fe0081a5_dedbc267-2062-44d0-82b4-de1fdf570f1d failed
in state FAILED: org.apache.spark.SparkException: Job 0 cancelled because
SparkContext was shut down
-------------------- >> begin captured logging << --------------------
root: WARNING: Make sure that locally built Python SDK docker image has Python
2.7 interpreter.
root: INFO: Using Python SDK docker image:
apache/beam_python2.7_sdk:2.21.0.dev. If the image is not available at local,
we will try to pull from hub.docker.com
root: WARNING: Make sure that locally built Python SDK docker image has Python
2.7 interpreter.
root: INFO: Using Python SDK docker image:
apache/beam_python2.7_sdk:2.21.0.dev. If the image is not available at local,
we will try to pull from hub.docker.com
apache_beam.runners.portability.fn_api_runner.translations: INFO:
==================== <function lift_combiners at 0x7ff6d23f4c08>
====================
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: 21 [1, 1, 1,
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: Stages:
['ref_AppliedPTransform_Create/Impulse_3\n
Create/Impulse:beam:transform:impulse:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2644>)_4\n
Create/FlatMap(<lambda at core.py:2644>):beam:transform:pardo:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n
Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14\n
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15\n
Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_Create/Map(decode)_16\n
Create/Map(decode):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'external_10_AppliedPTransform_ExternalTransform(beam:transforms:xlang:test:prefix)/TestLabel_3\n
ExternalTransform(beam:transforms:xlang:test:prefix)/TestLabel:beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Create/Impulse_20\n
assert_that/Create/Impulse:beam:transform:impulse:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at
core.py:2644>)_21\n assert_that/Create/FlatMap(<lambda at
core.py:2644>):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Create/Map(decode)_23\n
assert_that/Create/Map(decode):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_24\n
assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/ToVoidKey_25\n
assert_that/ToVoidKey:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/pair_with_0_27\n
assert_that/Group/pair_with_0:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/pair_with_1_28\n
assert_that/Group/pair_with_1:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/Flatten_29\n
assert_that/Group/Flatten:beam:transform:flatten:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/GroupByKey_30\n
assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_34\n
assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Unkey_35\n
assert_that/Unkey:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Match_36\n
assert_that/Match:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>']
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'job_name' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'runner'
was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'temp_location' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'experiments' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'streaming' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'dataflow_kms_key' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'enable_streaming_engine' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'project'
was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'worker_region' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'worker_zone' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'zone'
was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'environment_cache_millis' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'files_to_stage' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'job_endpoint' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'output_executable_path' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option
'sdk_worker_parallelism' was already added
apache_beam.runners.portability.portable_runner: INFO: Job state changed to
STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to
STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to
RUNNING
root: DEBUG: java.lang.RuntimeException: org.apache.spark.SparkException: Job 0
cancelled because SparkContext was shut down
at
org.apache.beam.runners.spark.SparkPipelineResult.runtimeExceptionFrom(SparkPipelineResult.java:58)
at
org.apache.beam.runners.spark.SparkPipelineResult.beamExceptionFrom(SparkPipelineResult.java:75)
at
org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:102)
at
org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:90)
at
org.apache.beam.runners.spark.SparkPipelineRunner.run(SparkPipelineRunner.java:139)
at
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:83)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.spark.SparkException: Job 0 cancelled because
SparkContext was shut down
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:933)
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:931)
at scala.collection.mutable.HashSet.foreach(HashSet.scala:78)
at
org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:931)
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:2130)
at org.apache.spark.util.EventLoop.stop(EventLoop.scala:84)
at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:2043)
at
org.apache.spark.SparkContext$$anonfun$stop$6.apply$mcV$sp(SparkContext.scala:1949)
at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1340)
at org.apache.spark.SparkContext.stop(SparkContext.scala:1948)
at
org.apache.spark.SparkContext$$anonfun$2.apply$mcV$sp(SparkContext.scala:575)
at
org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:216)
at
org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188)
at
org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
at
org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1945)
at
org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188)
at
org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
at
org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
at scala.util.Try$.apply(Try.scala:192)
at
org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
at
org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
at
org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:738)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2061)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2082)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2101)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2126)
at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:972)
at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:970)
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:385)
at org.apache.spark.rdd.RDD.foreach(RDD.scala:970)
at
org.apache.spark.api.java.JavaRDDLike$class.foreach(JavaRDDLike.scala:351)
at
org.apache.spark.api.java.AbstractJavaRDDLike.foreach(JavaRDDLike.scala:45)
at
org.apache.beam.runners.spark.translation.BoundedDataset.action(BoundedDataset.java:124)
at
org.apache.beam.runners.spark.translation.SparkTranslationContext.computeOutputs(SparkTranslationContext.java:82)
at
org.apache.beam.runners.spark.SparkPipelineRunner.lambda$run$1(SparkPipelineRunner.java:126)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more
root: ERROR: org.apache.spark.SparkException: Job 0 cancelled because
SparkContext was shut down
apache_beam.runners.portability.portable_runner: INFO: Job state changed to
FAILED
--------------------- >> end captured logging << ---------------------
----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML:
<https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 10 tests in 38.294s
FAILED (errors=7)
> Task :runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython
> FAILED
> Task :runners:spark:job-server:validatesCrossLanguageRunnerCleanup
> Task :runners:spark:job-server:sparkJobServerCleanup
FAILURE: Build completed with 4 failures.
1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task
':runners:spark:job-server:validatesCrossLanguageRunnerJavaUsingJava'.
> There were failing tests. See the report at:
> file://<https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/runners/spark/job-server/build/reports/tests/validatesCrossLanguageRunnerJavaUsingJava/index.html>
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task
':runners:spark:job-server:validatesCrossLanguageRunnerJavaUsingPython'.
> There were failing tests. See the report at:
> file://<https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/runners/spark/job-server/build/reports/tests/validatesCrossLanguageRunnerJavaUsingPython/index.html>
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task
':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
4: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task
':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 18m 30s
104 actionable tasks: 79 executed, 23 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/dey4mvd7i6zxo
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]