See 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/2282/display/redirect?page=changes>

Changes:

[noreply] Update release notes for 2.29.0


------------------------------------------
[...truncated 1.55 MB...]
        at 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
        at 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
        at 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
        at 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
        at 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
        at org.apache.spark.scheduler.Task.run(Task.scala:123)
        at 
org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
        ... 3 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for 
instruction 3: Traceback (most recent call last):
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/internal/pickler.py", line 
283, in loads
    return dill.loads(s)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 275, in 
loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 827, in 
_import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/transforms/sql_test.py", 
line 26, in <module>
    import pytest
ModuleNotFoundError: No module named 'pytest'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 292, in _execute
    response = task()
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 365, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 610, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 641, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 470, in get
    self.data_channel_factory)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 869, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 926, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)])
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 925, in <listcomp>
    get_operation(transform_id))) for transform_id in sorted(
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 813, in wrapper
    result = cache[args] = func(*args)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 907, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 907, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 905, in <listcomp>
    tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 813, in wrapper
    result = cache[args] = func(*args)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 907, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 907, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 905, in <listcomp>
    tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 813, in wrapper
    result = cache[args] = func(*args)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 910, in get_operation
    transform_id, transform_consumers)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 1199, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 1547, in create_par_do
    parameter)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 1583, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/internal/pickler.py", line 
287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 275, in 
loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 827, in 
_import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/transforms/sql_test.py", 
line 26, in <module>
    import pytest
ModuleNotFoundError: No module named 'pytest'

        at 
org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:180)
        at 
org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:160)
        at 
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
        at 
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
        at 
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
        at 
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
        at 
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
        at 
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
        at 
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
        at 
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
        ... 3 more

root: ERROR: java.lang.RuntimeException: Error received from SDK harness for 
instruction 3: Traceback (most recent call last):
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/internal/pickler.py", line 
283, in loads
    return dill.loads(s)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 275, in 
loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 827, in 
_import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/transforms/sql_test.py", 
line 26, in <module>
    import pytest
ModuleNotFoundError: No module named 'pytest'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 292, in _execute
    response = task()
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 365, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 610, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 641, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 470, in get
    self.data_channel_factory)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 869, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 926, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)])
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 925, in <listcomp>
    get_operation(transform_id))) for transform_id in sorted(
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 813, in wrapper
    result = cache[args] = func(*args)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 907, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 907, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 905, in <listcomp>
    tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 813, in wrapper
    result = cache[args] = func(*args)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 907, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 907, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 905, in <listcomp>
    tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 813, in wrapper
    result = cache[args] = func(*args)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 910, in get_operation
    transform_id, transform_consumers)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 1199, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 1547, in create_par_do
    parameter)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 1583, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/internal/pickler.py", line 
287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 275, in 
loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 827, in 
_import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/transforms/sql_test.py", 
line 26, in <module>
    import pytest
ModuleNotFoundError: No module named 'pytest'

apache_beam.runners.portability.portable_runner: INFO: Job state changed to 
FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangSqlValidateRunner.xml
----------------------------------------------------------------------
XML: 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 9 tests in 304.029s

FAILED (errors=2)

> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingSql 
> FAILED

> Task :runners:spark:2:job-server:validatesCrossLanguageRunnerCleanup
Stopping expansion service pid: 25071.
Stopping expansion service pid: 25074.

> Task :runners:spark:2:job-server:sparkJobServerCleanup
Stopping job server pid: 22984.

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':runners:spark:2:job-server:validatesCrossLanguageRunnerPythonUsingSql'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 40m 20s
183 actionable tasks: 133 executed, 46 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/fwi2ienffo3ce

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to