See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/4516/display/redirect?page=changes>

Changes:

[Elliotte Rusty Harold] use toMinutes

[Elliotte Rusty Harold] update test

[noreply] Change kafka table provider properties structure. (#14507)


------------------------------------------
[...truncated 647.42 KB...]
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
ERROR:root:java.lang.NullPointerException
------------------------------ Captured log call -------------------------------
WARNING  root:environments.py:316 Make sure that locally built Python SDK 
docker image has Python 3.8 interpreter.
ERROR    root:portable_runner.py:569 java.lang.NullPointerException
_______________ SparkRunnerTest.test_windowed_pardo_state_timers _______________

self = <apache_beam.runners.portability.spark_runner_test.SparkRunnerTest 
testMethod=test_windowed_pardo_state_timers>

    def test_windowed_pardo_state_timers(self):
>     self._run_pardo_state_timers(windowed=True)

apache_beam/runners/portability/fn_api_runner/fn_runner_test.py:425: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/runners/portability/fn_api_runner/fn_runner_test.py:490: in 
_run_pardo_state_timers
    assert_that(actual, is_buffered_correctly)
apache_beam/pipeline.py:580: in __exit__
    self.result.wait_until_finish()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <apache_beam.runners.portability.portable_runner.PipelineResult object 
at 0x7fa836759400>
duration = None

    def wait_until_finish(self, duration=None):
      """
      :param duration: The maximum time in milliseconds to wait for the result 
of
      the execution. If None or zero, will wait until the pipeline finishes.
      :return: The result of the pipeline, i.e. PipelineResult.
      """
      def read_messages():
        # type: () -> None
        previous_state = -1
        for message in self._message_stream:
          if message.HasField('message_response'):
            logging.log(
                MESSAGE_LOG_LEVELS[message.message_response.importance],
                "%s",
                message.message_response.message_text)
          else:
            current_state = message.state_response.state
            if current_state != previous_state:
              _LOGGER.info(
                  "Job state changed to %s",
                  self._runner_api_state_to_pipeline_state(current_state))
              previous_state = current_state
          self._messages.append(message)
    
      message_thread = threading.Thread(
          target=read_messages, name='wait_until_finish_read')
      message_thread.daemon = True
      message_thread.start()
    
      if duration:
        state_thread = threading.Thread(
            target=functools.partial(self._observe_state, message_thread),
            name='wait_until_finish_state_observer')
        state_thread.daemon = True
        state_thread.start()
        start_time = time.time()
        duration_secs = duration / 1000
        while (time.time() - start_time < duration_secs and
               state_thread.is_alive()):
          time.sleep(1)
      else:
        self._observe_state(message_thread)
    
      if self._runtime_exception:
>       raise self._runtime_exception
E       RuntimeError: Pipeline 
test_windowed_pardo_state_timers_1618855633.7357237_64ef2e18-2788-43c5-b7c2-4ea68dc0b5ef
 failed in state FAILED: java.lang.NullPointerException

apache_beam/runners/portability/portable_runner.py:602: RuntimeError
----------------------------- Captured stderr call -----------------------------
WARNING:root:Make sure that locally built Python SDK docker image has Python 
3.8 interpreter.
21/04/19 18:07:14 INFO 
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging 
artifacts for job_aa3c20ba-affd-43ee-b73c-7a7234592b15.
21/04/19 18:07:14 INFO 
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving 
artifacts for 
job_aa3c20ba-affd-43ee-b73c-7a7234592b15.ref_Environment_default_environment_1.
21/04/19 18:07:14 INFO 
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 0 
artifacts for job_aa3c20ba-affd-43ee-b73c-7a7234592b15.null.
21/04/19 18:07:14 INFO 
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts 
fully staged for job_aa3c20ba-affd-43ee-b73c-7a7234592b15.
21/04/19 18:07:14 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking 
job 
test_windowed_pardo_state_timers_1618855633.7357237_64ef2e18-2788-43c5-b7c2-4ea68dc0b5ef
21/04/19 18:07:14 INFO org.apache.beam.runners.jobsubmission.JobInvocation: 
Starting job invocation 
test_windowed_pardo_state_timers_1618855633.7357237_64ef2e18-2788-43c5-b7c2-4ea68dc0b5ef
21/04/19 18:07:14 ERROR org.apache.beam.runners.jobsubmission.JobInvocation: 
Error during job invocation 
test_windowed_pardo_state_timers_1618855633.7357237_64ef2e18-2788-43c5-b7c2-4ea68dc0b5ef.
java.lang.NullPointerException
        at 
org.apache.beam.runners.spark.SparkPipelineRunner.run(SparkPipelineRunner.java:120)
        at 
org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
ERROR:root:java.lang.NullPointerException
------------------------------ Captured log call -------------------------------
WARNING  root:environments.py:316 Make sure that locally built Python SDK 
docker image has Python 3.8 interpreter.
ERROR    root:portable_runner.py:569 java.lang.NullPointerException
________________________ SparkRunnerTest.test_windowing ________________________

self = <apache_beam.runners.portability.spark_runner_test.SparkRunnerTest 
testMethod=test_windowing>

    def test_windowing(self):
      with self.create_pipeline() as p:
        res = (
            p
            | beam.Create([1, 2, 100, 101, 102])
            | beam.Map(lambda t: window.TimestampedValue(('k', t), t))
            | beam.WindowInto(beam.transforms.window.Sessions(10))
            | beam.GroupByKey()
            | beam.Map(lambda k_vs1: (k_vs1[0], sorted(k_vs1[1]))))
>       assert_that(res, equal_to([('k', [1, 2]), ('k', [100, 101, 102])]))

apache_beam/runners/portability/fn_api_runner/fn_runner_test.py:777: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/pipeline.py:580: in __exit__
    self.result.wait_until_finish()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <apache_beam.runners.portability.portable_runner.PipelineResult object 
at 0x7fa836486640>
duration = None

    def wait_until_finish(self, duration=None):
      """
      :param duration: The maximum time in milliseconds to wait for the result 
of
      the execution. If None or zero, will wait until the pipeline finishes.
      :return: The result of the pipeline, i.e. PipelineResult.
      """
      def read_messages():
        # type: () -> None
        previous_state = -1
        for message in self._message_stream:
          if message.HasField('message_response'):
            logging.log(
                MESSAGE_LOG_LEVELS[message.message_response.importance],
                "%s",
                message.message_response.message_text)
          else:
            current_state = message.state_response.state
            if current_state != previous_state:
              _LOGGER.info(
                  "Job state changed to %s",
                  self._runner_api_state_to_pipeline_state(current_state))
              previous_state = current_state
          self._messages.append(message)
    
      message_thread = threading.Thread(
          target=read_messages, name='wait_until_finish_read')
      message_thread.daemon = True
      message_thread.start()
    
      if duration:
        state_thread = threading.Thread(
            target=functools.partial(self._observe_state, message_thread),
            name='wait_until_finish_state_observer')
        state_thread.daemon = True
        state_thread.start()
        start_time = time.time()
        duration_secs = duration / 1000
        while (time.time() - start_time < duration_secs and
               state_thread.is_alive()):
          time.sleep(1)
      else:
        self._observe_state(message_thread)
    
      if self._runtime_exception:
>       raise self._runtime_exception
E       RuntimeError: Pipeline 
test_windowing_1618855634.3258946_4430f286-26ab-4ffd-8c29-f8d6eb993fef failed 
in state FAILED: java.lang.NullPointerException

apache_beam/runners/portability/portable_runner.py:602: RuntimeError
----------------------------- Captured stderr call -----------------------------
WARNING:root:Make sure that locally built Python SDK docker image has Python 
3.8 interpreter.
21/04/19 18:07:14 INFO 
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging 
artifacts for job_23916304-01f7-43e3-a09b-ab63464f9356.
21/04/19 18:07:14 INFO 
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving 
artifacts for 
job_23916304-01f7-43e3-a09b-ab63464f9356.ref_Environment_default_environment_1.
21/04/19 18:07:14 INFO 
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 0 
artifacts for job_23916304-01f7-43e3-a09b-ab63464f9356.null.
21/04/19 18:07:14 INFO 
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts 
fully staged for job_23916304-01f7-43e3-a09b-ab63464f9356.
21/04/19 18:07:14 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking 
job test_windowing_1618855634.3258946_4430f286-26ab-4ffd-8c29-f8d6eb993fef
21/04/19 18:07:14 INFO org.apache.beam.runners.jobsubmission.JobInvocation: 
Starting job invocation 
test_windowing_1618855634.3258946_4430f286-26ab-4ffd-8c29-f8d6eb993fef
21/04/19 18:07:14 ERROR org.apache.beam.runners.jobsubmission.JobInvocation: 
Error during job invocation 
test_windowing_1618855634.3258946_4430f286-26ab-4ffd-8c29-f8d6eb993fef.
java.lang.NullPointerException
        at 
org.apache.beam.runners.spark.SparkPipelineRunner.run(SparkPipelineRunner.java:120)
        at 
org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
ERROR:root:java.lang.NullPointerException
------------------------------ Captured log call -------------------------------
WARNING  root:environments.py:316 Make sure that locally built Python SDK 
docker image has Python 3.8 interpreter.
ERROR    root:portable_runner.py:569 java.lang.NullPointerException
=============================== warnings summary ===============================
target/.tox-spark-runner-test/spark-runner-test/lib/python3.8/site-packages/tenacity/_asyncio.py:42
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/target/.tox-spark-runner-test/spark-runner-test/lib/python3.8/site-packages/tenacity/_asyncio.py>:42:
 DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use 
"async def" instead
    def call(self, fn, *args, **kwargs):

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/pytest_spark-runner-test.xml>
 -
========= 33 failed, 1 passed, 17 skipped, 1 warnings in 23.83 seconds =========
ERROR: InvocationError for command 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/scripts/pytest_validates_runner.sh>
 spark-runner-test apache_beam/runners/portability/spark_runner_test.py 
'--spark_job_server_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.30.0-SNAPSHOT.jar>
 --environment_type=LOOPBACK' (exited with code 1)
spark-runner-test run-test-post: commands[0] | bash 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/scripts/run_tox_cleanup.sh>
___________________________________ summary 
____________________________________
ERROR:   spark-runner-test: commands failed

> Task :sdks:python:test-suites:portable:py38:sparkCompatibilityMatrixLOOPBACK 
> FAILED

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':sdks:python:test-suites:portable:py36:sparkCompatibilityMatrixLOOPBACK'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':sdks:python:test-suites:portable:py37:sparkCompatibilityMatrixLOOPBACK'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':sdks:python:test-suites:portable:py38:sparkCompatibilityMatrixLOOPBACK'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 58s
70 actionable tasks: 46 executed, 24 from cache

Publishing build scan...
https://gradle.com/s/gbhenwghx25ye

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to