See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/2462/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #28656: Update Google Cloud Java Libraries BOM from


------------------------------------------
[...truncated 202.23 KB...]
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

input = None, capture_output = False, timeout = None, check = True
popenargs = 
(['<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/2050596099/bin/python3.11',>
 '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', ...],)
kwargs = {'stderr': -2, 'stdout': -1}
process = <Popen: returncode: 1 args: 
['/home/jenkins/jenkins-slave/workspace/beam_Pos...>
stdout = b'Collecting pyhamcrest!=1.10.0,<2.0.0 (from -r 
/tmp/tmpw_5xz1ve/tmp_requirements.txt (line 1))\n  Using cached 
PyHamc...092db01ec946d6bee086ab8c\n             Got        
e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855\n\n'
stderr = None, retcode = 1

    def run(*popenargs,
            input=None, capture_output=False, timeout=None, check=False, 
**kwargs):
        """Run command with arguments and return a CompletedProcess instance.
    
        The returned instance will have attributes args, returncode, stdout and
        stderr. By default, stdout and stderr are not captured, and those 
attributes
        will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture 
them,
        or pass capture_output=True to capture both.
    
        If check is True and the exit code was non-zero, it raises a
        CalledProcessError. The CalledProcessError object will have the return 
code
        in the returncode attribute, and output & stderr attributes if those 
streams
        were captured.
    
        If timeout is given, and the process takes too long, a TimeoutExpired
        exception will be raised.
    
        There is an optional argument "input", allowing you to
        pass bytes or a string to the subprocess's stdin.  If you use this 
argument
        you may not also use the Popen constructor's "stdin" argument, as
        it will be used internally.
    
        By default, all communication is in bytes, and therefore any "input" 
should
        be bytes, and the stdout and stderr will be bytes. If in text mode, any
        "input" should be a string, and stdout and stderr will be strings 
decoded
        according to locale encoding, or by "encoding" if set. Text mode is
        triggered by setting any of text, encoding, errors or 
universal_newlines.
    
        The other arguments are the same as for the Popen constructor.
        """
        if input is not None:
            if kwargs.get('stdin') is not None:
                raise ValueError('stdin and input arguments may not both be 
used.')
            kwargs['stdin'] = PIPE
    
        if capture_output:
            if kwargs.get('stdout') is not None or kwargs.get('stderr') is not 
None:
                raise ValueError('stdout and stderr arguments may not be used '
                                 'with capture_output.')
            kwargs['stdout'] = PIPE
            kwargs['stderr'] = PIPE
    
        with Popen(*popenargs, **kwargs) as process:
            try:
                stdout, stderr = process.communicate(input, timeout=timeout)
            except TimeoutExpired as exc:
                process.kill()
                if _mswindows:
                    # Windows accumulates the output in a single blocking
                    # read() call run on child threads, with the timeout
                    # being done in a join() on those threads.  communicate()
                    # _after_ kill() is required to collect that and add it
                    # to the exception.
                    exc.stdout, exc.stderr = process.communicate()
                else:
                    # POSIX _communicate already populated the output so
                    # far into the TimeoutExpired exception.
                    process.wait()
                raise
            except:  # Including KeyboardInterrupt, communicate handled that.
                process.kill()
                # We don't call process.wait() as .__exit__ does that for us.
                raise
            retcode = process.poll()
            if check and retcode:
>               raise CalledProcessError(retcode, process.args,
                                         output=stdout, stderr=stderr)
E               subprocess.CalledProcessError: Command 
'['<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/2050596099/bin/python3.11',>
 '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 
'/tmp/tmpw_5xz1ve/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', 
'--implementation', 'cp', '--abi', 'cp311', '--platform', 
'manylinux2014_x86_64']' returned non-zero exit status 1.

/usr/lib/python3.11/subprocess.py:571: CalledProcessError

During handling of the above exception, another exception occurred:

self = 
<apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT 
testMethod=test_bigquery_tornadoes_it>

    @pytest.mark.examples_postcommit
    @pytest.mark.it_postcommit
    def test_bigquery_tornadoes_it(self):
      test_pipeline = TestPipeline(is_integration_test=True)
    
      # Set extra options to the pipeline for test purpose
      project = test_pipeline.get_option('project')
    
      dataset = 'BigQueryTornadoesIT'
      table = 'monthly_tornadoes_%s' % int(round(time.time() * 1000))
      output_table = '.'.join([dataset, table])
      query = 'SELECT month, tornado_count FROM `%s`' % output_table
    
      pipeline_verifiers = [
          PipelineStateMatcher(),
          BigqueryMatcher(
              project=project, query=query, checksum=self.DEFAULT_CHECKSUM)
      ]
      extra_opts = {
          'output': output_table,
          'on_success_matcher': all_of(*pipeline_verifiers)
      }
    
      # Register cleanup before pipeline execution.
      # Note that actual execution happens in reverse order.
      self.addCleanup(utils.delete_bq_table, project, dataset, table)
    
      # Get pipeline options from command argument: --test-pipeline-options,
      # and start pipeline job by calling pipeline main function.
>     
> bigquery_tornadoes.run(test_pipeline.get_full_options_as_args(**extra_opts))

apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py:71: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/examples/cookbook/bigquery_tornadoes.py:90: in run
    with beam.Pipeline(argv=pipeline_args) as p:
apache_beam/pipeline.py:607: in __exit__
    self.result = self.run()
apache_beam/pipeline.py:560: in run
    self._options).run(False)
apache_beam/pipeline.py:584: in run
    return self.runner.run_pipeline(self, self._options)
apache_beam/runners/dataflow/test_dataflow_runner.py:53: in 
run_pipeline
    self.result = super().run_pipeline(pipeline, options)
apache_beam/runners/dataflow/dataflow_runner.py:393: in 
run_pipeline
    artifacts = environments.python_sdk_dependencies(options)
apache_beam/transforms/environments.py:846: in 
python_sdk_dependencies
    return stager.Stager.create_job_resources(
apache_beam/runners/portability/stager.py:240: in 
create_job_resources
    (
apache_beam/utils/retry.py:275: in wrapper
    return fun(*args, **kwargs)
apache_beam/runners/portability/stager.py:763: in 
_populate_requirements_cache
    processes.check_output(cmd_args, stderr=processes.STDOUT)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

args = 
(['<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/2050596099/bin/python3.11',>
 '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', ...],)
kwargs = {'stderr': -2}

    def check_output(*args, **kwargs):
      if force_shell:
        kwargs['shell'] = True
      try:
        out = subprocess.check_output(*args, **kwargs)
      except OSError:
        raise RuntimeError("Executable {} not found".format(args[0]))
      except subprocess.CalledProcessError as error:
        if isinstance(args, tuple) and (args[0][2] == "pip"):
>         raise RuntimeError( \
            "Full traceback: {} \n Pip install failed for package: {} \
            \n Output from execution of subprocess: {}" \
            .format(traceback.format_exc(), args[0][6], error.output))
E         RuntimeError: Full traceback: Traceback (most recent call 
last):
E           File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/utils/processes.py";,>
 line 89, in check_output
E             out = subprocess.check_output(*args, **kwargs)
E                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
E           File "/usr/lib/python3.11/subprocess.py", line 466, in 
check_output
E             return run(*popenargs, stdout=PIPE, timeout=timeout, 
check=True,
E                    
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
E           File "/usr/lib/python3.11/subprocess.py", line 571, in 
run
E             raise CalledProcessError(retcode, process.args,
E         subprocess.CalledProcessError: Command 
'['<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/2050596099/bin/python3.11',>
 '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 
'/tmp/tmpw_5xz1ve/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', 
'--implementation', 'cp', '--abi', 'cp311', '--platform', 
'manylinux2014_x86_64']' returned non-zero exit status 1.
E          
E          Pip install failed for package: -r           
E          Output from execution of subprocess: b'Collecting 
pyhamcrest!=1.10.0,<2.0.0 (from -r /tmp/tmpw_5xz1ve/tmp_requirements.txt (line 
1))\n  Using cached PyHamcrest-1.10.1-py3-none-any.whl (48 kB)\nERROR: THESE 
PACKAGES DO NOT MATCH THE HASHES FROM THE REQUIREMENTS FILE. If you have 
updated the package versions, please update the hashes. Otherwise, examine the 
package contents carefully; someone may have tampered with them.\n    
pyhamcrest!=1.10.0,<2.0.0 from 
https://files.pythonhosted.org/packages/92/1e/a87588eb4e301f7d7a945f7ca6aa430d9fc3f30022aa3984691f6e310162/PyHamcrest-1.10.1-py3-none-any.whl
 (from -r /tmp/tmpw_5xz1ve/tmp_requirements.txt (line 1)):\n        Expected 
sha256 d31a6793c6e5fa47137d7be3e796df3645b0afa0092db01ec946d6bee086ab8c\n       
      Got        
e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855\n\n'

apache_beam/utils/processes.py:94: RuntimeError
------------------------------ Captured log call -------------------------------
INFO     apache_beam.runners.portability.stager:stager.py:762 
Executing command: 
['<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/2050596099/bin/python3.11',>
 '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 
'/tmp/tmpw_5xz1ve/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', 
'--implementation', 'cp', '--abi', 'cp311', '--platform', 
'manylinux2014_x86_64']
INFO     apache_beam.io.gcp.tests.utils:utils.py:93 Clean up a 
BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, 
table: monthly_tornadoes_1697482856244.
=============================== warnings summary 
===============================
../../build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py>:17:
 DeprecationWarning: The distutils package is deprecated and slated for removal 
in Python 3.12. Use setuptools or check PEP 632 for potential alternatives
    from distutils import util

apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100:
 PendingDeprecationWarning: Client.dataset is deprecated and will be removed in 
a future version. Use a string like 'my_project.my_dataset' or a 
cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/examples/complete/game/leader_board_it_test.py::LeaderBoardIT::test_leader_board_it
apache_beam/examples/complete/game/game_stats_it_test.py::GameStatsIT::test_game_stats_it
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63:
 PendingDeprecationWarning: Client.dataset is deprecated and will be removed in 
a future version. Use a string like 'my_project.my_dataset' or a 
cloud.google.bigquery.DatasetReference object, instead.
    dataset_ref = client.dataset(unique_dataset_name, project=project)

apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47:
 FutureWarning: The default value of numeric_only in DataFrame.mean is 
deprecated. In a future version, it will default to False. In addition, 
specifying 'numeric_only=None' is deprecated. Select only valid columns or 
specify the value of numeric_only to silence this warning.
    return airline_df[at_top_airports].mean()

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/pytest_postCommitIT-df-py311-xdist.xml>
 -
=========================== short test summary info 
============================
FAILED 
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
 - RuntimeError: Full traceback: Traceback (most recent call last):
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/utils/processes.py";,>
 line 89, in check_output
    out = subprocess.check_output(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/subprocess.py", line 466, in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/subprocess.py", line 571, in run
    raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command 
'['<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/2050596099/bin/python3.11',>
 '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 
'/tmp/tmpw_5xz1ve/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', 
'--implementation', 'cp', '--abi', 'cp311', '--platform', 
'manylinux2014_x86_64']' returned non-zero exit status 1.
 
 Pip install failed for package: -r           
 Output from execution of subprocess: b'Collecting pyhamcrest!=1.10.0,<2.0.0 
(from -r /tmp/tmpw_5xz1ve/tmp_requirements.txt (line 1))\n  Using cached 
PyHamcrest-1.10.1-py3-none-any.whl (48 kB)\nERROR: THESE PACKAGES DO NOT MATCH 
THE HASHES FROM THE REQUIREMENTS FILE. If you have updated the package 
versions, please update the hashes. Otherwise, examine the package contents 
carefully; someone may have tampered with them.\n    pyhamcrest!=1.10.0,<2.0.0 
from 
https://files.pythonhosted.org/packages/92/1e/a87588eb4e301f7d7a945f7ca6aa430d9fc3f30022aa3984691f6e310162/PyHamcrest-1.10.1-py3-none-any.whl
 (from -r /tmp/tmpw_5xz1ve/tmp_requirements.txt (line 1)):\n        Expected 
sha256 d31a6793c6e5fa47137d7be3e796df3645b0afa0092db01ec946d6bee086ab8c\n       
      Got        
e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855\n\n'
ERROR 
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
 - apache_beam.io.gcp.tests.utils.GcpTestIOError: BigQuery table does not 
exist: apache-beam-testing.BigQueryTornadoesIT.monthly_tornadoes_1697482856244
= 1 failed, 14 passed, 18 skipped, 13 
warnings, 1 error in 1427.00s (0:23:46) ==

> Task :sdks:python:test-suites:dataflow:py311:examples FAILED

FAILURE: Build failed with an exception.

* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
 line: 220

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py311:examples'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 9.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

For more on this, please refer to 
https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings
 in the Gradle documentation.

BUILD FAILED in 29m 7s
11 actionable tasks: 10 executed, 1 from cache

Publishing build scan...
https://ge.apache.org/s/tabb7i2ukrjv2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to