See
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/942/display/redirect?page=changes>
Changes:
[zyichi] Allow Nexmark launcher to publish human-readable events to pubsub.
[irvi.fa] [BEAM-10753] Add Slack link invitation on README
[Luke Cwik] [BEAM-10756] Fix empty pull response to not ack and to not throw
[noreply] Merge pull request #12597: [BEAM-10685] Added integration test for
[ettarapp] clarifying unclear comments
[noreply] Update README.md
[noreply] [BEAM-3301] Adding SDF Go Dataflow translation. (#12629)
[noreply] [BEAM-10752] Use TestPubsubSignal in PubsubToBigqueryIT (#12625)
------------------------------------------
[...truncated 5.27 MB...]
"is_pair_like": true,
"pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"output_name": "None",
"user_name": "assert_that/Match.out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "None",
"step_name": "s20"
},
"serialized_fn": "ref_AppliedPTransform_assert_that/Match_30",
"user_name": "assert_that/Match"
}
}
],
"type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
createTime: u'2020-08-19T18:59:48.351369Z'
currentStateTime: u'1970-01-01T00:00:00Z'
id: u'2020-08-19_11_59_47-2065660926116299971'
location: u'us-central1'
name: u'beamapp-jenkins-0819185938-071243'
projectId: u'apache-beam-testing'
stageStates: []
startTime: u'2020-08-19T18:59:48.351369Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id:
[2020-08-19_11_59_47-2065660926116299971]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job:
2020-08-19_11_59_47-2065660926116299971
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow
monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-19_11_59_47-2065660926116299971?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely
for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job
2020-08-19_11_59_47-2065660926116299971 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:47.347Z:
JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job
2020-08-19_11_59_47-2065660926116299971.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:47.347Z:
JOB_MESSAGE_DETAILED: Autoscaling is enabled for job
2020-08-19_11_59_47-2065660926116299971. The number of workers will be between
1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:47.347Z:
JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine.
Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:52.876Z:
JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:53.652Z:
JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable
parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:53.715Z:
JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into
optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:53.933Z:
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:53.980Z:
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:54.008Z:
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not
followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:54.044Z:
JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into
optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:54.088Z:
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write
steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:54.179Z:
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:54.287Z:
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:54.352Z:
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:54.394Z:
JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:54.417Z:
JOB_MESSAGE_DETAILED: Fusing unzipped copy of
assert_that/Group/GroupByKey/WriteStream, through flatten
assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:54.450Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream
into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:54.486Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:54.523Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:54.573Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:54.601Z:
JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into
Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:54.630Z:
JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:54.661Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into
Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:54.684Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into
assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:54.711Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into
assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:54.749Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets
into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:54.784Z:
JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/Map(_merge_tagged_vals_under_key) into
assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:54.816Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into
assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:54.851Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:54.876Z:
JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2826>)
into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:54.913Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at
core.py:2826>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:54.935Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into
assert_that/Create/FlatMap(<lambda at core.py:2826>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:54.965Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into
assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:54.990Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at
core.py:2826>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:55.026Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into
Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:55.058Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:55.119Z:
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:55.184Z:
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:55.243Z:
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:55.303Z:
JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:57.873Z:
JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:57.928Z:
JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T18:59:57.966Z:
JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T19:00:22.919Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that
the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T19:00:24.553Z:
JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric
descriptors and Stackdriver will not create new Dataflow custom metrics for
this job. Each unique user-defined metric name (independent of the DoFn in
which it is defined) produces a new metric descriptor. To delete old / unused
metric descriptors see
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T19:01:01.590Z:
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T19:01:01.625Z:
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T19:07:02.063Z:
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T19:07:02.125Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T19:07:02.166Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T19:07:02.221Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T19:07:02.255Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T19:07:56.067Z:
JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on
low average worker CPU utilization, and the pipeline having sufficiently low
backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T19:07:56.112Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T19:07:56.146Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job
2020-08-19_11_59_47-2065660926116299971 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T19:13:33.317Z:
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T19:13:33.416Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T19:13:33.470Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T19:13:33.525Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T19:13:33.558Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T19:14:15.329Z:
JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on
low average worker CPU utilization, and the pipeline having sufficiently low
backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T19:14:15.394Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-19T19:14:15.484Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job
2020-08-19_11_51_16-4637314372768157291 is in state JOB_STATE_DONE
test_reshuffle_preserves_timestamps
(apache_beam.transforms.util_test.ReshuffleTest) ... ok
======================================================================
ERROR: test_iterable_side_input
(apache_beam.transforms.sideinputs_test.SideInputsTest)
----------------------------------------------------------------------
Traceback (most recent call last):
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/transforms/sideinputs_test.py",>
line 196, in test_iterable_side_input
pipeline.run()
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",>
line 112, in run
False if self.not_use_test_runner_api else test_runner_api))
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",>
line 521, in run
allow_proto_holders=True).run(False)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",>
line 534, in run
return self.runner.run_pipeline(self, self._options)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",>
line 57, in run_pipeline
self).run_pipeline(pipeline, options)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",>
line 479, in run_pipeline
artifacts=environments.python_sdk_dependencies(options)))
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/transforms/environments.py",>
line 613, in python_sdk_dependencies
staged_name in stager.Stager.create_job_resources(options, tmp_dir))
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/portability/stager.py",>
line 172, in create_job_resources
setup_options.requirements_file, requirements_cache_path)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/utils/retry.py",>
line 236, in wrapper
return fun(*args, **kwargs)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/portability/stager.py",>
line 558, in _populate_requirements_cache
processes.check_output(cmd_args, stderr=processes.STDOUT)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/utils/processes.py",>
line 99, in check_output
.format(traceback.format_exc(), args[0][6], error.output))
RuntimeError: Full traceback: Traceback (most recent call last):
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/utils/processes.py",>
line 91, in check_output
out = subprocess.check_output(*args, **kwargs)
File "/usr/lib/python2.7/subprocess.py", line 574, in check_output
raise CalledProcessError(retcode, cmd, output=output)
CalledProcessError: Command
'['<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/bin/python',>
'-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r',
'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']'
returned non-zero exit status 1
Pip install failed for package: -r
Output from execution of subprocess: DEPRECATION: Python 2.7 reached the end
of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 is
no longer maintained. pip 21.0 will drop support for Python 2.7 in January
2021. More details about Python 2 support in pip can be found at
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Collecting pyhamcrest!=1.10.0,<2.0.0
File was already downloaded
/tmp/dataflow-requirements-cache/PyHamcrest-1.10.1.tar.gz
Collecting mock<3.0.0
File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
ERROR: Command errored out with exit status 1:
command:
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/bin/python>
-c 'import sys, setuptools, tokenize; sys.argv[0] =
'"'"'/tmp/pip-download-UNwiY5/mock/setup.py'"'"';
__file__='"'"'/tmp/pip-download-UNwiY5/mock/setup.py'"'"';f=getattr(tokenize,
'"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"',
'"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info
--egg-base /tmp/pip-pip-egg-info-0I0slc
cwd: /tmp/pip-download-UNwiY5/mock/
Complete output (12 lines):
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/manager.py>:395:
RuntimeWarning: Unable to load plugin beam_test_plugin =
test_config:BeamTestPlugin: No module named test_config
RuntimeWarning)
ERROR:root:Error parsing
Traceback (most recent call last):
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/pbr/core.py",>
line 96, in pbr
attrs = util.cfg_to_args(path, dist.script_args)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/pbr/util.py",>
line 273, in cfg_to_args
kwargs = setup_cfg_to_setup_kwargs(config, script_args)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/pbr/util.py",>
line 335, in setup_cfg_to_setup_kwargs
description_file = io.open(filename, encoding='utf-8')
IOError: [Errno 2] No such file or directory: 'README.rst'
error in setup command: Error parsing
/tmp/pip-download-UNwiY5/mock/setup.cfg: IOError: [Errno 2] No such file or
directory: 'README.rst'
----------------------------------------
ERROR: Command errored out with exit status 1: python setup.py egg_info Check
the logs for full command output.
-------------------- >> begin captured logging << --------------------
apache_beam.runners.portability.stager: INFO: Executing command:
['<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/bin/python',>
'-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r',
'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
--------------------- >> end captured logging << ---------------------
----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df-py27.xml
----------------------------------------------------------------------
XML:
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2452.019s
FAILED (errors=1)
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-19_11_34_03-17928302912384551122?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-19_11_42_37-13870694828124691244?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-19_11_51_16-4637314372768157291?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-19_11_34_02-1777669174619017140?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-19_11_42_36-18308924120840762979?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-19_11_51_16-15729121810703375010?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-19_11_59_47-2065660926116299971?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-19_11_34_02-2590283008582156702?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-19_11_41_55-6462671989878904527?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-19_11_49_30-3011089566372207975?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-19_11_34_01-842684077288773637?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-19_11_44_21-6057857156988794721?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-19_11_34_03-3792594628270338755?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-19_11_42_36-960974216227116747?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-19_11_50_12-2887463791027147253?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-19_11_34_00-797747120068818378?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-19_11_42_35-10502759086909132774?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-19_11_50_56-1782863025528057357?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-19_11_34_04-13196309150460909240?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-19_11_44_30-7659070136748731510?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-19_11_34_01-13096680862334625095?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-19_11_42_35-10645879251236557957?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-19_11_51_01-17802348734991135227?project=apache-beam-testing
> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests
> FAILED
FAILURE: Build completed with 2 failures.
1: Task failed with an exception.
-----------
* Where:
Script
'<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
line: 146
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script
'<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
line: 175
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 14m 8s
65 actionable tasks: 47 executed, 18 from cache
Publishing build scan...
https://gradle.com/s/clsxcoadlmsio
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]