See
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/923/display/redirect?page=changes>
Changes:
[Luke Cwik] [BEAM-10688] Euphoria assumes that all type descriptors are
resolvable
[Luke Cwik] [BEAM-10670] Use fraction of remainder if consumed fraction is
unknown
[Luke Cwik] [BEAM-10670] Improve splitting logic to prefer splits upto the the
[Luke Cwik] [BEAM-10670] Fix passing forward the self-checkpoint from the
[noreply] [BEAM-9547] Implement some methods for deferred Series. (#12534)
------------------------------------------
[...truncated 5.26 MB...]
"properties": {
"display_data": [
{
"key": "fn",
"label": "Transform Function",
"namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
"type": "STRING",
"value": "_equal"
},
{
"key": "fn",
"label": "Transform Function",
"namespace": "apache_beam.transforms.core.ParDo",
"shortValue": "CallableWrapperDoFn",
"type": "STRING",
"value": "apache_beam.transforms.core.CallableWrapperDoFn"
}
],
"non_parallel_inputs": {},
"output_info": [
{
"encoding": {
"@type": "kind:windowed_value",
"component_encodings": [
{
"@type": "FastPrimitivesCoder$<string of 176 bytes>",
"component_encodings": [
{
"@type": "FastPrimitivesCoder$<string of 176 bytes>",
"component_encodings": [],
"pipeline_proto_coder_id":
"ref_Coder_FastPrimitivesCoder_4"
},
{
"@type": "FastPrimitivesCoder$<string of 176 bytes>",
"component_encodings": [],
"pipeline_proto_coder_id":
"ref_Coder_FastPrimitivesCoder_4"
}
],
"is_pair_like": true,
"pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"output_name": "None",
"user_name": "assert_that/Match.out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "None",
"step_name": "s20"
},
"serialized_fn": "ref_AppliedPTransform_assert_that/Match_30",
"user_name": "assert_that/Match"
}
}
],
"type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
createTime: u'2020-08-15T01:05:16.664632Z'
currentStateTime: u'1970-01-01T00:00:00Z'
id: u'2020-08-14_18_05_15-4080485319493451453'
location: u'us-central1'
name: u'beamapp-jenkins-0815010506-296338'
projectId: u'apache-beam-testing'
stageStates: []
startTime: u'2020-08-15T01:05:16.664632Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id:
[2020-08-14_18_05_15-4080485319493451453]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job:
2020-08-14_18_05_15-4080485319493451453
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow
monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-14_18_05_15-4080485319493451453?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely
for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job
2020-08-14_18_05_15-4080485319493451453 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:15.289Z:
JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine.
Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:15.289Z:
JOB_MESSAGE_DETAILED: Autoscaling is enabled for job
2020-08-14_18_05_15-4080485319493451453. The number of workers will be between
1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:15.289Z:
JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job
2020-08-14_18_05_15-4080485319493451453.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:19.942Z:
JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:20.629Z:
JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable
parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:20.652Z:
JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into
optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:20.730Z:
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:20.764Z:
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:20.799Z:
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not
followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:20.844Z:
JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into
optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:20.887Z:
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write
steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:20.998Z:
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:21.113Z:
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:21.167Z:
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:21.212Z:
JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:21.244Z:
JOB_MESSAGE_DETAILED: Fusing unzipped copy of
assert_that/Group/GroupByKey/WriteStream, through flatten
assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:21.283Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream
into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:21.313Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:21.372Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:21.407Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:21.439Z:
JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into
Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:21.473Z:
JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:21.509Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into
Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:21.545Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into
assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:21.576Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into
assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:21.600Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets
into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:21.638Z:
JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/Map(_merge_tagged_vals_under_key) into
assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:21.677Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into
assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:21.713Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:21.754Z:
JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2826>)
into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:21.785Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at
core.py:2826>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:21.825Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into
assert_that/Create/FlatMap(<lambda at core.py:2826>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:21.865Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into
assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:21.901Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at
core.py:2826>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:21.930Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into
Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:21.959Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:22.004Z:
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:22.038Z:
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:22.075Z:
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:22.108Z:
JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:24.343Z:
JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:24.392Z:
JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:24.435Z:
JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:34.008Z:
JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric
descriptors and Stackdriver will not create new Dataflow custom metrics for
this job. Each unique user-defined metric name (independent of the DoFn in
which it is defined) produces a new metric descriptor. To delete old / unused
metric descriptors see
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:05:49.836Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that
the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:06:27.432Z:
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:06:27.540Z:
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:11:27.565Z:
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:11:27.646Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:11:27.676Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:11:27.704Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:11:27.736Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:12:15.512Z:
JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on
low average worker CPU utilization, and the pipeline having sufficiently low
backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:12:15.565Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-15T01:12:15.612Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job
2020-08-14_18_05_15-4080485319493451453 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok
======================================================================
ERROR: Runs streaming Dataflow job and verifies that user metrics are reported
----------------------------------------------------------------------
Traceback (most recent call last):
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_streaming_metrics_pipeline_test.py",>
line 76, in setUp
self.input_topic.name)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/cloud/pubsub_v1/_gapic.py",>
line 40, in <lambda>
fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw) # noqa
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/cloud/pubsub_v1/gapic/subscriber_client.py",>
line 439, in create_subscription
request, retry=retry, timeout=timeout, metadata=metadata
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/api_core/gapic_v1/method.py",>
line 145, in __call__
return wrapped_func(*args, **kwargs)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/api_core/retry.py",>
line 286, in retry_wrapped_func
on_error=on_error,
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/api_core/retry.py",>
line 184, in retry_target
return target()
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/api_core/timeout.py",>
line 214, in func_with_timeout
return func(*args, **kwargs)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/api_core/grpc_helpers.py",>
line 59, in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/six.py",>
line 738, in raise_from
raise value
ResourceExhausted: 429 Your project has exceeded a limit:
(type="subscriptions-per-project", current=10000, maximum=10000).
-------------------- >> begin captured logging << --------------------
google.auth._default: DEBUG: Checking None for explicit credentials as part of
auth process...
google.auth._default: DEBUG: Checking Cloud SDK credentials as part of auth
process...
google.auth._default: DEBUG: Cloud SDK credentials not found on disk; not using
them
google.auth._default: DEBUG: Checking for App Engine runtime as part of auth
process...
google.auth._default: DEBUG: No App Engine library was found so cannot
authentication via App Engine Identity Credentials.
google.auth.transport._http_client: DEBUG: Making request: GET
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1):
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1"
200 144
google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/[email protected]/token
HTTP/1.1" 200 221
google.auth._default: DEBUG: Checking None for explicit credentials as part of
auth process...
google.auth._default: DEBUG: Checking Cloud SDK credentials as part of auth
process...
google.auth._default: DEBUG: Cloud SDK credentials not found on disk; not using
them
google.auth._default: DEBUG: Checking for App Engine runtime as part of auth
process...
google.auth._default: DEBUG: No App Engine library was found so cannot
authentication via App Engine Identity Credentials.
google.auth.transport._http_client: DEBUG: Making request: GET
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1):
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1"
200 144
google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/[email protected]/token
HTTP/1.1" 200 221
--------------------- >> end captured logging << ---------------------
----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df-py27.xml
----------------------------------------------------------------------
XML:
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 1981.909s
FAILED (errors=1)
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-14_17_39_43-7619780989977771338?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-14_17_48_14-9321412400040695747?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-14_17_56_39-11497762367432892351?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-14_18_05_15-4080485319493451453?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-14_17_39_45-3178901820521395188?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-14_17_47_25-3276458250327492080?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-14_17_55_46-2139997719142111578?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-14_17_39_44-613383891648937501?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-14_17_48_14-13513029610767278821?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-14_17_39_47-15055453421888892453?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-14_17_47_30-3398938505571489717?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-14_17_54_46-171640633285444119?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-14_17_39_43-2554452871484579530?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-14_17_48_09-5958195668841176831?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-14_17_39_45-4684731116773864216?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-14_17_47_17-2476780533437668563?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-14_17_54_37-7952477232672941115?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-14_17_39_45-591302360421614386?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-14_17_47_18-11746670671879749814?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-14_17_55_49-5999382574662268400?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-14_17_39_43-14455970737641100221?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-14_17_47_21-7971679903524092256?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-14_17_55_53-5382887801545461811?project=apache-beam-testing
> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests
> FAILED
FAILURE: Build failed with an exception.
* Where:
Script
'<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
line: 173
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 11m 24s
65 actionable tasks: 47 executed, 18 from cache
Publishing build scan...
https://gradle.com/s/q6ceptjxrlisk
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]