See
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/5084/display/redirect>
Changes:
------------------------------------------
[...truncated 649.89 KB...]
"@type": "kind:windowed_value",
"component_encodings": [
{
"@type": "kind:bytes"
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"output_name": "out",
"user_name": "generate_metrics.out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "s1"
},
"serialized_fn": "ref_AppliedPTransform_generate_metrics_4",
"user_name": "generate_metrics"
}
},
{
"kind": "ParallelWrite",
"name": "s3",
"properties": {
"display_data": [],
"encoding": {
"@type": "kind:windowed_value",
"component_encodings": [
{
"@type": "kind:bytes"
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"format": "pubsub",
"parallel_input": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "s2"
},
"pubsub_topic":
"projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output0b1e9a94-5f1b-47bb-a2f4-fcc0b54649fd",
"user_name": "dump_to_pub/Write/NativeWrite"
}
}
],
"type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
createTime: '2019-11-14T12:36:05.237527Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2019-11-14_04_36_03-6083148794610535148'
location: 'us-central1'
name: 'beamapp-jenkins-1114123552-747210'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2019-11-14T12:36:05.237527Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-11-14_04_36_03-6083148794610535148]
root: INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_36_03-6083148794610535148?project=apache-beam-testing
root: INFO: Job 2019-11-14_04_36_03-6083148794610535148 is in state
JOB_STATE_RUNNING
root: INFO: 2019-11-14T12:36:07.621Z: JOB_MESSAGE_DETAILED: Checking
permissions granted to controller Service Account.
root: INFO: 2019-11-14T12:36:08.507Z: JOB_MESSAGE_BASIC: Worker configuration:
n1-standard-4 in us-central1-f.
root: INFO: 2019-11-14T12:36:09.092Z: JOB_MESSAGE_DETAILED: Expanding
SplittableParDo operations into optimizable parts.
root: INFO: 2019-11-14T12:36:09.094Z: JOB_MESSAGE_DETAILED: Expanding
CollectionToSingleton operations into optimizable parts.
root: INFO: 2019-11-14T12:36:09.101Z: JOB_MESSAGE_DETAILED: Expanding
CoGroupByKey operations into optimizable parts.
root: INFO: 2019-11-14T12:36:09.109Z: JOB_MESSAGE_DETAILED: Expanding
SplittableProcessKeyed operations into optimizable parts.
root: INFO: 2019-11-14T12:36:09.111Z: JOB_MESSAGE_DETAILED: Expanding
GroupByKey operations into streaming Read/Write steps
root: INFO: 2019-11-14T12:36:09.114Z: JOB_MESSAGE_DEBUG: Annotating graph with
Autotuner information.
root: INFO: 2019-11-14T12:36:09.129Z: JOB_MESSAGE_DETAILED: Fusing adjacent
ParDo, Read, Write, and Flatten operations
root: INFO: 2019-11-14T12:36:09.132Z: JOB_MESSAGE_DETAILED: Fusing consumer
generate_metrics into ReadFromPubSub/Read
root: INFO: 2019-11-14T12:36:09.134Z: JOB_MESSAGE_DETAILED: Fusing consumer
dump_to_pub/Write/NativeWrite into generate_metrics
root: INFO: 2019-11-14T12:36:09.141Z: JOB_MESSAGE_DEBUG: Adding StepResource
setup and teardown to workflow graph.
root: INFO: 2019-11-14T12:36:09.168Z: JOB_MESSAGE_DEBUG: Adding workflow start
and stop steps.
root: INFO: 2019-11-14T12:36:09.218Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-11-14T12:36:09.324Z: JOB_MESSAGE_DEBUG: Executing wait step
start2
root: INFO: 2019-11-14T12:36:09.337Z: JOB_MESSAGE_DEBUG: Starting worker pool
setup.
root: INFO: 2019-11-14T12:36:09.342Z: JOB_MESSAGE_BASIC: Starting 1 workers...
root: INFO: 2019-11-14T12:36:11.693Z: JOB_MESSAGE_BASIC: Executing operation
ReadFromPubSub/Read+generate_metrics+dump_to_pub/Write/NativeWrite
root: INFO: 2019-11-14T12:36:20.980Z: JOB_MESSAGE_WARNING: Your project already
contains 100 Dataflow-created metric descriptors and Stackdriver will not
create new Dataflow custom metrics for this job. Each unique user-defined
metric name (independent of the DoFn in which it is defined) produces a new
metric descriptor. To delete old / unused metric descriptors see
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
root: INFO: 2019-11-14T12:36:39.881Z: JOB_MESSAGE_DEBUG: Executing input step
topology_init_attach_disk_input_step
root: INFO: 2019-11-14T12:36:39.882Z: JOB_MESSAGE_DETAILED: Checking
permissions granted to controller Service Account.
root: INFO: 2019-11-14T12:36:40.676Z: JOB_MESSAGE_BASIC: Worker configuration:
n1-standard-4 in us-central1-f.
root: INFO: 2019-11-14T12:36:52.324Z: JOB_MESSAGE_DETAILED: Workers have
started successfully.
root: WARNING: Timing out on waiting for job
2019-11-14_04_36_03-6083148794610535148 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1):
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1"
200 144
google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/[email protected]/token
HTTP/1.1" 200 181
--------------------- >> end captured logging << ---------------------
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_36_03-6083148794610535148?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_39_53-10382941511261364010?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_48_06-3869301231493290259?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_36_00-4703701126939857491?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_44_08-18306182445778208444?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_36_03-8427434135949482194?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_43_52-6339302246416370626?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_51_59-15477622580701012998?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_59_43-6688981187662912523?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_35_59-2157872546678394661?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_44_13-2207198706619950592?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_36_01-12142459032801693249?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_43_58-17938719062304673562?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_36_01-11747917204616969228?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_44_09-8765951026323428344?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_36_01-17818886890293702764?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_44_19-864794886441319303?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_36_01-188223436287008350?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_43_50-5768474887411681135?project=apache-beam-testing
----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df-py36.xml
----------------------------------------------------------------------
XML:
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 19 tests in 1950.364s
FAILED (errors=1)
> Task :sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests
> FAILED
> Task :sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_37_01-8480721275842032581?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_44_49-10628881534503321096?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_36_58-13586007500537450815?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_44_57-18220272700391309908?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_52_40-16753875097295488646?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_05_01_09-3207326212540512822?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_36_59-292124824033869437?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_45_02-6502779518750252457?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_36_58-4517681700020006905?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_45_27-16777695156200682467?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_36_59-5344406437957276136?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_44_32-9222378029725286389?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_52_25-3649663656418191066?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_37_00-11018404492710422398?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_44_52-6971656248989384699?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_36_59-13382451399412910095?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_45_08-10131580624076586828?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_37_00-9236123149880167635?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-14_04_44_52-8523256350877108253?project=apache-beam-testing
test_multiple_empty_outputs
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest)
... ok
test_par_do_with_multiple_outputs_and_using_return
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_dofn_lifecycle
(apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_impulse (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ...
ok
test_as_singleton_without_unique_labels
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ...
ok
test_as_list_and_as_dict_side_inputs
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_window_assignment_idempotency
(apache_beam.transforms.window_test.WindowTest) ... ok
test_window_assignment_through_multiple_gbk_idempotency
(apache_beam.transforms.window_test.WindowTest) ... ok
----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df-py37.xml
----------------------------------------------------------------------
XML:
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 19 tests in 1927.963s
OK
FAILURE: Build completed with 4 failures.
1: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'>
line: 78
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py35:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'>
line: 78
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py36:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'>
line: 111
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py37:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
4: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'>
line: 101
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 8m 31s
74 actionable tasks: 57 executed, 17 from cache
Publishing build scan...
https://gradle.com/s/7pe7teurubvju
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]