See
<https://builds.apache.org/job/beam_PerformanceTests_Python/763/display/redirect?page=changes>
Changes:
[iemejia] [BEAM-3187] Ensure that teardown is called in case of Exception on
[iemejia] [BEAM-3187] Enable PardoLifecycleTest for the Spark runner
------------------------------------------
[...truncated 90.47 KB...]
{
"@type": "kind:windowed_value",
"component_encodings": [
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": [
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": []
},
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": []
}
],
"is_pair_like": true
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
}
]
},
"output_name": "out",
"user_name":
"write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(InitializeWrite.out.0).output"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "s8"
},
"user_name":
"write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(InitializeWrite.out.0)"
}
},
{
"kind": "CollectionToSingleton",
"name": "SideInput-s16",
"properties": {
"output_info": [
{
"encoding": {
"@type": "kind:windowed_value",
"component_encodings": [
{
"@type": "kind:windowed_value",
"component_encodings": [
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": [
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": []
},
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": []
}
],
"is_pair_like": true
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
}
]
},
"output_name": "out",
"user_name":
"write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(Extract.out.0).output"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "s14"
},
"user_name":
"write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(Extract.out.0)"
}
},
{
"kind": "ParallelDo",
"name": "s17",
"properties": {
"display_data": [
{
"key": "fn",
"label": "Transform Function",
"namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
"type": "STRING",
"value": "_finalize_write"
},
{
"key": "fn",
"label": "Transform Function",
"namespace": "apache_beam.transforms.core.ParDo",
"shortValue": "CallableWrapperDoFn",
"type": "STRING",
"value": "apache_beam.transforms.core.CallableWrapperDoFn"
}
],
"non_parallel_inputs": {
"SideInput-s15": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "SideInput-s15"
},
"SideInput-s16": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "SideInput-s16"
}
},
"output_info": [
{
"encoding": {
"@type": "kind:windowed_value",
"component_encodings": [
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": [
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": []
},
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": []
}
],
"is_pair_like": true
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"output_name": "out",
"user_name": "write/Write/WriteImpl/FinalizeWrite.out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "s7"
},
"serialized_fn": "<string of 2420 bytes>",
"user_name": "write/Write/WriteImpl/FinalizeWrite/FinalizeWrite"
}
}
],
"type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
createTime: u'2018-01-07T12:10:37.910596Z'
currentStateTime: u'1970-01-01T00:00:00Z'
id: u'2018-01-07_04_10_37-17261477608655891931'
location: u'us-central1'
name: u'beamapp-jenkins-0107121034-978576'
projectId: u'apache-beam-testing'
stageStates: []
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-01-07_04_10_37-17261477608655891931]
root: INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-01-07_04_10_37-17261477608655891931?project=apache-beam-testing
root: INFO: Job 2018-01-07_04_10_37-17261477608655891931 is in state
JOB_STATE_PENDING
root: INFO: 2018-01-07T12:10:37.312Z: JOB_MESSAGE_DETAILED: (ef8d166e0a4b27d7):
Autoscaling is enabled for job 2018-01-07_04_10_37-17261477608655891931. The
number of workers will be between 1 and 15.
root: INFO: 2018-01-07T12:10:37.330Z: JOB_MESSAGE_DETAILED: (ef8d166e0a4b2a56):
Autoscaling was automatically enabled for job
2018-01-07_04_10_37-17261477608655891931.
root: INFO: 2018-01-07T12:10:39.443Z: JOB_MESSAGE_DETAILED: (d77250a7fe5821cc):
Checking required Cloud APIs are enabled.
root: INFO: 2018-01-07T12:10:40.484Z: JOB_MESSAGE_DETAILED: (d77250a7fe582066):
Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2018-01-07T12:10:40.497Z: JOB_MESSAGE_DEBUG: (d77250a7fe582b87):
Combiner lifting skipped for step write/Write/WriteImpl/GroupByKey: GroupByKey
not followed by a combiner.
root: INFO: 2018-01-07T12:10:40.506Z: JOB_MESSAGE_DEBUG: (d77250a7fe582d9d):
Combiner lifting skipped for step group: GroupByKey not followed by a combiner.
root: INFO: 2018-01-07T12:10:40.515Z: JOB_MESSAGE_DETAILED: (d77250a7fe582fb3):
Expanding GroupByKey operations into optimizable parts.
root: INFO: 2018-01-07T12:10:40.525Z: JOB_MESSAGE_DETAILED: (d77250a7fe5821c9):
Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2018-01-07T12:10:40.541Z: JOB_MESSAGE_DEBUG: (d77250a7fe58280b):
Annotating graph with Autotuner information.
root: INFO: 2018-01-07T12:10:40.562Z: JOB_MESSAGE_DETAILED: (d77250a7fe582c37):
Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2018-01-07T12:10:40.574Z: JOB_MESSAGE_DETAILED: (d77250a7fe582e4d):
Fusing consumer split into read/Read
root: INFO: 2018-01-07T12:10:40.586Z: JOB_MESSAGE_DETAILED: (d77250a7fe582063):
Fusing consumer group/Write into group/Reify
root: INFO: 2018-01-07T12:10:40.595Z: JOB_MESSAGE_DETAILED: (d77250a7fe582279):
Fusing consumer group/GroupByWindow into group/Read
root: INFO: 2018-01-07T12:10:40.607Z: JOB_MESSAGE_DETAILED: (d77250a7fe58248f):
Fusing consumer write/Write/WriteImpl/GroupByKey/GroupByWindow into
write/Write/WriteImpl/GroupByKey/Read
root: INFO: 2018-01-07T12:10:40.617Z: JOB_MESSAGE_DETAILED: (d77250a7fe5826a5):
Fusing consumer write/Write/WriteImpl/GroupByKey/Write into
write/Write/WriteImpl/GroupByKey/Reify
root: INFO: 2018-01-07T12:10:40.628Z: JOB_MESSAGE_DETAILED: (d77250a7fe5828bb):
Fusing consumer write/Write/WriteImpl/WindowInto(WindowIntoFn) into
write/Write/WriteImpl/Pair
root: INFO: 2018-01-07T12:10:40.638Z: JOB_MESSAGE_DETAILED: (d77250a7fe582ad1):
Fusing consumer write/Write/WriteImpl/GroupByKey/Reify into
write/Write/WriteImpl/WindowInto(WindowIntoFn)
root: INFO: 2018-01-07T12:10:40.649Z: JOB_MESSAGE_DETAILED: (d77250a7fe582ce7):
Fusing consumer pair_with_one into split
root: INFO: 2018-01-07T12:10:40.658Z: JOB_MESSAGE_DETAILED: (d77250a7fe582efd):
Fusing consumer group/Reify into pair_with_one
root: INFO: 2018-01-07T12:10:40.668Z: JOB_MESSAGE_DETAILED: (d77250a7fe582113):
Fusing consumer write/Write/WriteImpl/WriteBundles/WriteBundles into format
root: INFO: 2018-01-07T12:10:40.678Z: JOB_MESSAGE_DETAILED: (d77250a7fe582329):
Fusing consumer write/Write/WriteImpl/Pair into
write/Write/WriteImpl/WriteBundles/WriteBundles
root: INFO: 2018-01-07T12:10:40.687Z: JOB_MESSAGE_DETAILED: (d77250a7fe58253f):
Fusing consumer format into count
root: INFO: 2018-01-07T12:10:40.697Z: JOB_MESSAGE_DETAILED: (d77250a7fe582755):
Fusing consumer write/Write/WriteImpl/Extract into
write/Write/WriteImpl/GroupByKey/GroupByWindow
root: INFO: 2018-01-07T12:10:40.706Z: JOB_MESSAGE_DETAILED: (d77250a7fe58296b):
Fusing consumer count into group/GroupByWindow
root: INFO: 2018-01-07T12:10:40.715Z: JOB_MESSAGE_DETAILED: (d77250a7fe582b81):
Fusing consumer write/Write/WriteImpl/InitializeWrite into
write/Write/WriteImpl/DoOnce/Read
root: INFO: 2018-01-07T12:10:40.725Z: JOB_MESSAGE_DEBUG: (d77250a7fe582d97):
Workflow config is missing a default resource spec.
root: INFO: 2018-01-07T12:10:40.734Z: JOB_MESSAGE_DEBUG: (d77250a7fe582fad):
Adding StepResource setup and teardown to workflow graph.
root: INFO: 2018-01-07T12:10:40.742Z: JOB_MESSAGE_DEBUG: (d77250a7fe5821c3):
Adding workflow start and stop steps.
root: INFO: 2018-01-07T12:10:40.751Z: JOB_MESSAGE_DEBUG: (d77250a7fe5823d9):
Assigning stage ids.
root: INFO: 2018-01-07T12:10:40.816Z: JOB_MESSAGE_DEBUG: (e4efc74a147919ff):
Executing wait step start25
root: INFO: 2018-01-07T12:10:40.841Z: JOB_MESSAGE_BASIC: (e4efc74a14791441):
Executing operation
write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite
root: INFO: 2018-01-07T12:10:40.851Z: JOB_MESSAGE_BASIC: (8dd6a31c2425b536):
Executing operation group/Create
root: INFO: 2018-01-07T12:10:40.865Z: JOB_MESSAGE_DEBUG: (d1a7cd361e5e0cec):
Starting worker pool setup.
root: INFO: 2018-01-07T12:10:40.874Z: JOB_MESSAGE_BASIC: (d1a7cd361e5e0b12):
Starting 1 workers in us-central1-f...
root: INFO: 2018-01-07T12:10:40.904Z: JOB_MESSAGE_DEBUG: (8dd6a31c2425b968):
Value "group/Session" materialized.
root: INFO: 2018-01-07T12:10:40.924Z: JOB_MESSAGE_BASIC: (8dd6a31c2425bb95):
Executing operation read/Read+split+pair_with_one+group/Reify+group/Write
root: INFO: Job 2018-01-07_04_10_37-17261477608655891931 is in state
JOB_STATE_RUNNING
root: INFO: 2018-01-07T12:10:48.989Z: JOB_MESSAGE_DETAILED: (814cc8e79686478d):
Autoscaling: Raised the number of workers to 0 based on the rate of progress in
the currently running step(s).
root: INFO: 2018-01-07T12:11:23.121Z: JOB_MESSAGE_ERROR: (814cc8e796864764):
Startup of the worker pool in zone us-central1-f failed to bring up any of the
desired 1 workers. QUOTA_EXCEEDED: Quota 'DISKS_TOTAL_GB' exceeded. Limit:
21000.0 in region us-central1.
root: INFO: 2018-01-07T12:11:23.130Z: JOB_MESSAGE_ERROR: (814cc8e796864762):
Workflow failed.
root: INFO: 2018-01-07T12:11:23.359Z: JOB_MESSAGE_DETAILED: (d77250a7fe58253c):
Cleaning up.
root: INFO: 2018-01-07T12:11:23.385Z: JOB_MESSAGE_DEBUG: (d77250a7fe582968):
Starting worker pool teardown.
root: INFO: 2018-01-07T12:11:23.393Z: JOB_MESSAGE_BASIC: (d77250a7fe582b7e):
Stopping worker pool...
root: INFO: 2018-01-07T12:11:31.619Z: JOB_MESSAGE_DEBUG: (d77250a7fe5825ec):
Tearing down pending resources...
root: INFO: Job 2018-01-07_04_10_37-17261477608655891931 is in state
JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
----------------------------------------------------------------------
Ran 2 tests in 210.139s
FAILED (errors=2)
2018-01-07 12:11:43,136 305386de MainThread beam_integration_benchmark(1/1)
ERROR Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
File
"<https://builds.apache.org/job/beam_PerformanceTests_Python/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",>
line 601, in RunBenchmark
DoRunPhase(spec, collector, detailed_timer)
File
"<https://builds.apache.org/job/beam_PerformanceTests_Python/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",>
line 504, in DoRunPhase
samples = spec.BenchmarkRun(spec)
File
"<https://builds.apache.org/job/beam_PerformanceTests_Python/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",>
line 159, in Run
job_type=job_type)
File
"<https://builds.apache.org/job/beam_PerformanceTests_Python/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",>
line 90, in SubmitJob
assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-01-07 12:11:43,137 305386de MainThread beam_integration_benchmark(1/1)
INFO Cleaning up benchmark beam_integration_benchmark
2018-01-07 12:11:43,139 305386de MainThread beam_integration_benchmark(1/1)
ERROR Benchmark 1/1 beam_integration_benchmark (UID:
beam_integration_benchmark0) failed. Execution will continue.
2018-01-07 12:11:43,191 305386de MainThread INFO Benchmark run statuses:
---------------------------------------------------------------
Name UID Status
---------------------------------------------------------------
beam_integration_benchmark beam_integration_benchmark0 FAILED
---------------------------------------------------------------
Success rate: 0.00% (0/1)
2018-01-07 12:11:43,192 305386de MainThread INFO Complete logs can be found
at: /tmp/perfkitbenchmarker/runs/305386de/pkb.log
2018-01-07 12:11:43,192 305386de MainThread INFO Completion statuses can be
found at: /tmp/perfkitbenchmarker/runs/305386de/completion_statuses.json
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]