See
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/5454/display/redirect?page=changes>
Changes:
[iemejia] [BEAM-8701] Remove unused commons-io_1x dependency
[iemejia] [BEAM-8701] Update commons-io to version 2.6
------------------------------------------
[...truncated 703.67 KB...]
"serialized_fn": "ref_AppliedPTransform_Start/Map(decode)_13",
"user_name": "Start/Map(decode)"
}
},
{
"kind": "ParallelDo",
"name": "s9",
"properties": {
"display_data": [
{
"key": "fn",
"label": "Transform Function",
"namespace": "apache_beam.transforms.core.ParDo",
"shortValue": "CallSequenceEnforcingDoFn",
"type": "STRING",
"value":
"apache_beam.transforms.dofn_lifecycle_test.CallSequenceEnforcingDoFn"
}
],
"non_parallel_inputs": {},
"output_info": [
{
"encoding": {
"@type": "kind:windowed_value",
"component_encodings": [
{
"@type":
"FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
"component_encodings": [
{
"@type":
"FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
"component_encodings": [],
"pipeline_proto_coder_id":
"ref_Coder_FastPrimitivesCoder_6"
},
{
"@type":
"FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
"component_encodings": [],
"pipeline_proto_coder_id":
"ref_Coder_FastPrimitivesCoder_6"
}
],
"is_pair_like": true,
"pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_6"
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"output_name": "out",
"user_name": "Do.out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "s8"
},
"serialized_fn": "ref_AppliedPTransform_Do_14",
"user_name": "Do"
}
}
],
"type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
createTime: '2020-01-07T16:20:51.234957Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2020-01-07_08_20_49-16263618190933252273'
location: 'us-central1'
name: 'beamapp-jenkins-0107162020-675235'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2020-01-07T16:20:51.234957Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id:
[2020-01-07_08_20_49-16263618190933252273]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow
monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_08_20_49-16263618190933252273?project=apache-beam-testing
apache_beam.runners.dataflow.test_dataflow_runner: WARNING: Waiting
indefinitely for streaming job.
apache_beam.runners.dataflow.dataflow_runner: INFO: Job
2020-01-07_08_20_49-16263618190933252273 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:20:53.525Z:
JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service
Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:20:54.306Z:
JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:20:55.162Z:
JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable
parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:20:55.165Z:
JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into
optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:20:55.172Z:
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:20:55.184Z:
JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into
optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:20:55.186Z:
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write
steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:20:55.192Z:
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:20:55.214Z:
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:20:55.217Z:
JOB_MESSAGE_DETAILED: Fusing consumer Start/FlatMap(<lambda at core.py:2591>)
into Start/Impulse
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:20:55.219Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Start/MaybeReshuffle/Reshuffle/AddRandomKeys into Start/FlatMap(<lambda at
core.py:2591>)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:20:55.221Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Start/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into
Start/MaybeReshuffle/Reshuffle/AddRandomKeys
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:20:55.224Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Start/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into
Start/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:20:55.227Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Start/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into
Start/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:20:55.230Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Start/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into
Start/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:20:55.232Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Start/MaybeReshuffle/Reshuffle/RemoveRandomKeys into
Start/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:20:55.235Z:
JOB_MESSAGE_DETAILED: Fusing consumer Start/Map(decode) into
Start/MaybeReshuffle/Reshuffle/RemoveRandomKeys
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:20:55.237Z:
JOB_MESSAGE_DETAILED: Fusing consumer Do into Start/Map(decode)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:20:55.249Z:
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:20:55.272Z:
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:20:55.285Z:
JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:20:55.433Z:
JOB_MESSAGE_DEBUG: Executing wait step start2
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:20:55.475Z:
JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:20:55.480Z:
JOB_MESSAGE_BASIC: Starting 1 workers...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:20:57.806Z:
JOB_MESSAGE_BASIC: Executing operation
Start/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream+Start/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets+Start/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Start/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Start/Map(decode)+Do
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:20:57.806Z:
JOB_MESSAGE_BASIC: Executing operation Start/Impulse+Start/FlatMap(<lambda at
core.py:2591>)+Start/MaybeReshuffle/Reshuffle/AddRandomKeys+Start/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Start/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:21:16.999Z:
JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric
descriptors and Stackdriver will not create new Dataflow custom metrics for
this job. Each unique user-defined metric name (independent of the DoFn in
which it is defined) produces a new metric descriptor. To delete old / unused
metric descriptors see
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:21:26.314Z:
JOB_MESSAGE_DEBUG: Executing input step topology_init_attach_disk_input_step
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:21:26.314Z:
JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service
Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:21:27.007Z:
JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:21:37.666Z:
JOB_MESSAGE_ERROR: Workflow failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:21:37.692Z:
JOB_MESSAGE_BASIC: Finished operation Start/Impulse+Start/FlatMap(<lambda at
core.py:2591>)+Start/MaybeReshuffle/Reshuffle/AddRandomKeys+Start/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Start/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:21:37.692Z:
JOB_MESSAGE_BASIC: Finished operation
Start/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream+Start/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets+Start/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Start/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Start/Map(decode)+Do
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:21:37.861Z:
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:21:37.886Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:21:37.890Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:21:37.892Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:21:37.898Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:21:40.108Z:
JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-07T16:23:11.357Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: Job
2020-01-07_08_20_49-16263618190933252273 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_08_20_50-18099685570275384164?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_08_28_06-14831978760780519172?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_08_35_20-995801420735531225?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_08_20_48-15109189360554525070?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_08_28_48-15590665062021303441?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_08_36_13-12765219893115240145?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_08_20_51-502064313294775483?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_08_28_42-11083980542379184386?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_08_20_49-16263618190933252273?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_08_23_50-1000870339621491635?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_08_31_04-6357429243989867734?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_08_20_48-1149421288009276600?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_08_28_36-8642364254578023337?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_08_36_17-13019833417344582712?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_08_44_12-10710463749163565756?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_08_20_48-15948994369707590329?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_08_28_41-5699669252959694693?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_08_36_15-5278237671258441514?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_08_20_49-17033792202151930755?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_08_29_14-4271285007400884745?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_08_20_48-4397119725350443555?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_08_28_45-4482720574956939940?project=apache-beam-testing
----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df-py35.xml
----------------------------------------------------------------------
XML:
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 26 tests in 1873.258s
FAILED (errors=1)
> Task :sdks:python:test-suites:dataflow:py35:validatesRunnerStreamingTests
> FAILED
FAILURE: Build completed with 6 failures.
1: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'>
line: 107
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'>
line: 111
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py37:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'>
line: 78
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py36:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
4: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'>
line: 130
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
5: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'>
line: 134
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
6: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'>
line: 101
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py35:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 15m 57s
74 actionable tasks: 73 executed, 1 from cache
Publishing build scan...
https://gradle.com/s/4aftkgjndowbu
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]