See
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/5261/display/redirect?page=changes>
Changes:
[kcweaver] [BEAM-8870] Skip test_sdf_with_watermark_tracking for Spark
[kcweaver] [BEAM-8870] Cache environment in spark_runner_test.
[kcweaver] Don't override default for environment_cache_millis
[ehudm] [BEAM-3713] pytest migration: py27-gcp-pytest
------------------------------------------
[...truncated 13.37 MB...]
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"output_name": "out",
"user_name": "assert:even/Unkey.out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "s43"
},
"serialized_fn": "ref_AppliedPTransform_assert:even/Unkey_62",
"user_name": "assert:even/Unkey"
}
},
{
"kind": "ParallelDo",
"name": "s45",
"properties": {
"display_data": [
{
"key": "fn",
"label": "Transform Function",
"namespace": "apache_beam.transforms.core.ParDo",
"shortValue": "CallableWrapperDoFn",
"type": "STRING",
"value": "apache_beam.transforms.core.CallableWrapperDoFn"
},
{
"key": "fn",
"label": "Transform Function",
"namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
"type": "STRING",
"value": "_equal"
}
],
"non_parallel_inputs": {},
"output_info": [
{
"encoding": {
"@type": "kind:windowed_value",
"component_encodings": [
{
"@type":
"FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
"component_encodings": [
{
"@type":
"FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
"component_encodings": [],
"pipeline_proto_coder_id":
"ref_Coder_FastPrimitivesCoder_6"
},
{
"@type":
"FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
"component_encodings": [],
"pipeline_proto_coder_id":
"ref_Coder_FastPrimitivesCoder_6"
}
],
"is_pair_like": true,
"pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_6"
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"output_name": "out",
"user_name": "assert:even/Match.out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "s44"
},
"serialized_fn": "ref_AppliedPTransform_assert:even/Match_63",
"user_name": "assert:even/Match"
}
}
],
"type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
createTime: '2019-12-10T03:10:30.472219Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2019-12-09_19_10_29-15349047076494753803'
location: 'us-central1'
name: 'beamapp-jenkins-1210031013-592629'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2019-12-10T03:10:30.472219Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id:
[2019-12-09_19_10_29-15349047076494753803]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow
monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-09_19_10_29-15349047076494753803?project=apache-beam-testing
apache_beam.runners.dataflow.test_dataflow_runner: WARNING: Waiting
indefinitely for streaming job.
--------------------- >> end captured logging << ---------------------
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-09_19_10_18-4967808566962451827?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-09_19_10_49-10607904999738883562?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-09_19_10_02-15922080724732498682?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-09_19_10_34-8263334294251674567?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-09_19_11_01-14169399489941828571?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-09_19_09_58-8315192763666211118?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-09_19_10_28-14170589144301490569?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-09_19_10_54-7138094695332837081?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-09_19_10_16-16721522594263154132?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-09_19_10_46-12481396828610070808?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-09_19_09_59-14608067401760795796?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-09_19_10_29-15349047076494753803?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-09_19_09_59-174984405797775964?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-09_19_10_34-14513313407343586639?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-09_19_11_00-7177607390735957729?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-09_19_09_58-6387930242214044086?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-09_19_10_33-12231860628197066720?project=apache-beam-testing
----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df-py36.xml
----------------------------------------------------------------------
XML:
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 24 tests in 4566.677s
FAILED (errors=3, failures=15)
> Task :sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests
> FAILED
FAILURE: Build completed with 8 failures.
1: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'>
line: 78
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py36:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'>
line: 78
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py35:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'>
line: 111
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py37:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
4: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'>
line: 107
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
5: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'>
line: 130
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
6: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'>
line: 101
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py35:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
7: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'>
line: 134
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
8: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'>
line: 101
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 26m 58s
74 actionable tasks: 57 executed, 17 from cache
Publishing build scan...
https://scans.gradle.com/s/zse65kz6dsub6
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]