See
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1080/display/redirect>
------------------------------------------
[...truncated 249.73 KB...]
"@type": "kind:interval_window"
}
],
"is_wrapper": true
},
"output_name": "out",
"user_name": "format.out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "s7"
},
"serialized_fn": "ref_AppliedPTransform_format_10",
"user_name": "format"
}
},
{
"kind": "ParallelDo",
"name": "s9",
"properties": {
"display_data": [
{
"key": "fn",
"label": "Transform Function",
"namespace": "apache_beam.transforms.core.ParDo",
"shortValue": "CallableWrapperDoFn",
"type": "STRING",
"value": "apache_beam.transforms.core.CallableWrapperDoFn"
},
{
"key": "fn",
"label": "Transform Function",
"namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
"type": "STRING",
"value": "<lambda>"
}
],
"non_parallel_inputs": {},
"output_info": [
{
"encoding": {
"@type": "kind:windowed_value",
"component_encodings": [
{
"@type": "kind:bytes"
},
{
"@type": "kind:interval_window"
}
],
"is_wrapper": true
},
"output_name": "out",
"user_name": "encode.out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "s8"
},
"serialized_fn": "ref_AppliedPTransform_encode_11",
"user_name": "encode"
}
},
{
"kind": "ParallelWrite",
"name": "s10",
"properties": {
"display_data": [],
"encoding": {
"@type": "kind:windowed_value",
"component_encodings": [
{
"@type": "kind:bytes"
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"format": "pubsub",
"parallel_input": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "s9"
},
"pubsub_topic":
"projects/apache-beam-testing/topics/wc_topic_outputa38928dd-4f21-48f9-8121-0fc0897e9552",
"user_name": "WriteToPubSub/Write/NativeWrite"
}
}
],
"type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
createTime: '2019-06-08T00:01:44.223402Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2019-06-07_17_01_42-12289124887969079501'
location: 'us-central1'
name: 'beamapp-jenkins-0608000128-661840'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2019-06-08T00:01:44.223402Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-06-07_17_01_42-12289124887969079501]
root: INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_01_42-12289124887969079501?project=apache-beam-testing
root: INFO: Job 2019-06-07_17_01_42-12289124887969079501 is in state
JOB_STATE_RUNNING
root: INFO: 2019-06-08T00:01:46.840Z: JOB_MESSAGE_DETAILED: Checking
permissions granted to controller Service Account.
root: INFO: 2019-06-08T00:01:47.805Z: JOB_MESSAGE_BASIC: Worker configuration:
n1-standard-4 in us-central1-c.
root: INFO: 2019-06-08T00:01:48.349Z: JOB_MESSAGE_DETAILED: Expanding
SplittableParDo operations into optimizable parts.
root: INFO: 2019-06-08T00:01:48.351Z: JOB_MESSAGE_DETAILED: Expanding
CollectionToSingleton operations into optimizable parts.
root: INFO: 2019-06-08T00:01:48.360Z: JOB_MESSAGE_DETAILED: Expanding
CoGroupByKey operations into optimizable parts.
root: INFO: 2019-06-08T00:01:48.367Z: JOB_MESSAGE_DETAILED: Expanding
SplittableProcessKeyed operations into optimizable parts.
root: INFO: 2019-06-08T00:01:48.369Z: JOB_MESSAGE_DETAILED: Expanding
GroupByKey operations into streaming Read/Write steps
root: INFO: 2019-06-08T00:01:48.375Z: JOB_MESSAGE_DEBUG: Annotating graph with
Autotuner information.
root: INFO: 2019-06-08T00:01:48.387Z: JOB_MESSAGE_DETAILED: Fusing adjacent
ParDo, Read, Write, and Flatten operations
root: INFO: 2019-06-08T00:01:48.389Z: JOB_MESSAGE_DETAILED: Fusing consumer
decode into ReadFromPubSub/Read
root: INFO: 2019-06-08T00:01:48.392Z: JOB_MESSAGE_DETAILED: Fusing consumer
pair_with_one into split
root: INFO: 2019-06-08T00:01:48.395Z: JOB_MESSAGE_DETAILED: Fusing consumer
count into group/MergeBuckets
root: INFO: 2019-06-08T00:01:48.397Z: JOB_MESSAGE_DETAILED: Fusing consumer
WriteToPubSub/Write/NativeWrite into encode
root: INFO: 2019-06-08T00:01:48.399Z: JOB_MESSAGE_DETAILED: Fusing consumer
encode into format
root: INFO: 2019-06-08T00:01:48.401Z: JOB_MESSAGE_DETAILED: Fusing consumer
group/MergeBuckets into group/ReadStream
root: INFO: 2019-06-08T00:01:48.403Z: JOB_MESSAGE_DETAILED: Fusing consumer
format into count
root: INFO: 2019-06-08T00:01:48.405Z: JOB_MESSAGE_DETAILED: Fusing consumer
group/WriteStream into WindowInto(WindowIntoFn)
root: INFO: 2019-06-08T00:01:48.407Z: JOB_MESSAGE_DETAILED: Fusing consumer
split into decode
root: INFO: 2019-06-08T00:01:48.410Z: JOB_MESSAGE_DETAILED: Fusing consumer
WindowInto(WindowIntoFn) into pair_with_one
root: INFO: 2019-06-08T00:01:48.419Z: JOB_MESSAGE_DEBUG: Adding StepResource
setup and teardown to workflow graph.
root: INFO: 2019-06-08T00:01:48.437Z: JOB_MESSAGE_DEBUG: Adding workflow start
and stop steps.
root: INFO: 2019-06-08T00:01:48.449Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-06-08T00:01:48.559Z: JOB_MESSAGE_DEBUG: Executing wait step
start2
root: INFO: 2019-06-08T00:01:48.572Z: JOB_MESSAGE_DEBUG: Starting worker pool
setup.
root: INFO: 2019-06-08T00:01:48.577Z: JOB_MESSAGE_BASIC: Starting 1 workers...
root: INFO: 2019-06-08T00:01:50.752Z: JOB_MESSAGE_BASIC: Executing operation
group/ReadStream+group/MergeBuckets+count+format+encode+WriteToPubSub/Write/NativeWrite
root: INFO: 2019-06-08T00:01:50.752Z: JOB_MESSAGE_BASIC: Executing operation
ReadFromPubSub/Read+decode+split+pair_with_one+WindowInto(WindowIntoFn)+group/WriteStream
root: INFO: 2019-06-08T00:02:33.539Z: JOB_MESSAGE_DETAILED: Workers have
started successfully.
root: INFO: 2019-06-08T00:02:42.215Z: JOB_MESSAGE_DEBUG: Executing input step
topology_init_attach_disk_input_step
root: WARNING: Timing out on waiting for job
2019-06-07_17_01_42-12289124887969079501 after 184 seconds
google.auth.transport._http_client: DEBUG: Making request: GET
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1):
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1"
200 144
google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/[email protected]/token
HTTP/1.1" 200 176
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_01_43-2139874303237858434?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_17_49-6499562686224581357?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_26_06-7532604195143244874?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_33_43-11582694502594110728?project=apache-beam-testing.
method_to_use = self._compute_method(p, p.options)
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_01_40-16923392122375489080?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_24_59-1466145956049519208?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_33_09-13264382725917828400?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use
WriteToBigQuery instead.
kms_key=transform.kms_key))
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_41_55-11109088730021433999?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232:
FutureWarning: MatchAll is experimental.
| 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243:
FutureWarning: MatchAll is experimental.
| 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243:
FutureWarning: ReadMatches is experimental.
| 'Checksums' >> beam.Map(compute_hash))
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_01_47-5983965498102569932?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
method_to_use = self._compute_method(p, p.options)
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_15_03-9739172857792500142?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_23_39-13095010632493641331?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_32_19-16595435388230958432?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_01_39-9765384331573326663?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_21_41-17177895666245847228?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_30_09-3329810871185207543?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use
WriteToBigQuery instead.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_38_07-14393639080782924900?project=apache-beam-testing.
kms_key=transform.kms_key))
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_01_40-16072282876448216450?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_10_10-18176788779743474345?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_19_22-15919978854092154330?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_29_04-6840500456591769805?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_37_15-11600006984779837832?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_46_17-7122029537438411930?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_01_39-4641872674174065339?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_09_22-1876391115009484204?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use
WriteToBigQuery instead.
kms_key=transform.kms_key))
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_18_04-695602214433978352?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_26_20-13737438695420786558?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
or p.options.view_as(GoogleCloudOptions).temp_location)
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_01_42-12289124887969079501?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
method_to_use = self._compute_method(p, p.options)
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_11_38-7074900660928783502?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_20_41-7790661302122248732?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
or p.options.view_as(GoogleCloudOptions).temp_location)
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_29_06-8500455229608768391?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_01_41-9950885534732561500?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_10_20-9277522728864431438?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use
WriteToBigQuery instead.
kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use
WriteToBigQuery instead.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_17_42-7030281459520832936?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_25_10-7489680361018894929?project=apache-beam-testing.
kms_key=kms_key))
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_17_34_27-6206923792453577271?project=apache-beam-testing.
----------------------------------------------------------------------
XML:
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 3163.768s
FAILED (SKIP=5, failures=1)
> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED
FAILURE: Build failed with an exception.
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'>
line: 78
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 53m 51s
77 actionable tasks: 60 executed, 17 from cache
Publishing build scan...
https://gradle.com/s/msd42xe2eucfk
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]