See
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1312/display/redirect?page=changes>
Changes:
[ryan] Consider Elasticsearch as one word in camelCase.
------------------------------------------
[...truncated 908.86 KB...]
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "s19"
},
"serialized_fn":
"ref_AppliedPTransform_WriteUserScoreSums/WriteToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_34",
"user_name":
"WriteUserScoreSums/WriteToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)"
}
}
],
"type": "JOB_TYPE_STREAMING"
}
root: DEBUG: Response returned status 429, retrying
root: DEBUG: Retrying request to url
https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json
after exception HttpError accessing
<https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>:
response: <{'vary': 'Origin, X-Origin, Referer', 'content-type':
'application/json; charset=UTF-8', 'date': 'Tue, 09 Jul 2019 09:18:03 GMT',
'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0',
'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff',
'transfer-encoding': 'chunked', 'status': '429', 'content-length': '598',
'-content-encoding': 'gzip'}>, content <{
"error": {
"code": 429,
"message": "Quota exceeded for quota metric
'dataflow.googleapis.com/create_requests' and limit
'CreateRequestsPerMinutePerUser' of service 'dataflow.googleapis.com' for
consumer 'project_number:844138762903'.",
"status": "RESOURCE_EXHAUSTED",
"details": [
{
"@type": "type.googleapis.com/google.rpc.Help",
"links": [
{
"description": "Google developer console API key",
"url":
"https://console.developers.google.com/project/844138762903/apiui/credential"
}
]
}
]
}
}
>
root: DEBUG: Response returned status 429, retrying
root: DEBUG: Retrying request to url
https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json
after exception HttpError accessing
<https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>:
response: <{'vary': 'Origin, X-Origin, Referer', 'content-type':
'application/json; charset=UTF-8', 'date': 'Tue, 09 Jul 2019 09:18:05 GMT',
'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0',
'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff',
'transfer-encoding': 'chunked', 'status': '429', 'content-length': '598',
'-content-encoding': 'gzip'}>, content <{
"error": {
"code": 429,
"message": "Quota exceeded for quota metric
'dataflow.googleapis.com/create_requests' and limit
'CreateRequestsPerMinutePerUser' of service 'dataflow.googleapis.com' for
consumer 'project_number:844138762903'.",
"status": "RESOURCE_EXHAUSTED",
"details": [
{
"@type": "type.googleapis.com/google.rpc.Help",
"links": [
{
"description": "Google developer console API key",
"url":
"https://console.developers.google.com/project/844138762903/apiui/credential"
}
]
}
]
}
}
>
root: DEBUG: Response returned status 429, retrying
root: DEBUG: Retrying request to url
https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json
after exception HttpError accessing
<https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>:
response: <{'vary': 'Origin, X-Origin, Referer', 'content-type':
'application/json; charset=UTF-8', 'date': 'Tue, 09 Jul 2019 09:18:09 GMT',
'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0',
'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff',
'transfer-encoding': 'chunked', 'status': '429', 'content-length': '598',
'-content-encoding': 'gzip'}>, content <{
"error": {
"code": 429,
"message": "Quota exceeded for quota metric
'dataflow.googleapis.com/create_requests' and limit
'CreateRequestsPerMinutePerUser' of service 'dataflow.googleapis.com' for
consumer 'project_number:844138762903'.",
"status": "RESOURCE_EXHAUSTED",
"details": [
{
"@type": "type.googleapis.com/google.rpc.Help",
"links": [
{
"description": "Google developer console API key",
"url":
"https://console.developers.google.com/project/844138762903/apiui/credential"
}
]
}
]
}
}
>
root: DEBUG: Response returned status 429, retrying
root: DEBUG: Retrying request to url
https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json
after exception HttpError accessing
<https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>:
response: <{'vary': 'Origin, X-Origin, Referer', 'content-type':
'application/json; charset=UTF-8', 'date': 'Tue, 09 Jul 2019 09:18:19 GMT',
'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0',
'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff',
'transfer-encoding': 'chunked', 'status': '429', 'content-length': '598',
'-content-encoding': 'gzip'}>, content <{
"error": {
"code": 429,
"message": "Quota exceeded for quota metric
'dataflow.googleapis.com/create_requests' and limit
'CreateRequestsPerMinutePerUser' of service 'dataflow.googleapis.com' for
consumer 'project_number:844138762903'.",
"status": "RESOURCE_EXHAUSTED",
"details": [
{
"@type": "type.googleapis.com/google.rpc.Help",
"links": [
{
"description": "Google developer console API key",
"url":
"https://console.developers.google.com/project/844138762903/apiui/credential"
}
]
}
]
}
}
>
root: ERROR: HTTP status 429 trying to create job at dataflow service endpoint
https://dataflow.googleapis.com
root: CRITICAL: details of server error: HttpError accessing
<https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>:
response: <{'vary': 'Origin, X-Origin, Referer', 'content-type':
'application/json; charset=UTF-8', 'date': 'Tue, 09 Jul 2019 09:18:37 GMT',
'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0',
'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff',
'transfer-encoding': 'chunked', 'status': '429', 'content-length': '598',
'-content-encoding': 'gzip'}>, content <{
"error": {
"code": 429,
"message": "Quota exceeded for quota metric
'dataflow.googleapis.com/create_requests' and limit
'CreateRequestsPerMinutePerUser' of service 'dataflow.googleapis.com' for
consumer 'project_number:844138762903'.",
"status": "RESOURCE_EXHAUSTED",
"details": [
{
"@type": "type.googleapis.com/google.rpc.Help",
"links": [
{
"description": "Google developer console API key",
"url":
"https://console.developers.google.com/project/844138762903/apiui/credential"
}
]
}
]
}
}
>
google.auth.transport._http_client: DEBUG: Making request: GET
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3,
connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1):
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1"
200 144
google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/[email protected]/token
HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1):
www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE
/bigquery/v2/projects/apache-beam-testing/datasets/leader_board_it_dataset1562663866?deleteContents=true
HTTP/1.1" 204 0
--------------------- >> end captured logging << ---------------------
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_18_04-13190204651245717707?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_33_04-15100100234424660818?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1140:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
method_to_use = self._compute_method(p, p.options)
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_41_39-11817177095907249717?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:557:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_17_57-17658698038327860311?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_40_32-17388058138561245782?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use
WriteToBigQuery instead.
kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232:
FutureWarning: MatchAll is experimental.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_49_57-8546624512124323314?project=apache-beam-testing.
| 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243:
FutureWarning: MatchAll is experimental.
| 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243:
FutureWarning: ReadMatches is experimental.
| 'Checksums' >> beam.Map(compute_hash))
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_18_47-7974726312663942031?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_28_20-18085250319104283034?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1140:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
method_to_use = self._compute_method(p, p.options)
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_37_36-16647650447775876112?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_45_47-17854825402874086884?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_17_57-16552751979417398128?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_40_01-2598207693128433100?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_48_45-1094118293359040066?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_18_31-10758761609267504401?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_27_47-11779700095118979551?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_37_15-4438284583753947715?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_46_15-5741591706095574732?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_54_57-11839084137426662109?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use
WriteToBigQuery instead.
kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use
WriteToBigQuery instead.
kms_key=kms_key))
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_17_55-14517046111167565093?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_27_22-10525049470924542020?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use
WriteToBigQuery instead.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_35_48-13759297424565321120?project=apache-beam-testing.
kms_key=transform.kms_key))
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_45_23-3731660066302348267?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_55_08-2114938021697625323?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_03_03_59-14124274139929969270?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_18_02-955158317500953097?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_27_50-8540945083221760764?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1140:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:557:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_37_26-4869735784269044126?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_46_47-11121728809969352362?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_55_22-2377333402061627956?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_17_59-13807675527996451612?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_28_10-9301170464691070195?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_37_45-838275390051280059?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use
WriteToBigQuery instead.
kms_key=transform.kms_key))
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_46_17-6214481377569985733?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_02_54_39-10415259387883050161?project=apache-beam-testing.
----------------------------------------------------------------------
XML:
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 3261.767s
FAILED (SKIP=5, errors=1)
> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED
FAILURE: Build completed with 3 failures.
1: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'>
line: 48
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'>
line: 48
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'>
line: 78
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 55m 34s
77 actionable tasks: 60 executed, 17 from cache
Publishing build scan...
https://gradle.com/s/ttmhlzxmontfe
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]