See
<https://builds.apache.org/job/beam_PostCommit_Python37/2039/display/redirect?page=changes>
Changes:
[rohde.samuel] Add dependency comment in streaming cache
[ehudm] [BEAM-1894] Remove obsolete EagerRunner test
------------------------------------------
[...truncated 10.53 MB...]
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T19:23:40.063Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py",
line 647, in do_work
work_executor.execute()
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py",
line 218, in execute
self._split_task)
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py",
line 226, in _perform_source_split_considering_api_limits
desired_bundle_size)
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py",
line 263, in _perform_source_split
for split in source.split(desired_bundle_size):
File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery.py",
line 669, in split
schema, metadata_list = self._export_files(bq)
File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery.py",
line 728, in _export_files
include_header=False)
File "/usr/local/lib/python3.7/site-packages/apache_beam/utils/retry.py",
line 236, in wrapper
return fun(*args, **kwargs)
File
"/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery_tools.py",
line 715, in perform_extract_job
response = self.client.jobs.Insert(request)
File
"/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py",
line 346, in Insert
upload=upload, upload_config=upload_config)
File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py",
line 731, in _RunMethod
return self.ProcessHttpResponse(method_config, http_response, request)
File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py",
line 737, in ProcessHttpResponse
self.__ProcessHttpResponse(method_config, http_response, request))
File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py",
line 604, in __ProcessHttpResponse
http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpForbiddenError: HttpError accessing
<https://bigquery.googleapis.com/bigquery/v2/projects/clouddataflow-readonly/jobs?alt=json>:
response: <{'vary': 'Origin, X-Origin, Referer', 'content-type':
'application/json; charset=UTF-8', 'date': 'Wed, 01 Apr 2020 19:23:38 GMT',
'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0',
'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff',
'transfer-encoding': 'chunked', 'status': '403', 'content-length': '478',
'-content-encoding': 'gzip'}>, content <{
"error": {
"code": 403,
"message": "Access Denied: Project clouddataflow-readonly: User does not
have bigquery.jobs.create permission in project clouddataflow-readonly.",
"errors": [
{
"message": "Access Denied: Project clouddataflow-readonly: User does
not have bigquery.jobs.create permission in project clouddataflow-readonly.",
"domain": "global",
"reason": "accessDenied"
}
],
"status": "PERMISSION_DENIED"
}
}
>
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T19:23:43.465Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py",
line 647, in do_work
work_executor.execute()
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py",
line 218, in execute
self._split_task)
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py",
line 226, in _perform_source_split_considering_api_limits
desired_bundle_size)
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py",
line 263, in _perform_source_split
for split in source.split(desired_bundle_size):
File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery.py",
line 669, in split
schema, metadata_list = self._export_files(bq)
File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery.py",
line 728, in _export_files
include_header=False)
File "/usr/local/lib/python3.7/site-packages/apache_beam/utils/retry.py",
line 236, in wrapper
return fun(*args, **kwargs)
File
"/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery_tools.py",
line 715, in perform_extract_job
response = self.client.jobs.Insert(request)
File
"/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py",
line 346, in Insert
upload=upload, upload_config=upload_config)
File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py",
line 731, in _RunMethod
return self.ProcessHttpResponse(method_config, http_response, request)
File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py",
line 737, in ProcessHttpResponse
self.__ProcessHttpResponse(method_config, http_response, request))
File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py",
line 604, in __ProcessHttpResponse
http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpForbiddenError: HttpError accessing
<https://bigquery.googleapis.com/bigquery/v2/projects/clouddataflow-readonly/jobs?alt=json>:
response: <{'vary': 'Origin, X-Origin, Referer', 'content-type':
'application/json; charset=UTF-8', 'date': 'Wed, 01 Apr 2020 19:23:41 GMT',
'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0',
'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff',
'transfer-encoding': 'chunked', 'status': '403', 'content-length': '478',
'-content-encoding': 'gzip'}>, content <{
"error": {
"code": 403,
"message": "Access Denied: Project clouddataflow-readonly: User does not
have bigquery.jobs.create permission in project clouddataflow-readonly.",
"errors": [
{
"message": "Access Denied: Project clouddataflow-readonly: User does
not have bigquery.jobs.create permission in project clouddataflow-readonly.",
"domain": "global",
"reason": "accessDenied"
}
],
"status": "PERMISSION_DENIED"
}
}
>
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T19:23:46.893Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py",
line 647, in do_work
work_executor.execute()
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py",
line 218, in execute
self._split_task)
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py",
line 226, in _perform_source_split_considering_api_limits
desired_bundle_size)
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py",
line 263, in _perform_source_split
for split in source.split(desired_bundle_size):
File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery.py",
line 669, in split
schema, metadata_list = self._export_files(bq)
File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery.py",
line 728, in _export_files
include_header=False)
File "/usr/local/lib/python3.7/site-packages/apache_beam/utils/retry.py",
line 236, in wrapper
return fun(*args, **kwargs)
File
"/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery_tools.py",
line 715, in perform_extract_job
response = self.client.jobs.Insert(request)
File
"/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py",
line 346, in Insert
upload=upload, upload_config=upload_config)
File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py",
line 731, in _RunMethod
return self.ProcessHttpResponse(method_config, http_response, request)
File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py",
line 737, in ProcessHttpResponse
self.__ProcessHttpResponse(method_config, http_response, request))
File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py",
line 604, in __ProcessHttpResponse
http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpForbiddenError: HttpError accessing
<https://bigquery.googleapis.com/bigquery/v2/projects/clouddataflow-readonly/jobs?alt=json>:
response: <{'vary': 'Origin, X-Origin, Referer', 'content-type':
'application/json; charset=UTF-8', 'date': 'Wed, 01 Apr 2020 19:23:44 GMT',
'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0',
'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff',
'transfer-encoding': 'chunked', 'status': '403', 'content-length': '478',
'-content-encoding': 'gzip'}>, content <{
"error": {
"code": 403,
"message": "Access Denied: Project clouddataflow-readonly: User does not
have bigquery.jobs.create permission in project clouddataflow-readonly.",
"errors": [
{
"message": "Access Denied: Project clouddataflow-readonly: User does
not have bigquery.jobs.create permission in project clouddataflow-readonly.",
"domain": "global",
"reason": "accessDenied"
}
],
"status": "PERMISSION_DENIED"
}
}
>
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T19:23:47.193Z:
JOB_MESSAGE_BASIC: Finished operation
read/Read+read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+months
with tornadoes+monthly count/GroupByKey+monthly count/Combine/Partial+monthly
count/GroupByKey/Reify+monthly count/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T19:23:47.273Z:
JOB_MESSAGE_DEBUG: Executing failure step failure22
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T19:23:47.303Z:
JOB_MESSAGE_ERROR: Workflow failed. Causes:
S02:read/Read+read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+months
with tornadoes+monthly count/GroupByKey+monthly count/Combine/Partial+monthly
count/GroupByKey/Reify+monthly count/GroupByKey/Write failed., Internal Issue
(da34bd359f08fecf): 63963027:24514
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T19:23:47.433Z:
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T19:23:47.516Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T19:23:47.548Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T19:25:42.520Z:
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T19:25:42.572Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T19:25:42.616Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job
2020-04-01_12_18_00-6593033114547578053 is in state JOB_STATE_FAILED
apache_beam.io.gcp.tests.utils: INFO: Clean up a BigQuery table with project:
apache-beam-testing, dataset: BigQueryTornadoesIT, table:
monthly_tornadoes_1585768665497.
google.auth.transport._http_client: DEBUG: Making request: GET
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3,
connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1):
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1"
200 144
google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/[email protected]/token
HTTP/1.1" 200 192
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1):
bigquery.googleapis.com:443
urllib3.connectionpool: DEBUG: https://bigquery.googleapis.com:443 "DELETE
/bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1585768665497
HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_18_04-14028599578326749670?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_32_35-9110211431515818972?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_43_28-1442509688693516071?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_53_49-7184476579352209485?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_13_01_44-6764583833564008681?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_13_08_50-17139953363349911730?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_13_16_54-16990614190816856576?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_18_01-14790106227578462364?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_41_32-9489958784277573602?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_49_43-16464069933624101327?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_58_32-8251242721379068492?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_13_09_47-12066452783337673378?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_18_03-17043543861242910930?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_30_39-12376513808991833125?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_39_07-17497809255442900306?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_47_31-5945644158706662501?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_55_28-10600502155714985416?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_13_05_32-4850176955288074697?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_18_00-15476426502976009892?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_38_38-5069079288902960518?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_47_29-10467656784680547522?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_55_42-2522917993727299752?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_13_06_05-4069253802143750645?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_18_02-3332578078450359963?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_27_40-8944470559386818497?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_36_22-1292572562518612244?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_44_57-17669341026800557749?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_53_39-14627307381215331204?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_13_02_16-5399084830977770010?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_13_11_10-2634708503845721040?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_18_00-6593033114547578053?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_26_08-11361350250557035692?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_36_07-5899259539079840118?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_44_23-13614202702743079668?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_53_05-15694011663909596521?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_13_01_13-9042595448977202706?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_13_09_36-17072816354602944476?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_18_01-7649518617345909171?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_28_42-6121586960571170226?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_41_05-6476177102324458056?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_49_53-10923407634305293504?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_13_07_17-14120073321598644242?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_18_02-7661920334308289637?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_26_32-1657383727353383281?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_36_00-715557318717280244?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_44_42-12808319952876913898?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_12_53_59-8018651847716669798?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_13_02_51-16856253077851258850?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_13_11_11-11597714979802512290?project=apache-beam-testing
----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py37.xml
----------------------------------------------------------------------
XML:
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 59 tests in 4090.760s
FAILED (SKIP=9, errors=2)
> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED
FAILURE: Build completed with 2 failures.
1: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/direct/py37/build.gradle'>
line: 60
* What went wrong:
Execution failed for task
':sdks:python:test-suites:direct:py37:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 255
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'>
line: 89
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 10m 12s
87 actionable tasks: 70 executed, 17 from cache
Publishing build scan...
https://gradle.com/s/xnheexxgg34ow
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]