See
<https://builds.apache.org/job/beam_PostCommit_Python37/2041/display/redirect?page=changes>
Changes:
[daniel.o.programmer] [BEAM-9642] Create runtime invokers for SDF methods.
------------------------------------------
[...truncated 10.60 MB...]
include_header=False)
File "/usr/local/lib/python3.7/site-packages/apache_beam/utils/retry.py",
line 236, in wrapper
return fun(*args, **kwargs)
File
"/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery_tools.py",
line 715, in perform_extract_job
response = self.client.jobs.Insert(request)
File
"/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py",
line 346, in Insert
upload=upload, upload_config=upload_config)
File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py",
line 731, in _RunMethod
return self.ProcessHttpResponse(method_config, http_response, request)
File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py",
line 737, in ProcessHttpResponse
self.__ProcessHttpResponse(method_config, http_response, request))
File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py",
line 604, in __ProcessHttpResponse
http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpForbiddenError: HttpError accessing
<https://bigquery.googleapis.com/bigquery/v2/projects/clouddataflow-readonly/jobs?alt=json>:
response: <{'vary': 'Origin, X-Origin, Referer', 'content-type':
'application/json; charset=UTF-8', 'date': 'Thu, 02 Apr 2020 00:09:30 GMT',
'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0',
'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff',
'transfer-encoding': 'chunked', 'status': '403', 'content-length': '478',
'-content-encoding': 'gzip'}>, content <{
"error": {
"code": 403,
"message": "Access Denied: Project clouddataflow-readonly: User does not
have bigquery.jobs.create permission in project clouddataflow-readonly.",
"errors": [
{
"message": "Access Denied: Project clouddataflow-readonly: User does
not have bigquery.jobs.create permission in project clouddataflow-readonly.",
"domain": "global",
"reason": "accessDenied"
}
],
"status": "PERMISSION_DENIED"
}
}
>
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-02T00:09:36.070Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py",
line 647, in do_work
work_executor.execute()
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py",
line 218, in execute
self._split_task)
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py",
line 226, in _perform_source_split_considering_api_limits
desired_bundle_size)
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py",
line 263, in _perform_source_split
for split in source.split(desired_bundle_size):
File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery.py",
line 669, in split
schema, metadata_list = self._export_files(bq)
File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery.py",
line 728, in _export_files
include_header=False)
File "/usr/local/lib/python3.7/site-packages/apache_beam/utils/retry.py",
line 236, in wrapper
return fun(*args, **kwargs)
File
"/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery_tools.py",
line 715, in perform_extract_job
response = self.client.jobs.Insert(request)
File
"/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py",
line 346, in Insert
upload=upload, upload_config=upload_config)
File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py",
line 731, in _RunMethod
return self.ProcessHttpResponse(method_config, http_response, request)
File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py",
line 737, in ProcessHttpResponse
self.__ProcessHttpResponse(method_config, http_response, request))
File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py",
line 604, in __ProcessHttpResponse
http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpForbiddenError: HttpError accessing
<https://bigquery.googleapis.com/bigquery/v2/projects/clouddataflow-readonly/jobs?alt=json>:
response: <{'vary': 'Origin, X-Origin, Referer', 'content-type':
'application/json; charset=UTF-8', 'date': 'Thu, 02 Apr 2020 00:09:34 GMT',
'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0',
'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff',
'transfer-encoding': 'chunked', 'status': '403', 'content-length': '478',
'-content-encoding': 'gzip'}>, content <{
"error": {
"code": 403,
"message": "Access Denied: Project clouddataflow-readonly: User does not
have bigquery.jobs.create permission in project clouddataflow-readonly.",
"errors": [
{
"message": "Access Denied: Project clouddataflow-readonly: User does
not have bigquery.jobs.create permission in project clouddataflow-readonly.",
"domain": "global",
"reason": "accessDenied"
}
],
"status": "PERMISSION_DENIED"
}
}
>
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-02T00:09:39.468Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py",
line 647, in do_work
work_executor.execute()
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py",
line 218, in execute
self._split_task)
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py",
line 226, in _perform_source_split_considering_api_limits
desired_bundle_size)
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py",
line 263, in _perform_source_split
for split in source.split(desired_bundle_size):
File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery.py",
line 669, in split
schema, metadata_list = self._export_files(bq)
File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery.py",
line 728, in _export_files
include_header=False)
File "/usr/local/lib/python3.7/site-packages/apache_beam/utils/retry.py",
line 236, in wrapper
return fun(*args, **kwargs)
File
"/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery_tools.py",
line 715, in perform_extract_job
response = self.client.jobs.Insert(request)
File
"/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py",
line 346, in Insert
upload=upload, upload_config=upload_config)
File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py",
line 731, in _RunMethod
return self.ProcessHttpResponse(method_config, http_response, request)
File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py",
line 737, in ProcessHttpResponse
self.__ProcessHttpResponse(method_config, http_response, request))
File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py",
line 604, in __ProcessHttpResponse
http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpForbiddenError: HttpError accessing
<https://bigquery.googleapis.com/bigquery/v2/projects/clouddataflow-readonly/jobs?alt=json>:
response: <{'vary': 'Origin, X-Origin, Referer', 'content-type':
'application/json; charset=UTF-8', 'date': 'Thu, 02 Apr 2020 00:09:37 GMT',
'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0',
'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff',
'transfer-encoding': 'chunked', 'status': '403', 'content-length': '478',
'-content-encoding': 'gzip'}>, content <{
"error": {
"code": 403,
"message": "Access Denied: Project clouddataflow-readonly: User does not
have bigquery.jobs.create permission in project clouddataflow-readonly.",
"errors": [
{
"message": "Access Denied: Project clouddataflow-readonly: User does
not have bigquery.jobs.create permission in project clouddataflow-readonly.",
"domain": "global",
"reason": "accessDenied"
}
],
"status": "PERMISSION_DENIED"
}
}
>
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-02T00:09:39.725Z:
JOB_MESSAGE_BASIC: Finished operation
read/Read+read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+months
with tornadoes+monthly count/GroupByKey+monthly count/Combine/Partial+monthly
count/GroupByKey/Reify+monthly count/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-02T00:09:39.793Z:
JOB_MESSAGE_DEBUG: Executing failure step failure22
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-02T00:09:39.828Z:
JOB_MESSAGE_ERROR: Workflow failed. Causes:
S02:read/Read+read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+months
with tornadoes+monthly count/GroupByKey+monthly count/Combine/Partial+monthly
count/GroupByKey/Reify+monthly count/GroupByKey/Write failed., Internal Issue
(15b9b4d273bcdc88): 63963027:24514
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-02T00:09:39.945Z:
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-02T00:09:40.014Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-02T00:09:40.044Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-02T00:11:01.745Z:
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-02T00:11:01.791Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-02T00:11:01.824Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job
2020-04-01_17_03_20-18141929969744206621 is in state JOB_STATE_FAILED
apache_beam.io.gcp.tests.utils: INFO: Clean up a BigQuery table with project:
apache-beam-testing, dataset: BigQueryTornadoesIT, table:
monthly_tornadoes_1585785783751.
google.auth.transport._http_client: DEBUG: Making request: GET
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3,
connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1):
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1"
200 144
google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/[email protected]/token
HTTP/1.1" 200 192
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1):
bigquery.googleapis.com:443
urllib3.connectionpool: DEBUG: https://bigquery.googleapis.com:443 "DELETE
/bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1585785783751
HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_03_23-4293643747873812303?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_18_25-3829934759088899037?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_28_29-16021213691005215335?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_39_07-15770231523358567611?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_47_46-18207142610237952868?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_56_08-4094199511029948182?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_03_22-13080022148470492895?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_24_44-15214232488302168690?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_33_09-148085585630745074?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_42_08-4173371825262205275?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_50_37-17321364239952874153?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_03_22-5996091940044777466?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_15_48-5682490637927335817?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_24_08-18124130215590412043?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_32_52-9774380557503849533?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_42_26-11020940522250782947?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_50_57-44368199949027362?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_03_19-176812890515253339?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_23_00-7815144600622382679?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_31_43-149793535680941996?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_39_47-16208401818994197974?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_48_45-1136185130034878588?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_57_19-9753977026388750400?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_03_19-4202815732225949290?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_12_57-15394655354763958336?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_20_59-13007763249096591666?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_29_28-8104285311511870923?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_38_29-14503895907299193866?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_47_13-13979756676682071504?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_55_44-11114429123861448354?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_03_20-18141929969744206621?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_11_29-531043163608516393?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_21_18-6458281623582962157?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_30_18-10922295875471232543?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_39_56-1349639879942605176?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_49_15-17520100584554329228?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_03_20-6401221834356145538?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_14_22-16552348276819985921?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_27_20-10411460450968517210?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_36_27-9723891158136909235?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_53_54-249545836635523519?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_03_22-4756296791764076246?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_11_56-16217422808432840044?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_21_36-6382096803327498067?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_30_39-5154050661751483711?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_39_44-1183734462554400750?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_47_53-8324945921357720351?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_17_55_33-10105619775840495191?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_18_03_43-4457734835319036511?project=apache-beam-testing
----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py37.xml
----------------------------------------------------------------------
XML:
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 59 tests in 4172.015s
FAILED (SKIP=9, errors=2)
> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED
FAILURE: Build completed with 3 failures.
1: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/direct/py37/build.gradle'>
line: 60
* What went wrong:
Execution failed for task
':sdks:python:test-suites:direct:py37:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 255
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/direct/py37/build.gradle'>
line: 51
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'>
line: 89
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 10m 45s
87 actionable tasks: 65 executed, 22 from cache
Publishing build scan...
https://gradle.com/s/av7ld6rskv6ui
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]