See 
<https://builds.apache.org/job/beam_PostCommit_Python2/2114/display/redirect?page=changes>

Changes:

[daniel.o.programmer] [BEAM-9642] Create runtime invokers for SDF methods.


------------------------------------------
[...truncated 11.17 MB...]
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T23:56:29.158Z: 
JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T23:56:29.184Z: 
JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T23:56:29.242Z: 
JOB_MESSAGE_BASIC: Finished operation monthly count/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T23:56:29.314Z: 
JOB_MESSAGE_DEBUG: Value "monthly count/GroupByKey/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T23:56:29.384Z: 
JOB_MESSAGE_BASIC: Executing operation 
read/Read+read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+months
 with tornadoes+monthly count/GroupByKey+monthly count/Combine/Partial+monthly 
count/GroupByKey/Reify+monthly count/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T23:56:54.294Z: 
JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric 
descriptors and Stackdriver will not create new Dataflow custom metrics for 
this job. Each unique user-defined metric name (independent of the DoFn in 
which it is defined) produces a new metric descriptor. To delete old / unused 
metric descriptors see 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T23:56:58.411Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on 
the rate of progress in the currently running stage(s).
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T23:58:25.899Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T23:58:25.929Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-02T00:02:06.281Z: 
JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/dataflow_worker/batchworker.py", 
line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py", 
line 218, in execute
    self._split_task)
  File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py", 
line 226, in _perform_source_split_considering_api_limits
    desired_bundle_size)
  File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py", 
line 263, in _perform_source_split
    for split in source.split(desired_bundle_size):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery.py", 
line 669, in split
    schema, metadata_list = self._export_files(bq)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery.py", 
line 728, in _export_files
    include_header=False)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/utils/retry.py", 
line 236, in wrapper
    return fun(*args, **kwargs)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery_tools.py", 
line 715, in perform_extract_job
    response = self.client.jobs.Insert(request)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py",
 line 346, in Insert
    upload=upload, upload_config=upload_config)
  File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", 
line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", 
line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", 
line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
HttpForbiddenError: HttpError accessing 
<https://bigquery.googleapis.com/bigquery/v2/projects/clouddataflow-readonly/jobs?alt=json>:
 response: <{'status': '403', 'content-length': '478', 'x-xss-protection': '0', 
'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 
'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 
'cache-control': 'private', 'date': 'Thu, 02 Apr 2020 00:02:04 GMT', 
'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; 
charset=UTF-8'}>, content <{
  "error": {
    "code": 403,
    "message": "Access Denied: Project clouddataflow-readonly: User does not 
have bigquery.jobs.create permission in project clouddataflow-readonly.",
    "errors": [
      {
        "message": "Access Denied: Project clouddataflow-readonly: User does 
not have bigquery.jobs.create permission in project clouddataflow-readonly.",
        "domain": "global",
        "reason": "accessDenied"
      }
    ],
    "status": "PERMISSION_DENIED"
  }
}
>

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-02T00:02:09.618Z: 
JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/dataflow_worker/batchworker.py", 
line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py", 
line 218, in execute
    self._split_task)
  File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py", 
line 226, in _perform_source_split_considering_api_limits
    desired_bundle_size)
  File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py", 
line 263, in _perform_source_split
    for split in source.split(desired_bundle_size):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery.py", 
line 669, in split
    schema, metadata_list = self._export_files(bq)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery.py", 
line 728, in _export_files
    include_header=False)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/utils/retry.py", 
line 236, in wrapper
    return fun(*args, **kwargs)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery_tools.py", 
line 715, in perform_extract_job
    response = self.client.jobs.Insert(request)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py",
 line 346, in Insert
    upload=upload, upload_config=upload_config)
  File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", 
line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", 
line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", 
line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
HttpForbiddenError: HttpError accessing 
<https://bigquery.googleapis.com/bigquery/v2/projects/clouddataflow-readonly/jobs?alt=json>:
 response: <{'status': '403', 'content-length': '478', 'x-xss-protection': '0', 
'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 
'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 
'cache-control': 'private', 'date': 'Thu, 02 Apr 2020 00:02:07 GMT', 
'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; 
charset=UTF-8'}>, content <{
  "error": {
    "code": 403,
    "message": "Access Denied: Project clouddataflow-readonly: User does not 
have bigquery.jobs.create permission in project clouddataflow-readonly.",
    "errors": [
      {
        "message": "Access Denied: Project clouddataflow-readonly: User does 
not have bigquery.jobs.create permission in project clouddataflow-readonly.",
        "domain": "global",
        "reason": "accessDenied"
      }
    ],
    "status": "PERMISSION_DENIED"
  }
}
>

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-02T00:02:12.914Z: 
JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/dataflow_worker/batchworker.py", 
line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py", 
line 218, in execute
    self._split_task)
  File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py", 
line 226, in _perform_source_split_considering_api_limits
    desired_bundle_size)
  File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py", 
line 263, in _perform_source_split
    for split in source.split(desired_bundle_size):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery.py", 
line 669, in split
    schema, metadata_list = self._export_files(bq)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery.py", 
line 728, in _export_files
    include_header=False)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/utils/retry.py", 
line 236, in wrapper
    return fun(*args, **kwargs)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery_tools.py", 
line 715, in perform_extract_job
    response = self.client.jobs.Insert(request)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py",
 line 346, in Insert
    upload=upload, upload_config=upload_config)
  File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", 
line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", 
line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", 
line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
HttpForbiddenError: HttpError accessing 
<https://bigquery.googleapis.com/bigquery/v2/projects/clouddataflow-readonly/jobs?alt=json>:
 response: <{'status': '403', 'content-length': '478', 'x-xss-protection': '0', 
'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 
'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 
'cache-control': 'private', 'date': 'Thu, 02 Apr 2020 00:02:10 GMT', 
'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; 
charset=UTF-8'}>, content <{
  "error": {
    "code": 403,
    "message": "Access Denied: Project clouddataflow-readonly: User does not 
have bigquery.jobs.create permission in project clouddataflow-readonly.",
    "errors": [
      {
        "message": "Access Denied: Project clouddataflow-readonly: User does 
not have bigquery.jobs.create permission in project clouddataflow-readonly.",
        "domain": "global",
        "reason": "accessDenied"
      }
    ],
    "status": "PERMISSION_DENIED"
  }
}
>

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-02T00:02:16.206Z: 
JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/dataflow_worker/batchworker.py", 
line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py", 
line 218, in execute
    self._split_task)
  File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py", 
line 226, in _perform_source_split_considering_api_limits
    desired_bundle_size)
  File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py", 
line 263, in _perform_source_split
    for split in source.split(desired_bundle_size):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery.py", 
line 669, in split
    schema, metadata_list = self._export_files(bq)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery.py", 
line 728, in _export_files
    include_header=False)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/utils/retry.py", 
line 236, in wrapper
    return fun(*args, **kwargs)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery_tools.py", 
line 715, in perform_extract_job
    response = self.client.jobs.Insert(request)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py",
 line 346, in Insert
    upload=upload, upload_config=upload_config)
  File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", 
line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", 
line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", 
line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
HttpForbiddenError: HttpError accessing 
<https://bigquery.googleapis.com/bigquery/v2/projects/clouddataflow-readonly/jobs?alt=json>:
 response: <{'status': '403', 'content-length': '478', 'x-xss-protection': '0', 
'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 
'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 
'cache-control': 'private', 'date': 'Thu, 02 Apr 2020 00:02:14 GMT', 
'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; 
charset=UTF-8'}>, content <{
  "error": {
    "code": 403,
    "message": "Access Denied: Project clouddataflow-readonly: User does not 
have bigquery.jobs.create permission in project clouddataflow-readonly.",
    "errors": [
      {
        "message": "Access Denied: Project clouddataflow-readonly: User does 
not have bigquery.jobs.create permission in project clouddataflow-readonly.",
        "domain": "global",
        "reason": "accessDenied"
      }
    ],
    "status": "PERMISSION_DENIED"
  }
}
>

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-02T00:02:16.489Z: 
JOB_MESSAGE_BASIC: Finished operation 
read/Read+read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+months
 with tornadoes+monthly count/GroupByKey+monthly count/Combine/Partial+monthly 
count/GroupByKey/Reify+monthly count/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-02T00:02:16.581Z: 
JOB_MESSAGE_DEBUG: Executing failure step failure22
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-02T00:02:16.614Z: 
JOB_MESSAGE_ERROR: Workflow failed. Causes: 
S02:read/Read+read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+months
 with tornadoes+monthly count/GroupByKey+monthly count/Combine/Partial+monthly 
count/GroupByKey/Reify+monthly count/GroupByKey/Write failed., Internal Issue 
(37f66cc157969883): 63963027:24514
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-02T00:02:16.746Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-02T00:02:16.825Z: 
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-02T00:02:16.853Z: 
JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-02T00:04:18.497Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-02T00:04:18.552Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-02T00:04:18.604Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 
2020-04-01_16_56_20-5934095131814958478 is in state JOB_STATE_FAILED
apache_beam.io.gcp.tests.utils: INFO: Clean up a BigQuery table with project: 
apache-beam-testing, dataset: BigQueryTornadoesIT, table: 
monthly_tornadoes_1585785364402.
google.auth.transport._http_client: DEBUG: Making request: GET 
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, 
connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): 
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token
 HTTP/1.1" 200 192
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): 
bigquery.googleapis.com:443
urllib3.connectionpool: DEBUG: https://bigquery.googleapis.com:443 "DELETE 
/bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1585785364402
 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 59 tests in 4562.348s

FAILED (SKIP=8, errors=2)

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'>
 line: 81

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 255

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'>
 line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 17m 32s
126 actionable tasks: 99 executed, 24 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/i55q42p4ir3to

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to