See <https://builds.apache.org/job/beam_PostCommit_Python2/2113/display/redirect?page=changes>
Changes: [robertwb] [BEAM-9340] Populate requirement for timer families. ------------------------------------------ [...truncated 11.19 MB...] apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T21:44:13.404Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a... apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T21:44:13.446Z: JOB_MESSAGE_BASIC: Finished operation monthly count/GroupByKey/Create apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T21:44:13.517Z: JOB_MESSAGE_DEBUG: Value "monthly count/GroupByKey/Session" materialized. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T21:44:13.581Z: JOB_MESSAGE_BASIC: Executing operation read/Read+read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+months with tornadoes+monthly count/GroupByKey+monthly count/Combine/Partial+monthly count/GroupByKey/Reify+monthly count/GroupByKey/Write apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T21:44:34.211Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T21:44:39.145Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s). apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T21:46:26.176Z: JOB_MESSAGE_DETAILED: Workers have started successfully. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T21:46:26.216Z: JOB_MESSAGE_DETAILED: Workers have started successfully. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T21:50:13.179Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T21:50:52.239Z: JOB_MESSAGE_ERROR: Traceback (most recent call last): File "/usr/local/lib/python2.7/site-packages/dataflow_worker/batchworker.py", line 647, in do_work work_executor.execute() File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py", line 218, in execute self._split_task) File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py", line 226, in _perform_source_split_considering_api_limits desired_bundle_size) File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py", line 263, in _perform_source_split for split in source.split(desired_bundle_size): File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery.py", line 669, in split schema, metadata_list = self._export_files(bq) File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery.py", line 728, in _export_files include_header=False) File "/usr/local/lib/python2.7/site-packages/apache_beam/utils/retry.py", line 236, in wrapper return fun(*args, **kwargs) File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 715, in perform_extract_job response = self.client.jobs.Insert(request) File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py", line 346, in Insert upload=upload, upload_config=upload_config) File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", line 731, in _RunMethod return self.ProcessHttpResponse(method_config, http_response, request) File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", line 737, in ProcessHttpResponse self.__ProcessHttpResponse(method_config, http_response, request)) File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", line 604, in __ProcessHttpResponse http_response, method_config=method_config, request=request) HttpForbiddenError: HttpError accessing <https://bigquery.googleapis.com/bigquery/v2/projects/clouddataflow-readonly/jobs?alt=json>: response: <{'status': '403', 'content-length': '478', 'x-xss-protection': '0', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Wed, 01 Apr 2020 21:50:50 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{ "error": { "code": 403, "message": "Access Denied: Project clouddataflow-readonly: User does not have bigquery.jobs.create permission in project clouddataflow-readonly.", "errors": [ { "message": "Access Denied: Project clouddataflow-readonly: User does not have bigquery.jobs.create permission in project clouddataflow-readonly.", "domain": "global", "reason": "accessDenied" } ], "status": "PERMISSION_DENIED" } } > apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T21:50:55.607Z: JOB_MESSAGE_ERROR: Traceback (most recent call last): File "/usr/local/lib/python2.7/site-packages/dataflow_worker/batchworker.py", line 647, in do_work work_executor.execute() File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py", line 218, in execute self._split_task) File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py", line 226, in _perform_source_split_considering_api_limits desired_bundle_size) File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py", line 263, in _perform_source_split for split in source.split(desired_bundle_size): File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery.py", line 669, in split schema, metadata_list = self._export_files(bq) File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery.py", line 728, in _export_files include_header=False) File "/usr/local/lib/python2.7/site-packages/apache_beam/utils/retry.py", line 236, in wrapper return fun(*args, **kwargs) File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 715, in perform_extract_job response = self.client.jobs.Insert(request) File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py", line 346, in Insert upload=upload, upload_config=upload_config) File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", line 731, in _RunMethod return self.ProcessHttpResponse(method_config, http_response, request) File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", line 737, in ProcessHttpResponse self.__ProcessHttpResponse(method_config, http_response, request)) File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", line 604, in __ProcessHttpResponse http_response, method_config=method_config, request=request) HttpForbiddenError: HttpError accessing <https://bigquery.googleapis.com/bigquery/v2/projects/clouddataflow-readonly/jobs?alt=json>: response: <{'status': '403', 'content-length': '478', 'x-xss-protection': '0', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Wed, 01 Apr 2020 21:50:53 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{ "error": { "code": 403, "message": "Access Denied: Project clouddataflow-readonly: User does not have bigquery.jobs.create permission in project clouddataflow-readonly.", "errors": [ { "message": "Access Denied: Project clouddataflow-readonly: User does not have bigquery.jobs.create permission in project clouddataflow-readonly.", "domain": "global", "reason": "accessDenied" } ], "status": "PERMISSION_DENIED" } } > apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T21:50:58.923Z: JOB_MESSAGE_ERROR: Traceback (most recent call last): File "/usr/local/lib/python2.7/site-packages/dataflow_worker/batchworker.py", line 647, in do_work work_executor.execute() File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py", line 218, in execute self._split_task) File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py", line 226, in _perform_source_split_considering_api_limits desired_bundle_size) File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py", line 263, in _perform_source_split for split in source.split(desired_bundle_size): File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery.py", line 669, in split schema, metadata_list = self._export_files(bq) File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery.py", line 728, in _export_files include_header=False) File "/usr/local/lib/python2.7/site-packages/apache_beam/utils/retry.py", line 236, in wrapper return fun(*args, **kwargs) File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 715, in perform_extract_job response = self.client.jobs.Insert(request) File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py", line 346, in Insert upload=upload, upload_config=upload_config) File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", line 731, in _RunMethod return self.ProcessHttpResponse(method_config, http_response, request) File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", line 737, in ProcessHttpResponse self.__ProcessHttpResponse(method_config, http_response, request)) File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", line 604, in __ProcessHttpResponse http_response, method_config=method_config, request=request) HttpForbiddenError: HttpError accessing <https://bigquery.googleapis.com/bigquery/v2/projects/clouddataflow-readonly/jobs?alt=json>: response: <{'status': '403', 'content-length': '478', 'x-xss-protection': '0', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Wed, 01 Apr 2020 21:50:56 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{ "error": { "code": 403, "message": "Access Denied: Project clouddataflow-readonly: User does not have bigquery.jobs.create permission in project clouddataflow-readonly.", "errors": [ { "message": "Access Denied: Project clouddataflow-readonly: User does not have bigquery.jobs.create permission in project clouddataflow-readonly.", "domain": "global", "reason": "accessDenied" } ], "status": "PERMISSION_DENIED" } } > apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T21:51:02.292Z: JOB_MESSAGE_ERROR: Traceback (most recent call last): File "/usr/local/lib/python2.7/site-packages/dataflow_worker/batchworker.py", line 647, in do_work work_executor.execute() File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py", line 218, in execute self._split_task) File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py", line 226, in _perform_source_split_considering_api_limits desired_bundle_size) File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py", line 263, in _perform_source_split for split in source.split(desired_bundle_size): File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery.py", line 669, in split schema, metadata_list = self._export_files(bq) File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery.py", line 728, in _export_files include_header=False) File "/usr/local/lib/python2.7/site-packages/apache_beam/utils/retry.py", line 236, in wrapper return fun(*args, **kwargs) File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 715, in perform_extract_job response = self.client.jobs.Insert(request) File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py", line 346, in Insert upload=upload, upload_config=upload_config) File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", line 731, in _RunMethod return self.ProcessHttpResponse(method_config, http_response, request) File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", line 737, in ProcessHttpResponse self.__ProcessHttpResponse(method_config, http_response, request)) File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", line 604, in __ProcessHttpResponse http_response, method_config=method_config, request=request) HttpForbiddenError: HttpError accessing <https://bigquery.googleapis.com/bigquery/v2/projects/clouddataflow-readonly/jobs?alt=json>: response: <{'status': '403', 'content-length': '478', 'x-xss-protection': '0', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Wed, 01 Apr 2020 21:51:00 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{ "error": { "code": 403, "message": "Access Denied: Project clouddataflow-readonly: User does not have bigquery.jobs.create permission in project clouddataflow-readonly.", "errors": [ { "message": "Access Denied: Project clouddataflow-readonly: User does not have bigquery.jobs.create permission in project clouddataflow-readonly.", "domain": "global", "reason": "accessDenied" } ], "status": "PERMISSION_DENIED" } } > apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T21:51:02.586Z: JOB_MESSAGE_BASIC: Finished operation read/Read+read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+months with tornadoes+monthly count/GroupByKey+monthly count/Combine/Partial+monthly count/GroupByKey/Reify+monthly count/GroupByKey/Write apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T21:51:02.660Z: JOB_MESSAGE_DEBUG: Executing failure step failure22 apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T21:51:02.693Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S02:read/Read+read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+months with tornadoes+monthly count/GroupByKey+monthly count/Combine/Partial+monthly count/GroupByKey/Reify+monthly count/GroupByKey/Write failed., Internal Issue (b9d0ad8cb3911286): 63963027:24514 apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T21:51:02.814Z: JOB_MESSAGE_DETAILED: Cleaning up. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T21:51:02.884Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T21:51:02.925Z: JOB_MESSAGE_BASIC: Stopping worker pool... apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T21:52:27.553Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T21:52:27.603Z: JOB_MESSAGE_BASIC: Worker pool stopped. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-01T21:52:27.635Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-04-01_14_44_06-4490921440965606978 is in state JOB_STATE_FAILED apache_beam.io.gcp.tests.utils: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1585777425942. google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254 google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None) google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80 urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144 google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/[email protected]/token HTTP/1.1" 200 192 urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): bigquery.googleapis.com:443 urllib3.connectionpool: DEBUG: https://bigquery.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1585777425942 HTTP/1.1" 404 None --------------------- >> end captured logging << --------------------- ---------------------------------------------------------------------- XML: nosetests-postCommitIT-df.xml ---------------------------------------------------------------------- XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 59 tests in 4352.300s FAILED (SKIP=8, errors=2) > Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED FAILURE: Build completed with 2 failures. 1: Task failed with an exception. ----------- * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 81 * What went wrong: Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'. > Process 'command 'sh'' finished with non-zero exit value 255 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 2: Task failed with an exception. ----------- * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1h 14m 15s 126 actionable tasks: 101 executed, 22 from cache, 3 up-to-date Publishing build scan... https://gradle.com/s/exxaipagxw5pu Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
