See <https://ci-beam.apache.org/job/beam_PostCommit_Python2/2923/display/redirect?page=changes>
Changes: [Luke Cwik] [BEAM-10670] Update Samza to be opt-out for SplittableDoFn. [noreply] [BEAM-10616] Add missing ParDo test cases for streaming/Flink (#12848) [noreply] Bump versions of protobuf, shadow, other gradle plugins. (#12821) [Luke Cwik] [BEAM-10670] Update Jet runner to be opt-out for splittable DoFn [Luke Cwik] Update runners/jet/build.gradle [noreply] [BEAM-7523] Fix starting Kafka container twice in KafkaCSVTableIT ------------------------------------------ [...truncated 21.48 MB...] ], "type": "JOB_TYPE_BATCH" } INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job createTime: u'2020-09-16T19:03:20.657747Z' currentStateTime: u'1970-01-01T00:00:00Z' id: u'2020-09-16_12_03_19-13061421580333274470' location: u'us-central1' name: u'beamapp-jenkins-0916190312-661386' projectId: u'apache-beam-testing' stageStates: [] startTime: u'2020-09-16T19:03:20.657747Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-09-16_12_03_19-13061421580333274470] INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2020-09-16_12_03_19-13061421580333274470 INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-16_12_03_19-13061421580333274470?project=apache-beam-testing INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-09-16_12_03_19-13061421580333274470 is in state JOB_STATE_RUNNING INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:19.386Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-09-16_12_03_19-13061421580333274470. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:19.386Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-09-16_12_03_19-13061421580333274470. The number of workers will be between 1 and 1000. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:23.142Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:24.691Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:24.729Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step GroupByKey: GroupByKey not followed by a combiner. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:24.764Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:24.792Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:24.860Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:24.905Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:24.940Z: JOB_MESSAGE_DETAILED: Fusing consumer metrics into Create/Read INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:24.963Z: JOB_MESSAGE_DETAILED: Fusing consumer map_to_common_key into metrics INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:24.998Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Reify into map_to_common_key INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:25.031Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Write into GroupByKey/Reify INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:25.065Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/GroupByWindow into GroupByKey/Read INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:25.100Z: JOB_MESSAGE_DETAILED: Fusing consumer m_out into GroupByKey/GroupByWindow INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:25.136Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:25.169Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:25.204Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:25.243Z: JOB_MESSAGE_DEBUG: Assigning stage ids. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:25.446Z: JOB_MESSAGE_DEBUG: Executing wait step start13 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:25.511Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Create INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:25.551Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:25.589Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:25.629Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Create INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:25.700Z: JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:25.761Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:33.349Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:33.405Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:33.444Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:38.348Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-09-16_11_56_02-8326692103376168110 is in state JOB_STATE_DONE INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT bytes, date, time FROM python_write_to_table_16002825499380.python_no_schema_table to BQ DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process... DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process... DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process... DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials. DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254 DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None) DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80 DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144 DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/[email protected]/token HTTP/1.1" 200 221 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443 DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/edf84fe5-fc28-4c86-a418-6ad00e9c0c92?location=US&maxResults=0 HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anona78c008348a121807f8f171de5f1b2f48fa57bbd/data HTTP/1.1" 200 None INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is: [('xyw', datetime.date(2011, 1, 1), datetime.time(23, 59, 59, 999999)), ('\xe4\xbd\xa0\xe5\xa5\xbd', datetime.date(3000, 12, 31), datetime.time(23, 59, 59)), ('abc', datetime.date(2000, 1, 1), datetime.time(0, 0)), ('\xab\xac\xad', datetime.date(2000, 1, 1), datetime.time(0, 0))] INFO:apache_beam.io.gcp.bigquery_write_it_test:Deleting dataset python_write_to_table_16002825499380 in project apache-beam-testing test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:51.199Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:51.239Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:51.267Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:03:56.441Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s). INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-09-16_11_55_58-3293263625150351268 is in state JOB_STATE_DONE test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok Runs streaming Dataflow job and verifies that user metrics are reported ... ok INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:05:36.766Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:05:36.830Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:06:01.721Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:06:04.911Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+ExternalTransform(simple)/Map(<lambda at external_it_test.py:43>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:06:04.980Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:06:05.019Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:06:05.157Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:06:14.415Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:06:14.503Z: JOB_MESSAGE_DEBUG: Executing success step success19 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:06:14.575Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:06:14.608Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:06:14.632Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:06:20.614Z: JOB_MESSAGE_BASIC: Executing BigQuery import job "dataflow_job_3004012064630235218". You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_3004012064630235218". INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:06:31.509Z: JOB_MESSAGE_BASIC: BigQuery import job "dataflow_job_3004012064630235218" done. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:06:32.269Z: JOB_MESSAGE_BASIC: Finished operation read+write/NativeWrite INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:06:32.344Z: JOB_MESSAGE_DEBUG: Executing success step success1 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:06:32.400Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:06:32.682Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:06:32.709Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:07:05.618Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:07:05.656Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:07:05.686Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-09-16_12_00_33-10872625824351828739 is in state JOB_STATE_DONE test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:07:25.620Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:07:25.649Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:07:25.677Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-09-16_12_00_32-1465042597692064724 is in state JOB_STATE_DONE INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT fruit from `python_query_to_table_16002828214120.output_table`; to BQ DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process... DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process... DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process... DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials. DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254 DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None) DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80 DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144 DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/[email protected]/token HTTP/1.1" 200 221 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443 DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/aae5703e-4be5-4867-91cb-a80c9cf3892f?location=US&maxResults=0 HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon6ade1f7f096895915a8ea75475ab77220c548d48/data HTTP/1.1" 200 None INFO:apache_beam.io.gcp.tests.bigquery_matcher:Read from given query (SELECT fruit from `python_query_to_table_16002828214120.output_table`;), total rows 2 INFO:apache_beam.io.gcp.tests.bigquery_matcher:Generate checksum: 3b2cefe89863bf492d48f7d4da960f2999802a89 test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:09:09.017Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:09:09.076Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:09:09.121Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:09:09.204Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:09:17.358Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:09:17.434Z: JOB_MESSAGE_DEBUG: Executing success step success11 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:09:17.512Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:09:17.565Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:09:17.600Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:10:06.198Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:10:06.236Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-16T19:10:06.267Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-09-16_12_03_19-13061421580333274470 is in state JOB_STATE_DONE test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok ====================================================================== ERROR: test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ---------------------------------------------------------------------- Traceback (most recent call last): File "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py",> line 325, in test_transform_on_gcs label='Assert Checksums') File "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 568, in __exit__ self.result = self.run() File "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 518, in run allow_proto_holders=True).run(False) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 547, in run return self.runner.run_pipeline(self, self._options) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 57, in run_pipeline self).run_pipeline(pipeline, options) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 479, in run_pipeline artifacts=environments.python_sdk_dependencies(options))) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/transforms/environments.py",> line 613, in python_sdk_dependencies staged_name in stager.Stager.create_job_resources(options, tmp_dir)) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 173, in create_job_resources setup_options.requirements_file, requirements_cache_path) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 236, in wrapper return fun(*args, **kwargs) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 559, in _populate_requirements_cache processes.check_output(cmd_args, stderr=processes.STDOUT) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/processes.py",> line 99, in check_output .format(traceback.format_exc(), args[0][6], error.output)) RuntimeError: Full traceback: Traceback (most recent call last): File "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/processes.py",> line 91, in check_output out = subprocess.check_output(*args, **kwargs) File "/usr/lib/python2.7/subprocess.py", line 574, in check_output raise CalledProcessError(retcode, cmd, output=output) CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']' returned non-zero exit status 1 Pip install failed for package: -r Output from execution of subprocess: DEPRECATION: Python 2.7 reached the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 is no longer maintained. pip 21.0 will drop support for Python 2.7 in January 2021. More details about Python 2 support in pip can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support pip 21.0 will remove support for this functionality. Collecting pyhamcrest!=1.10.0,<2.0.0 File was already downloaded /tmp/dataflow-requirements-cache/PyHamcrest-1.10.1.tar.gz Collecting mock<3.0.0 File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz Collecting parameterized<0.8.0,>=0.7.1 File was already downloaded /tmp/dataflow-requirements-cache/parameterized-0.7.4.tar.gz Collecting six File was already downloaded /tmp/dataflow-requirements-cache/six-1.15.0.tar.gz Collecting funcsigs>=1 File was already downloaded /tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz Collecting pbr>=0.11 File was already downloaded /tmp/dataflow-requirements-cache/pbr-5.5.0.tar.gz ERROR: Command errored out with exit status 1: command: <https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/bin/python> -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-download-nt2iUE/pbr/setup.py'"'"'; __file__='"'"'/tmp/pip-download-nt2iUE/pbr/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info --egg-base /tmp/pip-pip-egg-info-k4rimf cwd: /tmp/pip-download-nt2iUE/pbr/ Complete output (3 lines): Traceback (most recent call last): File "<string>", line 1, in <module> IOError: [Errno 2] No such file or directory: '/tmp/pip-download-nt2iUE/pbr/setup.py' ---------------------------------------- ERROR: Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output. -------------------- >> begin captured logging << -------------------- apache_beam.runners.portability.stager: INFO: Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:'] --------------------- >> end captured logging << --------------------- ---------------------------------------------------------------------- XML: nosetests-postCommitIT-df-py27.xml ---------------------------------------------------------------------- XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 66 tests in 3968.227s FAILED (SKIP=7, errors=1) > Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED FAILURE: Build failed with an exception. * Where: Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 118 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1h 10m 6s 177 actionable tasks: 137 executed, 36 from cache, 4 up-to-date Publishing build scan... https://gradle.com/s/eauuse2yj2rpo Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
