See <https://builds.apache.org/job/beam_PostCommit_Python35/1121/display/redirect>
Changes: ------------------------------------------ [...truncated 625.37 KB...] location: 'us-central1' name: 'beamapp-jenkins-1202061617-413784' projectId: 'apache-beam-testing' stageStates: [] startTime: '2019-12-02T06:16:52.186225Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2019-12-01_22_16_50-18286273281165676842] apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_16_50-18286273281165676842?project=apache-beam-testing apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2019-12-01_22_16_50-18286273281165676842 is in state JOB_STATE_RUNNING apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:50.562Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-12-01_22_16_50-18286273281165676842. The number of workers will be between 1 and 1000. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:50.562Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-12-01_22_16_50-18286273281165676842. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:55.095Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:56.719Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:57.762Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:57.806Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step write/Write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:57.844Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step group: GroupByKey not followed by a combiner. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:57.883Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:57.916Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:58.014Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:58.064Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:58.095Z: JOB_MESSAGE_DETAILED: Fusing consumer split into read/Read apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:58.126Z: JOB_MESSAGE_DETAILED: Fusing consumer pair_with_one into split apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:58.162Z: JOB_MESSAGE_DETAILED: Fusing consumer group/Reify into pair_with_one apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:58.194Z: JOB_MESSAGE_DETAILED: Fusing consumer group/Write into group/Reify apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:58.232Z: JOB_MESSAGE_DETAILED: Fusing consumer group/GroupByWindow into group/Read apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:58.257Z: JOB_MESSAGE_DETAILED: Fusing consumer count into group/GroupByWindow apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:58.292Z: JOB_MESSAGE_DETAILED: Fusing consumer format into count apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:58.317Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WriteBundles/WriteBundles into format apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:58.351Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/Pair into write/Write/WriteImpl/WriteBundles/WriteBundles apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:58.388Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WindowInto(WindowIntoFn) into write/Write/WriteImpl/Pair apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:58.423Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/Reify into write/Write/WriteImpl/WindowInto(WindowIntoFn) apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:58.480Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/Write into write/Write/WriteImpl/GroupByKey/Reify apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:58.517Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/GroupByWindow into write/Write/WriteImpl/GroupByKey/Read apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:58.551Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/Extract into write/Write/WriteImpl/GroupByKey/GroupByWindow apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:58.587Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/InitializeWrite into write/Write/WriteImpl/DoOnce/Read apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:58.637Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:58.671Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:58.699Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:58.991Z: JOB_MESSAGE_DEBUG: Assigning stage ids. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:59.179Z: JOB_MESSAGE_DEBUG: Executing wait step start26 apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:59.275Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:59.303Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/GroupByKey/Create apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:59.345Z: JOB_MESSAGE_BASIC: Executing operation group/Create apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:59.346Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:59.387Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a... apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:59.462Z: JOB_MESSAGE_BASIC: Finished operation group/Create apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:59.462Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/GroupByKey/Create apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:59.528Z: JOB_MESSAGE_DEBUG: Value "group/Session" materialized. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:59.563Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/GroupByKey/Session" materialized. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:16:59.611Z: JOB_MESSAGE_BASIC: Executing operation read/Read+split+pair_with_one+group/Reify+group/Write apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:17:16.437Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:18:12.928Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded. Limit: 1250.0 in region us-central1. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:18:12.962Z: JOB_MESSAGE_ERROR: Workflow failed. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:18:13.052Z: JOB_MESSAGE_BASIC: Finished operation read/Read+split+pair_with_one+group/Reify+group/Write apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:18:13.303Z: JOB_MESSAGE_WARNING: S01:write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite failed. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:18:13.346Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:18:13.461Z: JOB_MESSAGE_DETAILED: Cleaning up. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:18:13.550Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:18:13.588Z: JOB_MESSAGE_BASIC: Stopping worker pool... apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:18:31.624Z: JOB_MESSAGE_BASIC: Worker pool stopped. apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T06:18:31.665Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2019-12-01_22_16_50-18286273281165676842 is in state JOB_STATE_FAILED apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1575267376698/results' apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1575267376698/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1575267376698\\/results[^/\\\\]*' apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input apache_beam.io.gcp.gcsio: INFO: Finished listing 0 files in 0.04036903381347656 seconds. --------------------- >> end captured logging << --------------------- Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_08_23-2458066308275811404?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_28_01-9687875242062961353?project=apache-beam-testing kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_46_05-10750673706617559447?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_54_56-1029533836203455219?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported self.table_reference.projectId = pcoll.pipeline.options.view_as( Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_08_27-7559119831702185005?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_20_55-7417642844209688856?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_29_45-891165082415730460?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_38_36-16334719473976796751?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_08_23-7321920604760682193?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_30_38-9618635213757341318?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_39_09-17255226419978242223?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_48_07-2148967418348942049?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_08_25-15395314300692250677?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_17_50-11038815780644020939?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_27_00-12843351118967025790?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_36_34-15374756506836365046?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_47_43-9731943736354834340?project=apache-beam-testing experiments = p.options.view_as(DebugOptions).experiments or [] <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental. | 'GetPath' >> beam.Map(lambda metadata: metadata.path)) <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental. | 'Checksums' >> beam.Map(compute_hash)) <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental. | 'Checksums' >> beam.Map(compute_hash)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_08_25-5434763728835571206?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_17_44-7641226882133499323?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_27_06-18379818116713431065?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_35_44-10100006181016200651?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_44_16-12474514461453233858?project=apache-beam-testing kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_08_23-4683850823639264806?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_16_50-18286273281165676842?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_19_16-17062997443487093030?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_29_01-8499354791546001136?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_39_24-7397368390294920180?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_48_57-7173729895039232563?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_08_24-10553289772568381477?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_17_32-12170603702908025106?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_27_28-14273180656420485991?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_37_43-7847175398780310542?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_47_29-9182195440841475678?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-01_22_57_04-17370259764046392366?project=apache-beam-testing ====================================================================== ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ---------------------------------------------------------------------- Traceback (most recent call last): File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/nose/plugins/multiprocess.py",> line 812, in run test(orig) File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/nose/case.py",> line 46, in __call__ return self.run(*arg, **kwarg) File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/nose/case.py",> line 134, in run self.runTest(result) File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/nose/case.py",> line 152, in runTest test(result) File "/usr/lib/python3.5/unittest/case.py", line 648, in __call__ return self.run(*args, **kwds) File "/usr/lib/python3.5/unittest/case.py", line 600, in run testMethod() File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform equal_to([(full_output_table_1, bad_record)])) File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__ self.run().wait_until_finish() File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run self._options).run(False) File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run return self.runner.run_pipeline(self, self._options) File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 73, in run_pipeline self.result.cancel() File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1464, in cancel return self.state File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1404, in state self._update_job() File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1360, in _update_job self._job = self._runner.dataflow_client.get_job(self.job_id()) File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper return fun(*args, **kwargs) File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job response = self._client.projects_locations_jobs.Get(request) File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get config, request, global_params=global_params) File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod http, http_request, **opts) File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest check_response_func=check_response_func) File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/http_wrapper.py",> line 396, in _MakeRequestNoRetry redirections=redirections, connection_type=connection_type) File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/oauth2client/transport.py",> line 169, in new_request redirections, connection_type) File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/oauth2client/transport.py",> line 169, in new_request redirections, connection_type) File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/httplib2/__init__.py",> line 1924, in request cachekey, File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/httplib2/__init__.py",> line 1595, in _request conn, request_uri, method, body, headers File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/httplib2/__init__.py",> line 1533, in _conn_request response = conn.getresponse() File "/usr/lib/python3.5/http/client.py", line 1213, in getresponse response.begin() File "/usr/lib/python3.5/http/client.py", line 307, in begin version, status, reason = self._read_status() File "/usr/lib/python3.5/http/client.py", line 268, in _read_status line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") File "/usr/lib/python3.5/socket.py", line 575, in readinto return self._sock.recv_into(b) File "/usr/lib/python3.5/ssl.py", line 929, in recv_into return self.read(nbytes, buffer) File "/usr/lib/python3.5/ssl.py", line 791, in read return self._sslobj.read(len, buffer) File "/usr/lib/python3.5/ssl.py", line 575, in read v = self._sslobj.read(len, buffer) File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler raise TimedOutException() nose.plugins.multiprocess.TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)' ---------------------------------------------------------------------- XML: nosetests-postCommitIT-df-py35.xml ---------------------------------------------------------------------- XML: <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 45 tests in 5396.795s FAILED (SKIP=7, errors=2) > Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED FAILURE: Build failed with an exception. * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 56 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1h 31m 22s 83 actionable tasks: 62 executed, 21 from cache Publishing build scan... https://scans.gradle.com/s/pe6hwhgn5vik2 Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org For additional commands, e-mail: builds-h...@beam.apache.org