See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/493/display/redirect?page=changes>
Changes: [lcwik] [BEAM-5433] Deprecate environment url field. [robert] [BEAM-5381] Fix Duplicate CoGBK node names. [25622840+adude3141] [BEAM-7016] guard DataflowWorkerLoggingInitializer.reset [github] [BEAM-7011] Update Beam SDKs to use the StandardSideInputType enums. ------------------------------------------ [...truncated 510.97 KB...] root: INFO: 2019-04-09T00:43:00.578Z: JOB_MESSAGE_DETAILED: Fusing consumer count_per_key/CombinePerKey(CountCombineFn)/GroupByKey/Write into count_per_key/CombinePerKey(CountCombineFn)/GroupByKey/Reify root: INFO: 2019-04-09T00:43:00.640Z: JOB_MESSAGE_DETAILED: Fusing consumer count_per_key/CombinePerKey(CountCombineFn)/Combine into count_per_key/CombinePerKey(CountCombineFn)/GroupByKey/Read root: INFO: 2019-04-09T00:43:00.694Z: JOB_MESSAGE_DETAILED: Fusing consumer read/read/Reshard/ReshufflePerKey/GroupByKey/GroupByWindow into read/read/Reshard/ReshufflePerKey/GroupByKey/Read root: INFO: 2019-04-09T00:43:00.728Z: JOB_MESSAGE_DETAILED: Fusing consumer read/read/Reshard/ReshufflePerKey/GroupByKey/Reify into read/read/Reshard/ReshufflePerKey/Map(reify_timestamps) root: INFO: 2019-04-09T00:43:00.778Z: JOB_MESSAGE_DETAILED: Fusing consumer read/read/Reshard/AddRandomKeys into read/read/ExpandIntoRanges root: INFO: 2019-04-09T00:43:00.825Z: JOB_MESSAGE_DETAILED: Fusing consumer read/read/Reshard/ReshufflePerKey/GroupByKey/Write into read/read/Reshard/ReshufflePerKey/GroupByKey/Reify root: INFO: 2019-04-09T00:43:00.868Z: JOB_MESSAGE_DETAILED: Fusing consumer sum_globally/InjectDefault/InjectDefault into sum_globally/DoOnce/Read root: INFO: 2019-04-09T00:43:00.904Z: JOB_MESSAGE_DETAILED: Fusing consumer sum_globally/CombinePerKey/GroupByKey/Reify into sum_globally/CombinePerKey/GroupByKey+sum_globally/CombinePerKey/Combine/Partial root: INFO: 2019-04-09T00:43:00.936Z: JOB_MESSAGE_DETAILED: Fusing consumer sum_globally/KeyWithVoid into get_number root: INFO: 2019-04-09T00:43:00.970Z: JOB_MESSAGE_DETAILED: Fusing consumer count_per_key/CombinePerKey(CountCombineFn)/Combine/Extract into count_per_key/CombinePerKey(CountCombineFn)/Combine root: INFO: 2019-04-09T00:43:01.009Z: JOB_MESSAGE_DETAILED: Fusing consumer validate_number into sum_globally/InjectDefault/InjectDefault root: INFO: 2019-04-09T00:43:01.059Z: JOB_MESSAGE_DETAILED: Fusing consumer sum_globally/CombinePerKey/Combine into sum_globally/CombinePerKey/GroupByKey/Read root: INFO: 2019-04-09T00:43:01.104Z: JOB_MESSAGE_DETAILED: Fusing consumer validate_name into count_per_key/CombinePerKey(CountCombineFn)/Combine/Extract root: INFO: 2019-04-09T00:43:01.151Z: JOB_MESSAGE_DETAILED: Fusing consumer read/read/ReadRange into read/read/Reshard/RemoveRandomKeys root: INFO: 2019-04-09T00:43:01.201Z: JOB_MESSAGE_DETAILED: Fusing consumer read/read/Reshard/ReshufflePerKey/FlatMap(restore_timestamps) into read/read/Reshard/ReshufflePerKey/GroupByKey/GroupByWindow root: INFO: 2019-04-09T00:43:01.272Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. root: INFO: 2019-04-09T00:43:01.315Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. root: INFO: 2019-04-09T00:43:01.359Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. root: INFO: 2019-04-09T00:43:01.409Z: JOB_MESSAGE_DEBUG: Assigning stage ids. root: INFO: 2019-04-09T00:43:01.633Z: JOB_MESSAGE_DEBUG: Executing wait step start90 root: INFO: 2019-04-09T00:43:01.719Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite root: INFO: 2019-04-09T00:43:01.762Z: JOB_MESSAGE_BASIC: Executing operation read/read/Reshard/ReshufflePerKey/GroupByKey/Create root: INFO: 2019-04-09T00:43:01.776Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. root: INFO: 2019-04-09T00:43:01.816Z: JOB_MESSAGE_BASIC: Executing operation sum_globally/CombinePerKey/GroupByKey/Create root: INFO: 2019-04-09T00:43:01.816Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b... root: INFO: 2019-04-09T00:43:01.879Z: JOB_MESSAGE_BASIC: Executing operation count_per_key/CombinePerKey(CountCombineFn)/GroupByKey/Create root: INFO: 2019-04-09T00:43:01.911Z: JOB_MESSAGE_BASIC: Executing operation reshuffle/ReshufflePerKey/GroupByKey/Create root: INFO: 2019-04-09T00:43:01.961Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/GroupByKey/Create root: INFO: 2019-04-09T00:43:02.007Z: JOB_MESSAGE_DEBUG: Value "read/read/Reshard/ReshufflePerKey/GroupByKey/Session" materialized. root: INFO: 2019-04-09T00:43:02.050Z: JOB_MESSAGE_DEBUG: Value "sum_globally/CombinePerKey/GroupByKey/Session" materialized. root: INFO: 2019-04-09T00:43:02.104Z: JOB_MESSAGE_DEBUG: Value "count_per_key/CombinePerKey(CountCombineFn)/GroupByKey/Session" materialized. root: INFO: 2019-04-09T00:43:02.152Z: JOB_MESSAGE_DEBUG: Value "reshuffle/ReshufflePerKey/GroupByKey/Session" materialized. root: INFO: 2019-04-09T00:43:02.200Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/GroupByKey/Session" materialized. root: INFO: 2019-04-09T00:43:14.888Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s). root: INFO: 2019-04-09T00:44:15.118Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s). root: INFO: 2019-04-09T00:45:27.672Z: JOB_MESSAGE_DETAILED: Workers have started successfully. root: INFO: 2019-04-09T00:45:27.719Z: JOB_MESSAGE_DETAILED: Workers have started successfully. root: INFO: 2019-04-09T00:47:43.505Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Read.out" materialized. root: INFO: 2019-04-09T00:47:43.554Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/InitializeWrite.out" materialized. root: INFO: 2019-04-09T00:47:43.653Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(InitializeWrite.out.0) root: INFO: 2019-04-09T00:47:43.701Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(InitializeWrite.out.0) root: INFO: 2019-04-09T00:47:43.748Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(InitializeWrite.out.0) root: INFO: 2019-04-09T00:47:43.787Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(InitializeWrite.out.0).output" materialized. root: INFO: 2019-04-09T00:47:43.834Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(InitializeWrite.out.0).output" materialized. root: INFO: 2019-04-09T00:47:43.872Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(InitializeWrite.out.0).output" materialized. root: INFO: 2019-04-09T00:47:43.924Z: JOB_MESSAGE_BASIC: Executing operation create/Read+produce+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write root: INFO: 2019-04-09T00:48:01.911Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/GroupByKey/Close root: INFO: 2019-04-09T00:48:02.011Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/GroupByKey/Read+write/Write/WriteImpl/GroupByKey/GroupByWindow+write/Write/WriteImpl/Extract root: INFO: 2019-04-09T00:48:13.463Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/Extract.out" materialized. root: INFO: 2019-04-09T00:48:13.561Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(Extract.out.0) root: INFO: 2019-04-09T00:48:13.606Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(Extract.out.0) root: INFO: 2019-04-09T00:48:13.676Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(Extract.out.0).output" materialized. root: INFO: 2019-04-09T00:48:13.738Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(Extract.out.0).output" materialized. root: INFO: 2019-04-09T00:48:13.842Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/PreFinalize root: INFO: 2019-04-09T00:48:19.520Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize.out" materialized. root: INFO: 2019-04-09T00:48:19.607Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(PreFinalize.out.0) root: INFO: 2019-04-09T00:48:19.726Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(PreFinalize.out.0).output" materialized. root: INFO: 2019-04-09T00:48:19.828Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/FinalizeWrite+reshuffle/AddRandomKeys+reshuffle/ReshufflePerKey/Map(reify_timestamps)+reshuffle/ReshufflePerKey/GroupByKey/Reify+reshuffle/ReshufflePerKey/GroupByKey/Write+read/read/ExpandIntoRanges+read/read/Reshard/AddRandomKeys+read/read/Reshard/ReshufflePerKey/Map(reify_timestamps)+read/read/Reshard/ReshufflePerKey/GroupByKey/Reify+read/read/Reshard/ReshufflePerKey/GroupByKey/Write root: INFO: 2019-04-09T00:48:23.422Z: JOB_MESSAGE_BASIC: Executing operation read/read/Reshard/ReshufflePerKey/GroupByKey/Close root: INFO: 2019-04-09T00:48:23.521Z: JOB_MESSAGE_BASIC: Executing operation read/read/Reshard/ReshufflePerKey/GroupByKey/Read+read/read/Reshard/ReshufflePerKey/GroupByKey/GroupByWindow+read/read/Reshard/ReshufflePerKey/FlatMap(restore_timestamps)+read/read/Reshard/RemoveRandomKeys+read/read/ReadRange+get_number+make_pair+count_per_key/CombinePerKey(CountCombineFn)/GroupByKey+count_per_key/CombinePerKey(CountCombineFn)/Combine/Partial+count_per_key/CombinePerKey(CountCombineFn)/GroupByKey/Reify+count_per_key/CombinePerKey(CountCombineFn)/GroupByKey/Write+sum_globally/KeyWithVoid+sum_globally/CombinePerKey/GroupByKey+sum_globally/CombinePerKey/Combine/Partial+sum_globally/CombinePerKey/GroupByKey/Reify+sum_globally/CombinePerKey/GroupByKey/Write root: INFO: 2019-04-09T00:48:32.987Z: JOB_MESSAGE_BASIC: Executing operation count_per_key/CombinePerKey(CountCombineFn)/GroupByKey/Close root: INFO: 2019-04-09T00:48:33.022Z: JOB_MESSAGE_BASIC: Executing operation sum_globally/CombinePerKey/GroupByKey/Close root: INFO: 2019-04-09T00:48:33.089Z: JOB_MESSAGE_BASIC: Executing operation count_per_key/CombinePerKey(CountCombineFn)/GroupByKey/Read+count_per_key/CombinePerKey(CountCombineFn)/Combine+count_per_key/CombinePerKey(CountCombineFn)/Combine/Extract+validate_name+reshuffle/AddRandomKeys+reshuffle/ReshufflePerKey/Map(reify_timestamps)+reshuffle/ReshufflePerKey/GroupByKey/Reify+reshuffle/ReshufflePerKey/GroupByKey/Write root: INFO: 2019-04-09T00:48:33.144Z: JOB_MESSAGE_BASIC: Executing operation sum_globally/CombinePerKey/GroupByKey/Read+sum_globally/CombinePerKey/Combine+sum_globally/CombinePerKey/Combine/Extract+sum_globally/UnKey root: INFO: 2019-04-09T00:48:43.291Z: JOB_MESSAGE_DEBUG: Value "sum_globally/UnKey.out" materialized. root: INFO: 2019-04-09T00:48:43.368Z: JOB_MESSAGE_BASIC: Executing operation sum_globally/InjectDefault/_UnpickledSideInput(UnKey.out.0) root: INFO: 2019-04-09T00:48:43.474Z: JOB_MESSAGE_DEBUG: Value "sum_globally/InjectDefault/_UnpickledSideInput(UnKey.out.0).output" materialized. root: INFO: 2019-04-09T00:48:43.557Z: JOB_MESSAGE_BASIC: Executing operation sum_globally/DoOnce/Read+sum_globally/InjectDefault/InjectDefault+validate_number+reshuffle/AddRandomKeys+reshuffle/ReshufflePerKey/Map(reify_timestamps)+reshuffle/ReshufflePerKey/GroupByKey/Reify+reshuffle/ReshufflePerKey/GroupByKey/Write root: INFO: 2019-04-09T00:48:51.498Z: JOB_MESSAGE_BASIC: Executing operation reshuffle/ReshufflePerKey/GroupByKey/Close root: INFO: 2019-04-09T00:48:51.586Z: JOB_MESSAGE_BASIC: Executing operation reshuffle/ReshufflePerKey/GroupByKey/Read+reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+reshuffle/RemoveRandomKeys+cleanup root: INFO: 2019-04-09T00:48:58.892Z: JOB_MESSAGE_DEBUG: Executing success step success88 root: INFO: 2019-04-09T00:48:59.067Z: JOB_MESSAGE_DETAILED: Cleaning up. root: INFO: 2019-04-09T00:48:59.127Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. root: INFO: 2019-04-09T00:48:59.179Z: JOB_MESSAGE_BASIC: Stopping worker pool... --------------------- >> end captured logging << --------------------- <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported method_to_use = self._compute_method(p, p.options) <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental. | 'GetPath' >> beam.Map(lambda metadata: metadata.path)) <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_17_22_45-12180847373103478740?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_17_41_04-11490832026115259622?project=apache-beam-testing. | 'Checksums' >> beam.Map(compute_hash)) <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental. | 'Checksums' >> beam.Map(compute_hash)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_17_22_42-2515831188158438987?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_17_46_02-7232553086949331486?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_17_22_44-6943692399007416708?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_17_38_37-14065737689845992051?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported method_to_use = self._compute_method(p, p.options) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_17_48_18-13760083537726604062?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported or p.options.view_as(GoogleCloudOptions).temp_location) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_17_22_43-10271610569295686880?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_17_44_51-14044350704885656478?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_17_53_24-7236764947987745415?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_17_22_43-12046588152546211931?project=apache-beam-testing. Exception in thread Thread-3: Traceback (most recent call last): File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_inner self.run() File "/usr/lib/python3.5/threading.py", line 862, in run self._target(*self._args, **self._kwargs) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_17_33_20-9315454686006537202?project=apache-beam-testing. File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 184, in poll_for_job_completion job_id, page_token=page_token, start_time=last_message_time) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_17_42_52-1714624128897039904?project=apache-beam-testing. File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 195, in wrapper return fun(*args, **kwargs) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 744, in list_messages response = self._client.projects_locations_jobs_messages.List(request) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 550, in List config, request, global_params=global_params) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod return self.ProcessHttpResponse(method_config, http_response, request) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse self.__ProcessHttpResponse(method_config, http_response, request)) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse http_response, method_config=method_config, request=request) apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-08_17_42_52-1714624128897039904/messages?alt=json&startTime=2019-04-09T00%3A48%3A59.179Z>: response: <{'-content-encoding': 'gzip', 'server': 'ESF', 'date': 'Tue, 09 Apr 2019 00:51:21 GMT', 'cache-control': 'private', 'x-frame-options': 'SAMEORIGIN', 'status': '404', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'content-length': '279', 'x-content-type-options': 'nosniff', 'x-xss-protection': '1; mode=block', 'content-type': 'application/json; charset=UTF-8'}>, content <{ "error": { "code": 404, "message": "(1105677783fe5172): Information about job 2019-04-08_17_42_52-1714624128897039904 could not be found in our system. Please double check the id is correct. If it is please contact customer support.", "status": "NOT_FOUND" } } > Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_17_22_42-16550365527119311520?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_17_32_31-9158600450905106260?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_17_42_10-3449935410520169325?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_17_22_44-1322259001744181196?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_17_32_56-1347630802152325620?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported method_to_use = self._compute_method(p, p.options) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_17_45_35-15941132535909634067?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported or p.options.view_as(GoogleCloudOptions).temp_location) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_17_22_43-15101916477404857848?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_17_31_25-14177395294227043765?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_17_41_08-13359813421686746765?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_17_50_13-15965023880946528059?project=apache-beam-testing. ---------------------------------------------------------------------- XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 34 tests in 2397.868s FAILED (SKIP=5, errors=1, failures=1) > Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED > Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests >>> RUNNING integration tests with pipeline options: >>> --runner=TestDataflowRunner --project=apache-beam-testing >>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it >>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it >>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output >>> --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> >>> --requirements_file=postcommit_requirements.txt --num_workers=1 >>> --sleep_secs=20 >>> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test >>> >>> --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test >>> test options: --nocapture --processes=8 --process-timeout=4500 >>> --attr=ValidatesRunner running nosetests running egg_info writing dependency_links to apache_beam.egg-info/dependency_links.txt writing top-level names to apache_beam.egg-info/top_level.txt writing entry points to apache_beam.egg-info/entry_points.txt writing apache_beam.egg-info/PKG-INFO writing requirements to apache_beam.egg-info/requires.txt reading manifest file 'apache_beam.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features. 'Python 3 support for the Apache Beam SDK is not yet fully supported. ' <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0' normalized_version, warning: no files found matching 'README.md' warning: no files found matching 'NOTICE' warning: no files found matching 'LICENSE' writing manifest file 'apache_beam.egg-info/SOURCES.txt' <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features. 'Running the Apache Beam SDK on Python 3 is not yet fully supported. ' <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543. warnings.warn('Datastore IO will support Python 3 after replacing ' <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628. warnings.warn("VCF IO will support Python 3 after migration to Nucleus, " Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_18_02_41-7646267434757600647?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_18_12_53-13028428130797428882?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_18_02_42-5854813696830089457?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_18_16_24-7899593149486727206?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_18_02_41-17939276551357217533?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_18_12_04-9200599395402189637?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_18_02_42-1339407263728672457?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_18_11_54-4544339877142534574?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_18_02_41-11064784774160074068?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_18_13_08-12846821762054543107?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_18_02_42-3527127596355115142?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_18_12_30-4011160619169664922?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_18_02_41-16889361337823326071?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_18_11_53-1528799888367334502?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_18_02_42-5008102957572011829?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_18_11_04-2703421053267357937?project=apache-beam-testing. test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok ---------------------------------------------------------------------- XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 16 tests in 1398.116s OK FAILURE: Build failed with an exception. * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41 * What went wrong: Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1h 4m 8s 8 actionable tasks: 8 executed Publishing build scan... https://gradle.com/s/dbiwqeapfozqg Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
