See <https://builds.apache.org/job/beam_PostCommit_Python35/53/display/redirect?page=changes>
Changes: [rohde.samuel] Add hot key detection protos to Windmill ------------------------------------------ [...truncated 79.68 KB...] root: INFO: ==================== <function expand_sdf at 0x7fb37a23f1e0> ==================== root: DEBUG: 3 [1, 1, 1] root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n create/Read:beam:transform:read:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/AppendDestination_5\n write/AppendDestination:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7\n write/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: '] root: INFO: ==================== <function expand_gbk at 0x7fb37a23f268> ==================== root: DEBUG: 3 [1, 1, 1] root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n create/Read:beam:transform:read:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/AppendDestination_5\n write/AppendDestination:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7\n write/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: '] root: INFO: ==================== <function sink_flattens at 0x7fb37a23f378> ==================== root: DEBUG: 3 [1, 1, 1] root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n create/Read:beam:transform:read:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/AppendDestination_5\n write/AppendDestination:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7\n write/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: '] root: INFO: ==================== <function greedily_fuse at 0x7fb37a23f400> ==================== root: DEBUG: 1 [3] root: DEBUG: Stages: ['((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/AppendDestination_5))+(ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7)\n create/Read:beam:transform:read:v1\nwrite/AppendDestination:beam:transform:pardo:v1\nwrite/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: '] root: INFO: ==================== <function read_to_impulse at 0x7fb37a23f488> ==================== root: DEBUG: 1 [4] root: DEBUG: Stages: ['((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/AppendDestination_5))+(ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7)\n write/AppendDestination:beam:transform:pardo:v1\nwrite/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\ncreate/Read/Impulse:beam:transform:impulse:v1\ncreate/Read:beam:transform:read_from_impulse_python:v1\n must follow: \n downstream_side_inputs: '] root: INFO: ==================== <function impulse_to_input at 0x7fb37a23f510> ==================== root: DEBUG: 1 [4] root: DEBUG: Stages: ['((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/AppendDestination_5))+(ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7)\n write/AppendDestination:beam:transform:pardo:v1\nwrite/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n must follow: \n downstream_side_inputs: '] root: INFO: ==================== <function inject_timer_pcollections at 0x7fb37a23f6a8> ==================== root: DEBUG: 1 [4] root: DEBUG: Stages: ['((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/AppendDestination_5))+(ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7)\n write/AppendDestination:beam:transform:pardo:v1\nwrite/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n must follow: \n downstream_side_inputs: '] root: INFO: ==================== <function sort_stages at 0x7fb37a23f730> ==================== root: DEBUG: 1 [4] root: DEBUG: Stages: ['((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/AppendDestination_5))+(ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7)\n write/AppendDestination:beam:transform:pardo:v1\nwrite/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n must follow: \n downstream_side_inputs: '] root: INFO: ==================== <function window_pcollection_coders at 0x7fb37a23f7b8> ==================== root: DEBUG: 1 [4] root: DEBUG: Stages: ['((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/AppendDestination_5))+(ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7)\n write/AppendDestination:beam:transform:pardo:v1\nwrite/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n must follow: \n downstream_side_inputs: '] root: INFO: Running ((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/AppendDestination_5))+(ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7) root: DEBUG: start <DoOperation write/StreamInsertRows/ParDo(BigQueryWriteFn) output_tags=['out_FailedRows', 'out'], receivers=[ConsumerSet[write/StreamInsertRows/ParDo(BigQueryWriteFn).out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0], ConsumerSet[write/StreamInsertRows/ParDo(BigQueryWriteFn).out1, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]> root: DEBUG: Connecting using Google Application Default Credentials. root: DEBUG: start <DoOperation write/AppendDestination output_tags=['out'], receivers=[SingletonConsumerSet[write/AppendDestination.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]], len(consumers)=1]]> root: DEBUG: start <ImpulseReadOperation receivers=[SingletonConsumerSet[create/Read.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]> root: DEBUG: start <DataInputOperation receivers=[SingletonConsumerSet[create/Read/Impulse.out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]> root: DEBUG: Creating or getting table <TableReference datasetId: 'python_write_to_table_15640882173648' projectId: 'apache-beam-testing' tableId: 'python_no_schema_table'> with schema None. oauth2client.transport: INFO: Attempting refresh to obtain initial access_token root: DEBUG: finish <DataInputOperation receivers=[SingletonConsumerSet[create/Read/Impulse.out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]> root: DEBUG: finish <ImpulseReadOperation receivers=[SingletonConsumerSet[create/Read.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]> root: DEBUG: finish <DoOperation write/AppendDestination output_tags=['out'], receivers=[SingletonConsumerSet[write/AppendDestination.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]], len(consumers)=1]]> root: DEBUG: finish <DoOperation write/StreamInsertRows/ParDo(BigQueryWriteFn) output_tags=['out_FailedRows', 'out'], receivers=[ConsumerSet[write/StreamInsertRows/ParDo(BigQueryWriteFn).out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0], ConsumerSet[write/StreamInsertRows/ParDo(BigQueryWriteFn).out1, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]> root: DEBUG: Attempting to flush to all destinations. Total buffered: 4 root: DEBUG: Flushing data to apache-beam-testing:python_write_to_table_15640882173648.python_no_schema_table. Total 4 rows. root: DEBUG: Passed: True. Errors are [] root: DEBUG: Wait for the bundle bundle_12 to finish. google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254 google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id root: INFO: Attempting to perform query SELECT bytes, date, time FROM python_write_to_table_15640882173648.python_no_schema_table to BQ urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None) google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80 urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144 google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/[email protected]/token HTTP/1.1" 200 176 urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443 urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/c7387961-dc58-4696-9e47-9564e351706d?maxResults=0&timeoutMs=10000&location=US HTTP/1.1" 200 None urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/jobs/c7387961-dc58-4696-9e47-9564e351706d?location=US HTTP/1.1" 200 None root: INFO: Result of query is: <google.cloud.bigquery.table.RowIterator object at 0x7fb379f12e48> urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon9f237b8a4e427c34658c3d91f04c4634a5579325/data HTTP/1.1" 200 None root: INFO: Deleting dataset python_write_to_table_15640882173648 in project apache-beam-testing --------------------- >> end captured logging << --------------------- <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported self.table_reference.projectId = pcoll.pipeline.options.view_as( <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported self.table_reference.projectId = pcoll.pipeline.options.view_as( <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:557: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location ---------------------------------------------------------------------- XML: nosetests-postCommitIT-direct-py35.xml ---------------------------------------------------------------------- XML: <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 15 tests in 24.209s FAILED (SKIP=1, failures=1) > Task :sdks:python:test-suites:direct:py35:postCommitIT FAILED > Task :sdks:python:test-suites:dataflow:py35:postCommitIT >>> RUNNING integration tests with pipeline options: >>> --runner=TestDataflowRunner --project=apache-beam-testing >>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it >>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it >>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output >>> --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/apache-beam.tar.gz> >>> --requirements_file=postcommit_requirements.txt --num_workers=1 >>> --sleep_secs=20 >>> --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.15.0-SNAPSHOT.jar> >>> >>> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test >>> >>> --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test >>> test options: --nocapture --processes=8 --process-timeout=4500 --attr=IT running nosetests running egg_info writing apache_beam.egg-info/PKG-INFO writing dependency_links to apache_beam.egg-info/dependency_links.txt writing entry points to apache_beam.egg-info/entry_points.txt writing requirements to apache_beam.egg-info/requires.txt writing top-level names to apache_beam.egg-info/top_level.txt reading manifest file 'apache_beam.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' setup.py:178: UserWarning: Some syntactic constructs of Python 3 are not yet fully supported by Apache Beam. 'Some syntactic constructs of Python 3 are not yet fully supported by ' <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0' normalized_version, warning: no files found matching 'README.md' warning: no files found matching 'NOTICE' warning: no files found matching 'LICENSE' writing manifest file 'apache_beam.egg-info/SOURCES.txt' <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Some syntactic constructs of Python 3 are not yet fully supported by Apache Beam. 'Some syntactic constructs of Python 3 are not yet fully supported by ' <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:59: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543. warnings.warn('Datastore IO will support Python 3 after replacing ' <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628. warnings.warn("VCF IO will support Python 3 after migration to Nucleus, " Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_13_57_21-5920675336037002426?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1140: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported method_to_use = self._compute_method(p, p.options) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_14_14_04-18083712001542943826?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_14_23_11-1598338729190593288?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_14_32_38-9318869419435459489?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_14_43_00-14727098644868107451?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_13_57_14-2545535496535916822?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_14_23_48-15954727130650247526?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:686: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_14_33_34-12823743317047835545?project=apache-beam-testing. kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental. | 'GetPath' >> beam.Map(lambda metadata: metadata.path)) <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental. | 'Checksums' >> beam.Map(compute_hash)) <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental. | 'Checksums' >> beam.Map(compute_hash)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_13_57_22-17574973387741131827?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1140: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_14_11_38-1421549066493510752?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_14_21_13-9538454807915486028?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_14_30_45-16663274398757274855?project=apache-beam-testing. method_to_use = self._compute_method(p, p.options) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_13_57_21-4420785453580084421?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:686: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_14_19_44-3023303341462591903?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_14_29_17-4157319962942552378?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_14_39_10-8407419571361164643?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_13_57_29-10864911891100935483?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_14_09_22-14528410016116168360?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:686: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_14_19_10-16018574555429090080?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_14_28_39-3718461782641835549?project=apache-beam-testing. kms_key=kms_key)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_14_42_02-16409934719530291515?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_13_57_13-17563387796678801785?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:686: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_14_09_00-12029205162682863991?project=apache-beam-testing. kms_key=transform.kms_key)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_14_19_23-5697948875180401397?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:565: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_14_29_44-2725971973411710860?project=apache-beam-testing. streaming = self.test_pipeline.options.view_as(StandardOptions).streaming <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1140: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported method_to_use = self._compute_method(p, p.options) <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:557: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_13_57_21-2828472055692537338?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_14_09_03-9106543358350563664?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_14_18_45-7111109336806370873?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_14_29_13-8790464233660442072?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_14_39_10-9236721528340213261?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_14_49_33-7138627036780123943?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_13_57_15-14358415783294566292?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1140: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported method_to_use = self._compute_method(p, p.options) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_14_09_51-7922512693891889038?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_14_21_09-400023400920011770?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-25_14_30_08-7720370964270902488?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:557: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543 test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... SKIP: Due to a known issue in avro-python3 package, thistest is skipped until BEAM-6522 is addressed. test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543 test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok ---------------------------------------------------------------------- XML: nosetests-postCommitIT-df-py35.xml ---------------------------------------------------------------------- XML: <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 41 tests in 3670.657s OK (SKIP=4) FAILURE: Build failed with an exception. * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/direct/py35/build.gradle'> line: 49 * What went wrong: Execution failed for task ':sdks:python:test-suites:direct:py35:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1h 2m 6s 63 actionable tasks: 46 executed, 17 from cache Publishing build scan... https://gradle.com/s/ikrmxedq4zgmm Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
