See <https://builds.apache.org/job/beam_PostCommit_Python35/905/display/redirect?page=changes>
Changes: [aryan.naraghi] Fix a bug related to zero-row responses ------------------------------------------ [...truncated 334.81 KB...] fields: ['language']> root: INFO: Matching ['language'] to ['language'] root: INFO: Attempting to perform query SELECT name, language FROM python_bq_streaming_inserts_15729397952671.output_table1 to BQ google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254 google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None) google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80 urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144 google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-comp...@developer.gserviceaccount.com/token urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-comp...@developer.gserviceaccount.com/token HTTP/1.1" 200 181 urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443 urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/454d528e-c27c-4884-9713-ff8ef4ac0da2?maxResults=0&timeoutMs=10000&location=US HTTP/1.1" 200 None urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/jobs/454d528e-c27c-4884-9713-ff8ef4ac0da2?location=US HTTP/1.1" 200 None urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon3dee4dc6_aabb_4a7d_9f70_9a8b5ed8211d/data HTTP/1.1" 200 None root: INFO: Result of query is: [('spark', 'scala'), ('beam', 'go'), ('flink', 'scala'), ('spark', 'py'), ('beam', 'java'), ('beam', 'py'), ('flink', 'java'), ('spark', 'scala')] root: INFO: Attempting to perform query SELECT name, language FROM python_bq_streaming_inserts_15729397952671.output_table2 to BQ google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254 google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None) google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80 urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144 google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-comp...@developer.gserviceaccount.com/token urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-comp...@developer.gserviceaccount.com/token HTTP/1.1" 200 181 urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443 urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/21147188-5b59-40b3-8832-a724fa11f4fe?maxResults=0&timeoutMs=10000&location=US HTTP/1.1" 200 None urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/jobs/21147188-5b59-40b3-8832-a724fa11f4fe?location=US HTTP/1.1" 200 None urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon8a9cd30d19a2585a5823f151c0f6c12a2a58ac09/data HTTP/1.1" 200 None root: INFO: Result of query is: [('beam', 'go'), ('beam', 'py'), ('spark', 'py'), ('beam', 'java'), ('flink', 'java'), ('flink', 'scala'), ('spark', 'scala'), ('spark', 'scala'), ('beam', 'go'), ('beam', 'py'), ('spark', 'py'), ('beam', 'java'), ('flink', 'java'), ('flink', 'scala'), ('spark', 'scala'), ('spark', 'scala')] root: INFO: Start verify Bigquery table properties. root: INFO: Table proto is <Table clustering: <Clustering fields: ['language']> creationTime: 1572940186501 etag: 'FwriYkZejWQ/pJ3IbtbHIw==' id: 'apache-beam-testing:python_bq_streaming_inserts_15729397952671.output_table1' kind: 'bigquery#table' lastModifiedTime: 1572940186865 location: 'US' numBytes: 0 numLongTermBytes: 0 numRows: 0 schema: <TableSchema fields: [<TableFieldSchema fields: [] mode: 'NULLABLE' name: 'name' type: 'STRING'>, <TableFieldSchema fields: [] mode: 'NULLABLE' name: 'language' type: 'STRING'>]> selfLink: 'https://www.googleapis.com/bigquery/v2/projects/apache-beam-testing/datasets/python_bq_streaming_inserts_15729397952671/tables/output_table1' streamingBuffer: <Streamingbuffer estimatedBytes: 98 estimatedRows: 8 oldestEntryTime: 1572940140000> tableReference: <TableReference datasetId: 'python_bq_streaming_inserts_15729397952671' projectId: 'apache-beam-testing' tableId: 'output_table1'> timePartitioning: <TimePartitioning type: 'DAY'> type: 'TABLE'> root: INFO: Matching {'type': 'DAY'} to <TimePartitioning type: 'DAY'> root: INFO: Matching DAY to DAY root: INFO: Matching {'fields': ['language']} to <Clustering fields: ['language']> root: INFO: Matching ['language'] to ['language'] root: INFO: Start verify Bigquery table properties. root: INFO: Table proto is <Table clustering: <Clustering fields: ['language']> creationTime: 1572940212914 etag: '/Zmapq0TI1PIIVqw1if6zQ==' id: 'apache-beam-testing:python_bq_streaming_inserts_15729397952671.output_table2' kind: 'bigquery#table' lastModifiedTime: 1572940256638 location: 'US' numBytes: 196 numLongTermBytes: 0 numRows: 16 schema: <TableSchema fields: [<TableFieldSchema fields: [] mode: 'NULLABLE' name: 'language' type: 'STRING'>, <TableFieldSchema fields: [] mode: 'NULLABLE' name: 'name' type: 'STRING'>]> selfLink: 'https://www.googleapis.com/bigquery/v2/projects/apache-beam-testing/datasets/python_bq_streaming_inserts_15729397952671/tables/output_table2' tableReference: <TableReference datasetId: 'python_bq_streaming_inserts_15729397952671' projectId: 'apache-beam-testing' tableId: 'output_table2'> timePartitioning: <TimePartitioning type: 'DAY'> type: 'TABLE'> root: INFO: Matching {'type': 'DAY'} to <TimePartitioning type: 'DAY'> root: INFO: Matching DAY to DAY root: INFO: Matching {'fields': ['language']} to <Clustering fields: ['language']> root: INFO: Matching ['language'] to ['language'] root: INFO: Deleting dataset python_bq_streaming_inserts_15729397952671 in project apache-beam-testing --------------------- >> end captured logging << --------------------- <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_24_15-13537271386000457211?project=apache-beam-testing experiments = p.options.view_as(DebugOptions).experiments or [] Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_39_22-1022427178222739500?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_47_45-3965101904621668901?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_56_27-17176634836699198175?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_00_06_15-2439973028163061940?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_24_06-2198069739804349269?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:709: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_44_21-4777200507885094057?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_00_02_01-10223242523810686016?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1208: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported self.table_reference.projectId = pcoll.pipeline.options.view_as( <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_24_09-2411906782838632241?project=apache-beam-testing experiments = p.options.view_as(DebugOptions).experiments or [] Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_36_48-14585878209576403229?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:709: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_45_24-15273070847238918224?project=apache-beam-testing kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_53_48-230912937545129335?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:709: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_00_02_38-16735673900607434305?project=apache-beam-testing kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:709: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_24_06-8258902040348755623?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_43_46-11255101140955773620?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_53_23-8871825963305584532?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_00_02_56-7045557668369165829?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:795: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental. | 'GetPath' >> beam.Map(lambda metadata: metadata.path)) <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental. | 'Checksums' >> beam.Map(compute_hash)) <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental. | 'Checksums' >> beam.Map(compute_hash)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_24_08-4255817087714859260?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:709: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_33_03-11754640634174920647?project=apache-beam-testing kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_41_42-15458359391037309438?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:648: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_50_46-14748082237677507938?project=apache-beam-testing streaming = self.test_pipeline.options.view_as(StandardOptions).streaming Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_58_07-8837663057305231912?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_24_08-955073700020475108?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_33_41-18144455581658394307?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:709: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_41_26-15256939022038119836?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:709: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_49_30-8429929031942654415?project=apache-beam-testing kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_58_39-9662410232227029318?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:709: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_24_14-5858684838697357116?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_33_44-267617375943240626?project=apache-beam-testing experiments = p.options.view_as(DebugOptions).experiments or [] Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_43_38-4923135240129192553?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:795: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_53_29-17494783628736918399?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:795: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_24_07-2773266948618739737?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_33_35-14921032219605780990?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_41_37-9124111700153259065?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_49_54-7993711728384503238?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-04_23_59_33-1080706748696864136?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_00_07_53-739121102622919485?project=apache-beam-testing ---------------------------------------------------------------------- XML: nosetests-postCommitIT-df-py35.xml ---------------------------------------------------------------------- XML: <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 45 tests in 3139.375s FAILED (SKIP=6, failures=1) > Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED FAILURE: Build completed with 2 failures. 1: Task failed with an exception. ----------- * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/direct/py35/build.gradle'> line: 51 * What went wrong: Execution failed for task ':sdks:python:test-suites:direct:py35:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 2: Task failed with an exception. ----------- * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 56 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 53m 22s 64 actionable tasks: 48 executed, 16 from cache Publishing build scan... https://gradle.com/s/5nilhesy3xycs Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org For additional commands, e-mail: builds-h...@beam.apache.org