See
<https://ci-beam.apache.org/job/beam_PostCommit_Python35/2716/display/redirect?page=changes>
Changes:
[piotr.szuberski] [BEAM-10136] Add JdbcIO Write Cross-language transform
[piotr.szuberski] [BEAM-10135] Add JdbcIO Read Cross-language transform
[piotr.szuberski] [BEAM-10135][BEAM-10136] Add Python wrapper for
Cross-language JdbcIO
[piotr.szuberski] [BEAM-10135][BEAM-10136] Add integration tests for JdbcIO
python wrapper
[piotr.szuberski] [BEAM-10135][BEAM-10136] AddJdbcIO python wrapper integration
tests to
[piotr.szuberski] [BEAM-10171] Update the website with JdbcIO cross-language
support
[je.ik] [BEAM-10533] Remove watermark hold from RequiresTimeSortedInput
------------------------------------------
[...truncated 15.13 MB...]
"component_encodings": [],
"pipeline_proto_coder_id":
"ref_Coder_FastPrimitivesCoder_6"
},
{
"@type":
"FastPrimitivesCoder$QlpoOTFBWSZTWYQR6NMAAEDXwH8QgCEJAEBAv279AmAAIABqEqnqGgaABpoyNAGVHpGgMgDQBk0oR6IeoECBiqqU5NY23ndshzT2UPUOGrg42YPi9VyA8lbwwPJgtghxs5Qq1aWwExCDeMa0RHC2QigTCdizz1nx+LuSKcKEhCCPRpg=",
"component_encodings": [],
"pipeline_proto_coder_id":
"ref_Coder_FastPrimitivesCoder_6"
}
],
"is_pair_like": true,
"pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_6"
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"output_name": "None",
"user_name": "m_out.out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "s4"
},
"serialized_fn":
"QlpoOTFBWSZTWUQD3m0AAp9/8H/////////////cwr///+ZrwCAAAEBAAvcWNkDakNTRJtJMymgNGmT0CAPQmgAAA0aaAA0aaDRoABp6NQaINTCain4SbSaTyRoAZNANBoA0AAeoAGmgDRkNNNAcAAAAAAAAAAAAAAAAAAAAyBKEECaaCYp+omg0aD1AAAAAAANAGgaA9NQHpAYj/sBK0pyllVVzsqTq6HUNODFSxqqoUtAIaZGAX4LZfRYO3IyCSYL1pkeqoPRUSNDxyFRUyYNHSR1CFydBbMYpqI6EsKUMj4QIOc5znORzMMMyGkzL7ceP7yvxIY2VYrN9XvmnQVUXErKLoMyrUMIjKPUTbbdq8/BsuFDk3+94vP60ArukDK8dmoowDMHpILzWJOpZzafdY4nGO4ZWEYOy8ygEA5TxkKyjoiEcfJHJclYp4vfNt1aNFWPGloXeTCM8yAieuLkoqZBeoy8zSmr8HHzz8aAIfiy/kHpbszb05r3UncULa1ubjc976+hUH3FRneFNK3DQ6LIMUtWNHIRABvRxLk9ZhsBYJVILHRhnUHRg0xNGQgNhdQgcqAadrlx/NhAmplHc0I9UFXfuksgQKWZ23fcMo1qMaYARkvqouOD56lf2d6dD34nts/NKwBIU3ckGoIBNj6FREgfs33Qc5xUYJ5CyIE8EkL2E0dj4G4I9wDWvKRrYhQSQ82wo0IQSXRIYH1GdbloIZBno4i0o8Uvl4rBqrVwHFmaYHOkwBJixadkCmY1d4qtTl2LAwhDKUfE/Bllbrl3IS5hnqyVmbYulucS4Vly7bEK12iz3syy17AZghM9ZN7hCeMVWKIa8Wp5KW7kXESC2gl6UUIq6KJMUso944sIpMCwUFzqSjDSX63SjC2owRs2wGXej5410oKlJqrYGoDLZFnUvmVJJPfYB7uDJbbr0EZZw5N1TnUdCFWUooDD451mk2bearXPGILiij2uRY1VLqbbInktahxrXPJ9aJkWQYx60YyZ5mDAy7+3c0KDAPBSbJJ0DXLKVEmdBlmM+ZFotEQEKNd664fvEBUCYmU9D4RRdYoggUYiTLc8mvhcFwBwaosoMrjGCE0ax53LTNpgBODiu2BEYWbYkMdHFGN92XAUquC1m4RavSWH0MIpkoRAKyCHUMjEQdiQobAf1WnF7gwSao3WrK1Gqtb4c+en/F3JFOFCQRAPebQ==",
"user_name": "m_out"
}
}
],
"type": "JOB_TYPE_BATCH"
}
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query
SELECT bytes, date, time FROM
python_write_to_table_15952497241371.python_no_schema_table to BQ
DEBUG:google.auth.transport._http_client:Making request: GET
http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET
http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3,
connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1):
metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1"
200 144
DEBUG:google.auth.transport.requests:Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/[email protected]/token
HTTP/1.1" 200 192
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1):
bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST
/bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
createTime: '2020-07-20T13:02:36.847516Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2020-07-20_06_02_35-9156268729681842718'
location: 'us-central1'
name: 'beamapp-jenkins-0720130224-799846'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2020-07-20T13:02:36.847516Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id:
[2020-07-20_06_02_35-9156268729681842718]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job:
2020-07-20_06_02_35-9156268729681842718
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow
monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-20_06_02_35-9156268729681842718?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-20_06_02_35-9156268729681842718?project=apache-beam-testing
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET
/bigquery/v2/projects/apache-beam-testing/queries/b46b95c8-a237-4407-b625-09749d09d114?maxResults=0&location=US
HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET
/bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon891e6d3f37aa3dbb9e911a99b9789b7051a6c19e/data
HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is:
[(b'\xe4\xbd\xa0\xe5\xa5\xbd', datetime.date(3000, 12, 31), datetime.time(23,
59, 59)), (b'abc', datetime.date(2000, 1, 1), datetime.time(0, 0)),
(b'\xab\xac\xad', datetime.date(2000, 1, 1), datetime.time(0, 0)), (b'xyw',
datetime.date(2011, 1, 1), datetime.time(23, 59, 59, 999999))]
INFO:apache_beam.io.gcp.bigquery_write_it_test:Deleting dataset
python_write_to_table_15952497241371 in project apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job
2020-07-20_06_02_35-9156268729681842718 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:02:35.742Z:
JOB_MESSAGE_DETAILED: Autoscaling is enabled for job
2020-07-20_06_02_35-9156268729681842718. The number of workers will be between
1 and 1000.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:02:35.742Z:
JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job
2020-07-20_06_02_35-9156268729681842718.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:02:39.750Z:
JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:02:41.512Z:
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:02:41.547Z:
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step GroupByKey: GroupByKey not
followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:02:41.583Z:
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:02:41.611Z:
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:02:41.676Z:
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:02:41.725Z:
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:02:41.766Z:
JOB_MESSAGE_DETAILED: Fusing consumer metrics into Create/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:02:41.791Z:
JOB_MESSAGE_DETAILED: Fusing consumer map_to_common_key into metrics
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:02:41.828Z:
JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Reify into map_to_common_key
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:02:41.859Z:
JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Write into GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:02:41.897Z:
JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/GroupByWindow into
GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:02:41.923Z:
JOB_MESSAGE_DETAILED: Fusing consumer m_out into GroupByKey/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:02:41.973Z:
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:02:42.022Z:
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:02:42.072Z:
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:02:42.101Z:
JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:02:42.238Z:
JOB_MESSAGE_DEBUG: Executing wait step start13
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:02:42.312Z:
JOB_MESSAGE_BASIC: Executing operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:02:42.359Z:
JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:02:42.391Z:
JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:02:42.442Z:
JOB_MESSAGE_BASIC: Finished operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:02:42.497Z:
JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:02:42.579Z:
JOB_MESSAGE_BASIC: Executing operation
Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:03:07.571Z:
JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric
descriptors and Stackdriver will not create new Dataflow custom metrics for
this job. Each unique user-defined metric name (independent of the DoFn in
which it is defined) produces a new metric descriptor. To delete old / unused
metric descriptors see
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:03:08.426Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on
the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:04:01.621Z:
JOB_MESSAGE_BASIC: Finished operation
assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:04:03.775Z:
JOB_MESSAGE_BASIC: Finished operation
Create/Read+ExternalTransform(simple)/Map(<lambda at
external_it_test.py:43>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:04:03.833Z:
JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:04:03.882Z:
JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:04:03.952Z:
JOB_MESSAGE_BASIC: Executing operation
assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:04:13.239Z:
JOB_MESSAGE_BASIC: Finished operation
assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:04:13.347Z:
JOB_MESSAGE_DEBUG: Executing success step success19
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:04:13.467Z:
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:04:13.513Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:04:13.542Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:04:42.351Z:
JOB_MESSAGE_BASIC: Executing BigQuery import job
"dataflow_job_9541914550399538353". You can check its status with the bq tool:
"bq show -j --project_id=apache-beam-testing dataflow_job_9541914550399538353".
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:04:48.176Z:
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:04:48.215Z:
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:04:53.160Z:
JOB_MESSAGE_BASIC: BigQuery import job "dataflow_job_9541914550399538353" done.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:04:53.961Z:
JOB_MESSAGE_BASIC: Finished operation read+write/NativeWrite
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:04:54.036Z:
JOB_MESSAGE_DEBUG: Executing success step success1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:04:54.146Z:
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:04:54.377Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:04:54.414Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:05:07.715Z:
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:05:07.754Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:05:07.785Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job
2020-07-20_05_58_02-4811876975144812283 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:05:43.465Z:
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:05:43.514Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:05:43.551Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job
2020-07-20_05_59_12-1375211542480014097 is in state JOB_STATE_DONE
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query
SELECT fruit from `python_query_to_table_15952499382163.output_table`; to BQ
DEBUG:google.auth.transport._http_client:Making request: GET
http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET
http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3,
connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1):
metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1"
200 144
DEBUG:google.auth.transport.requests:Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/[email protected]/token
HTTP/1.1" 200 192
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1):
bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST
/bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET
/bigquery/v2/projects/apache-beam-testing/queries/42a5593a-682d-4bdd-9340-cdcfc6498ea6?maxResults=0&location=US
HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET
/bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon87e288cc11a55361e81a7a11d25c6551b4ad81cc/data
HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Read from given query (SELECT
fruit from `python_query_to_table_15952499382163.output_table`;), total rows 2
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Generate checksum:
158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:08:36.071Z:
JOB_MESSAGE_BASIC: Finished operation
Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:08:36.154Z:
JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:08:36.226Z:
JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:08:36.315Z:
JOB_MESSAGE_BASIC: Executing operation
GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:08:45.395Z:
JOB_MESSAGE_BASIC: Finished operation
GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:08:45.457Z:
JOB_MESSAGE_DEBUG: Executing success step success11
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:08:45.580Z:
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:08:45.637Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:08:45.668Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:09:37.107Z:
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:09:37.147Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-20T13:09:37.182Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job
2020-07-20_06_02_35-9156268729681842718 is in state JOB_STATE_DONE
test_bigquery_tornadoes_it
(apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT)
... ok
test_streaming_wordcount_debugging_it
(apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT)
... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming
test_datastore_wordcount_it
(apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT)
... ok
test_autocomplete_it
(apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_leader_board_it
(apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it
(apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_streaming_wordcount_it
(apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it
(apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_hourly_team_score_it
(apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT)
... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_read_via_sql
(apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest)
... ok
test_read_via_table
(apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest)
... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests)
... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests)
... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_bigquery_read_1M_python
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bqfl_streaming
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP:
TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_avro_file_load
(apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ...
ok
test_copy_batch
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest)
... ok
test_copy_rewrite_token
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_spanner_error
(apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest)
... ok
test_spanner_update
(apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest)
... ok
test_write_batches
(apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest)
... ok
test_multiple_destinations_transform
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
... ok
test_value_provider_transform
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_datastore_write_limit
(apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT)
... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP:
https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ...
ok
test_streaming_data_only
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_analyzing_syntax
(apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok
test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_basic_execution
(apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP:
The "TestDataflowRunner", does not support the TestStream transform. Supported
runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP:
The "TestDataflowRunner", does not support the TestStream transform. Supported
runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ...
SKIP: The "TestDataflowRunner", does not support the TestStream transform.
Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_text_detection_with_language_hint
(apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok
test_label_detection_with_video_context
(apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ...
ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_big_query_write
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ...
SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_job_python_from_python_it
(apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
test_big_query_legacy_sql
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_new_types
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_new_types_avro
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_new_types_native
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_standard_sql
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_standard_sql_kms_key_native
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_metrics_fnapi_it
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
... ok
test_metrics_it
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
... ok
----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py35.xml
----------------------------------------------------------------------
XML:
<https://ci-beam.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 64 tests in 3968.390s
OK (SKIP=7)
FAILURE: Build failed with an exception.
* Where:
Script
'<https://ci-beam.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/direct/common.gradle'>
line: 48
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 9m 27s
130 actionable tasks: 98 executed, 31 from cache, 1 up-to-date
Publishing build scan...
https://gradle.com/s/hyybw2lxpkcaa
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]