See
<https://ci-beam.apache.org/job/beam_PostCommit_Python35/9/display/redirect?page=changes>
Changes:
[noreply] [BEAM-9615] Add String UTF8 coder. (#11989)
[noreply] [BEAM-7163] Correcting godoc for passert.Sum (#11999)
------------------------------------------
[...truncated 10.71 MB...]
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"output_name": "None",
"user_name": "m_out.out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "s4"
},
"serialized_fn":
"eNqFU2tz3DQU1WbzqttCGt7vFlrYALWBUgq0tMCGtqnLtuNmpv6S0Witu7tKZMlXkrvNDDsDw6TLV/4FPxNZm5QuEBiPbelKOveco3t/bncKVrFiBLQPrIydYcoOtCltXGgDUZdJyfoSHhpWVWA29U0VIdn4BVsTXOjkJwkh1O1XQEdCOYvtggspY9p8I1oYYA6oYiVwV1cSIlwMR7bubfszt8ORpRARqqpdQLK4nJ/yEV27v0Ir+XITMmIoFK5O8UR+1s/niPt9gUPMwTNnThuL0RRPZniq19vB01N8LsPn83bDeKBw7d+YDmpVOKG9xDOduXWpGQ9UIlzPVz1EV3NoNOALB/hihi910lZK/BulD7pbvEX4AuFtwhcJXyKOkN0W2fWRZfITIb8uPBNZOTYyXCUP8OVOz4MupO10MV3KFxsPVAH4Sr7kh24s/PhVh6+FlcYGfD0YVT1isgZ8Ixi5zYZD4PeCnfjmFN/KWz76GN8+wHfy3/0wGekSkl1Qe0LZo/9FK9kjSMba7FlvMyQNPL2vrevqshSO3t93I60uXU6sKRLL92xShUjyzK0kplYKjE04c2wg9fjpgMJjMIWwQEtwRhSWVqICKRTE1T6eDR5fk6zsc3Ydz6V/dNdIq/30WfcPvrux4fC9DM/P3dQQHGXOmQgvBJB+LaTzevD9fMVP/XKzih88wU6GG3NHRVlp42iped2U6of5jb/V2KGa+EhE/P9q8KMD/DjDi4EL9YkKRynGTzDJ8JPRuV5vgp/+o4s+G4WSvTTFzzO8bHsOv8jwSrhMX6K+Ic3QVlDgl72gqTK6AGvxq9GVur+DX0/w6g5e+8+2figU12OhhhF+45v5+gRvdPIzjdyiqMtasqYLGicAv01bM4LCUg4DVkuH3/0W9IwDim+m749LNtsR3ZK6z+QsqY2w61Nu5utNCYsSrGNlRQtd9r1fBn/w+U40S77Zh2A8+s3j0A+3RJszWtuHU7zl8W/na40c6W8HOPWCQDUebc1h1U5ID3NEItqsDZv1/x0PkU7wbminUhRGW/wxJbZf9x324j8BtCatzQ==",
"user_name": "m_out"
}
}
],
"type": "JOB_TYPE_BATCH"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
createTime: '2020-06-12T21:33:21.025313Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2020-06-12_14_33_19-16982964132928080057'
location: 'us-central1'
name: 'beamapp-jenkins-0612213309-784519'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2020-06-12T21:33:21.025313Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id:
[2020-06-12_14_33_19-16982964132928080057]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job:
2020-06-12_14_33_19-16982964132928080057
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow
monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_14_33_19-16982964132928080057?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job
2020-06-12_14_33_19-16982964132928080057 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:19.798Z:
JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job
2020-06-12_14_33_19-16982964132928080057.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:19.798Z:
JOB_MESSAGE_DETAILED: Autoscaling is enabled for job
2020-06-12_14_33_19-16982964132928080057. The number of workers will be between
1 and 1000.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:23.673Z:
JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:25.468Z:
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:25.505Z:
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step GroupByKey: GroupByKey not
followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:25.540Z:
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:25.575Z:
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:25.650Z:
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:25.707Z:
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:25.747Z:
JOB_MESSAGE_DETAILED: Fusing consumer metrics into Create/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:25.782Z:
JOB_MESSAGE_DETAILED: Fusing consumer map_to_common_key into metrics
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:25.822Z:
JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Reify into map_to_common_key
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:25.862Z:
JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Write into GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:25.894Z:
JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/GroupByWindow into
GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:25.930Z:
JOB_MESSAGE_DETAILED: Fusing consumer m_out into GroupByKey/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:25.966Z:
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:25.995Z:
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:26.034Z:
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:26.066Z:
JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:26.195Z:
JOB_MESSAGE_DEBUG: Executing wait step start13
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:26.262Z:
JOB_MESSAGE_BASIC: Executing operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:26.308Z:
JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:26.345Z:
JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:26.388Z:
JOB_MESSAGE_BASIC: Finished operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:26.448Z:
JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:26.523Z:
JOB_MESSAGE_BASIC: Executing operation
Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:38.726Z:
JOB_MESSAGE_BASIC: Finished operation
assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:41.939Z:
JOB_MESSAGE_BASIC: Finished operation
Create/Read+ExternalTransform(simple)/Map(<lambda at
external_it_test.py:43>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:42.020Z:
JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:42.088Z:
JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:42.161Z:
JOB_MESSAGE_BASIC: Executing operation
assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:50.749Z:
JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric
descriptors and Stackdriver will not create new Dataflow custom metrics for
this job. Each unique user-defined metric name (independent of the DoFn in
which it is defined) produces a new metric descriptor. To delete old / unused
metric descriptors see
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:50.365Z:
JOB_MESSAGE_BASIC: Finished operation
assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:50.440Z:
JOB_MESSAGE_DEBUG: Executing success step success19
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:50.601Z:
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:50.656Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:33:50.683Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:34:04.069Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on
the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:35:51.540Z:
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:35:51.573Z:
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:35:58.267Z:
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:35:58.318Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:35:58.362Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job
2020-06-12_14_28_02-9083963659914578808 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:39:13.930Z:
JOB_MESSAGE_BASIC: Finished operation
Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:39:14.015Z:
JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:39:14.071Z:
JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:39:14.150Z:
JOB_MESSAGE_BASIC: Executing operation
GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:39:23.265Z:
JOB_MESSAGE_BASIC: Finished operation
GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:39:23.335Z:
JOB_MESSAGE_DEBUG: Executing success step success11
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:39:23.441Z:
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:39:23.486Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:39:23.513Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:41:06.957Z:
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:41:07.002Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-12T21:41:07.029Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job
2020-06-12_14_33_19-16982964132928080057 is in state JOB_STATE_DONE
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_13_29_29-8707245887595677424?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_13_43_59-5649514252666479454?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_13_52_15-11072324969408712914?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_14_00_15-11513366309112185664?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_14_09_29-6663147020901588303?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_14_17_01-11350928675995882562?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_14_25_08-15747719226351431688?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_14_33_19-16982964132928080057?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_13_29_25-4637386936069211079?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_13_48_47-1455775487775872039?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_13_57_04-5184078937303510417?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_14_05_08-5514004173149806368?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_14_14_27-12093998463203798659?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_14_24_04-15870080650627319109?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_13_29_28-12524979632216940688?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_13_41_50-15930656722472507771?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_13_49_57-11556492080569579197?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_13_58_50-10462610679016569841?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_14_07_11-8364571625865994575?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_14_15_02-11417078648603609549?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_14_23_18-9555578083859822884?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_13_29_25-14527886378884438245?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_13_38_04-15757925808081261435?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_13_47_09-13770817912929567528?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_13_55_05-6661706322245417643?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_14_03_56-14308623647275863112?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_14_11_45-7611629407264840237?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_14_20_02-3618620068478105941?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_13_29_23-6271080564716672532?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_13_49_32-8617974952573047282?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_13_58_08-14393702114859870288?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_14_06_29-18427385550494013088?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_14_15_01-13552570295761320578?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_14_22_53-5536318358676368818?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_13_29_25-16788566058847341374?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_13_38_13-2543302444534692804?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_13_47_14-2111890226024941882?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_13_55_52-8304435085904941968?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_14_03_45-612190351868800785?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_14_11_38-3745128791546534911?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_14_19_39-14207058677074712689?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_14_28_02-9083963659914578808?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_13_29_26-12282689126647086846?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_13_39_09-13174777774729491969?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_13_48_04-7658086743398980458?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_13_58_29-6413289378979647305?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_14_07_36-3248681431746063297?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_14_25_27-8155039090847483665?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_13_29_25-13051553858871806303?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_13_40_17-14308195995382587783?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_13_51_51-8244794907918093050?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_14_00_17-14016465101729084265?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_14_08_19-11692806585392859534?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_14_16_42-9539714195255409100?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-12_14_24_29-3478169139853379508?project=apache-beam-testing
test_bigquery_tornadoes_it
(apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT)
... ok
test_streaming_wordcount_debugging_it
(apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT)
... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming
test_autocomplete_it
(apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it
(apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT)
... ok
test_leader_board_it
(apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it
(apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_streaming_wordcount_it
(apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_hourly_team_score_it
(apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT)
... ok
test_user_score_it
(apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_read_via_sql
(apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest)
... ok
test_read_via_table
(apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest)
... ok
test_bigquery_read_1M_python
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests)
... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests)
... ok
test_bqfl_streaming
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP:
TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_avro_file_load
(apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ...
ok
test_copy_batch
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest)
... ok
test_copy_rewrite_token
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_spanner_error
(apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest)
... ok
test_spanner_update
(apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest)
... ok
test_write_batches
(apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest)
... ok
test_multiple_destinations_transform
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
... ok
test_value_provider_transform
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_datastore_write_limit
(apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT)
... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP:
https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ...
ok
test_big_query_legacy_sql
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_new_types
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_new_types_native
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_standard_sql
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_standard_sql_kms_key_native
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_analyzing_syntax
(apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok
test_basic_execution
(apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP:
The "TestDataflowRunner", does not support the TestStream transform. Supported
runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP:
The "TestDataflowRunner", does not support the TestStream transform. Supported
runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ...
SKIP: The "TestDataflowRunner", does not support the TestStream transform.
Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_text_detection_with_language_hint
(apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok
test_label_detection_with_video_context
(apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ...
ok
test_big_query_write
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ...
SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it
(apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
test_metrics_fnapi_it
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
... ok
test_metrics_it
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
... ok
----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py35.xml
----------------------------------------------------------------------
XML:
<https://ci-beam.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 63 tests in 4339.902s
OK (SKIP=7)
FAILURE: Build failed with an exception.
* Where:
Script
'<https://ci-beam.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/direct/common.gradle'>
line: 48
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 13m 56s
86 actionable tasks: 63 executed, 23 from cache
Publishing build scan...
https://gradle.com/s/doihlqxhdft4y
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]