See 
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/7858/display/redirect?page=changes>

Changes:

[amaliujia] [BEAM-7010] MAX/MIN(STRING)

------------------------------------------
[...truncated 773.10 KB...]
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
                  "component_encodings": [
                    {
                      "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
                      "component_encodings": []
                    }, 
                    {
                      "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": 
"WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s32"
        }, 
        "serialized_fn": "<string of 976 bytes>", 
        "user_name": 
"WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s34", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "DeleteTablesFn", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.bigquery_file_loads.DeleteTablesFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
                  "component_encodings": [
                    {
                      "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
                      "component_encodings": []
                    }, 
                    {
                      "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": 
"WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/Delete.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s33"
        }, 
        "serialized_fn": "<string of 412 bytes>", 
        "user_name": 
"WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/Delete"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2019-04-09T09:02:00.917836Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-04-09_02_01_59-15614925420298458514'
 location: u'us-central1'
 name: u'beamapp-jenkins-0409090152-091594'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-04-09T09:02:00.917836Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-09_02_01_59-15614925420298458514]
root: INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_02_01_59-15614925420298458514?project=apache-beam-testing
root: INFO: Job 2019-04-09_02_01_59-15614925420298458514 is in state 
JOB_STATE_RUNNING
root: INFO: 2019-04-09T09:01:59.968Z: JOB_MESSAGE_DETAILED: Autoscaling is 
enabled for job 2019-04-09_02_01_59-15614925420298458514. The number of workers 
will be between 1 and 1000.
root: INFO: 2019-04-09T09:02:00.024Z: JOB_MESSAGE_DETAILED: Autoscaling was 
automatically enabled for job 2019-04-09_02_01_59-15614925420298458514.
root: INFO: 2019-04-09T09:02:03.111Z: JOB_MESSAGE_DETAILED: Checking 
permissions granted to controller Service Account.
root: INFO: 2019-04-09T09:02:04.243Z: JOB_MESSAGE_BASIC: Worker configuration: 
n1-standard-1 in us-central1-a.
root: INFO: 2019-04-09T09:02:04.913Z: JOB_MESSAGE_DETAILED: Expanding 
CoGroupByKey operations into optimizable parts.
root: INFO: 2019-04-09T09:02:04.959Z: JOB_MESSAGE_DEBUG: Combiner lifting 
skipped for step 
WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables:
 GroupByKey not followed by a combiner.
root: INFO: 2019-04-09T09:02:04.997Z: JOB_MESSAGE_DEBUG: Combiner lifting 
skipped for step 
WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations: 
GroupByKey not followed by a combiner.
root: INFO: 2019-04-09T09:02:05.052Z: JOB_MESSAGE_DEBUG: Combiner lifting 
skipped for step 
WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows: GroupByKey not 
followed by a combiner.
root: INFO: 2019-04-09T09:02:05.097Z: JOB_MESSAGE_DETAILED: Expanding 
GroupByKey operations into optimizable parts.
root: INFO: 2019-04-09T09:02:05.140Z: JOB_MESSAGE_DETAILED: Lifting 
ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-04-09T09:02:05.302Z: JOB_MESSAGE_DEBUG: Annotating graph with 
Autotuner information.
root: INFO: 2019-04-09T09:02:05.347Z: JOB_MESSAGE_DETAILED: Fusing adjacent 
ParDo, Read, Write, and Flatten operations
root: INFO: 2019-04-09T09:02:05.389Z: JOB_MESSAGE_DETAILED: Unzipping flatten 
s17 for input s11.out_WrittenFiles
root: INFO: 2019-04-09T09:02:05.439Z: JOB_MESSAGE_DETAILED: Fusing unzipped 
copy of 
WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify,
 through flatten 
WriteWithMultipleDests2/BigQueryBatchFileLoads/DestinationFilesUnion, into 
producer 
WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
root: INFO: 2019-04-09T09:02:05.485Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow
 into 
WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read
root: INFO: 2019-04-09T09:02:05.533Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)
 into 
WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow
root: INFO: 2019-04-09T09:02:05.574Z: JOB_MESSAGE_DETAILED: Unzipping flatten 
s17-u31 for input s18-reify-value9-c29
root: INFO: 2019-04-09T09:02:05.629Z: JOB_MESSAGE_DETAILED: Fusing unzipped 
copy of 
WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write,
 through flatten 
WriteWithMultipleDests2/BigQueryBatchFileLoads/DestinationFilesUnion/Unzipped-1,
 into producer 
WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
root: INFO: 2019-04-09T09:02:05.680Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
 into 
WriteWithMultipleDests2/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile
root: INFO: 2019-04-09T09:02:05.731Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests/AppendDestination into Create/Read
root: INFO: 2019-04-09T09:02:05.788Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests2/BigQueryBatchFileLoads/ApplyGlobalWindow into 
Create/Read
root: INFO: 2019-04-09T09:02:05.826Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
 into 
WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
root: INFO: 2019-04-09T09:02:05.873Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
 into WriteWithMultipleDests2/BigQueryBatchFileLoads/AppendDestination
root: INFO: 2019-04-09T09:02:05.923Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests2/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile
 into WriteWithMultipleDests2/BigQueryBatchFileLoads/DropShardNumber
root: INFO: 2019-04-09T09:02:05.970Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests2/BigQueryBatchFileLoads/AppendDestination into 
WriteWithMultipleDests2/BigQueryBatchFileLoads/ApplyGlobalWindow
root: INFO: 2019-04-09T09:02:06.022Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests2/BigQueryBatchFileLoads/DropShardNumber into 
WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow
root: INFO: 2019-04-09T09:02:06.066Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests/StreamInsertRows/ParDo(BigQueryWriteFn) into 
WriteWithMultipleDests/AppendDestination
root: INFO: 2019-04-09T09:02:06.110Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow 
into WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Read
root: INFO: 2019-04-09T09:02:06.154Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Write into 
WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Reify
root: INFO: 2019-04-09T09:02:06.195Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Reify into 
WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(_ShardDestinations)
root: INFO: 2019-04-09T09:02:06.232Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(_ShardDestinations) into 
WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
root: INFO: 2019-04-09T09:02:06.275Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests2/BigQueryBatchFileLoads/GenerateFilePrefix into 
WriteWithMultipleDests2/BigQueryBatchFileLoads/CreateFilePrefixView/Read
root: INFO: 2019-04-09T09:02:06.316Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue 
into 
WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables
root: INFO: 2019-04-09T09:02:06.365Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow
 into 
WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read
root: INFO: 2019-04-09T09:02:06.407Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs 
into WriteWithMultipleDests2/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read
root: INFO: 2019-04-09T09:02:06.456Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
 into 
WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify
root: INFO: 2019-04-09T09:02:06.494Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
 into 
WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs
root: INFO: 2019-04-09T09:02:06.540Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames 
into 
WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow
root: INFO: 2019-04-09T09:02:06.579Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs 
into WriteWithMultipleDests2/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read
root: INFO: 2019-04-09T09:02:06.624Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/Delete into 
WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames
root: INFO: 2019-04-09T09:02:06.674Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests2/BigQueryBatchFileLoads/Map(<lambda at 
bigquery_file_loads.py:534>) into 
WriteWithMultipleDests2/BigQueryBatchFileLoads/ImpulseJobName/Read
root: INFO: 2019-04-09T09:02:06.726Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables
 into 
WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
root: INFO: 2019-04-09T09:02:06.777Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify
 into 
WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue
root: INFO: 2019-04-09T09:02:06.830Z: JOB_MESSAGE_DEBUG: Workflow config is 
missing a default resource spec.
root: INFO: 2019-04-09T09:02:07.045Z: JOB_MESSAGE_DEBUG: Adding StepResource 
setup and teardown to workflow graph.
root: INFO: 2019-04-09T09:02:07.085Z: JOB_MESSAGE_DEBUG: Adding workflow start 
and stop steps.
root: INFO: 2019-04-09T09:02:07.134Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-04-09T09:02:07.345Z: JOB_MESSAGE_DEBUG: Executing wait step 
start44
root: INFO: 2019-04-09T09:02:07.453Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests2/BigQueryBatchFileLoads/ImpulseJobName/Read+WriteWithMultipleDests2/BigQueryBatchFileLoads/Map(<lambda
 at bigquery_file_loads.py:534>)
root: INFO: 2019-04-09T09:02:07.495Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests2/BigQueryBatchFileLoads/CreateFilePrefixView/Read+WriteWithMultipleDests2/BigQueryBatchFileLoads/GenerateFilePrefix
root: INFO: 2019-04-09T09:02:07.507Z: JOB_MESSAGE_DEBUG: Starting worker pool 
setup.
root: INFO: 2019-04-09T09:02:07.534Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Create
root: INFO: 2019-04-09T09:02:07.544Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-a...
root: INFO: 2019-04-09T09:02:07.578Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
root: INFO: 2019-04-09T09:02:07.610Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
root: INFO: 2019-04-09T09:02:07.651Z: JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Session" 
materialized.
root: INFO: 2019-04-09T09:02:07.693Z: JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session"
 materialized.
root: INFO: 2019-04-09T09:02:07.739Z: JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session"
 materialized.
root: INFO: 2019-04-09T09:02:19.170Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised 
the number of workers to 0 based on the rate of progress in the currently 
running step(s).
root: INFO: 2019-04-09T09:03:17.101Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised 
the number of workers to 1 based on the rate of progress in the currently 
running step(s).
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 3317.682s

FAILED (SKIP=1, errors=1, failures=2)
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_01_37_17-10294040118327849826?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_01_53_25-10527935848214457851?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_02_03_15-3760874532431609100?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_01_37_15-10756775351034802543?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_01_57_37-7283812850353785701?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_01_37_16-11310505292680567814?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_01_51_01-259742641084850611?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_01_58_27-15037618760383771734?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_02_05_47-16863214475889784685?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_01_37_15-6688279407558628324?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_01_55_24-3512828576659218760?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_02_02_27-8378040666783617086?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_01_37_16-3851847634992654514?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_01_45_54-17324936666409032368?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_01_53_23-5413973423668309007?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_02_01_14-10135054538153194660?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_02_09_24-9071466353186785579?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_01_37_14-4273658469908236263?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_01_44_16-2303789562772934983?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_01_54_17-3006576387194126342?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_02_01_59-15614925420298458514?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_01_37_15-3965214052300967786?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_01_46_09-10697882893233748574?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_01_56_20-9837254960422244177?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_02_04_21-12006533701893758231?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_01_37_15-5963165260339057959?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_01_46_12-11305386608722673583?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_01_54_28-5702529234034709307?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_02_01_58-13388498740666764113?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_02_09_12-9443206757191962559?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_02_16_56-17024120205619242532?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_02_24_56-12065450123942746433?project=apache-beam-testing.

> Task :beam-sdks-python:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'>
 line: 229

* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 59m 41s
6 actionable tasks: 6 executed

Publishing build scan...
https://gradle.com/s/ncu5gl7voud6c

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to