See 
<https://builds.apache.org/job/beam_PostCommit_Python37/1155/display/redirect?page=changes>

Changes:

[mxm] [BEAM-8962] Add option to disable the metric container accumulator


------------------------------------------
[...truncated 1.06 MB...]
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "assert_that/Match.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s24"
        },
        "serialized_fn": 
"eNq1VOtzFEUQ3927kLCEIIlCEB8Hil4UbhVUEAlCLgmPkyUukRvUuM7uzt1usq+enSVJ1V2VipfK32CVlvrJP9OeuQsxInyzpvbRr1/P/Lqnf6jUfZpTP2Sux2jSEJymRSfjSdHwM87MJo1j6sWszWmeM76YLacmaHM/gt4Ho04qmqa5nRQqfhDFccOVb9P1OaOCuZ0y9UWUYUC1fsAeZzRwxXbOTBgjEwjRzAK2ijIcGsC4AxP1lt7S8DFaM01zV+tpT/Su/kCDw/YAzDmiY8gWHBnAJEnw1wqzhFnrLN2I0mLve6GI6WNmbWZ8o8DzMUsez13JCtHMkiQS7sq2CLP00mWr4L5VBBuFlSuN9Q86rH06LElHI9+Go2rH12KaeAG9DlP3fhlranCMGKhFJl4awPE5AdMOzBw4c5cJlwrBTXhZAXhlFAvcJ7xCxlFEs7TCiR046cDsgdAoyTMu3CQLyhgpO0VOY8ALigavDuC0A6+pPC6C+MJ14fUdeMOBN8khqWRQ0hhq9n+VzWcowJmwWg9HhRhrTWIh/hLaLhair2+fF7r6M4QuS9Ov9IxeZaPCLwoj0OV/xziB+p/0tpamVU0YAjUbJv9T1+TautHTFrW1K/3q9lRP/73aq/6h61pbAxt9x9Dv56GfRJPF30N7hB4En7bW0/lvz1pTg2iBhm1y1iYn8ZTLNIpZUKNFwbi4WjvHa/Pz+Ia3duDtOqmiRxwVAs4pSgqkmAXwDplBYQFZvanClrZ8lssmhnfJYbTILl3iPONQV2GcJdljBnPEROEhjcuR9T0B7w89qC8k1+fJURTYVs58zOOqzBfIsaeZ3T0TNJTnSDuKtlSTsJglLBXwgYAPyfr/3PmswPbsWqWIYtn2F8Na69fmWU2fOGToE2oZ+vT4tG7idwqXoc/qVXzDJdV3T4/z0QA+xgvxiQOXw9nwFJn9d/MOEzVkIrgygE8duBpis37mwLWwZodn1mC+pdEBXHfg8wHc6MNNckT2sBwgbhilooCFgzMMDUrfCBjeByoyXph37svS3ZZqE5o4wBbtPizVFVSU5qVQeAUs22QSVVkp9nW37HIHbnuFLeCOA3cH0HLgiwHc64NdDxdCiXYf0Vbq4bIdKucvveEeKe8WyIMckU54t0SABw6sqgw4HnG0Du3wla0KnPPMZ0UBD8PVZ87YVnkI5nm0n+drr/TW4Js+fLsGay8c5O0oDbJNJNqE7xDH7cP3dVWoTWXADdLnxQ89zFtx5tF4iIMceojiqzsheNTtMo4QwfMgRi7mIuvQMharIxEYgnTIcXVN/DIpYyrvmhx1DLotnUxL+CjBFqFJ7vpZ4kUp4xCiSfETFW4whIRot/QErDf+BjRSKEI=",
        "user_name": "assert_that/Match"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: '2019-12-16T17:21:34.908615Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-12-16_09_21_33-13645221479653486648'
 location: 'us-central1'
 name: 'beamapp-jenkins-1216165035-960244'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-12-16T17:21:34.908615Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: 
[2019-12-16_09_21_33-13645221479653486648]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow 
monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_21_33-13645221479653486648?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 
2019-12-16_09_21_33-13645221479653486648 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:33.434Z: 
JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 
2019-12-16_09_21_33-13645221479653486648.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:33.434Z: 
JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 
2019-12-16_09_21_33-13645221479653486648. The number of workers will be between 
1 and 1000.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:37.252Z: 
JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service 
Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:38.401Z: 
JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:39.145Z: 
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:39.242Z: 
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step 
assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:39.477Z: 
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a 
combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:39.678Z: 
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:39.803Z: 
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:39.986Z: 
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:40.038Z: 
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:40.078Z: 
JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/SplitQuery into read 
from datastore/UserQuery/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:40.109Z: 
JOB_MESSAGE_DETAILED: Fusing consumer read from 
datastore/Reshuffle/AddRandomKeys into read from datastore/SplitQuery
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:40.137Z: 
JOB_MESSAGE_DETAILED: Fusing consumer read from 
datastore/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into read from 
datastore/Reshuffle/AddRandomKeys
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:40.160Z: 
JOB_MESSAGE_DETAILED: Fusing consumer read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/Reify into read from 
datastore/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:40.188Z: 
JOB_MESSAGE_DETAILED: Fusing consumer read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/Write into read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:40.222Z: 
JOB_MESSAGE_DETAILED: Fusing consumer read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow into read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:40.256Z: 
JOB_MESSAGE_DETAILED: Fusing consumer read from 
datastore/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:40.294Z: 
JOB_MESSAGE_DETAILED: Fusing consumer read from 
datastore/Reshuffle/RemoveRandomKeys into read from 
datastore/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:40.323Z: 
JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/Read into read from 
datastore/Reshuffle/RemoveRandomKeys
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:40.359Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
Globally/CombineGlobally(CountCombineFn)/KeyWithVoid into read from 
datastore/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:40.394Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial
 into Globally/CombineGlobally(CountCombineFn)/KeyWithVoid
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:40.434Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify into 
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:40.470Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write into 
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:40.501Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine into 
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:40.526Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract into 
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:40.555Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
Globally/CombineGlobally(CountCombineFn)/UnKey into 
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:40.578Z: 
JOB_MESSAGE_DETAILED: Unzipping flatten s21 for input s19.out
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:40.614Z: 
JOB_MESSAGE_DETAILED: Fusing unzipped copy of 
assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, 
into producer assert_that/Group/pair_with_0
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:40.648Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/GroupByKey/GroupByWindow into 
assert_that/Group/GroupByKey/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:40.692Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/Map(_merge_tagged_vals_under_key) into 
assert_that/Group/GroupByKey/GroupByWindow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:40.730Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into 
assert_that/Group/Map(_merge_tagged_vals_under_key)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:40.767Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:40.793Z: 
JOB_MESSAGE_DETAILED: Unzipping flatten s21-u40 for input s22-reify-value18-c38
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:40.817Z: 
JOB_MESSAGE_DETAILED: Fusing unzipped copy of 
assert_that/Group/GroupByKey/Write, through flatten 
assert_that/Group/Flatten/Unzipped-1, into producer 
assert_that/Group/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:40.851Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into 
assert_that/Create/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:40.888Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into 
assert_that/Group/pair_with_1
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:40.914Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into 
assert_that/Group/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:40.942Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
Globally/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault into 
Globally/CombineGlobally(CountCombineFn)/DoOnce/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:40.968Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into 
Globally/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:41.001Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into 
assert_that/WindowInto(WindowIntoFn)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:41.031Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into 
assert_that/ToVoidKey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:41.061Z: 
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:41.084Z: 
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:41.109Z: 
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:41.164Z: 
JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:41.533Z: 
JOB_MESSAGE_DEBUG: Executing wait step start51
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:41.658Z: 
JOB_MESSAGE_BASIC: Executing operation 
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:41.708Z: 
JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:41.709Z: 
JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:41.743Z: 
JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:41.743Z: 
JOB_MESSAGE_BASIC: Executing operation read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:41.777Z: 
JOB_MESSAGE_BASIC: Finished operation 
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:41.791Z: 
JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:41.791Z: 
JOB_MESSAGE_BASIC: Finished operation read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:41.851Z: 
JOB_MESSAGE_DEBUG: Value 
"Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Session" 
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:41.891Z: 
JOB_MESSAGE_DEBUG: Value "read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:41.929Z: 
JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:41.967Z: 
JOB_MESSAGE_BASIC: Executing operation read from datastore/UserQuery/Read+read 
from datastore/SplitQuery+read from datastore/Reshuffle/AddRandomKeys+read from 
datastore/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/Reify+read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:42.005Z: 
JOB_MESSAGE_BASIC: Executing operation 
assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:21:59.409Z: 
JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric 
descriptors and Stackdriver will not create new Dataflow custom metrics for 
this job. Each unique user-defined metric name (independent of the DoFn in 
which it is defined) produces a new metric descriptor. To delete old / unused 
metric descriptors see 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:22:13.887Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on 
the rate of progress in the currently running step(s).
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:23:51.068Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:23:51.104Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:24:53.377Z: 
JOB_MESSAGE_ERROR: Workflow failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:24:53.482Z: 
JOB_MESSAGE_BASIC: Finished operation 
assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:24:53.483Z: 
JOB_MESSAGE_BASIC: Finished operation read from datastore/UserQuery/Read+read 
from datastore/SplitQuery+read from datastore/Reshuffle/AddRandomKeys+read from 
datastore/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/Reify+read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:24:53.741Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:24:53.821Z: 
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:24:53.857Z: 
JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:26:13.168Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:26:13.207Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:26:13.241Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
oauth2client.transport: INFO: Refreshing due to a 401 (attempt 1/2)
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 
2019-12-16_09_21_33-13645221479653486648 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_42_50-1355784197792544752?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_56_47-16788430876644093879?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_04_18-15802810351609425677?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_11_43-663915550958645301?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_19_12-17694078789363603095?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_42_46-11975819821930149688?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_03_39-11480900550776276851?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_11_21-14625495783319475219?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_18_28-1343968486568312510?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_42_48-18060171705386539406?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_54_47-13487906923651822324?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_02_52-5413592477998166013?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_42_47-12288373851444492884?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_00_34-7449919149234413630?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_09_51-6257619808012766156?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_18_42-6645626837139459513?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_42_47-87289199619895020?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_50_47-5782292728428170444?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_58_01-16354948594141862559?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_05_45-15162386945739033961?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_13_54-14547877452020866396?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_21_33-13645221479653486648?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_42_46-16499429962431162518?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_50_22-9707862042395983143?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_58_29-11455279054647206642?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_07_37-1878601733944552755?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:652:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_15_15-18366415265484953272?project=apache-beam-testing
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_42_48-10812952069920643670?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_51_28-2716449684306203722?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_00_12-13550927083243018470?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_09_10-4482429689613593757?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_17_45-8968327171360628269?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296:
 FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307:
 FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307:
 FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_42_48-6679695131146974847?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_50_59-14828359123925395129?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_58_05-12178049772616806131?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_05_25-9462185267694802295?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_12_56-3590271210428579250?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py37.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 2675.434s

FAILED (SKIP=6, errors=1)

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/direct/py37/build.gradle'>
 line: 61

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:direct:py37:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'>
 line: 89

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 45m 40s
84 actionable tasks: 63 executed, 21 from cache

Publishing build scan...
https://gradle.com/s/2wkxah2msxnp4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to