See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/6/display/redirect?page=changes>

Changes:

[Robert Bradshaw] [BEAM-9577] Remove use of legacy artifact service in Python.

[Robert Bradshaw] Simplify Python on Flink runner instructions.

[Robert Bradshaw] Fix stray paragraph, separate and rework python.

[tysonjh] [BEAM-9999] Remove Gearpump runner.

[Robert Bradshaw] Expand note on runner selection.

[Rui Wang] [BEAM-10230] @Ignore: BYTES works with LIKE.

[Robert Bradshaw] Move Beam Compatibility table below instructions.

[noreply] [BEAM-9742] Add Configurable FluentBackoff to JdbcIO Write (#11396)

[noreply] Finalize CHANGES.md for 2.22.0 (#11973)

[noreply] [BEAM-9679] Add CombinePerKey to Core Transforms Go Katas (#11936)


------------------------------------------
[...truncated 10.87 MB...]
            },
            "output_name": "None",
            "user_name": "assert_that/Match.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "None",
          "step_name": "s24"
        },
        "serialized_fn": 
"eNq1Vlt3FEUQnpm9ZDPcw1VQiUF0g7CroKAIKGwIl5UlTiIZlDjOzvTuDDu36u4hic7IJWwSXvwJHjye44v/wp/gow+++eC/sLo3IUbBN0+f6d6q6qquy9fV+6BQdezEdjxitYkd1ji1I9aJachqTkyJ3rCDwG4HZJbaSULoRDwZ6aCMPwQ1B61qFhRFsToRFBzXD4KaJWbdciixObE6aeRwP0aFYnWTPIht1+KLCdGhZFbQRCN2yQzSUO7DkAGValNtKs0CjsONkVUlUzpqpn6vZdp3aledVmC41Qd93FRRdQG2mEVcbdplsNUs48/evCS2rcB2M0S67sUhqd8lUc+P2Pp6ggX2PVKfj2mPYfykLsK3pmLGG3EY+tyaWuReHJ06U2fUqTO3x+qJ5NT/lq76RrrqIl21ZBF2yIjOBXbYdu0LsPPGH6WGArtMDbmYqZE+7B7nsMeAvZty0iXcsjmnOuyTBtqpH3D0E/abQ0iiWEjhwDK8ZMDBTap+mMSUW2HspgGm9JB5SOTjxUWFl/vwigGvynMsNOJwy4LDyzBqwGsygxaB1A5grPW8sjoECTjiFaseFkrDQm1pVpq7G79zZVVZUnMtU9l1V3W1x0quLo5yVXILXM3EWswKWbGn0ZOuMq3MKtGeosK1nk6vqooYt5VIM5WukpcWWpkyocxN5mW3kA8t1Hgh05ZU5Izd0fJKPpyp2XBWXtJoxy3mQ5n2tJhVflBVZUqBzqySDbGZrPS0lJUFD2aQU2I/ZRr9zS1JQGn7lFzPSvSXTHfLWWmfsl/QGv0VabQ2oHk505fUR6rwFV6vttYD1nBMNyqZkqk9DUVHEY9vDPBI4E252lBdhnHz7v+MP8IQJN16yv1AgO+Yt7P5s7drHBH2lgHHzWPSI6ylxePauSB27IBdqA3Ku0E/A+uJJ6vmflSZtP2AuKM2Y4Tys6NH6ej58zhDzTyA0uOjaUQWEuJw3EMCEpKIMyGum3ulOPQZQ682yd7m8E5V3tTAZxxOSpwxxC1x4ZQ5jIRoAJcpjSm8K2kSpSGhiDh4T/aZJE7gtFQTrShy4Yy5G4lLmIeL0tHLCw5JRMOB9zl8UB1sdbhA8tmBRRE2E/3qQ3OboNeCsKRL58wdz1yy1kVwXu5c465Zu2DqyNtIAnwk7+hauPCxWRJ3PU7Rx4sSDD5ckus9aMhQQtaFCQ6XvWPeWPPHxmlFrZSLakXV8BuM7Wp5qKjqalEdkbOulnHejpKtuA4+mJQ3+JkXV/pwFQt/zYDr3kHvkKzVpjYwAEtNgAWaffjEgBseXvuWATe9MYTwlHdkDj698WfB7oNhwHQfZnL4zNwiOoJo15bnYzXh1vO6QmSHxOVpIjrQrFS5dlNU9KpUMSXHj5KUS0sMbptbkROnfIP1uaxYTP2uH8EXK3DHHP1nBLhP+lBzCXYym8eUwdwKfGmA1WrNwVcrYBvQZi0OjgFuH4gBnT50c/CqgzDEy4DpEhjwPde75Um9uyvQMyBIUS80IJKu4fOFT99gO8QtWeGExg5hDBIv+ldWYGCKrgAzgKftOUhzuDcH8//5vs76kRvPY1V0WMBXdTGHr6uyqvNSgG5+8yL9wQ79ShC37WBgh+mQoZVcop1jIruEoolvX2RibYs+QTp2GvCZNRLuo5EH5i55f5w0TANbXCvxwhB42FTNnUISBPG8uDtY/Ugk5dGmUwTI8AA/RNDZYaJPpNQe/BlYQuOPc+jLaoe+Q2MGy02Ftc0R4fW6huXEYduPCIUVPFFm22eWO/AUVlfTNocntb8ADhu+tA==",
        "user_name": "assert_that/Match"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: '2020-06-11T04:47:37.163475Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2020-06-10_21_47_35-15826432664999544410'
 location: 'us-central1'
 name: 'beamapp-jenkins-0611043108-804251'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2020-06-11T04:47:37.163475Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: 
[2020-06-10_21_47_35-15826432664999544410]
apache_beam.runners.dataflow.internal.apiclient: INFO: Submitted job: 
2020-06-10_21_47_35-15826432664999544410
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow 
monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_47_35-15826432664999544410?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 
2020-06-10_21_47_35-15826432664999544410 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:35.784Z: 
JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 
2020-06-10_21_47_35-15826432664999544410.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:35.784Z: 
JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 
2020-06-10_21_47_35-15826432664999544410. The number of workers will be between 
1 and 1000.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:43.430Z: 
JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:44.379Z: 
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:44.418Z: 
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step 
assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:44.486Z: 
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a 
combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:44.539Z: 
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:44.932Z: 
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:45.451Z: 
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:46.014Z: 
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:46.055Z: 
JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/SplitQuery into read 
from datastore/UserQuery/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:46.218Z: 
JOB_MESSAGE_DETAILED: Fusing consumer read from 
datastore/Reshuffle/AddRandomKeys into read from datastore/SplitQuery
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:46.255Z: 
JOB_MESSAGE_DETAILED: Fusing consumer read from 
datastore/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into read from 
datastore/Reshuffle/AddRandomKeys
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:46.291Z: 
JOB_MESSAGE_DETAILED: Fusing consumer read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/Reify into read from 
datastore/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:46.324Z: 
JOB_MESSAGE_DETAILED: Fusing consumer read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/Write into read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:46.364Z: 
JOB_MESSAGE_DETAILED: Fusing consumer read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow into read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:46.403Z: 
JOB_MESSAGE_DETAILED: Fusing consumer read from 
datastore/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:46.440Z: 
JOB_MESSAGE_DETAILED: Fusing consumer read from 
datastore/Reshuffle/RemoveRandomKeys into read from 
datastore/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:46.474Z: 
JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/Read into read from 
datastore/Reshuffle/RemoveRandomKeys
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:46.508Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
Globally/CombineGlobally(CountCombineFn)/KeyWithVoid into read from 
datastore/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:46.546Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial
 into Globally/CombineGlobally(CountCombineFn)/KeyWithVoid
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:46.589Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify into 
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:46.616Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write into 
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:46.661Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine into 
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:46.798Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract into 
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:46.947Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
Globally/CombineGlobally(CountCombineFn)/UnKey into 
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:47.138Z: 
JOB_MESSAGE_DETAILED: Unzipping flatten s21 for input s19.None
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:47.430Z: 
JOB_MESSAGE_DETAILED: Fusing unzipped copy of 
assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, 
into producer assert_that/Group/pair_with_0
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:47.561Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/GroupByKey/GroupByWindow into 
assert_that/Group/GroupByKey/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:47.725Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/Map(_merge_tagged_vals_under_key) into 
assert_that/Group/GroupByKey/GroupByWindow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:47.822Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into 
assert_that/Group/Map(_merge_tagged_vals_under_key)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:47.926Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:47.962Z: 
JOB_MESSAGE_DETAILED: Unzipping flatten s21-u40 for input s22-reify-value18-c38
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:48.060Z: 
JOB_MESSAGE_DETAILED: Fusing unzipped copy of 
assert_that/Group/GroupByKey/Write, through flatten 
assert_that/Group/Flatten/Unzipped-1, into producer 
assert_that/Group/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:48.137Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into 
assert_that/Create/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:48.171Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into 
assert_that/Group/pair_with_1
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:48.204Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into 
assert_that/Group/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:48.244Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
Globally/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault into 
Globally/CombineGlobally(CountCombineFn)/DoOnce/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:48.282Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into 
Globally/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:48.321Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into 
assert_that/WindowInto(WindowIntoFn)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:48.358Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into 
assert_that/ToVoidKey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:48.402Z: 
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:48.427Z: 
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:48.464Z: 
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:48.548Z: 
JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:49.282Z: 
JOB_MESSAGE_DEBUG: Executing wait step start51
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:49.376Z: 
JOB_MESSAGE_BASIC: Executing operation 
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:49.409Z: 
JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:49.422Z: 
JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:49.455Z: 
JOB_MESSAGE_BASIC: Executing operation read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:49.475Z: 
JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:49.554Z: 
JOB_MESSAGE_BASIC: Finished operation 
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:49.583Z: 
JOB_MESSAGE_BASIC: Finished operation read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:49.583Z: 
JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:49.655Z: 
JOB_MESSAGE_DEBUG: Value 
"Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Session" 
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:49.700Z: 
JOB_MESSAGE_DEBUG: Value "read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:49.755Z: 
JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:49.796Z: 
JOB_MESSAGE_BASIC: Executing operation read from datastore/UserQuery/Read+read 
from datastore/SplitQuery+read from datastore/Reshuffle/AddRandomKeys+read from 
datastore/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/Reify+read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:47:49.843Z: 
JOB_MESSAGE_BASIC: Executing operation 
assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:48:02.165Z: 
JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric 
descriptors and Stackdriver will not create new Dataflow custom metrics for 
this job. Each unique user-defined metric name (independent of the DoFn in 
which it is defined) produces a new metric descriptor. To delete old / unused 
metric descriptors see 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:48:28.557Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on 
the rate of progress in the currently running stage(s).
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:50:05.690Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:50:05.731Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:53:25.571Z: 
JOB_MESSAGE_BASIC: Finished operation read from datastore/UserQuery/Read+read 
from datastore/SplitQuery+read from datastore/Reshuffle/AddRandomKeys+read from 
datastore/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/Reify+read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:53:25.661Z: 
JOB_MESSAGE_BASIC: Executing operation read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:53:25.740Z: 
JOB_MESSAGE_BASIC: Finished operation read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:53:25.815Z: 
JOB_MESSAGE_BASIC: Executing operation read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/Read+read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read from 
datastore/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read from 
datastore/Reshuffle/RemoveRandomKeys+read from 
datastore/Read+Globally/CombineGlobally(CountCombineFn)/KeyWithVoid+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:53:32.291Z: 
JOB_MESSAGE_BASIC: Finished operation 
assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:53:39.087Z: 
JOB_MESSAGE_BASIC: Finished operation read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/Read+read from 
datastore/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read from 
datastore/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read from 
datastore/Reshuffle/RemoveRandomKeys+read from 
datastore/Read+Globally/CombineGlobally(CountCombineFn)/KeyWithVoid+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:53:39.152Z: 
JOB_MESSAGE_BASIC: Executing operation 
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:53:39.204Z: 
JOB_MESSAGE_BASIC: Finished operation 
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:53:39.262Z: 
JOB_MESSAGE_BASIC: Executing operation 
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Read+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract+Globally/CombineGlobally(CountCombineFn)/UnKey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:53:50.779Z: 
JOB_MESSAGE_BASIC: Finished operation 
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Read+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract+Globally/CombineGlobally(CountCombineFn)/UnKey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:53:50.860Z: 
JOB_MESSAGE_DEBUG: Value "Globally/CombineGlobally(CountCombineFn)/UnKey.out" 
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:53:50.926Z: 
JOB_MESSAGE_BASIC: Executing operation 
Globally/CombineGlobally(CountCombineFn)/InjectDefault/_UnpickledSideInput(UnKey.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:53:50.984Z: 
JOB_MESSAGE_BASIC: Finished operation 
Globally/CombineGlobally(CountCombineFn)/InjectDefault/_UnpickledSideInput(UnKey.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:53:51.041Z: 
JOB_MESSAGE_DEBUG: Value 
"Globally/CombineGlobally(CountCombineFn)/InjectDefault/_UnpickledSideInput(UnKey.out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:53:51.120Z: 
JOB_MESSAGE_BASIC: Executing operation 
Globally/CombineGlobally(CountCombineFn)/DoOnce/Read+Globally/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:53:56.704Z: 
JOB_MESSAGE_BASIC: Finished operation 
Globally/CombineGlobally(CountCombineFn)/DoOnce/Read+Globally/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:53:56.763Z: 
JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:53:56.837Z: 
JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:53:56.904Z: 
JOB_MESSAGE_BASIC: Executing operation 
assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:54:04.108Z: 
JOB_MESSAGE_BASIC: Finished operation 
assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:54:04.173Z: 
JOB_MESSAGE_DEBUG: Executing success step success49
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:54:04.291Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:54:04.359Z: 
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:54:04.385Z: 
JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:55:35.573Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:55:35.653Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-06-11T04:55:35.703Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 
2020-06-10_21_47_35-15826432664999544410 is in state JOB_STATE_DONE
apache_beam.io.gcp.datastore.v1new.datastore_write_it_pipeline: INFO: Deleting 
entities.
apache_beam.runners.portability.stager: INFO: Executing command: 
['<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/bin/python',>
 '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 
'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
apache_beam.runners.portability.stager: INFO: Copying Beam SDK 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz";>
 to staging location.
root: WARNING: Make sure that locally built Python SDK docker image has Python 
3.7 interpreter.
root: INFO: Using Python SDK docker image: 
apache/beam_python3.7_sdk:2.23.0.dev. If the image is not available at local, 
we will try to pull from hub.docker.com
apache_beam.runners.dataflow.dataflow_runner: WARNING: Typical end users should 
not use this worker jar feature. It can only be used when FnAPI is enabled.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0611043108-804251.1591849868.804402/beamapp-jenkins-0611043108-804251.1591850340.301929/beamapp-jenkins-0611043108-804251.1591850845.672089/beamapp-jenkins-0611043108-804251.1591851347.151776/pipeline.pb...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0611043108-804251.1591849868.804402/beamapp-jenkins-0611043108-804251.1591850340.301929/beamapp-jenkins-0611043108-804251.1591850845.672089/beamapp-jenkins-0611043108-804251.1591851347.151776/pipeline.pb
 in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0611043108-804251.1591849868.804402/beamapp-jenkins-0611043108-804251.1591850340.301929/beamapp-jenkins-0611043108-804251.1591850845.672089/beamapp-jenkins-0611043108-804251.1591851347.151776/requirements.txt...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0611043108-804251.1591849868.804402/beamapp-jenkins-0611043108-804251.1591850340.301929/beamapp-jenkins-0611043108-804251.1591850845.672089/beamapp-jenkins-0611043108-804251.1591851347.151776/requirements.txt
 in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0611043108-804251.1591849868.804402/beamapp-jenkins-0611043108-804251.1591850340.301929/beamapp-jenkins-0611043108-804251.1591850845.672089/beamapp-jenkins-0611043108-804251.1591851347.151776/parameterized-0.7.4.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0611043108-804251.1591849868.804402/beamapp-jenkins-0611043108-804251.1591850340.301929/beamapp-jenkins-0611043108-804251.1591850845.672089/beamapp-jenkins-0611043108-804251.1591851347.151776/parameterized-0.7.4.tar.gz
 in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0611043108-804251.1591849868.804402/beamapp-jenkins-0611043108-804251.1591850340.301929/beamapp-jenkins-0611043108-804251.1591850845.672089/beamapp-jenkins-0611043108-804251.1591851347.151776/mock-2.0.0.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0611043108-804251.1591849868.804402/beamapp-jenkins-0611043108-804251.1591850340.301929/beamapp-jenkins-0611043108-804251.1591850845.672089/beamapp-jenkins-0611043108-804251.1591851347.151776/mock-2.0.0.tar.gz
 in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0611043108-804251.1591849868.804402/beamapp-jenkins-0611043108-804251.1591850340.301929/beamapp-jenkins-0611043108-804251.1591850845.672089/beamapp-jenkins-0611043108-804251.1591851347.151776/six-1.15.0.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0611043108-804251.1591849868.804402/beamapp-jenkins-0611043108-804251.1591850340.301929/beamapp-jenkins-0611043108-804251.1591850845.672089/beamapp-jenkins-0611043108-804251.1591851347.151776/six-1.15.0.tar.gz
 in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0611043108-804251.1591849868.804402/beamapp-jenkins-0611043108-804251.1591850340.301929/beamapp-jenkins-0611043108-804251.1591850845.672089/beamapp-jenkins-0611043108-804251.1591851347.151776/funcsigs-1.0.2.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0611043108-804251.1591849868.804402/beamapp-jenkins-0611043108-804251.1591850340.301929/beamapp-jenkins-0611043108-804251.1591850845.672089/beamapp-jenkins-0611043108-804251.1591851347.151776/funcsigs-1.0.2.tar.gz
 in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0611043108-804251.1591849868.804402/beamapp-jenkins-0611043108-804251.1591850340.301929/beamapp-jenkins-0611043108-804251.1591850845.672089/beamapp-jenkins-0611043108-804251.1591851347.151776/pbr-5.4.5.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0611043108-804251.1591849868.804402/beamapp-jenkins-0611043108-804251.1591850340.301929/beamapp-jenkins-0611043108-804251.1591850845.672089/beamapp-jenkins-0611043108-804251.1591851347.151776/pbr-5.4.5.tar.gz
 in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0611043108-804251.1591849868.804402/beamapp-jenkins-0611043108-804251.1591850340.301929/beamapp-jenkins-0611043108-804251.1591850845.672089/beamapp-jenkins-0611043108-804251.1591851347.151776/PyHamcrest-1.10.1.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0611043108-804251.1591849868.804402/beamapp-jenkins-0611043108-804251.1591850340.301929/beamapp-jenkins-0611043108-804251.1591850845.672089/beamapp-jenkins-0611043108-804251.1591851347.151776/PyHamcrest-1.10.1.tar.gz
 in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0611043108-804251.1591849868.804402/beamapp-jenkins-0611043108-804251.1591850340.301929/beamapp-jenkins-0611043108-804251.1591850845.672089/beamapp-jenkins-0611043108-804251.1591851347.151776/dataflow_python_sdk.tar...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0611043108-804251.1591849868.804402/beamapp-jenkins-0611043108-804251.1591850340.301929/beamapp-jenkins-0611043108-804251.1591850845.672089/beamapp-jenkins-0611043108-804251.1591851347.151776/dataflow_python_sdk.tar
 in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0611043108-804251.1591849868.804402/beamapp-jenkins-0611043108-804251.1591850340.301929/beamapp-jenkins-0611043108-804251.1591850845.672089/beamapp-jenkins-0611043108-804251.1591851347.151776/dataflow-worker.jar...
root: DEBUG: Response returned status 503, retrying
root: DEBUG: Retrying request to url 
https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?alt=json&name=staging-it%2Fbeamapp-jenkins-0611043108-804251.1591849868.804402%2Fbeamapp-jenkins-0611043108-804251.1591850340.301929%2Fbeamapp-jenkins-0611043108-804251.1591850845.672089%2Fbeamapp-jenkins-0611043108-804251.1591851347.151776%2Fdataflow-worker.jar&uploadType=resumable&upload_id=AAANsUneRgKs9EAvxLT0Mh8n6wkQoXmNd6A5-bXc_SYfTpfmhYpOmpayN3_UFyvweIinJxhcjR4LVBqUj1XplzWsXdA
 after exception HttpError accessing 
<https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?alt=json&name=staging-it%2Fbeamapp-jenkins-0611043108-804251.1591849868.804402%2Fbeamapp-jenkins-0611043108-804251.1591850340.301929%2Fbeamapp-jenkins-0611043108-804251.1591850845.672089%2Fbeamapp-jenkins-0611043108-804251.1591851347.151776%2Fdataflow-worker.jar&uploadType=resumable&upload_id=AAANsUneRgKs9EAvxLT0Mh8n6wkQoXmNd6A5-bXc_SYfTpfmhYpOmpayN3_UFyvweIinJxhcjR4LVBqUj1XplzWsXdA>:
 response: <{'content-type': 'text/plain; charset=utf-8', 
'x-guploader-uploadid': 
'AAANsUlxKFhO9cFBbM5xVZPGnkw6Cp5NzTTxt8Q0HL9s9UoSZuUPHQDZzTD_kOcakz1Od3SuCWAuJUD8n5yUUcZxpIQ',
 'content-length': '19', 'date': 'Thu, 11 Jun 2020 04:56:48 GMT', 'server': 
'UploadServer', 'status': '503'}>, content <Service Unavailable>
root: DEBUG: Response returned status 503, retrying
root: DEBUG: Retrying request to url 
https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?alt=json&name=staging-it%2Fbeamapp-jenkins-0611043108-804251.1591849868.804402%2Fbeamapp-jenkins-0611043108-804251.1591850340.301929%2Fbeamapp-jenkins-0611043108-804251.1591850845.672089%2Fbeamapp-jenkins-0611043108-804251.1591851347.151776%2Fdataflow-worker.jar&uploadType=resumable&upload_id=AAANsUneRgKs9EAvxLT0Mh8n6wkQoXmNd6A5-bXc_SYfTpfmhYpOmpayN3_UFyvweIinJxhcjR4LVBqUj1XplzWsXdA
 after exception HttpError accessing 
<https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?alt=json&name=staging-it%2Fbeamapp-jenkins-0611043108-804251.1591849868.804402%2Fbeamapp-jenkins-0611043108-804251.1591850340.301929%2Fbeamapp-jenkins-0611043108-804251.1591850845.672089%2Fbeamapp-jenkins-0611043108-804251.1591851347.151776%2Fdataflow-worker.jar&uploadType=resumable&upload_id=AAANsUneRgKs9EAvxLT0Mh8n6wkQoXmNd6A5-bXc_SYfTpfmhYpOmpayN3_UFyvweIinJxhcjR4LVBqUj1XplzWsXdA>:
 response: <{'content-type': 'text/plain; charset=utf-8', 
'x-guploader-uploadid': 
'AAANsUmo8xcAyMRxIkbgwDYhOK9ldi1epMMGAnap8qQr8WmIrHFxbwwniwLWlOeXXDXc7QuABaUrOAUjJQNidDdwSSA',
 'content-length': '19', 'date': 'Thu, 11 Jun 2020 04:57:20 GMT', 'server': 
'UploadServer', 'status': '503'}>, content <Service Unavailable>
root: DEBUG: Response returned status 503, retrying
root: DEBUG: Retrying request to url 
https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?alt=json&name=staging-it%2Fbeamapp-jenkins-0611043108-804251.1591849868.804402%2Fbeamapp-jenkins-0611043108-804251.1591850340.301929%2Fbeamapp-jenkins-0611043108-804251.1591850845.672089%2Fbeamapp-jenkins-0611043108-804251.1591851347.151776%2Fdataflow-worker.jar&uploadType=resumable&upload_id=AAANsUneRgKs9EAvxLT0Mh8n6wkQoXmNd6A5-bXc_SYfTpfmhYpOmpayN3_UFyvweIinJxhcjR4LVBqUj1XplzWsXdA
 after exception HttpError accessing 
<https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?alt=json&name=staging-it%2Fbeamapp-jenkins-0611043108-804251.1591849868.804402%2Fbeamapp-jenkins-0611043108-804251.1591850340.301929%2Fbeamapp-jenkins-0611043108-804251.1591850845.672089%2Fbeamapp-jenkins-0611043108-804251.1591851347.151776%2Fdataflow-worker.jar&uploadType=resumable&upload_id=AAANsUneRgKs9EAvxLT0Mh8n6wkQoXmNd6A5-bXc_SYfTpfmhYpOmpayN3_UFyvweIinJxhcjR4LVBqUj1XplzWsXdA>:
 response: <{'content-type': 'text/plain; charset=utf-8', 
'x-guploader-uploadid': 
'AAANsUlGGLiPgEQ4rK8ehFdKMcxHq5Y-lHMSHfQksCPfg6PWN7Q434D2bKq7QbFWSgIwN0bcwJ6Y-xA7SMMkZV0hbmQ',
 'content-length': '19', 'date': 'Thu, 11 Jun 2020 04:57:55 GMT', 'server': 
'UploadServer', 'status': '503'}>, content <Service Unavailable>
root: DEBUG: Response returned status 503, retrying
root: DEBUG: Retrying request to url 
https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?alt=json&name=staging-it%2Fbeamapp-jenkins-0611043108-804251.1591849868.804402%2Fbeamapp-jenkins-0611043108-804251.1591850340.301929%2Fbeamapp-jenkins-0611043108-804251.1591850845.672089%2Fbeamapp-jenkins-0611043108-804251.1591851347.151776%2Fdataflow-worker.jar&uploadType=resumable&upload_id=AAANsUneRgKs9EAvxLT0Mh8n6wkQoXmNd6A5-bXc_SYfTpfmhYpOmpayN3_UFyvweIinJxhcjR4LVBqUj1XplzWsXdA
 after exception HttpError accessing 
<https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?alt=json&name=staging-it%2Fbeamapp-jenkins-0611043108-804251.1591849868.804402%2Fbeamapp-jenkins-0611043108-804251.1591850340.301929%2Fbeamapp-jenkins-0611043108-804251.1591850845.672089%2Fbeamapp-jenkins-0611043108-804251.1591851347.151776%2Fdataflow-worker.jar&uploadType=resumable&upload_id=AAANsUneRgKs9EAvxLT0Mh8n6wkQoXmNd6A5-bXc_SYfTpfmhYpOmpayN3_UFyvweIinJxhcjR4LVBqUj1XplzWsXdA>:
 response: <{'content-type': 'text/plain; charset=utf-8', 
'x-guploader-uploadid': 
'AAANsUm_SckrEiAoMVJjcL2rgIkg4H6T1T_UHXmPEOROGKmx0AV40yYF2rOe-Q9i4jA5vAn2NSJ3aCJOGd2pgQZ43P-zKAlMkw',
 'content-length': '19', 'date': 'Thu, 11 Jun 2020 04:58:32 GMT', 'server': 
'UploadServer', 'status': '503'}>, content <Service Unavailable>
--------------------- >> end captured logging << ---------------------
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_18_41-4027948102077989856?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_33_00-10978155864660858378?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_41_10-14303894870774277881?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_49_53-18381379187569566062?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_22_01_54-4412269797440005382?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_22_10_47-9475246555551096191?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_18_39-1814884672949687835?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_40_13-8728823640595317870?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_48_37-15327145696055283743?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_56_23-7645694935556428618?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_22_04_33-17207011565957522313?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_22_12_37-13338751496977191664?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_18_43-9449273775332749293?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_31_19-7464371138016063055?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_39_09-15512948660399727891?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_47_35-15826432664999544410?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_22_00_36-3109026534985375860?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_22_09_16-3077531035678999347?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_22_16_49-10393138576760763753?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_18_38-18172003479959312186?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_38_31-4473886310256704477?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_47_03-12688943564372552158?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_55_21-4422512959899887900?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_22_04_02-13848152122735705801?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_22_12_43-12752366333224783073?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_18_40-13539614085662747973?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_27_18-15016068914431266268?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_36_40-8805508589474832368?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_44_54-615941180503444303?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_52_52-9775266835299413199?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_22_00_41-18302866890171412369?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_22_09_13-2223375654053592777?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_18_37-17338737767934545372?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_27_16-16408402747244642204?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_36_19-266081193957637924?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_43_59-11981804397609528684?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_51_41-9150003725838929889?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_22_00_11-2510690189700642?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_22_08_32-1452069660690330216?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_18_39-11312134934281810468?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_27_56-3547184069902251155?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_37_22-5526843512879203900?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_47_31-376409247140895665?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_56_06-9237082362758245773?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_22_13_23-5474102236874191743?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_22_21_27-16912148042867090522?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_18_38-4628953173983393928?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_28_54-6553084489984013056?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_39_57-554220627429177291?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_47_53-9820862669435820153?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_21_55_56-149587066727031262?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_22_07_11-17236229419799651158?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_22_15_25-14781480851821500563?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py37.xml
----------------------------------------------------------------------
XML: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 63 tests in 4279.178s

FAILED (SKIP=7, errors=1)

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
 line: 116

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 13m 0s
87 actionable tasks: 64 executed, 23 from cache

Publishing build scan...
https://gradle.com/s/t2ngx7eoo345o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to