See 
<https://builds.apache.org/job/beam_PostCommit_Python_ValidatesContainer_Dataflow/75/display/redirect?page=changes>

Changes:

[herohde] [BEAM-3817] Switch BQ write to not use side input

[herohde] Add TODO to revert Go IO to use side input

[axelmagn] Fix StateRequestHandler interface to be idiomatic

[herohde] Add Go support for universal runners, incl Flink

[herohde] CR: Fixed comments for job service helper functions

[iemejia] Add missing ASF license to ExecutableStageTranslation file

[yifanzou] [BEAM-3840] Get python mobile-gaming automating on core runners

[sidhom] [BEAM-3565] Clean up ExecutableStage

[wcn] Fix incorrect read of atomic counter.

[herohde] [BEAM-3893] Add fallback to unauthenticated access for GCS IO

[robertwb] [BEAM-3865] Fix watermark hold handling bug.

[robertwb] [BEAM-2927] Python support for dataflow portable side inputs over Fn 
API

[herohde] CR: fix typo

[aaltay] [BEAM-3861] Improve test infra in Python SDK for streaming end-to-end

------------------------------------------
[...truncated 152.72 KB...]
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "pair_with_one.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s2"
        }, 
        "serialized_fn": "ref_AppliedPTransform_pair_with_one_5", 
        "user_name": "pair_with_one"
      }
    }, 
    {
      "kind": "GroupByKey", 
      "name": "s4", 
      "properties": {
        "display_data": [], 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": 
"StrUtf8Coder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlzBJUWhJWkWziAeVyGDZmMhY20hU5IeAAajEkY=",
 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "kind:stream", 
                      "component_encodings": [
                        {
                          "@type": 
"VarIntCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxhiUWeeSXOIA5XIYNmYyFjbSFTkh4A89cR+g==",
 
                          "component_encodings": []
                        }
                      ], 
                      "is_stream_like": true
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "group_and_sum/GroupByKey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s3"
        }, 
        "serialized_fn": 
"%0AD%22B%0A%1Dref_Coder_GlobalWindowCoder_1%12%21%0A%1F%0A%1D%0A%1Bbeam%3Acoder%3Aglobal_window%3Av1jT%0A%25%0A%23%0A%21beam%3Awindowfn%3Aglobal_windows%3Av0.1%10%01%1A%1Dref_Coder_GlobalWindowCoder_1%22%02%3A%00%28%010%018%01H%01",
 
        "user_name": "group_and_sum/GroupByKey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s5", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CombineValuesDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CombineValuesDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
                  "component_encodings": [
                    {
                      "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
                      "component_encodings": []
                    }, 
                    {
                      "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "group_and_sum/Combine/ParDo(CombineValuesDoFn).out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s4"
        }, 
        "serialized_fn": 
"ref_AppliedPTransform_group_and_sum/Combine/ParDo(CombineValuesDoFn)_9", 
        "user_name": "group_and_sum/Combine/ParDo(CombineValuesDoFn)"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s6", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "format_result"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
                  "component_encodings": [
                    {
                      "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
                      "component_encodings": []
                    }, 
                    {
                      "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "format.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s5"
        }, 
        "serialized_fn": "ref_AppliedPTransform_format_10", 
        "user_name": "format"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2018-03-21T05:15:52.117537Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-03-20_22_15_51-246540274952956440'
 location: u'us-central1'
 name: u'beamapp-jenkins-0321051550-149742'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-20_22_15_51-246540274952956440]
root: INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-20_22_15_51-246540274952956440?project=apache-beam-testing
root: INFO: Job 2018-03-20_22_15_51-246540274952956440 is in state 
JOB_STATE_PENDING
root: INFO: 2018-03-21T05:15:51.303Z: JOB_MESSAGE_WARNING: Job 
2018-03-20_22_15_51-246540274952956440 might autoscale up to 250 workers.
root: INFO: 2018-03-21T05:15:51.322Z: JOB_MESSAGE_DETAILED: Autoscaling is 
enabled for job 2018-03-20_22_15_51-246540274952956440. The number of workers 
will be between 1 and 250.
root: INFO: 2018-03-21T05:15:51.343Z: JOB_MESSAGE_DETAILED: Autoscaling was 
automatically enabled for job 2018-03-20_22_15_51-246540274952956440.
root: INFO: 2018-03-21T05:15:53.762Z: JOB_MESSAGE_DETAILED: Checking required 
Cloud APIs are enabled.
root: INFO: 2018-03-21T05:15:54.054Z: JOB_MESSAGE_DETAILED: Checking 
permissions granted to controller Service Account.
root: INFO: 2018-03-21T05:15:54.436Z: JOB_MESSAGE_DETAILED: Expanding 
CollectionToSingleton operations into optimizable parts.
root: INFO: 2018-03-21T05:15:54.467Z: JOB_MESSAGE_DETAILED: Expanding 
CoGroupByKey operations into optimizable parts.
root: INFO: 2018-03-21T05:15:54.496Z: JOB_MESSAGE_DEBUG: Combiner lifting 
skipped for step group_and_sum/GroupByKey: GroupByKey not followed by a 
combiner.
root: INFO: 2018-03-21T05:15:54.527Z: JOB_MESSAGE_DETAILED: Expanding 
GroupByKey operations into optimizable parts.
root: INFO: 2018-03-21T05:15:54.560Z: JOB_MESSAGE_DEBUG: Annotating graph with 
Autotuner information.
root: INFO: 2018-03-21T05:15:54.600Z: JOB_MESSAGE_DETAILED: Fusing adjacent 
ParDo, Read, Write, and Flatten operations
root: INFO: 2018-03-21T05:15:54.625Z: JOB_MESSAGE_DETAILED: Fusing consumer 
split into read/Read
root: INFO: 2018-03-21T05:15:54.656Z: JOB_MESSAGE_DETAILED: Fusing consumer 
group_and_sum/GroupByKey/Reify into pair_with_one
root: INFO: 2018-03-21T05:15:54.679Z: JOB_MESSAGE_DETAILED: Fusing consumer 
format into group_and_sum/Combine/ParDo(CombineValuesDoFn)
root: INFO: 2018-03-21T05:15:54.712Z: JOB_MESSAGE_DETAILED: Fusing consumer 
group_and_sum/Combine/ParDo(CombineValuesDoFn) into 
group_and_sum/GroupByKey/GroupByWindow
root: INFO: 2018-03-21T05:15:54.745Z: JOB_MESSAGE_DETAILED: Fusing consumer 
pair_with_one into split
root: INFO: 2018-03-21T05:15:54.776Z: JOB_MESSAGE_DETAILED: Fusing consumer 
group_and_sum/GroupByKey/Write into group_and_sum/GroupByKey/Reify
root: INFO: 2018-03-21T05:15:54.799Z: JOB_MESSAGE_DETAILED: Fusing consumer 
group_and_sum/GroupByKey/GroupByWindow into group_and_sum/GroupByKey/Read
root: INFO: 2018-03-21T05:15:54.821Z: JOB_MESSAGE_DEBUG: Workflow config is 
missing a default resource spec.
root: INFO: 2018-03-21T05:15:54.844Z: JOB_MESSAGE_DEBUG: Adding StepResource 
setup and teardown to workflow graph.
root: INFO: 2018-03-21T05:15:54.876Z: JOB_MESSAGE_DEBUG: Adding workflow start 
and stop steps.
root: INFO: 2018-03-21T05:15:54.898Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-03-21T05:15:55.034Z: JOB_MESSAGE_DEBUG: Executing wait step 
start13
root: INFO: 2018-03-21T05:15:55.089Z: JOB_MESSAGE_BASIC: Executing operation 
group_and_sum/GroupByKey/Create
root: INFO: 2018-03-21T05:15:55.131Z: JOB_MESSAGE_DEBUG: Starting worker pool 
setup.
root: INFO: 2018-03-21T05:15:55.161Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-f...
root: INFO: 2018-03-21T05:15:55.228Z: JOB_MESSAGE_DEBUG: Value 
"group_and_sum/GroupByKey/Session" materialized.
root: INFO: Job 2018-03-20_22_15_51-246540274952956440 is in state 
JOB_STATE_RUNNING
root: INFO: 2018-03-21T05:15:55.282Z: JOB_MESSAGE_BASIC: Executing operation 
read/Read+split+pair_with_one+group_and_sum/GroupByKey/Reify+group_and_sum/GroupByKey/Write
root: INFO: 2018-03-21T05:16:02.954Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised 
the number of workers to 0 based on the rate of progress in the currently 
running step(s).
root: INFO: 2018-03-21T05:16:29.383Z: JOB_MESSAGE_ERROR: Startup of the worker 
pool in zone us-central1-f failed to bring up any of the desired 1 workers. 
QUOTA_EXCEEDED: Quota 'DISKS_TOTAL_GB' exceeded.  Limit: 21000.0 in region 
us-central1.
root: INFO: 2018-03-21T05:16:29.406Z: JOB_MESSAGE_ERROR: Workflow failed.
root: INFO: 2018-03-21T05:16:29.509Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-03-21T05:16:29.545Z: JOB_MESSAGE_DEBUG: Starting worker pool 
teardown.
root: INFO: 2018-03-21T05:16:29.573Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-03-21T05:16:41.122Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2018-03-21T05:16:41.161Z: JOB_MESSAGE_DEBUG: Tearing down pending 
resources...
root: INFO: Job 2018-03-20_22_15_51-246540274952956440 is in state 
JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 1 test in 63.530s

FAILED (errors=1)
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-20_22_15_51-246540274952956440?project=apache-beam-testing
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user axelm...@gmail.com
Not sending mail to unregistered user 
yifan...@yifanzou-linuxworkstation.sea.corp.google.com
Not sending mail to unregistered user sid...@google.com
Not sending mail to unregistered user w...@google.com
Not sending mail to unregistered user hero...@google.com
Not sending mail to unregistered user aal...@gmail.com

Reply via email to