See 
<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/518/display/redirect>

------------------------------------------
[...truncated 156.69 KB...]
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": 
"FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": 
"FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": 
"FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "write/Write/WriteImpl/FinalizeWrite.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s7"
        },
        "serialized_fn": 
"eNrNV/l/G8UV10rOwZKmAUogpIdISbumSIqhcbBLIamSgFGsuGuDt0dYRrsjzca7O/tmZq2YWlypHEPvu9C79L7pfR9/R/+ZvhlJTgU2zW/p52Pvrt7MezPv+77vO7vPlpyAZCRg1G9RklSVIKlsc5HIasAFteskjkkrpsuCZBkVp/nZ1IbC5HNg9aDoeKVCoeC3UyiNBYk4/rWIpLbfjlISR09RvysiRW2Y8PYZlxzDio7MaAC7mt4etGWCB1RK2M0mvBv1HLWWUZ9FqZKwZ3yPOGDs1ZDiJoniQtpz55fQ/Ig227AXN3hDswe2Y0JFaZYrE0/CjU2zA56rq7Z9zXwD3tTKWxdgv/PaTBS9pCJu+0t4X4zSFRvejNEP9OAmxzuAkdpRTP2MKOZngrajS3DzWASeoXcqq6skznGe4KtRSIW9qIiKgse1cWFkg1sw8Ft6cKvj2Rh44KF3CAeDMIrjqq+vth9zEhq7DbeZCkgl4PY+HHLhDm/XyBMOe+fxuSNnazVFk6wiESfSoRWsboWmYUXxwY1KJWvZWiVSlSDmeVgbYFObOj49c2LqxLGZ+2emj9cElXmMpXhr3rqadkoS6su8rdN+G9Pbf3sP3uGwg+wOdtjDWQUo43ydTZonvmREhBLubBTMNgOOWcORMbiMTQ5v9hJfVCJKO3Vu8HknLnBXD462vJt01jraYAuasETBu7y70F456gyH8mTy2PGwwttourq+tsG7vYPjIToxb43iOGawcve2npMm/YAnWG4psbaDEt3tTaCZ5IrDe7wb8DGJkmH17jEAaB7VsphEKVS8Q5rguqNSrKQgUYxJ+int4h35WN00HeEzSjRCtWbeGmdlqqjAvqrmKortU6KTJzRVCzEJKOOxgeoYQjXF9PXehmXu99EevPcCHHe8SZ05Mu5YxTRlbfnqdS7J4trZYc8aC0yPLZ0Zbtn+Y2kWBSsxDRcx0JzuLxtO9OB+x8AQEkVgZjvHremncYoNs7ix9/XgAcfbr9ELtALoZtL5wfu9w2jU3rN6u75p41nck9CKNLs6BQ8aInSjNORdP0E8NYwoRw/tpGk6jgkjbVNxEvuv87bhpIF/NaJdHezUWPMFghKFjMvTQDe2DR9w2G3eXpyvSao1COp9OO3CGadhNQr4X2rcUrc3C+uFTetycbEAZ5t9eHjSuIxygUf6MOcJtNQYT2jtIk1XolSO7hUZk1Va63KxIjEvWtNp+QtUGL6mAV3SPewvcxHWeZ6quSV/Ye2+6ZoUQU2GK7q7FeNp7b9AqQ3qUc3W4FGzlQdikrRC8iA05s9b9QKc827VXS544gsMqcm8tdt5I6EGn6G8QXMDzk8qWHDhg2NwdajysZzISNcs08qjWGFKsGgwxmE9Cksb8JgLj4+5RknGhfITHuYxSt2yd7OmyOsoBV4fPuTCh014H30D5fvwkQ34qAsXWKO5XfUCij/gCYa1wzIVsUylxp5Gvf6kKijrYnG9EGKpLlu94npRHlm3LpbCopq4XNDlU7vWcSQsrRTF9HoxnDhUQNvucNfArvYMbMOnkn5qlw7i9XkrxJngO81G0aQe0jZBRYUnNU28p9GyUOdxTA2tyrxdltiC5aNhuRspVk7wMC4rRnAkpWUaU93x5UHD0LBMZJmgQ9qJqUJvXZpq+WwkpCqrLh/Nl2WaBpogVGgfjHjkqDxyj7lWAYWvNWjfOJIKAqNiuuaK81hC6O3Wv2UcBRSoOXewetA2nXsmydTaVm9DxwzHNAVmtM+cc2eE4AIidruCi17RxIYVA8WIRLFZXaseJFcgZY8aHfa3JyGff9WqHygUD1gT1n5rn7XXKlrFImSTSENwQbALjDfxHUA2FSgX8j6sutBlT/Tg0g6qscZOyj485cLHNmC9Bz10fNqFZ/IWe4idzK/As1vKOXVNyvkcQ0V83mEzTMvc5R583GH/W9P6ejV2asjNgYRYjcX6xLrmzwaKx5VJhnqxeb304oVxvXhx/t8WO6dR/4QLn0TUX9SofwrB+7QLn9kC77Nb4N17TeB9ToP3+RF4X+jBF68FvC9tA95r9ffLCOFXDIRfvV4QvjQO4csouewcm2eool9DIL/uwjcQyJeb7HpI1De1RLH/G1n6loJvOyxgIaOszTqMMSMh32ErLGYoE99lL72RTLyyvUx8TxP2+y78AHF+RRP2h0jYH7nw4z78xIWfapn42Q4y8XMjE79w4Zcb8KsevIqOv3bhN1tM/y19wy+qZRMQY9nwO2T273vwB8fQYbASLvDHnfwHM+yHzYvLIA5+7PwJo/zZiDW+K3c6SNEU/rJTiOEU+/Tg9Fka/oS/YpC/mYSxeHmSx0TXWx++FP7esMzZq4GViiSZj2+/LXxTFfAPHDIfa5H0RwfaPzfzloJ/Vf8Dd23S9w==",
        "user_name": "write/Write/WriteImpl/FinalizeWrite/FinalizeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-09-29T00:31:43.016526Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-09-28_17_31_41-15694561143181347133'
 location: 'us-central1'
 name: 'beamapp-jenkins-0929003139-979459'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-09-29T00:31:43.016526Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-09-28_17_31_41-15694561143181347133]
root: INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-28_17_31_41-15694561143181347133?project=apache-beam-testing
root: INFO: Job 2019-09-28_17_31_41-15694561143181347133 is in state 
JOB_STATE_RUNNING
root: INFO: 2019-09-29T00:31:45.054Z: JOB_MESSAGE_DETAILED: Checking 
permissions granted to controller Service Account.
root: INFO: 2019-09-29T00:31:45.476Z: JOB_MESSAGE_BASIC: Worker configuration: 
n1-standard-1 in us-central1-f.
root: INFO: 2019-09-29T00:31:46.023Z: JOB_MESSAGE_DETAILED: Expanding 
CoGroupByKey operations into optimizable parts.
root: INFO: 2019-09-29T00:31:46.059Z: JOB_MESSAGE_DEBUG: Combiner lifting 
skipped for step write/Write/WriteImpl/GroupByKey: GroupByKey not followed by a 
combiner.
root: INFO: 2019-09-29T00:31:46.094Z: JOB_MESSAGE_DEBUG: Combiner lifting 
skipped for step group: GroupByKey not followed by a combiner.
root: INFO: 2019-09-29T00:31:46.121Z: JOB_MESSAGE_DETAILED: Expanding 
GroupByKey operations into optimizable parts.
root: INFO: 2019-09-29T00:31:46.141Z: JOB_MESSAGE_DETAILED: Lifting 
ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-09-29T00:31:46.222Z: JOB_MESSAGE_DEBUG: Annotating graph with 
Autotuner information.
root: INFO: 2019-09-29T00:31:46.268Z: JOB_MESSAGE_DETAILED: Fusing adjacent 
ParDo, Read, Write, and Flatten operations
root: INFO: 2019-09-29T00:31:46.303Z: JOB_MESSAGE_DETAILED: Fusing consumer 
split into read/Read
root: INFO: 2019-09-29T00:31:46.331Z: JOB_MESSAGE_DETAILED: Fusing consumer 
pair_with_one into split
root: INFO: 2019-09-29T00:31:46.365Z: JOB_MESSAGE_DETAILED: Fusing consumer 
group/Reify into pair_with_one
root: INFO: 2019-09-29T00:31:46.401Z: JOB_MESSAGE_DETAILED: Fusing consumer 
group/Write into group/Reify
root: INFO: 2019-09-29T00:31:46.434Z: JOB_MESSAGE_DETAILED: Fusing consumer 
group/GroupByWindow into group/Read
root: INFO: 2019-09-29T00:31:46.469Z: JOB_MESSAGE_DETAILED: Fusing consumer 
count into group/GroupByWindow
root: INFO: 2019-09-29T00:31:46.506Z: JOB_MESSAGE_DETAILED: Fusing consumer 
format into count
root: INFO: 2019-09-29T00:31:46.538Z: JOB_MESSAGE_DETAILED: Fusing consumer 
write/Write/WriteImpl/WriteBundles/WriteBundles into format
root: INFO: 2019-09-29T00:31:46.573Z: JOB_MESSAGE_DETAILED: Fusing consumer 
write/Write/WriteImpl/Pair into write/Write/WriteImpl/WriteBundles/WriteBundles
root: INFO: 2019-09-29T00:31:46.608Z: JOB_MESSAGE_DETAILED: Fusing consumer 
write/Write/WriteImpl/WindowInto(WindowIntoFn) into write/Write/WriteImpl/Pair
root: INFO: 2019-09-29T00:31:46.632Z: JOB_MESSAGE_DETAILED: Fusing consumer 
write/Write/WriteImpl/GroupByKey/Reify into 
write/Write/WriteImpl/WindowInto(WindowIntoFn)
root: INFO: 2019-09-29T00:31:46.666Z: JOB_MESSAGE_DETAILED: Fusing consumer 
write/Write/WriteImpl/GroupByKey/Write into 
write/Write/WriteImpl/GroupByKey/Reify
root: INFO: 2019-09-29T00:31:46.700Z: JOB_MESSAGE_DETAILED: Fusing consumer 
write/Write/WriteImpl/GroupByKey/GroupByWindow into 
write/Write/WriteImpl/GroupByKey/Read
root: INFO: 2019-09-29T00:31:46.731Z: JOB_MESSAGE_DETAILED: Fusing consumer 
write/Write/WriteImpl/Extract into 
write/Write/WriteImpl/GroupByKey/GroupByWindow
root: INFO: 2019-09-29T00:31:46.762Z: JOB_MESSAGE_DETAILED: Fusing consumer 
write/Write/WriteImpl/InitializeWrite into write/Write/WriteImpl/DoOnce/Read
root: INFO: 2019-09-29T00:31:46.796Z: JOB_MESSAGE_DEBUG: Workflow config is 
missing a default resource spec.
root: INFO: 2019-09-29T00:31:46.832Z: JOB_MESSAGE_DEBUG: Adding StepResource 
setup and teardown to workflow graph.
root: INFO: 2019-09-29T00:31:46.857Z: JOB_MESSAGE_DEBUG: Adding workflow start 
and stop steps.
root: INFO: 2019-09-29T00:31:46.893Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-09-29T00:31:47.011Z: JOB_MESSAGE_DEBUG: Executing wait step 
start26
root: INFO: 2019-09-29T00:31:47.075Z: JOB_MESSAGE_BASIC: Executing operation 
write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite
root: INFO: 2019-09-29T00:31:47.103Z: JOB_MESSAGE_BASIC: Executing operation 
write/Write/WriteImpl/GroupByKey/Create
root: INFO: 2019-09-29T00:31:47.115Z: JOB_MESSAGE_DEBUG: Starting worker pool 
setup.
root: INFO: 2019-09-29T00:31:47.131Z: JOB_MESSAGE_BASIC: Executing operation 
group/Create
root: INFO: 2019-09-29T00:31:47.146Z: JOB_MESSAGE_BASIC: Starting 10 workers in 
us-central1-f...
root: INFO: 2019-09-29T00:31:47.195Z: JOB_MESSAGE_BASIC: Finished operation 
write/Write/WriteImpl/GroupByKey/Create
root: INFO: 2019-09-29T00:31:47.195Z: JOB_MESSAGE_BASIC: Finished operation 
group/Create
root: INFO: 2019-09-29T00:31:47.252Z: JOB_MESSAGE_DEBUG: Value "group/Session" 
materialized.
root: INFO: 2019-09-29T00:31:47.279Z: JOB_MESSAGE_DEBUG: Value 
"write/Write/WriteImpl/GroupByKey/Session" materialized.
root: INFO: 2019-09-29T00:31:47.314Z: JOB_MESSAGE_BASIC: Executing operation 
read/Read+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-09-29T00:32:13.090Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised 
the number of workers to 9 based on the rate of progress in the currently 
running step(s).
root: INFO: 2019-09-29T00:32:13.125Z: JOB_MESSAGE_DETAILED: Resized worker pool 
to 9, though goal was 10.  This could be a quota issue.
root: INFO: 2019-09-29T00:32:18.499Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised 
the number of workers to 10 based on the rate of progress in the currently 
running step(s).
root: INFO: 2019-09-29T00:32:37.786Z: JOB_MESSAGE_DETAILED: Workers have 
started successfully.
root: INFO: 2019-09-29T00:32:37.824Z: JOB_MESSAGE_DETAILED: Workers have 
started successfully.
root: INFO: 2019-09-29T00:36:49.613Z: JOB_MESSAGE_ERROR: Traceback (most recent 
call last):
  File "/usr/local/lib/python3.6/site-packages/dataflow_worker/batchworker.py", 
line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.6/site-packages/dataflow_worker/batchworker.py", 
line 489, in _load_main_session
    pickler.load_session(session_file)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/internal/pickler.py", line 
287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 410, in 
load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were 
given

root: INFO: 2019-09-29T00:36:51.673Z: JOB_MESSAGE_ERROR: Traceback (most recent 
call last):
  File "/usr/local/lib/python3.6/site-packages/dataflow_worker/batchworker.py", 
line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.6/site-packages/dataflow_worker/batchworker.py", 
line 489, in _load_main_session
    pickler.load_session(session_file)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/internal/pickler.py", line 
287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 410, in 
load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were 
given

root: INFO: 2019-09-29T00:36:53.745Z: JOB_MESSAGE_ERROR: Traceback (most recent 
call last):
  File "/usr/local/lib/python3.6/site-packages/dataflow_worker/batchworker.py", 
line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.6/site-packages/dataflow_worker/batchworker.py", 
line 489, in _load_main_session
    pickler.load_session(session_file)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/internal/pickler.py", line 
287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 410, in 
load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were 
given

root: INFO: 2019-09-29T00:36:54.984Z: JOB_MESSAGE_ERROR: Traceback (most recent 
call last):
  File "/usr/local/lib/python3.6/site-packages/dataflow_worker/batchworker.py", 
line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.6/site-packages/dataflow_worker/batchworker.py", 
line 489, in _load_main_session
    pickler.load_session(session_file)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/internal/pickler.py", line 
287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 410, in 
load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were 
given

root: INFO: 2019-09-29T00:36:55.810Z: JOB_MESSAGE_ERROR: Traceback (most recent 
call last):
  File "/usr/local/lib/python3.6/site-packages/dataflow_worker/batchworker.py", 
line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.6/site-packages/dataflow_worker/batchworker.py", 
line 489, in _load_main_session
    pickler.load_session(session_file)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/internal/pickler.py", line 
287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 410, in 
load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were 
given

root: INFO: 2019-09-29T00:36:56.048Z: JOB_MESSAGE_ERROR: Traceback (most recent 
call last):
  File "/usr/local/lib/python3.6/site-packages/dataflow_worker/batchworker.py", 
line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.6/site-packages/dataflow_worker/batchworker.py", 
line 489, in _load_main_session
    pickler.load_session(session_file)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/internal/pickler.py", line 
287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 410, in 
load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were 
given

root: INFO: 2019-09-29T00:36:56.868Z: JOB_MESSAGE_ERROR: Traceback (most recent 
call last):
  File "/usr/local/lib/python3.6/site-packages/dataflow_worker/batchworker.py", 
line 773, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python3.6/site-packages/dataflow_worker/batchworker.py", 
line 489, in _load_main_session
    pickler.load_session(session_file)
  File 
"/usr/local/lib/python3.6/site-packages/apache_beam/internal/pickler.py", line 
287, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 410, in 
load_session
    module = unpickler.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were 
given

root: INFO: 2019-09-29T00:36:57.167Z: JOB_MESSAGE_BASIC: Finished operation 
write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite
root: INFO: 2019-09-29T00:36:57.224Z: JOB_MESSAGE_DEBUG: Executing failure step 
failure25
root: INFO: 2019-09-29T00:36:57.256Z: JOB_MESSAGE_ERROR: Workflow failed. 
Causes: 
S01:write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite 
failed., Internal Issue (93eb8727af2a6290): 63963027:24514
root: INFO: 2019-09-29T00:36:57.360Z: JOB_MESSAGE_BASIC: Finished operation 
read/Read+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-09-29T00:36:57.463Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-09-29T00:36:57.510Z: JOB_MESSAGE_DEBUG: Starting worker pool 
teardown.
root: INFO: 2019-09-29T00:36:57.536Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-09-29T00:38:51.257Z: JOB_MESSAGE_DETAILED: Autoscaling: 
Reduced the number of workers to 0 based on the rate of progress in the 
currently running step(s).
root: INFO: 2019-09-29T00:38:51.301Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-09-29T00:38:51.321Z: JOB_MESSAGE_DEBUG: Tearing down pending 
resources...
root: INFO: Job 2019-09-28_17_31_41-15694561143181347133 is in state 
JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 
'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1569717098965/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 
'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1569717098965/results*'
 -> 
'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1569717098965\\/results[^/\\\\]*'
root: INFO: Starting the size estimation of the input
root: INFO: Finished listing 0 files in 0.035767555236816406 seconds.
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-integrationTest-perf.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 444.740s

FAILED (errors=1)

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:dataflow:py36:integrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 8m 5s

2019-09-29 00:39:04,962 b35fd78d MainThread beam_integration_benchmark(1/1) 
ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py";,>
 line 841, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py";,>
 line 687, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py";,>
 line 160, in Run
    job_type=job_type)
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py";,>
 line 91, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-09-29 00:39:04,964 b35fd78d MainThread beam_integration_benchmark(1/1) 
INFO     Cleaning up benchmark beam_integration_benchmark
2019-09-29 00:39:04,967 b35fd78d MainThread beam_integration_benchmark(1/1) 
ERROR    Exception running benchmark
Traceback (most recent call last):
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py";,>
 line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py";,>
 line 841, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py";,>
 line 687, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py";,>
 line 160, in Run
    job_type=job_type)
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py";,>
 line 91, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-09-29 00:39:04,967 b35fd78d MainThread beam_integration_benchmark(1/1) 
ERROR    Benchmark 1/1 beam_integration_benchmark (UID: 
beam_integration_benchmark0) failed. Execution will continue.
2019-09-29 00:39:04,968 b35fd78d MainThread beam_integration_benchmark(1/1) 
INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed 
Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                 
 
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-09-29 00:39:04,968 b35fd78d MainThread beam_integration_benchmark(1/1) 
INFO     Complete logs can be found at: 
<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/runs/b35fd78d/pkb.log>
2019-09-29 00:39:04,968 b35fd78d MainThread beam_integration_benchmark(1/1) 
INFO     Completion statuses can be found at: 
<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py36/ws/runs/b35fd78d/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to