See 
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Dataflow_Batch/7/display/redirect>

------------------------------------------
[...truncated 41.71 KB...]
TypeError: int() argument must be a string or a number, not 'NoneType'
-------------------- >> begin captured logging << --------------------
root: INFO: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/Grammar.txt
root: INFO: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
google.auth.transport._http_client: DEBUG: Making request: GET 
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, 
connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): 
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token
 HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): 
www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/datasets/load_test HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/datasets/load_test/tables/python_dataflow_batch_gbk_6
 HTTP/1.1" 200 None
--------------------- >> end captured logging << ---------------------

======================================================================
ERROR: testGroupByKey 
(apache_beam.testing.load_tests.group_by_key_test.GroupByKeyTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File 
"<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Dataflow_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py";,>
 line 65, in tearDown
    result = self.pipeline.run()
  File 
"<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Dataflow_Batch/ws/src/sdks/python/apache_beam/testing/test_pipeline.py";,>
 line 109, in run
    state = result.wait_until_finish()
  File 
"<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Dataflow_Batch/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py";,>
 line 1325, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
Runnable workflow has no steps specified.
-------------------- >> begin captured logging << --------------------
root: INFO: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/Grammar.txt
root: INFO: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
google.auth.transport._http_client: DEBUG: Making request: GET 
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, 
connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): 
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token
 HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): 
www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/datasets/load_test HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/datasets/load_test/tables/python_dataflow_batch_gbk_6
 HTTP/1.1" 200 None
root: INFO: Defaulting to the temp_location as staging_location: 
gs://temp-storage-for-perf-tests/loadtests
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-batch-gbk-6-0524133458.1558707904.024643/pipeline.pb...
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-batch-gbk-6-0524133458.1558707904.024643/pipeline.pb
 in 0 seconds.
root: INFO: Copying Beam SDK 
"<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Dataflow_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build/apache-beam.tar.gz";>
 to staging location.
root: INFO: Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-batch-gbk-6-0524133458.1558707904.024643/dataflow_python_sdk.tar...
root: INFO: Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-batch-gbk-6-0524133458.1558707904.024643/dataflow_python_sdk.tar
 in 0 seconds.
root: WARNING: Discarding unparseable args: ['--publish_to_big_query=true', 
'--metrics_dataset=load_test', '--metrics_table=python_dataflow_batch_gbk_6', 
'--input_options={"numRecords": 20000000,"keySizeBytes": 10,"valueSizeBytes": 
90,"numHotKeys": 200,"hotKeyFraction": 1}', '--fanout=1', '--iterations=4']
root: WARNING: Discarding unparseable args: ['--publish_to_big_query=true', 
'--metrics_dataset=load_test', '--metrics_table=python_dataflow_batch_gbk_6', 
'--input_options={"numRecords": 20000000,"keySizeBytes": 10,"valueSizeBytes": 
90,"numHotKeys": 200,"hotKeyFraction": 1}', '--fanout=1', '--iterations=4']
root: DEBUG: JOB: {
  "environment": {
    "clusterManagerApiService": "compute.googleapis.com", 
    "dataset": "bigquery.googleapis.com/cloud_dataflow", 
    "sdkPipelineOptions": {
      "display_data": [
        {
          "key": "sdk_location", 
          "namespace": "apache_beam.options.pipeline_options.PipelineOptions", 
          "type": "STRING", 
          "value": 
"<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Dataflow_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build/apache-beam.tar.gz";>
        }, 
        {
          "key": "autoscaling_algorithm", 
          "namespace": "apache_beam.options.pipeline_options.PipelineOptions", 
          "type": "STRING", 
          "value": "NONE"
        }, 
        {
          "key": "num_workers", 
          "namespace": "apache_beam.options.pipeline_options.PipelineOptions", 
          "type": "INTEGER", 
          "value": 5
        }, 
        {
          "key": "runner", 
          "namespace": "apache_beam.options.pipeline_options.PipelineOptions", 
          "type": "STRING", 
          "value": "DataflowRunner"
        }, 
        {
          "key": "staging_location", 
          "namespace": "apache_beam.options.pipeline_options.PipelineOptions", 
          "type": "STRING", 
          "value": 
"gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-batch-gbk-6-0524133458.1558707904.024643"
        }, 
        {
          "key": "project", 
          "namespace": "apache_beam.options.pipeline_options.PipelineOptions", 
          "type": "STRING", 
          "value": "apache-beam-testing"
        }, 
        {
          "key": "max_num_workers", 
          "namespace": "apache_beam.options.pipeline_options.PipelineOptions", 
          "type": "INTEGER", 
          "value": 5
        }, 
        {
          "key": "temp_location", 
          "namespace": "apache_beam.options.pipeline_options.PipelineOptions", 
          "type": "STRING", 
          "value": 
"gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-batch-gbk-6-0524133458.1558707904.024643"
        }, 
        {
          "key": "beam_plugins", 
          "namespace": "apache_beam.options.pipeline_options.PipelineOptions", 
          "type": "STRING", 
          "value": "['apache_beam.io.filesystem.FileSystem', 
'apache_beam.io.hadoopfilesystem.HadoopFileSystem', 
'apache_beam.io.localfilesystem.LocalFileSystem', 
'apache_beam.io.gcp.gcsfilesystem.GCSFileSystem']"
        }, 
        {
          "key": "job_name", 
          "namespace": "apache_beam.options.pipeline_options.PipelineOptions", 
          "type": "STRING", 
          "value": "load-tests-python-dataflow-batch-gbk-6-0524133458"
        }
      ], 
      "options": {
        "autoscaling_algorithm": "NONE", 
        "beam_plugins": [
          "apache_beam.io.filesystem.FileSystem", 
          "apache_beam.io.hadoopfilesystem.HadoopFileSystem", 
          "apache_beam.io.localfilesystem.LocalFileSystem", 
          "apache_beam.io.gcp.gcsfilesystem.GCSFileSystem"
        ], 
        "dataflow_endpoint": "https://dataflow.googleapis.com";, 
        "direct_runner_bundle_repeat": 0, 
        "direct_runner_use_stacked_bundle": true, 
        "dry_run": false, 
        "enable_streaming_engine": false, 
        "environment_cache_millis": 0, 
        "job_name": "load-tests-python-dataflow-batch-gbk-6-0524133458", 
        "max_num_workers": 5, 
        "no_auth": false, 
        "num_workers": 5, 
        "pipelineUrl": 
"gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-batch-gbk-6-0524133458.1558707904.024643/pipeline.pb",
 
        "pipeline_type_check": true, 
        "profile_cpu": false, 
        "profile_memory": false, 
        "profile_sample_rate": 1.0, 
        "project": "apache-beam-testing", 
        "region": "us-central1", 
        "runner": "DataflowRunner", 
        "runtime_type_check": false, 
        "save_main_session": false, 
        "sdk_location": 
"<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Dataflow_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build/apache-beam.tar.gz";,>
 
        "sdk_worker_parallelism": 0, 
        "staging_location": 
"gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-batch-gbk-6-0524133458.1558707904.024643",
 
        "streaming": false, 
        "temp_location": 
"gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-batch-gbk-6-0524133458.1558707904.024643",
 
        "type_check_strictness": "DEFAULT_TO_ANY", 
        "update": false
      }
    }, 
    "tempStoragePrefix": 
"storage.googleapis.com/temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-batch-gbk-6-0524133458.1558707904.024643",
 
    "userAgent": {
      "name": "Apache Beam Python 2.7 SDK", 
      "version": "2.14.0.dev"
    }, 
    "version": {
      "job_type": "PYTHON_BATCH", 
      "major": "7"
    }, 
    "workerPools": [
      {
        "autoscalingSettings": {
          "algorithm": "AUTOSCALING_ALGORITHM_NONE", 
          "maxNumWorkers": 5
        }, 
        "kind": "harness", 
        "numWorkers": 5, 
        "packages": [
          {
            "location": 
"storage.googleapis.com/temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-batch-gbk-6-0524133458.1558707904.024643/dataflow_python_sdk.tar",
 
            "name": "dataflow_python_sdk.tar"
          }
        ], 
        "taskrunnerSettings": {
          "parallelWorkerSettings": {
            "baseUrl": "https://dataflow.googleapis.com";, 
            "servicePath": "https://dataflow.googleapis.com";
          }
        }, 
        "workerHarnessContainerImage": 
"gcr.io/cloud-dataflow/v1beta3/python:beam-master-20190509"
      }
    ]
  }, 
  "name": "load-tests-python-dataflow-batch-gbk-6-0524133458", 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2019-05-24T14:25:05.697107Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-05-24_07_25_04-8375761516176570908'
 location: u'us-central1'
 name: u'load-tests-python-dataflow-batch-gbk-6-0524133458'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-05-24T14:25:05.697107Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-05-24_07_25_04-8375761516176570908]
root: INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_07_25_04-8375761516176570908?project=apache-beam-testing
root: INFO: Job 2019-05-24_07_25_04-8375761516176570908 is in state 
JOB_STATE_PENDING
root: INFO: 2019-05-24T14:25:04.548Z: JOB_MESSAGE_WARNING: The requested max 
number of workers (5) is ignored as autoscaling is explicitly disabled 
(autoscalingAlgorithm=NONE).
root: INFO: 2019-05-24T14:25:07.245Z: JOB_MESSAGE_ERROR: Runnable workflow has 
no steps specified.
root: INFO: Job 2019-05-24_07_25_04-8375761516176570908 is in state 
JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Dataflow_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 16.951s

FAILED (errors=2)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Dataflow_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'>
 line: 52

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 4s
4 actionable tasks: 4 executed

Publishing build scan...
https://gradle.com/s/kjoixk2kfdjn6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to