See 
<https://builds.apache.org/job/beam_PostCommit_Python2/968/display/redirect?page=changes>

Changes:

[sunjincheng121] [BEAM-8594] Remove unnecessary error check in DataFlow Runner

[sunjincheng121] fixup

[sunjincheng121] fix codestyle


------------------------------------------
[...truncated 1.47 MB...]
            "value": 
"projects/apache-beam-testing/subscriptions/psit_subscription_inputa6547c4c-b0fc-4600-a1b4-5c176af24c52"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_id_label": "id", 
        "pubsub_subscription": 
"projects/apache-beam-testing/subscriptions/psit_subscription_inputa6547c4c-b0fc-4600-a1b4-5c176af24c52",
 
        "pubsub_timestamp_label": "timestamp", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "modify_data"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "modify_data.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_modify_data_4", 
        "user_name": "modify_data"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s2"
        }, 
        "pubsub_id_label": "id", 
        "pubsub_timestamp_label": "timestamp", 
        "pubsub_topic": 
"projects/apache-beam-testing/topics/psit_topic_outputa6547c4c-b0fc-4600-a1b4-5c176af24c52",
 
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: u'2019-11-12T17:25:39.666155Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-11-12_09_25_37-14345361942927811346'
 location: u'us-central1'
 name: u'beamapp-jenkins-1112172527-474761'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-11-12T17:25:39.666155Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-11-12_09_25_37-14345361942927811346]
root: INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_25_37-14345361942927811346?project=apache-beam-testing
root: INFO: Job 2019-11-12_09_25_37-14345361942927811346 is in state 
JOB_STATE_RUNNING
root: INFO: 2019-11-12T17:25:43.045Z: JOB_MESSAGE_DETAILED: Checking 
permissions granted to controller Service Account.
root: INFO: 2019-11-12T17:25:43.737Z: JOB_MESSAGE_BASIC: Worker configuration: 
n1-standard-4 in us-central1-f.
root: INFO: 2019-11-12T17:25:44.525Z: JOB_MESSAGE_DETAILED: Expanding 
SplittableParDo operations into optimizable parts.
root: INFO: 2019-11-12T17:25:44.529Z: JOB_MESSAGE_DETAILED: Expanding 
CollectionToSingleton operations into optimizable parts.
root: INFO: 2019-11-12T17:25:44.538Z: JOB_MESSAGE_DETAILED: Expanding 
CoGroupByKey operations into optimizable parts.
root: INFO: 2019-11-12T17:25:44.553Z: JOB_MESSAGE_DETAILED: Expanding 
SplittableProcessKeyed operations into optimizable parts.
root: INFO: 2019-11-12T17:25:44.557Z: JOB_MESSAGE_DETAILED: Expanding 
GroupByKey operations into streaming Read/Write steps
root: INFO: 2019-11-12T17:25:44.561Z: JOB_MESSAGE_DEBUG: Annotating graph with 
Autotuner information.
root: INFO: 2019-11-12T17:25:44.581Z: JOB_MESSAGE_DETAILED: Fusing adjacent 
ParDo, Read, Write, and Flatten operations
root: INFO: 2019-11-12T17:25:44.584Z: JOB_MESSAGE_DETAILED: Fusing consumer 
modify_data into ReadFromPubSub/Read
root: INFO: 2019-11-12T17:25:44.587Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToPubSub/Write/NativeWrite into modify_data
root: INFO: 2019-11-12T17:25:44.598Z: JOB_MESSAGE_BASIC: The pubsub read for: 
projects/apache-beam-testing/subscriptions/psit_subscription_inputa6547c4c-b0fc-4600-a1b4-5c176af24c52
 is configured to compute input data watermarks based on custom timestamp 
attribute timestamp. Cloud Dataflow has created an additional tracking 
subscription to do this, which will be cleaned up automatically. For details, 
see: https://cloud.google.com/dataflow/model/pubsub-io#timestamps-ids
root: INFO: 2019-11-12T17:25:44.603Z: JOB_MESSAGE_DEBUG: Adding StepResource 
setup and teardown to workflow graph.
root: INFO: 2019-11-12T17:25:44.642Z: JOB_MESSAGE_DEBUG: Adding workflow start 
and stop steps.
root: INFO: 2019-11-12T17:25:44.692Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-11-12T17:25:44.853Z: JOB_MESSAGE_DEBUG: Executing wait step 
start2
root: INFO: 2019-11-12T17:25:44.872Z: JOB_MESSAGE_DEBUG: Starting worker pool 
setup.
root: INFO: 2019-11-12T17:25:44.880Z: JOB_MESSAGE_BASIC: Starting 1 workers...
root: INFO: 2019-11-12T17:25:46.070Z: JOB_MESSAGE_DETAILED: Pub/Sub resources 
set up for topic 
'projects/apache-beam-testing/topics/psit_topic_inputa6547c4c-b0fc-4600-a1b4-5c176af24c52'.
root: INFO: 2019-11-12T17:25:48.666Z: JOB_MESSAGE_BASIC: Executing operation 
ReadFromPubSub/Read+modify_data+WriteToPubSub/Write/NativeWrite
root: INFO: 2019-11-12T17:26:15.530Z: JOB_MESSAGE_DETAILED: Checking 
permissions granted to controller Service Account.
root: INFO: 2019-11-12T17:26:15.536Z: JOB_MESSAGE_DEBUG: Executing input step 
topology_init_attach_disk_input_step
root: INFO: 2019-11-12T17:26:16.334Z: JOB_MESSAGE_BASIC: Worker configuration: 
n1-standard-4 in us-central1-f.
root: INFO: 2019-11-12T17:26:18.845Z: JOB_MESSAGE_WARNING: Your project already 
contains 100 Dataflow-created metric descriptors and Stackdriver will not 
create new Dataflow custom metrics for this job. Each unique user-defined 
metric name (independent of the DoFn in which it is defined) produces a new 
metric descriptor. To delete old / unused metric descriptors see 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
root: INFO: 2019-11-12T17:26:28.922Z: JOB_MESSAGE_DETAILED: Workers have 
started successfully.
root: WARNING: Timing out on waiting for job 
2019-11-12_09_25_37-14345361942927811346 after 182 seconds
google.auth.transport._http_client: DEBUG: Making request: GET 
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): 
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token
 HTTP/1.1" 200 181
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3425.585s

FAILED (SKIP=4, errors=1)
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_01_29-3149003326771714583?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_09_01-17201992166099224927?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_16_37-12350578197207506726?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_23_21-13471072587083761142?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_30_10-1608753879955555974?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_37_29-4839223710935032460?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_44_33-12010631388735176095?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_51_19-947063100993079382?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_01_31-16959093996271339792?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_16_23-5985018124356451428?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_24_40-11878762943264897893?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_32_51-18109179498649971267?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_01_34-2697712949236132070?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_20_55-17430478902863836803?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_28_25-6248043904211629243?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_34_59-765816068826669969?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_01_32-5741233666483044081?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_14_27-15838920801609285532?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_21_38-11678902170388619171?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_28_18-6438451559571698582?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_35_14-16174435225944307443?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_01_30-14395795956129405529?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_17_35-6290055846401228869?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_25_37-14345361942927811346?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_31_17-17806944174573139301?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_01_31-1294342331896100378?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_08_47-2191734989772417643?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_17_34-7506565589598728051?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_25_38-16343354175432602586?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_32_42-12210789908743447504?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_39_36-18038508595237212262?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_01_30-3558373506729293118?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_09_33-12592495050530172093?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_17_19-5897409475943397466?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_23_59-13104329371154842224?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_30_19-96938745690284207?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_01_30-10208448686258203092?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_10_05-15486207910692613535?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_19_55-193645615439949111?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-12_09_37_32-10629925168491801373?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'>
 line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 58m 21s
117 actionable tasks: 94 executed, 20 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/rizexjbqqmbzm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to