See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python2/2800/display/redirect>

Changes:


------------------------------------------
[...truncated 17.88 MB...]
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s39"
        }, 
        "user_name": 
"write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)",
 
        "windowing_strategy": 
"%0A%88%11%22%40%0A%1Dref_Coder_GlobalWindowCoder_1%12%1F%0A%1D%0A%1Bbeam%3Acoder%3Aglobal_window%3Av1%2A%C3%10%0A%25ref_Environment_default_environment_1%12%99%10%12%12beam%3Aenv%3Adocker%3Av1%1A%3B%0A9gcr.io/cloud-dataflow/v1beta3/python%3Abeam-master-20200630%2A%14beam%3Acoder%3Avarint%3Av1%2A%13beam%3Acoder%3Abytes%3Av1%2A%13beam%3Acoder%3Atimer%3Av1%2A%1Bbeam%3Acoder%3Aglobal_window%3Av1%2A%1Dbeam%3Acoder%3Ainterval_window%3Av1%2A%16beam%3Acoder%3Aiterable%3Av1%2A%23beam%3Acoder%3Astate_backed_iterable%3Av1%2A%1Cbeam%3Acoder%3Awindowed_value%3Av1%2A%22beam%3Acoder%3Aparam_windowed_value%3Av1%2A%14beam%3Acoder%3Adouble%3Av1%2A%19beam%3Acoder%3Astring_utf8%3Av1%2A%1Bbeam%3Acoder%3Alength_prefix%3Av1%2A%12beam%3Acoder%3Abool%3Av1%2A%10beam%3Acoder%3Akv%3Av1%2A%11beam%3Acoder%3Arow%3Av1%2A%23beam%3Aprotocol%3Aprogress_reporting%3Av0%2A%1Ebeam%3Aprotocol%3Aworker_status%3Av1%2A%3Abeam%3Aversion%3Asdk_base%3Aapache/beam_python2.7_sdk%3A2.25.0.dev2q%0A%1Abeam%3Aartifact%3Atype%3Afile%3Av1%12%1D%0A%1Bpostcommit_requirements.txt%1A%20beam%3Aartifact%3Arole%3Astaging_to%3Av1%22%12%0A%10requirements.txt2%9B%01%0A%1Abeam%3Aartifact%3Atype%3Afile%3Av1%12%3D%0A%3B/tmp/dataflow-requirements-cache/parameterized-0.7.4.tar.gz%1A%20beam%3Aartifact%3Arole%3Astaging_to%3Av1%22%1C%0A%1Aparameterized-0.7.4.tar.gz2%89%01%0A%1Abeam%3Aartifact%3Atype%3Afile%3Av1%124%0A2/tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz%1A%20beam%3Aartifact%3Arole%3Astaging_to%3Av1%22%13%0A%11mock-2.0.0.tar.gz2%89%01%0A%1Abeam%3Aartifact%3Atype%3Afile%3Av1%124%0A2/tmp/dataflow-requirements-cache/six-1.15.0.tar.gz%1A%20beam%3Aartifact%3Arole%3Astaging_to%3Av1%22%13%0A%11six-1.15.0.tar.gz2%91%01%0A%1Abeam%3Aartifact%3Atype%3Afile%3Av1%128%0A6/tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz%1A%20beam%3Aartifact%3Arole%3Astaging_to%3Av1%22%17%0A%15funcsigs-1.0.2.tar.gz2%87%01%0A%1Abeam%3Aartifact%3Atype%3Afile%3Av1%123%0A1/tmp/dataflow-requirements-cache/pbr-5.4.5.tar.gz%1A%20beam%3Aartifact%3Arole%3Astaging_to%3Av1%22%12%0A%10pbr-5.4.5.tar.gz2%97%01%0A%1Abeam%3Aartifact%3Atype%3Afile%3Av1%12%3B%0A9/tmp/dataflow-requirements-cache/PyHamcrest-1.10.1.tar.gz%1A%20beam%3Aartifact%3Arole%3Astaging_to%3Av1%22%1A%0A%18PyHamcrest-1.10.1.tar.gz2%C3%01%0A%1Abeam%3Aartifact%3Atype%3Afile%3Av1%12h%0Af<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/build/apache-beam.tar.gz%1A%20beam%3Aartifact%3Arole%3Astaging_to%3Av1%22%19%0A%17dataflow_python_sdk.tar2%9B%02%0A%1Abeam%3Aartifact%3Atype%3Afile%3Av1%12%C3%01%0A%C0%01/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python2/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.25.0-SNAPSHOT.jar%1A%20beam%3Aartifact%3Arole%3Astaging_to%3Av1%22%15%0A%13dataflow-worker.jarjx%0A%22%0A%20beam%3Awindow_fn%3Aglobal_windows%3Av1%10%01%1A%1Dref_Coder_GlobalWindowCoder_1%22%02%3A%00%28%010%018%01H%01Z%25ref_Environment_default_environment_1";>
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s41", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "WaitForBQJobs", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.bigquery_file_loads.WaitForBQJobs"
          }
        ], 
        "non_parallel_inputs": {
          
"python_side_input0-write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs": {
            "@type": "OutputReference", 
            "output_name": "out", 
            "step_name": "SideInput-s40"
          }
        }, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$<string of 176 bytes>", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$<string of 176 bytes>", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": 
"ref_Coder_FastPrimitivesCoder_5"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$<string of 176 bytes>", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": 
"ref_Coder_FastPrimitivesCoder_5"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_5"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": 
"write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s7"
        }, 
        "serialized_fn": "<string of 1276 bytes>", 
        "user_name": 
"write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs"
      }
    }, 
    {
      "kind": "Flatten", 
      "name": "s42", 
      "properties": {
        "display_data": [], 
        "inputs": [
          {
            "@type": "OutputReference", 
            "output_name": "None", 
            "step_name": "s39"
          }, 
          {
            "@type": "OutputReference", 
            "output_name": "None", 
            "step_name": "s25"
          }
        ], 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$<string of 176 bytes>", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$<string of 176 bytes>", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": 
"ref_Coder_FastPrimitivesCoder_5"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$<string of 176 bytes>", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": 
"ref_Coder_FastPrimitivesCoder_5"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_5"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "write/BigQueryBatchFileLoads/Flatten.out"
          }
        ], 
        "user_name": "write/BigQueryBatchFileLoads/Flatten"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-08-17T00:11:00.197608Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-08-16_17_10_58-8320176444431885644'
 location: u'us-central1'
 name: u'beamapp-jenkins-0817001047-935605'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-08-17T00:11:00.197608Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: 
[2020-08-16_17_10_58-8320176444431885644]
apache_beam.runners.dataflow.internal.apiclient: INFO: Submitted job: 
2020-08-16_17_10_58-8320176444431885644
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow 
monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_10_58-8320176444431885644?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py27.xml
----------------------------------------------------------------------
XML: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 66 tests in 3330.019s

FAILED (SKIP=6, errors=10, failures=1)
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_02_53-12352280740364145555?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_20_40-2950212004276856473?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_27_54-7216477644595193567?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_34_45-3821921539300197111?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_42_29-5756479147685882942?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_02_52-5050775501988803715?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_11_11-8275273016453853602?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_13_50-1534222860241034356?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_21_38-18041215076540644483?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_28_46-13930513394300957639?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_36_36-10399981445308645459?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_44_05-5671635534086540570?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_51_00-13036111499325979556?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_02_53-923892069514823040?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_10_49-4350792244373642669?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_18_14-8692573255859819793?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_25_09-3215623499383196018?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_32_02-5216943840522071608?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_39_37-15764763610906820420?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_02_52-12693848953131613440?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_19_57-10629428035894046560?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_27_53-1519651388669688343?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_35_17-2073170089260081749?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_42_26-3980828217160640718?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_02_53-11240381908758701072?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_10_43-8438324651697591137?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_18_46-2742419711708180045?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_26_09-9735000967724747394?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_34_10-18340613571672327609?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_41_34-1196226002031252234?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_02_54-10692583158662591160?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_10_40-16080165229660200582?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_17_54-8094294301690784076?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_25_29-4539292049139792279?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_32_48-5675168987817430289?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_39_33-1084442588012697068?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_02_51-11165406862503273322?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_10_58-8320176444431885644?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_11_30-3711049002346121419?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_19_12-15355299681874566072?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_26_52-4823648990039949419?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_33_41-15836622248867048235?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_41_25-5722391388977096748?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_02_53-8598911045931417607?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_11_51-17971446844684260477?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_21_17-3416837983009866678?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_21_45-3426810319255603634?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_28_47-4395858767774280366?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_30_55-12999101192049619602?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_38_34-4770907757730731875?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_46_11-9375499353574257854?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'>
 line: 50

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
 line: 116

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 57m 26s
159 actionable tasks: 122 executed, 35 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/wpqwkggtfeaig

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to