See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python2/2808/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-9956] Remove unbalanced code markup.

[zyichi] Bump up advanceProcessingTime duration in ParDoTest

[tobiasz.kedzierski] [BEAM-10623] Add workflow to run python tests on 
Linux/Windows/Mac

[tobiasz.kedzierski] [BEAM-10624] dtype explicit for the numpy arrays

[Steve Niemitz] [BEAM-10523] Add support for custom DatumWriters to AvroIO.Write

[Luke Cwik] [BEAM-10040, BEAM-6804] Increase wait times to reduce flakiness.

[noreply] [BEAM-10752] add use_deprecated_read experiment for testSimpleInsert,


------------------------------------------
[...truncated 21.11 MB...]
            "type": "STRING", 
            "value": "{}"
          }, 
          {
            "key": "ignore_insert_ids", 
            "namespace": "apache_beam.io.gcp.bigquery.BigQueryWriteFn", 
            "type": "STRING", 
            "value": "False"
          }, 
          {
            "key": "max_batch_size", 
            "namespace": "apache_beam.io.gcp.bigquery.BigQueryWriteFn", 
            "type": "INTEGER", 
            "value": 500
          }, 
          {
            "key": "write_disposition", 
            "namespace": "apache_beam.io.gcp.bigquery.BigQueryWriteFn", 
            "type": "STRING", 
            "value": "WRITE_APPEND"
          }, 
          {
            "key": "max_buffered_rows", 
            "namespace": "apache_beam.io.gcp.bigquery.BigQueryWriteFn", 
            "type": "INTEGER", 
            "value": 2000
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "BigQueryWriteFn", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.bigquery.BigQueryWriteFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$<string of 176 bytes>", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$<string of 176 bytes>", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": 
"ref_Coder_FastPrimitivesCoder_3"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$<string of 176 bytes>", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": 
"ref_Coder_FastPrimitivesCoder_3"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": 
"WriteToBigQuery/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out"
          }, 
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$<string of 176 bytes>", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$<string of 176 bytes>", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": 
"ref_Coder_FastPrimitivesCoder_3"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$<string of 176 bytes>", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": 
"ref_Coder_FastPrimitivesCoder_3"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "FailedRows", 
            "user_name": 
"WriteToBigQuery/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).FailedRows"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s8"
        }, 
        "serialized_fn": 
"ref_AppliedPTransform_WriteToBigQuery/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_15",
 
        "user_name": 
"WriteToBigQuery/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-08-19T00:38:08.094698Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-08-18_17_38_07-6339018435563490238'
 location: u'us-central1'
 name: u'beamapp-jenkins-0819003756-765482'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-08-19T00:38:08.094698Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: 
[2020-08-18_17_38_07-6339018435563490238]
apache_beam.runners.dataflow.internal.apiclient: INFO: Submitted job: 
2020-08-18_17_38_07-6339018435563490238
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow 
monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_38_07-6339018435563490238?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py27.xml
----------------------------------------------------------------------
XML: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 66 tests in 3817.772s

FAILED (SKIP=7, errors=1)
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_04_11-16798952272761327291?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_17_41-13127942308055587152?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_24_21-14284765973695230942?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_31_28-7171829251369985803?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_39_02-12737695677665035583?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_46_37-13187747541387850004?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_53_44-9236635769872979128?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_04_08-7043277594575225128?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_22_53-7205735766698466585?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_30_16-6371590197046854973?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_37_39-16504226082051789855?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_45_20-12577113502722523006?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_52_11-17925956378876304760?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_04_11-164340393300169951?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_15_55-5047141096410803594?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_22_57-17506260179280951809?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_30_23-12466508921812155689?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_37_58-2856082441038646160?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_44_53-9723067899883170947?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_51_58-10786305737496375906?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_04_06-8805994524452464756?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_24_30-11553436078196948701?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_32_00-15604793153185679696?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_40_24-8048555031384457023?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_46_34-16957780281045430851?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_52_49-17833065098530944812?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_59_28-16160677836456761068?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_04_07-11790065535336284912?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_11_50-15333233405726438108?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_21_49-1710664058582195448?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_30_36-16043877606690984822?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_38_07-6339018435563490238?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_38_34-14344565719958805118?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_46_01-2981296538352357333?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_53_23-1115031397151397636?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_04_06-6655376823183902576?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_13_12-878927870419123939?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_23_12-8982642726920421229?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_30_32-12134131434546324273?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_37_26-10798346959148988028?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_44_53-15178342861166407920?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_52_43-13423415916302488354?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_04_06-9099170169589056787?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_11_58-16075523412185347932?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_19_38-10524159223216589554?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_26_53-14625845064228799902?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_33_33-10609257512264402133?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_40_47-7860385856766499203?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_48_06-15571258045116286085?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_55_20-10106253301713297437?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_04_04-14187894961840050575?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_11_54-14158936117695442541?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_20_25-14152445548544388775?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_28_10-10596029146134886218?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_35_55-11437639537607655791?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_43_15-5307211966424406377?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_50_25-11241407080631372461?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_57_50-10008054280384642388?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* Where:
Build file 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'>
 line: 81

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:container:py2:docker'.
> Process 'command 'docker'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
 line: 118

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 6m 17s
156 actionable tasks: 131 executed, 23 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/dyme5uhfp4ook

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to