See 
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/3828/display/redirect>

------------------------------------------
[...truncated 224.09 KB...]
                  "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
                  "component_encodings": [
                    {
                      "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
                      "component_encodings": []
                    }, 
                    {
                      "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s13"
        }, 
        "serialized_fn": "<string of 1340 bytes>", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2019-06-18T12:12:01.796194Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-06-18_05_11_59-22430636337191646'
 location: u'us-central1'
 name: u'beamapp-jenkins-0618121152-932863'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-06-18T12:12:01.796194Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-06-18_05_11_59-22430636337191646]
root: INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_11_59-22430636337191646?project=apache-beam-testing
root: INFO: Job 2019-06-18_05_11_59-22430636337191646 is in state 
JOB_STATE_RUNNING
root: INFO: 2019-06-18T12:12:00.229Z: JOB_MESSAGE_DETAILED: Autoscaling is 
enabled for job 2019-06-18_05_11_59-22430636337191646. The number of workers 
will be between 1 and 1000.
root: INFO: 2019-06-18T12:12:00.448Z: JOB_MESSAGE_DETAILED: Autoscaling was 
automatically enabled for job 2019-06-18_05_11_59-22430636337191646.
root: INFO: 2019-06-18T12:12:33.871Z: JOB_MESSAGE_ERROR: Workflow failed. 
Causes: Workflow failed due to internal error. Please contact Google Cloud 
Support.
root: INFO: 2019-06-18T12:12:34.407Z: JOB_MESSAGE_DETAILED: Checking 
permissions granted to controller Service Account.
root: INFO: 2019-06-18T12:12:35.217Z: JOB_MESSAGE_BASIC: Worker configuration: 
n1-standard-1 in us-central1-f.
root: INFO: 2019-06-18T12:12:35.868Z: JOB_MESSAGE_DETAILED: Expanding 
CoGroupByKey operations into optimizable parts.
root: INFO: 2019-06-18T12:12:35.948Z: JOB_MESSAGE_DEBUG: Combiner lifting 
skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a 
combiner.
root: INFO: 2019-06-18T12:12:36Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey 
operations into optimizable parts.
root: INFO: 2019-06-18T12:12:36.042Z: JOB_MESSAGE_DETAILED: Lifting 
ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-06-18T12:12:36.109Z: JOB_MESSAGE_DEBUG: Annotating graph with 
Autotuner information.
root: INFO: 2019-06-18T12:12:36.177Z: JOB_MESSAGE_DETAILED: Fusing adjacent 
ParDo, Read, Write, and Flatten operations
root: INFO: 2019-06-18T12:12:36.213Z: JOB_MESSAGE_DETAILED: Unzipping flatten 
s10 for input s8.out
root: INFO: 2019-06-18T12:12:36.253Z: JOB_MESSAGE_DETAILED: Fusing unzipped 
copy of assert_that/Group/GroupByKey/Reify, through flatten 
assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
root: INFO: 2019-06-18T12:12:36.288Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/Map(_merge_tagged_vals_under_key) into 
assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2019-06-18T12:12:36.330Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Match into assert_that/Unkey
root: INFO: 2019-06-18T12:12:36.363Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2019-06-18T12:12:36.421Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/GroupByKey/GroupByWindow into 
assert_that/Group/GroupByKey/Read
root: INFO: 2019-06-18T12:12:36.469Z: JOB_MESSAGE_DETAILED: Unzipping flatten 
s10-u13 for input s11-reify-value0-c11
root: INFO: 2019-06-18T12:12:36.516Z: JOB_MESSAGE_DETAILED: Fusing unzipped 
copy of assert_that/Group/GroupByKey/Write, through flatten 
assert_that/Group/Flatten/Unzipped-1, into producer 
assert_that/Group/GroupByKey/Reify
root: INFO: 2019-06-18T12:12:36.555Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_1
root: INFO: 2019-06-18T12:12:36.594Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2019-06-18T12:12:36.689Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2019-06-18T12:12:36.788Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/WindowInto(WindowIntoFn) into compute/compute
root: INFO: 2019-06-18T12:12:36.872Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2019-06-18T12:12:36.972Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2019-06-18T12:12:37.019Z: JOB_MESSAGE_DETAILED: Fusing consumer 
compute/compute into start/Read
root: INFO: 2019-06-18T12:12:37.063Z: JOB_MESSAGE_DEBUG: Workflow config is 
missing a default resource spec.
root: INFO: 2019-06-18T12:12:37.104Z: JOB_MESSAGE_DEBUG: Adding StepResource 
setup and teardown to workflow graph.
root: INFO: 2019-06-18T12:12:37.168Z: JOB_MESSAGE_DEBUG: Adding workflow start 
and stop steps.
root: INFO: 2019-06-18T12:12:37.219Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-06-18T12:12:37.468Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-06-18T12:12:37.545Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-06-18T12:12:37.589Z: JOB_MESSAGE_DEBUG: Tearing down pending 
resources...
root: INFO: Job 2019-06-18_05_11_59-22430636337191646 is in state 
JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 17 tests in 1456.199s

FAILED (errors=1)
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_01_27-618401895387329081?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_08_37-10734647084180618209?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_01_30-14082249786564930452?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_10_49-1376588009832064757?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_01_29-15556760337267972360?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_09_24-17983662938306680440?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_01_29-6456708657336900181?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_09_59-5904006064556271477?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_01_30-254952194818018071?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_09_34-11622065484708716874?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_01_28-14159128451522549867?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_10_27-16365277074834014418?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_01_28-11941054660260190991?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_10_37-13630331602338669378?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_01_30-6135120362771033748?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_11_59-22430636337191646?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_12_59-1797692726422115535?project=apache-beam-testing.

> Task :sdks:python:validatesRunnerBatchTests FAILED

> Task :sdks:python:validatesRunnerStreamingTests
>>> RUNNING integration tests with pipeline options: 
>>> --runner=TestDataflowRunner --project=apache-beam-testing 
>>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it 
>>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it 
>>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output 
>>> --sdk_location=build/apache-beam.tar.gz 
>>> --requirements_file=postcommit_requirements.txt --num_workers=1 
>>> --sleep_secs=20 --streaming 
>>> --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.14.0-SNAPSHOT.jar>
>>>  
>>> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>  
>>> --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 
>>> --attr=ValidatesRunner,!sickbay-streaming
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/1922375555/local/lib/python2.7/site-packages/setuptools/dist.py>:472:
 UserWarning: Normalizing '2.14.0.dev' to '2.14.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
test_multiple_empty_outputs 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
test_par_do_with_multiple_outputs_and_using_yield 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_dofn_lifecycle 
(apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok
test_par_do_with_multiple_outputs_and_using_return 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) 
... ok
test_as_list_and_as_dict_side_inputs 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
test_default_value_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 15 tests in 1257.282s

OK
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_25_40-1125260336640079161?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_25_40-13030099435432450219?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_34_00-7112413110124242114?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_25_41-5418688616208059214?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_35_10-6334209023126791583?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_25_39-12520168901138302936?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_34_53-8053495491266054333?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_25_41-5400462426703532639?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_33_55-9883999104810442397?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_25_40-16355605719642038372?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_35_59-9429713315580556585?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_25_40-9712905806626668537?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_34_05-11597197756790023298?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_25_40-14947906685750939658?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-18_05_33_59-10260929599224012126?project=apache-beam-testing.

FAILURE: Build completed with 5 failures.

1: Task failed with an exception.
-----------
* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'>
 line: 67

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:dataflow:py36:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'>
 line: 87

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'>
 line: 97

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:dataflow:py37:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

4: Task failed with an exception.
-----------
* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'>
 line: 67

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:dataflow:py35:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

5: Task failed with an exception.
-----------
* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/build.gradle'>
 line: 197

* What went wrong:
Execution failed for task ':sdks:python:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 46m 16s
76 actionable tasks: 59 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/bq2qi3czrvogu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to