See 
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/5015/display/redirect?page=changes>

Changes:

[github] Revert "[BEAM-8427] Create a table and a table provider for MongoDB"


------------------------------------------
[...truncated 267.25 KB...]
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "ReadFromPubSub/Read.out"
          }
        ],
        "pubsub_subscription": 
"projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inpute711bcfc-ce5f-41b2-a70a-6c9f14d5b227",
        "user_name": "ReadFromPubSub/Read"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s2",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "StreamingUserMetricsDoFn",
            "type": "STRING",
            "value": 
"apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "generate_metrics.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s1"
        },
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4",
        "user_name": "generate_metrics"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s3",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s2"
        },
        "pubsub_topic": 
"projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outpute711bcfc-ce5f-41b2-a70a-6c9f14d5b227",
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: '2019-11-07T00:05:00.726907Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-11-06_16_04_59-16877846735182616126'
 location: 'us-central1'
 name: 'beamapp-jenkins-1107000447-494650'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-11-07T00:05:00.726907Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-11-06_16_04_59-16877846735182616126]
root: INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_04_59-16877846735182616126?project=apache-beam-testing
root: INFO: Job 2019-11-06_16_04_59-16877846735182616126 is in state 
JOB_STATE_RUNNING
root: INFO: 2019-11-07T00:05:03.888Z: JOB_MESSAGE_DETAILED: Checking 
permissions granted to controller Service Account.
root: INFO: 2019-11-07T00:05:04.579Z: JOB_MESSAGE_BASIC: Worker configuration: 
n1-standard-4 in us-central1-f.
root: INFO: 2019-11-07T00:05:05.234Z: JOB_MESSAGE_DETAILED: Expanding 
SplittableParDo operations into optimizable parts.
root: INFO: 2019-11-07T00:05:05.240Z: JOB_MESSAGE_DETAILED: Expanding 
CollectionToSingleton operations into optimizable parts.
root: INFO: 2019-11-07T00:05:05.257Z: JOB_MESSAGE_DETAILED: Expanding 
CoGroupByKey operations into optimizable parts.
root: INFO: 2019-11-07T00:05:05.278Z: JOB_MESSAGE_DETAILED: Expanding 
SplittableProcessKeyed operations into optimizable parts.
root: INFO: 2019-11-07T00:05:05.286Z: JOB_MESSAGE_DETAILED: Expanding 
GroupByKey operations into streaming Read/Write steps
root: INFO: 2019-11-07T00:05:05.293Z: JOB_MESSAGE_DEBUG: Annotating graph with 
Autotuner information.
root: INFO: 2019-11-07T00:05:05.329Z: JOB_MESSAGE_DETAILED: Fusing adjacent 
ParDo, Read, Write, and Flatten operations
root: INFO: 2019-11-07T00:05:05.336Z: JOB_MESSAGE_DETAILED: Fusing consumer 
generate_metrics into ReadFromPubSub/Read
root: INFO: 2019-11-07T00:05:05.342Z: JOB_MESSAGE_DETAILED: Fusing consumer 
dump_to_pub/Write/NativeWrite into generate_metrics
root: INFO: 2019-11-07T00:05:05.360Z: JOB_MESSAGE_DEBUG: Adding StepResource 
setup and teardown to workflow graph.
root: INFO: 2019-11-07T00:05:05.430Z: JOB_MESSAGE_DEBUG: Adding workflow start 
and stop steps.
root: INFO: 2019-11-07T00:05:05.462Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-11-07T00:05:05.731Z: JOB_MESSAGE_DEBUG: Executing wait step 
start2
root: INFO: 2019-11-07T00:05:05.767Z: JOB_MESSAGE_DEBUG: Starting worker pool 
setup.
root: INFO: 2019-11-07T00:05:05.778Z: JOB_MESSAGE_BASIC: Starting 1 workers...
root: INFO: 2019-11-07T00:05:08.010Z: JOB_MESSAGE_BASIC: Executing operation 
ReadFromPubSub/Read+generate_metrics+dump_to_pub/Write/NativeWrite
root: INFO: 2019-11-07T00:05:34.417Z: JOB_MESSAGE_DETAILED: Checking 
permissions granted to controller Service Account.
root: INFO: 2019-11-07T00:05:34.419Z: JOB_MESSAGE_DEBUG: Executing input step 
topology_init_attach_disk_input_step
root: INFO: 2019-11-07T00:05:35.044Z: JOB_MESSAGE_BASIC: Worker configuration: 
n1-standard-4 in us-central1-f.
root: INFO: 2019-11-07T00:05:55.277Z: JOB_MESSAGE_DETAILED: Workers have 
started successfully.
root: INFO: 2019-11-07T00:05:56.483Z: JOB_MESSAGE_WARNING: Your project already 
contains 100 Dataflow-created metric descriptors and Stackdriver will not 
create new Dataflow custom metrics for this job. Each unique user-defined 
metric name (independent of the DoFn in which it is defined) produces a new 
metric descriptor. To delete old / unused metric descriptors see 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
root: WARNING: Timing out on waiting for job 
2019-11-06_16_04_59-16877846735182616126 after 61 seconds
google.auth.transport._http_client: DEBUG: Making request: GET 
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): 
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token
 HTTP/1.1" 200 181
--------------------- >> end captured logging << ---------------------
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_04_59-16877846735182616126?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_12_04-14249790676232956189?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_19_59-5483898815507589811?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_04_55-2029314030689242672?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_13_19-2404607660068307368?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_04_58-12718319832158682076?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_13_28-4674890969126360230?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_04_57-1506844672589998852?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_12_55-12081564022504508838?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_04_57-13546920436018863044?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_12_50-12761235429800855733?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_04_58-14652523157144527436?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_12_21-7761978138709645550?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_04_58-7824542567790488005?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_13_38-5385346120596127504?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_04_58-8752468722686410718?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_12_55-10172254288709780251?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df-py36.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 17 tests in 1396.940s

FAILED (failures=1)

> Task :sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests 
> FAILED

> Task :sdks:python:test-suites:dataflow:py35:validatesRunnerStreamingTests
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_05_58-3997094119190844903?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_13_22-7228160902339540726?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_21_16-13744476762798018879?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_05_54-2805413829636350875?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_13_38-1456644747527104739?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_05_55-1812959645209797835?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_14_00-2280911151062246302?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_05_54-15405444377553867059?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_14_15-791429135078069985?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_06_00-2235890373495730818?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_14_16-9543400231068066212?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_05_55-9763713904238773787?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_14_03-15867059390604109259?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_05_56-5757329441537706886?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_14_20-13434611635156859058?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_05_56-16265907745143954760?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-06_16_14_06-10496914245564364361?project=apache-beam-testing
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_dofn_lifecycle 
(apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) 
... ok
test_impulse (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
test_as_list_and_as_dict_side_inputs 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
test_as_singleton_without_unique_labels 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df-py35.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 17 tests in 1401.590s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'>
 line: 78

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:dataflow:py35:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'>
 line: 101

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 50m 24s
74 actionable tasks: 57 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/kto3ks6gnbj36

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to