See 
<https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/3592/display/redirect?page=changes>

Changes:

[je.ik] Log exception caught during UnboundedSource#split

[David Morávek] [BEAM-11435] Reuse already set timers.


------------------------------------------
[...truncated 1.49 MB...]
            "value": "<lambda>"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:interval_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "None",
            "user_name": "encode.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "None",
          "step_name": "s8"
        },
        "serialized_fn": "ref_AppliedPTransform_encode_11",
        "user_name": "encode"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s10",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "bytes_to_proto_str"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:interval_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "None",
            "user_name": "WriteToPubSub/ToProtobuf.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "None",
          "step_name": "s9"
        },
        "serialized_fn": "ref_AppliedPTransform_WriteToPubSub/ToProtobuf_13",
        "user_name": "WriteToPubSub/ToProtobuf"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s11",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "None",
          "step_name": "s10"
        },
        "pubsub_serialized_attributes_fn": "",
        "pubsub_topic": 
"projects/apache-beam-testing/topics/wc_topic_output25845c0a-7692-40d2-ab44-4130e2dc9f0f",
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: '2020-12-10T18:39:03.132160Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2020-12-10_10_39_01-16049295868312848391'
 location: 'us-central1'
 name: 'beamapp-jenkins-1210183853-577654'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2020-12-10T18:39:03.132160Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: 
[2020-12-10_10_39_01-16049295868312848391]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 
2020-12-10_10_39_01-16049295868312848391
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow 
monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-10_10_39_01-16049295868312848391?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2020-12-10_10_39_01-16049295868312848391 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:01.561Z: 
JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 
2020-12-10_10_39_01-16049295868312848391. The number of workers will be between 
1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:01.561Z: 
JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. 
Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:01.561Z: 
JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 
2020-12-10_10_39_01-16049295868312848391.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:05.993Z: 
JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:06.703Z: 
JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable 
parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:06.739Z: 
JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into 
optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:06.808Z: 
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:06.843Z: 
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step group: GroupByKey not 
followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:06.896Z: 
JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into 
optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:06.918Z: 
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write 
steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:06.986Z: 
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:07.061Z: 
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:07.109Z: 
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:07.141Z: 
JOB_MESSAGE_DETAILED: Fusing consumer decode into ReadFromPubSub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:07.181Z: 
JOB_MESSAGE_DETAILED: Fusing consumer split into decode
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:07.220Z: 
JOB_MESSAGE_DETAILED: Fusing consumer pair_with_one into split
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:07.257Z: 
JOB_MESSAGE_DETAILED: Fusing consumer WindowInto(WindowIntoFn) into 
pair_with_one
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:07.322Z: 
JOB_MESSAGE_DETAILED: Fusing consumer group/WriteStream into 
WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:07.364Z: 
JOB_MESSAGE_DETAILED: Fusing consumer group/MergeBuckets into group/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:07.404Z: 
JOB_MESSAGE_DETAILED: Fusing consumer count into group/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:07.443Z: 
JOB_MESSAGE_DETAILED: Fusing consumer format into count
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:07.473Z: 
JOB_MESSAGE_DETAILED: Fusing consumer encode into format
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:07.512Z: 
JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/ToProtobuf into encode
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:07.543Z: 
JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/Write/NativeWrite into 
WriteToPubSub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:07.579Z: 
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:07.609Z: 
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:07.646Z: 
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:07.684Z: 
JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:08.806Z: 
JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:08.829Z: 
JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:08.864Z: 
JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:25.766Z: 
JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric 
descriptors, so new user metrics of the form custom.googleapis.com/* will not 
be created. However, all user metrics are also available in the metric 
dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, 
you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete

> Task :sdks:python:test-suites:dataflow:py36:preCommitIT_streaming_V2
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:22.548Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that 
the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:57.864Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:57.891Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.

> Task :sdks:python:test-suites:dataflow:py37:preCommitIT_streaming_V2
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:39:37.622Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that 
the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:40:09.697Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-10T18:40:09.732Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.

> Task :sdks:python:test-suites:dataflow:py36:preCommitIT_streaming_V2
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for 
job 2020-12-10_10_38_44-17308328964785011692 after 360 seconds
DEBUG:google.auth._default:Checking None for explicit credentials as part of 
auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth 
process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using 
them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth 
process...
DEBUG:google.auth._default:No App Engine library was found so cannot 
authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET 
http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): 
metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fpubsub
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fpubsub
 HTTP/1.1" 200 241

> Task :sdks:python:test-suites:dataflow:py37:preCommitIT_streaming_V2
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for 
job 2020-12-10_10_39_01-16049295868312848391 after 360 seconds
DEBUG:google.auth._default:Checking None for explicit credentials as part of 
auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth 
process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using 
them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth 
process...
DEBUG:google.auth._default:No App Engine library was found so cannot 
authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET 
http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): 
metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fpubsub
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fpubsub
 HTTP/1.1" 200 241
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-10_10_39_01-16049295868312848391?project=apache-beam-testing
test_streaming_wordcount_it 
(apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok

----------------------------------------------------------------------
XML: nosetests-preCommitIT-df-py37.xml
----------------------------------------------------------------------
XML: 
<https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 514.730s

OK

> Task :sdks:python:test-suites:dataflow:py36:preCommitIT_streaming_V2
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-10_10_38_44-17308328964785011692?project=apache-beam-testing
test_streaming_wordcount_it 
(apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok

----------------------------------------------------------------------
XML: nosetests-preCommitIT-df-py36.xml
----------------------------------------------------------------------
XML: 
<https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 544.910s

OK

> Task :sdks:python:test-suites:dataflow:preCommitIT_V2

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:tox:py38:testPy38CloudCoverage'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.7/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 46m 25s
101 actionable tasks: 74 executed, 27 from cache

Publishing build scan...
https://gradle.com/s/j4inzasdowluy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to